Next Article in Journal
Hybrid Design Optimization Methodology for Electromechanical Linear Actuators in Automotive LED Headlights
Previous Article in Journal
A Fast Response, High Flow Rate, Low Power Consumption Pneumatic Proportional Valve for Medical Ventilators Driven by a Piezoelectric Bimorph
Previous Article in Special Issue
Intelligent Vehicle Driving Decisions and Longitudinal–Lateral Trajectory Planning Considering Road Surface State Mutation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Autonomous Driving in Agricultural Machinery: Advancing the Frontier of Precision Agriculture

Automotive Engineering Research Institute, Jiangsu University, Zhenjiang 212013, China
*
Author to whom correspondence should be addressed.
Actuators 2025, 14(9), 464; https://doi.org/10.3390/act14090464
Submission received: 26 July 2025 / Revised: 11 September 2025 / Accepted: 15 September 2025 / Published: 22 September 2025

Abstract

Increasing global food production to address challenges from population growth, labor shortages, and climate change necessitates a significant enhancement of agricultural sustainability. Autonomous agricultural machinery, a recognized application of precision agriculture, offers a promising solution to boost productivity, resource efficiency, and environmental sustainability. This study presents a systematic review of autonomous driving technologies for agricultural machinery based on 506 rigorously selected publications. The review emphasizes three core aspects: navigation reliability assurance, motion control mechanisms for both vehicles and implements, and actuator fault-tolerance strategies in complex agricultural environments. Applications in farmland, orchards, and livestock farming demonstrate substantial potential. This study also discusses current challenges and future development trends. It aims to provide a reference and technical guidance for the engineering implementation of intelligent agricultural machinery and to support sustainable agricultural transformation.

1. Introduction

Agriculture plays a fundamental role in global food supply, economic stability, and sustainable development. However, it is projected that the world’s population will reach 10 billion by 2050, necessitating a substantial increase in agricultural productivity to meet rising food demands [1]. Throughout history, advancements in agricultural productivity have primarily resulted from structural changes and have undergone four major transformations to date. The technological features characterizing each stage of agricultural development are illustrated in Figure 1. Among these, traditional cultivation methods from ancient times to the late 19th century are commonly classified as Agriculture 1.0. During this period, agriculture relied heavily on localized tools and intensive manual labor, resulting in a characteristically labor-intensive production model. Although this method has low production efficiency, a standardized workflow for agricultural operations has been established, thereby providing a logical foundation for subsequent operations involving agricultural robots.
During the First Industrial Revolution, with the widespread adoption of power machinery, agricultural production advanced to the 2.0 stage in the early 20th century. The combined agricultural tool system, driven by steam engines and internal combustion engines, began replacing traditional manual tools, which significantly increased the grain output per unit of labor. This transformation laid the groundwork for the autonomy of agricultural robots in terms of power and mechanical structure, and it also directly promoted the development of mechanical guidance devices. Patent records indicate that mechanical guiding devices based on furrow trajectory recognition were already being developed during this period. By the late 1930s, facilitated by advances in electromechanical control systems, the first complete circular farming system was deployed. This system utilized piano wire deployed from a central spool to construct an accurate mechanical path planning network, thereby providing a foundational engineering practice for the autonomous driving of agricultural machinery [2].
From the 1960s to the 1990s, advances in embedded systems, software development, and communication technologies catalyzed the automation of agricultural production systems. This landmark technological shift has been recognized by scholars as the Third Agricultural Revolution (Agriculture 3.0). This marked a critical breakthrough period for the autonomy of agricultural robots. The core contribution during this period was defined by breakthroughs in positioning and sensing technologies. Specifically, the engineering deployment of the Global Positioning System (GPS) provided the necessary positioning support for the autonomous navigation of agricultural machinery, thereby enabling precise positioning for large-scale field operations. By the 1980s, the integration of computers and image sensors culminated in the development of early field robots with visual-based navigation systems [3,4]. As GPS and computer vision technologies continued to mature, the positioning accuracy of autonomous driving agricultural machinery reached centimeter-level precision and has been widely adopted in agricultural operations including sowing, fertilizing, and harvesting. This evolution led to the emergence of automated systems such as tractor navigation platforms, unmanned orchard spraying technologies, and crop harvesting guidance systems [5]. These devices are capable of continuous and high-precision operation, which marked a shift from single-function machinery to autonomous operation systems for agricultural robots.
With the rapid advancements in artificial intelligence and information technology in the 21st century, the emergence of the Fourth Industrial Revolution has completely transformed the form of agricultural activities. Agricultural production has entered the 4.0 era, often referred to as the digital agriculture era. During this stage, agricultural robots have been endowed with the core capabilities of intelligence and autonomy [6,7]. In this context, the acceleration of population growth and urbanization has led to a sustained increase in global food demand, while environmental challenges such as climate change, farmland degradation, and water scarcity have further constrained agricultural development potential [8,9]. Additionally, the agricultural sector is experiencing a shortage of labor and an aging workforce, partly because younger generations are increasingly disinterested in agricultural work. Compared with office employment, agricultural work is more physically demanding and typically offers lower income. Shifting societal and cultural values have further diminished the attractiveness of agricultural careers [10,11]. These factors collectively exert a negative impact on agricultural productivity and output. Therefore, it is imperative to develop innovative and sustainable strategies to boost agricultural productivity and capacity.
To address this challenge, precision agriculture (PA) has been introduced as a transformative management approach. It is defined as “a management strategy that collects, processes, and analyzes temporal, spatial, and individual plant and animal data, and integrates this information with other sources to support management decisions based on estimated variability, thereby enhancing resource use efficiency, productivity, quality, profitability, and the sustainability of agricultural production” [12]. As early as the 1990s, PA emerged in parallel with the development of GPS, GIS, and various data collection tools. Pierre C. Robert described it as an information revolution driven by new technologies [13]. PA facilitates the accurate collection of data related to crop conditions, weather, soil, and the environment through the use of GPS, sensors, data processing systems, and automation, thereby enabling reliable, data-driven decision-making. Autonomous driving agricultural machinery—comprising unmanned aerial vehicles, unmanned ground vehicles, and autonomous navigation robots—represents one of the most recent and practical advancements within PA. These systems integrate core technologies including positioning, perception, control, and actuation. They are capable of autonomous navigation across diverse agricultural settings—including farmlands, orchards, and livestock facilities—and can perform a wide range of precise operational tasks such as sowing, fertilizing, weeding, pest control, harvesting, livestock feeding, environmental monitoring, and facility cleaning. This capability allows them to address numerous operational gaps that traditional machinery cannot fill, thereby substituting human labor in large-scale and repetitive agricultural tasks [14,15,16]. In the long run, the advantages of autonomous agricultural robots in sustainability and economy render them highly valuable for future agricultural development [17], and this assertion has been empirically verified. From a cost perspective, Sørensen et al. conducted an economic study on mechanical weed-removing robots, reporting that when the weed-removing efficiency of organic agricultural machinery robots reached 100%, labor could be reduced by 85% and 60% in organic sugar beet and carrot cultivation, respectively. Furthermore, an efficiency level of 75% was projected to reduce labor costs by 50% [18]. Moreover, experimental results from Pedersen et al. indicated that the utilization of autonomous spraying robots is projected to reduce operating costs by 24% [19]. Regarding operational efficiency, Hussain et al. proposed that a 20 W laser weed-removing robot requires 23.7 h (excluding charging) or 35.7 h (including charging) to process a 1-acre plot, whereas traditional manual labor typically spans several days [20]. Research by Al-Amin et al. indicates that autonomous agricultural cluster robots can operate for up to 22 h per day (including 2 h of maintenance) compared to traditional machinery which typically operates for only 10 h [21]. In terms of environmental benefits, agricultural autonomous robots not only mitigate the reliance on pesticides but also minimize soil compaction. For instance, the SprayBox crop protection robot in the United States, which integrates 50 nozzles and a complex computer system, can achieve millimeter-level precision for weed removal. This results in an approximate 95% reduction in chemical herbicides compared to traditional spraying technology [22]. Similarly, the SwarmBot autonomous fertilizing robot, developed by an Australian team, utilizes machine learning algorithms to acquire crop physiological data, enabling precise fertilizer application as needed and thereby preventing environmental pollution caused by excessive fertilizer application [23]. Additionally, the 28 kW small autonomous agricultural equipment employed by Al-Amin et al. is considerably lighter than traditional 221 kW large machinery, which reduces soil compaction. Furthermore, the combination of strip intercropping and robot operation modes can further safeguard the biodiversity and soil health of the agricultural ecosystem [24].
At present, although numerous scholarly articles address the key technologies and application scenarios of specific categories of autonomous driving agricultural machinery, several limitations in research perspectives remain. These studies often fail to deeply examine the distinctive characteristics of autonomous driving agricultural machinery compared to other autonomous systems, lack comparative analyses of core technologies such as navigation control and machine operation control, and seldom address coping strategies for frequent actuator failures under complex agricultural conditions. To address these gaps, this paper conducts a systematic review of the evolution and practical application of key technologies in the domain of autonomous driving agricultural machinery and further investigates the unique challenges specific to this field. The structural framework of the study is presented in Table 1. The main contributions of this paper are as follows:
(1) A novel systematic framework for key technologies is introduced. Unlike previous reviews, which often treat technologies in isolation, an integrated framework is proposed that systematically categorizes and analyzes the core technologies of autonomous agricultural machinery.
(2) A pioneering comparative analysis between navigation and implementation control is presented. This review is among the first to explicitly compare and contrast the motion control mechanisms utilized for vehicle navigation and implementation operations. It highlights their unique design requirements and the associated technical challenges, which are often overlooked in the existing literature primarily focused on platform mobility.
(3) An in-depth focus on actuator fault tolerance in agricultural contexts is provided. Moving beyond conventional reviews that concentrate on perception and navigation, this paper offers a dedicated investigation into fault detection and fault-tolerant control strategies for actuators. This addresses a critical gap in ensuring reliability and safety under the harsh and unpredictable conditions inherent to agricultural operations.
(4) A multi-dimensional review of applicability across diverse agricultural scenarios is presented. This study offers a comprehensive evaluation of the applicability and practical deployment of various autonomous agricultural machines across diverse environments, including farmlands, orchards, and livestock farming. Their performance is assessed from multiple dimensions, such as technical feasibility, operational efficiency, and environmental adaptability.
The rest of this paper is organized as follows: Section 2 presents the literature review methodology. Section 3 analyzes the key technologies of agricultural machinery automatic driving. Section 4 discusses the application status of automatic driving in agricultural machinery. Section 5 describes the existing challenges and future development trends. Finally, a summary is made.

2. Literature Review and Analysis Methods

2.1. Overview of the Search Strategy

In this study, the literature was primarily retrieved from ScienceDirect, IEEE Xplore, and Web of Science. The search keywords included “agricultural automatic driving,” “agricultural unmanned vehicle,” “agricultural UAV,” “agricultural robot,” “orchard robot,” “livestock and poultry breeding robot,” “agricultural machinery positioning,” “agricultural machinery perception,” “agricultural machinery control,” and “agricultural actuator.” Inclusion criteria were established to encompass peer-reviewed journal articles, conference proceedings, and relevant book chapters published between 1 January 2011 and 30 May 2025. A small number of publications released before 2011 were also included to provide background information or clarify specific technical concepts. Duplicate entries retrieved from multiple databases were removed. By reading the titles and abstracts of the retrieved articles, relevant papers that match the research scope were sorted out, totaling 506 documents. Each document was exported in TXT format and converted into structured full-text records. A synonym substitution file along with the literature dataset was imported into VOSviewer 1.6.20 for bibliometric analysis, and the resulting keyword co-occurrence visualizations were generated. To more intuitively illustrate the temporal and geographic distribution of research on autonomous driving agricultural machinery, the relevant data from VOSviewer were subsequently imported into Origin 2025 and Scimago Graphics 1.0.51 for visualization.

2.2. Literature Classification and Focus Analysis

Table 2 illustrates the distribution of agricultural machinery reference types cited in this review. In this context, a single reference may encompass multiple key technologies, thereby allowing its classification into several categories simultaneously. “Perception and Vision” constitutes the most concentrated research area, accounting for 35.3%. This concentration is primarily driven by advancements in deep learning technology, with the objective of addressing core challenges in target recognition and understanding within unstructured agricultural environments. Research related to “Actuators and Fault-Tolerant Control” also comprises a relatively high proportion (24.4%). This indicates a growing academic concern regarding the conversion of intelligent decisions into precise and reliable physical actions. However, the sub-direction of “Fault Diagnosis and Fault-Tolerant Control” remains in its nascent stages of development, representing a critical research avenue for achieving future system robustness. Secondly, regarding application scenarios, the research exhibits characteristics highly correlated with agricultural production structures. The “Farmland” scenario accounts for over half of the research (50.4%). Its large-scale and regular characteristics allow for technological application throughout the entire life cycle of agricultural production. In contrast, the “Fruit Orchard and Greenhouse” scenario (30.7%) features research predominantly focused on resolving perception and operational challenges associated with high-value-added crops in complex environments, thus entailing higher technical requirements. Conversely, “Livestock Breeding,” as an emerging automated application scenario (14.4%), despite its currently smaller research base, is demonstrating robust growth potential, signifying a future expansion of research scope. Finally, regarding research maturity, a significant “innovation–application” gap currently exists. Although over half (55.7%) of the research has progressed to the prototype experimental verification stage, thereby demonstrating technological feasibility, only a small proportion (7.7%) has been successfully commercialized. This indicates that the majority of technical solutions continue to confront substantial engineering challenges, including those related to cost, reliability, applicability, and system integration. Bridging the transition from “usable” to “user-friendly” and ultimately to market adoption remains an urgent imperative for the entire field.

2.3. Literature Selection Bias

It should be noted that the literature search methodology employed in this study contains potential selection biases, which may consequently impact the comprehensiveness and objectivity of the derived research conclusions. Firstly, at the database selection level, this study excluded open access resources, such as Google Scholar, and multidisciplinary databases, including Scopus and SpringerLink. This directly resulted in the omission of relevant cross-disciplinary research. Secondly, regarding keyword design, the core terms were not comprehensively covered by their synonyms and various combinations. This led to limitations in the search scope, consequently hindering the comprehensive capture of research focused on specific technical directions. Thirdly, concerning the time range and literature type selection, the core literature predominantly focused on publications after 2011, thereby lacking coverage of fundamental agricultural robot technologies from 2000 to 2010. This omission impedes a complete tracing of the technological evolution process. Concurrently, while prioritizing journal and conference papers, this study neglected patent literature, industry reports, and promotional reports from commercial or research institutions. This oversight resulted in an insufficient comprehensive analysis of the “academic research–industrial implementation” nexus and a lack of multi-dimensional support for evaluating technology commercialization potential and global regulatory advancements. Finally, regarding the literature inclusion criteria, this study did not explicitly incorporate non-English studies into its search scope. Non-English-speaking countries, such as Germany, Japan, and China, offer significant reference value in research areas including agricultural robot safety standards, orchard picking technology, and Beidou navigation applications. This oversight may diminish the comprehensiveness and objectivity of the conclusions, particularly concerning cross-disciplinary result integration, technological evolution analysis, and the assessment of commercialization potential. Future research should therefore optimize the search strategy by expanding database coverage and integrating diverse literature types.

2.4. Year of Publication

Figure 2 shows the volume of publications related to autonomous driving in agricultural machinery over the past fifteen years. As research into precision agriculture and smart farming continues to expand globally, research efforts on autonomous farming machinery have intensified. Driven by the rising global food demand and an aging agricultural workforce, the adoption of automated agricultural machinery equipped with autonomous navigation and precision operation capabilities has emerged as both a feasible and inevitable trend to replace human labor in agricultural tasks.

2.5. Country

Figure 3 illustrates the global distribution of contributing countries and their collaborative networks in the field of autonomous driving agricultural machinery research. The figure reveals that scholars from China and the United States rank highest in publication output, suggesting strong market demand and substantial research investments in precision agriculture and intelligent production in both countries. Additionally, Australia, Japan, India, Brazil, and several European nations have also made notable contributions. These countries either possess large populations with high food demand or exhibit advanced agricultural systems with a strong emphasis on intelligent production, thereby promoting the development of autonomous driving agricultural machinery.

2.6. Keywords

Figure 4 shows the keyword co-occurrence network in all studies. VOSviewer software is employed to conduct a comprehensive analysis of keywords extracted from 506 selected papers. To reduce redundancy, some keywords with similar meanings are consolidated, such as “robotics” and “robots”, “smart farming” and “smart agriculture”, and “unmanned aerial vehicle” and “uav”. The keyword frequency threshold is set to 5, and generic terms such as “design” and “system” are excluded, resulting in a final set of 88 relevant keywords. In the visualization, the size of each circle represents keyword frequency, with larger circles indicating higher frequencies. Different colors denote distinct clusters, and arcs between circles indicate co-occurrence relationships. The frequent appearance of terms such as “robots”, “machine vision”, “navigation”, and “deep learning” indicates a diversification of autonomous driving agricultural machinery functions and a continuous enhancement in both navigation performance and intelligence level.

3. Key Technologies

The automatic driving of agricultural machinery is a complex form of systematic engineering, which consists of four key technologies: positioning, perception, control, and actuators, as shown in Figure 5. These technologies cooperate with each other and operate collaboratively to ensure the automatic driving and autonomous operation of agricultural machinery and equipment. Specifically, we summarize the key technologies as follows: (1) positioning technology is the cornerstone of agricultural machinery autonomous driving, providing accurate position, speed, and time information for agricultural machinery; (2) perception technology endows agricultural machinery with “vision” and “touch” to help it capture the surrounding environmental information and the information of the target operation object; (3) motion planning and control technology act as the “brain” of agricultural machinery autonomous driving, responsible for planning motion paths and formulating control strategies; (4) as the “muscle” of agricultural machinery, the actuator technology is responsible for driving the movement of mechanical equipment and completing specific operational tasks.

3.1. Positioning Technology

High-precision positioning serves as a prerequisite for enabling autonomous operation of agricultural machinery. This section provides a systematic review of positioning technologies applied in agricultural environments. Based on their technical principles and reference systems, these methods can be broadly categorized into three types: absolute positioning, relative positioning, and integrated positioning.

3.1.1. Absolute Positioning

According to the different reference coordinate systems, positioning techniques can be divided into absolute positioning and relative positioning. Absolute positioning determines the position coordinates of agricultural autonomous driving machinery relative to the global coordinate system by using a Global Navigation Satellite System (GNSS) including the Global Positioning System (GPS) and the Beidou Navigation Satellite System (BDS), as well as beacon methods [25]. A GNSS can provide positioning accuracy of 2–4 m, and some studies have successfully applied it to the autonomous driving of agricultural machinery [26,27]. With the help of the differential calculation principle of real-time kinematic (RTK) technology, a GNSS can further obtain position information accurate to the centimeter level. RTK-GNSS relies on wireless communication between the reference receiver (fixed base station) and the mobile receiver (mobile station) installed on the autonomous driving mechanical equipment to achieve precise positioning. The distance between the two is usually within 10 km [28]. Specifically, RTK technology transmits the phase of the GNSS signal received by the fixed base station to the mobile station. The mobile station compares the received phase with the phase of the satellite signal it observes, thereby obtaining the spatial position coordinates of the mobile station. Generally, if the positioning accuracy is required to reach the centimeter level, the base station and the mobile station need to receive signals from at least five satellites simultaneously [29,30]. Chou et al. [31] developed an autonomous agricultural vehicle based on the RTK-GPS module. They installed a mobile station antenna on the front frame of the multi-functional all-terrain vehicle and a base station antenna on the top of the tripod. Without introducing interruptions, the navigation position errors were all controlled within a range of 20 mm. Xiong et al. [32] designed an autonomous orchard spraying robot based on RTK-BDS. The results of the field experiment show that the offset error of the system positioning is proportional to the driving speed: the average offset error is about 0.03 m, while at a driving speed of 2 km/h, the maximum offset error can reach about 0.13 m. Perez-Ruiz et al. conducted geospatial mapping of crops by installing a single real-time RTK-GPS system on tractors. The error of the final drawn crop map in the trajectory direction is 2.67 cm, and 95% of the plants are located within a circular radius of 5.58 cm at the map position [33]. Based on the high positioning accuracy of RTK-GNSS, some studies suggest that a multi-receiver system can be adopted to further determine the heading angle of autonomous driving agricultural machinery [34,35,36]. However, integrating GPS-GNSS into autonomous driving agricultural machinery incurs high costs, especially with multiple RTK-GNSS systems [29,37]. To promote the application of autonomous driving agricultural machinery, it is necessary to adopt low-cost and compact RTK-GNSS modules. Valente et al. [38] compared two open-source, low-cost, single-frequency RTK-GNSS systems, namely Emlid Reach RTK (ER-RTK) and NavSpark RTK (NS-RTK). Based on all the test results, ER-RTK GNSS and NS-RTK GNSS achieved fixed solutions of 94.0% and 71.5%, respectively, and the former also achieved higher accuracy. Although GNSS technology has the significant advantage of all-temporal coverage, it may be affected by driving speed, radio frequency interference, obstacles, atmospheric conditions, and satellite geometry, and is not suitable for use indoors and in orchards with dense foliage [32,39,40,41].
To achieve absolute positioning in the indoor environment, active and passive beacon technologies have been introduced. The former, such as ultra-wideband (UWB) technology, actively transmits signals by deploying fixed beacon nodes, while the latter is often placed on the floor, ceiling, or around facilities, and requires autonomous driving agricultural machinery to obtain the position information of beacons through active perception (such as vision, lidar, RFID reading and writing, magnetic sensors, etc.) [42,43,44]. Active beacon technology has the advantages of high positioning accuracy and strong real-time performance, but it requires high installation and maintenance costs and is limited by the environmental structure and various obstacles [45]. In contrast, passive beacon technology has a lower initial installation cost and is suitable for installation in structured, smooth-surfaced livestock and poultry houses. However, this technology still requires regular maintenance to prevent corrosion, wear, and dust accumulation [42]. Therefore, the existing beacon technology still has a relatively high maintenance cost and is only suitable for use in large farms.

3.1.2. Relative Positioning

Relative positioning usually relies on technologies such as machine vision sensors, light detection and ranging (LiDAR), inertial measurement units (IMU), gyroscopes, accelerometers, magnetometers, and wheel odometers for heading position calculation. The advantages and disadvantages of various technologies are shown in Table 3 and Table 4. Compared with absolute positioning technology, it can also estimate the attitude (rolling, pitch, and yaw) of autonomous driving machinery, and the accuracy will not decline due to the failure of satellite signals. Machine vision sensors are a passive navigation and positioning method. They identify reference objects such as landmarks and crops through cameras and collect their continuous image sequences, thereby measuring the relative position and direction between the autonomous driving agricultural machinery and the reference objects [46]. Among them, the process of analyzing continuous images based on feature matching and tracking techniques to estimate the motion information of autonomous driving machinery is called visual odometry (VO). Combining image processing methods (such as Hough transform, Census transform) [47,48,49], optimization algorithms (such as Kalman filtering) [50], object detection algorithms (such as YOLO algorithm) [51,52], semantic segmentation algorithms [53,54], and control algorithms (such as PID control, fuzzy control) [49] can further improve the accuracy of recognition and positioning [55,56].
LiDAR includes 2D LiDAR(Blackfly S BFS-U3-200S4C, FLIR Systems, Wilsonville, OR, USA) and 3D radar (AXIS Q3526-LVE, Axis Communications, Chelmsford, MA, USA) [66,67]. The former only has a set of ranging data, and there is a risk of being blocked by weeds or leaves, as well as the possibility of losing detailed 3D data [68]. Therefore, researchers suggest using sensors with a wider vertical viewing angle, such as 3D LiDAR or rotating 2D LiDAR, to obtain richer information [69,70]. Similar to visual odometry, the process of estimating mechanical motion states based on LiDAR perception information and using point cloud registration is called LiDAR odometry (LO). LO and VO are usually used as the front-end modules of simultaneous Localization and Mapping (SLAM) technology, providing relative observation results of mechanical motion and generating local maps. On this basis, SLAM technology uses optimization algorithms or filters to optimize the pose trajectory and map of the machinery and combines the loop closure mechanism to update and eliminate cumulative errors. This stage is usually referred to as the back end. Finally, the SLAM algorithm constructs the real-time environmental map based on the optimized results [71]. Typical visual SLAM algorithms include ORB-SLAM [72], LSD-SLAM [73], etc., while common LiDAR SLAM algorithms are represented by algorithms such as LeGO-LOAM [74] and Cartographer [75]. Table 5 presents the performance and characteristics of these algorithms in both structured and unstructured agricultural environments.
Wheel-based odometers are commonly used in autonomous agricultural vehicles, which infer the position and direction changes of the vehicle during movement by analyzing the information of wheel rotation. Considering the drawbacks of wheel-based odometers, when performing autonomous positioning, it is necessary to ensure that the vehicle’s weight distribution is uniform and that the wheels are in full and uniform contact with the ground to minimize the risk of skidding to the greatest extent [62]. Accelerometers directly sense acceleration by measuring changes in force or mass, while gyroscopes determine the device’s motion posture based on the conservation of angular momentum and the measured angular velocity. The inertial measurement unit (IMU) consists of an accelerometer and a gyroscope and is a sensor that measures the device’s posture on three axes. Researchers usually combine it with Global Positioning Systems, magnetometers, or machine vision technologies, or use sensor calibration procedures to improve the estimation accuracy of heading angles [59,76]. The magnetometer is a method for calibrating heading angles because it can determine the absolute heading of an object by measuring the strength and direction of the Earth’s magnetic field [61].

3.1.3. Fusion Positioning

In actual agricultural scenarios, positioning technology also needs to deal with complex environmental interference issues such as obstructions, terrain undulations, and light fluctuations. For this, a single sensor has certain limitations and is unable to provide accurate position and attitude information for autonomous agricultural machinery. Integrating multiple information sources (sensor data) can effectively solve this problem.
To address the issue of GNSSs being affected by the obstruction of fruit tree canopies, Fei et al. proposed the error-state Kalman filtering (ESKF) fusion strategy, which combines non-holonomic constraints (NHCs) to limit the lateral and vertical velocity errors of the IMU and corrects the IMU drift through periodic/non-periodic zero-speed updates (ZUPT). Eventually, the average positioning error is controlled within 0.15 m, with the maximum error being approximately 0.3 m, and the system remains stable even when the satellite signal is briefly blocked [77]. Leanza et al. used adaptive Kalman filtering and linear regression to correct raw IMU/GPS data, achieving a full-path relative error of 1.9% and a straight-line segment error of 0.7% in a sloping vineyard scene with loose soil and severe vegetation obstruction. Compared to the uncorrected IMU/GPS combination, this method improves by more than 85% [61].
In response to the performance limitations of visual sensors under harsh lighting conditions, a visual–inertial fusion scheme was proposed by Fu et al. Thermal imaging images were optimized using adaptive bilateral filtering and Sobel gradient enhancement, with Sage–Husa adaptive filtering integrated to mitigate non-Gaussian noise. Under normal lighting, low lighting, and low-light jitter scenarios, the positioning error was reduced by 58.69%, 57.24%, and 60.23%, respectively, relative to the traditional IEKF algorithm. Furthermore, in real-world complex lighting environments, the root mean square error (RMSE) of the absolute trajectory error (ATE) was controlled within 0.2826 m [78].
Regarding the LiDAR point cloud occlusion problem in dense foliage environments, a LiDAR SLAM point cloud mapping scheme was designed by Qu et al., and the map was optimized using the GNU Image Manipulation Program (GIMP) to eliminate noise points. The average positioning error in outdoor tests was only 0.205 m [79]. Separately, a probabilistic sensor model was constructed by Hiremath et al. through the combination of LiDAR and particle filtering. Even in scenarios featuring curved crop rows and uneven plant gaps, the robot’s heading error was controlled within 2.4°, and its lateral deviation was controlled within 0.04 m, enhancing its adaptability to semi-structured farmland [80]. Furthermore, terrain undulation can exacerbate LiDAR-induced positioning errors. Indoor–outdoor comparison experiments conducted by Qu et al. demonstrated that the average lateral error during 15 m indoor navigation was 0.1717 m, whereas the error increased to 0.237 m for the same navigation distance in outdoor terrain with undulations. Complementary filtering integrating an inertial measurement unit (IMU) and wheel speed odometry is recognized as an optimization method for addressing this issue [79].
Most of the above studies adopt filtering (such as Kalman filtering and particle filtering) to conduct multi-sensor fusion. In addition, another commonly used fusion method is optimization (such as pose graphs and factor graphs) [81,82]. Among them, the filtering method updates the state estimates and covariance matrices in a recursive manner in real time to achieve efficient real-time positioning effects [83], while the optimization method uses optimization techniques to minimize positioning errors and thereby achieve global optimization [84]. The application scenarios of these two methods in the field of autonomous driving of agricultural machinery are shown in Table 6. The filtering algorithms and optimization algorithms mentioned in Table 6 and their performances in different environments are presented in Table 7 and Table 8, respectively.
Multi-sensor fusion technology also has extensive applications in the field of SLAM. Compared with the single-sensor SLAM technology, the multi-sensor SLAM algorithm shows higher robustness and positioning accuracy. Peng et al. tested three SLAM algorithms in the caged chicken house environment. The results show that the SLAM algorithm using only 2D LiDAR has excessive deviation, resulting in ineffective map construction. The mapping deviation of the SLAM algorithm that fuses 2D LiDAR, wheel odometer and IMU sensing data based on particle filters is 15 degrees. The mapping deviation of the SLAM algorithm that fuses the above three types of sensor data based on the graph optimization algorithm is only 3 degrees [96]. Jiang et al. fused IMU data with 3D LiDAR data to construct an autonomous navigation scheme for orchard spraying robots based on multi-sensor devices and SLAM technology. The average lateral navigation error and average heading Angle deviation of the robot’s movement do not exceed 16 cm and 8°, respectively [97]. Zhu et al. used binocular cameras to obtain crop boundary lines as navigation references, and then integrated visual SLAM and inertial guidance information to achieve real-time positioning of unmanned harvesters [98]. To solve the positioning problem of unmanned agricultural machinery vehicles in the absence of GNSS assistance, Zhao et al. proposed an efficient and adaptive LiDAR—vision-inertial odometry SLAM system. The test results show that the proposed model maintains accurate and stable positioning performance for a long time in various typical agricultural scenarios (greenhouses, farmlands, engineering buildings) [99]. Based on the measurement data of sensors, the SLAM algorithm can construct real-time environmental maps. The map matching method can be adopted to match the map drawn by the SLAM algorithm with the pre-constructed high-precision map, thereby achieving more accurate positioning. Map matching is also a fusion positioning method, which combines the relative positioning information and absolute positioning information generated by sensors. This method is usually applicable to indoor environments with relatively fixed internal structures, such as livestock and poultry breeding houses. Joffe et al. [100] adopted ultrasonic beacon technology to obtain absolute positioning information, calculated the motion posture of the autonomous driving robot through sensors such as wheel odometers and IMUs, and developed an autonomous egg picking robot with a positioning accuracy within a range of 2 cm.

3.2. Perception Technology

Perception technology is crucial for autonomous agricultural machinery, endowing it with the ability to “see” and “touch” its surroundings. It enables the system to acquire vital information about the environment and target operational objects, thereby facilitating precise navigation and operation. This section elaborates on key environmental perception technologies, which are essential for tasks such as object positioning and identification, status monitoring, and environmental assessment.
In the field of autonomous driving of agricultural machinery, various types of perception devices have been developed at present for capturing data such as light, humidity, temperature, odor, gas concentration, chemical substance concentration, images, and tactile and force feedback [101,102,103]. Images, as the main data type, have the advantages of rich signal information, easy acquisition, and non-contact collection, and are widely used in the navigation and operation processes of autonomous driving agricultural machinery. Therefore, here we mainly introduce common image perception devices (imaging technologies). Imaging technology can be classified according to different criteria. According to the differences in spectral resolution (spectral width) or the number of bands, it can be classified into visible-spectrum imaging (VSI), multispectral imaging (MSI), and hyperspectral imaging (HSI) [104]. From the perspective of imaging dimensions, they can be further classified into 2D cameras and 3D cameras (depth cameras) [105]. Table 9 and Table 10 show the characteristics, advantages, and disadvantages of the above-mentioned imaging technologies, and Figure 6 presents several commonly used perception technologies. Among these technologies, visible-light-spectrum imaging exhibits lower costs, ease of deployment, and support for plug-and-play three-dimensional perception, making it suitable for crop recognition, localization, and simple modeling. Although highly susceptible to lighting conditions, visible-light-spectrum imaging maintains high overall cost-effectiveness, rendering it well-suited for small- and medium-sized agricultural scenarios. Multispectral imaging (MSI) equipment features a moderate price point and enables the acquisition of spectral information from bands including the red edge and near infrared (NIR). MSI facilitates vegetation index analysis and crop health monitoring; however, it imposes specific requirements for data processing, limiting its applicability to farmers with adequate technical expertise, sufficient funding, and demands for high-efficiency operations to implement precision agricultural management. By contrast, hyperspectral imaging (HSI) delivers highly detailed spectral information, making it suitable for component analysis and precise crop disease identification. However, HSI equipment remains costly, requires complex data processing, and exhibits environmental sensitivity. HSI’s applicability to small-scale farming operations is relatively limited; instead, it is well-suited for large-scale agricultural production systems or research fields with stringent precision requirements.

3.2.1. Visible-Spectrum Imaging Technology

Typical visible-light-spectrum imaging techniques mainly include RGB cameras and RGB-D cameras. Both of them use filter arrays to capture RGB images. Such images can significantly distinguish the target object from other background objects, and thus have been widely used in many studies for identifying objects with different characteristics and detecting key features of the target object [106,107,108,109]. In addition, RGB-D cameras can also generate depth images that represent the distance information of the surrounding environment. The darker the black color in the image, the closer the distance; the lighter the white color, the farther the distance. The gray areas between black and white correspond to the actual physical distance between the target and the sensor. By combining this depth information with the 2D coordinates or RGB information of the target object, the target can be located [110,111]. Among them, the positioning based on 2D coordinates requires the use of coordinate transformation methods to obtain 3D coordinates, while the positioning based on RGB information requires the segmentation of the generated full-scene point cloud [112,113].

3.2.2. Multispectral Imaging Technology

Given that multispectral imaging technology is capable of acquiring data across multiple spectral bands—and that different materials exhibit distinct reflectance and absorption properties for radiation across these bands—spectral bands with the highest discriminatory power for specific feature information are typically selected by researchers for target recognition and feature analysis [114]. Within the field of autonomous driving for agricultural machinery, the most commonly utilized spectral bands primarily include the red-edge, near-infrared (NIR), and infrared (IR) bands.
The red-edge and near-infrared (NIR) bands exhibit high sensitivity to variations in green vegetation and soil moisture content. As a result, these spectral regions are frequently employed in precision agriculture for non-destructive and non-invasive analyses, such as estimating vegetation indices (e.g., NDVI, NDRE) [115,116,117], drought indices (e.g., PDI, MPDI) [118,119], and chlorophyll concentration [120,121], as well as monitoring crop growth and health [122,123]. Chlorophyll strongly reflects light in the red-edge and NIR bands, while water molecules exhibit absorption in these bands [124]. Consequently, higher vegetation density and chlorophyll content correspond to increased spectral reflectance, whereas higher soil moisture content leads to reduced reflectance in these bands [125,126]. In addition, NIR wavelengths can induce molecular vibrations in C–H, N–H, O–H, S–H, and C=O bonds, which are key components of organic matter. This property makes NIR widely applicable for soil component analysis [127,128].
Infrared (IR) radiation is an important form of energy exchange between objects and the outside world. All objects above absolute zero emit infrared energy. By converting the invisible IR radiation signals in the environment into visible two-dimensional images, the spatial distribution of temperature differences in the environment can be visually presented [129]. This is how thermal imaging technology works. This technology is non-invasive, non-contact, non-destructive, and highly applicable, and can form images even in the absence of light [130]. Since the temperature of plants is closely related to physiological processes such as transpiration, water metabolism, and stress response, thermal imaging technology is widely used in crop maturity and bruising analysis [131,132], water stress monitoring [133], pest and disease detection, and yield estimation [130]. Infrared thermal imagers can be roughly divided into two types: cooled types and uncooled types. Refrigerated infrared thermal imagers can provide more details of temperature differences, but they are large in size, expensive, and consume a lot of energy. Uncooled infrared thermal imagers have the advantages of small size, light weight, and affordability, and are suitable for installation on autonomous driving agricultural machinery [134]. However, the low contrast of thermal images is prone to cause significant errors in the generation of orthophoto images. In this regard, some researchers have developed effective calibration algorithms to improve the accuracy of uncooled thermal imagers [135,136].

3.2.3. Hyperspectral Imaging Technology

Hyperspectral imaging (HSI) is capable of capturing reflectance, transmittance, or radiance information from a target object across dozens to hundreds of narrow spectral bands, which is then compiled into a series of images and subsequently integrated into a three-dimensional hyperspectral data cube. These three dimensions correspond to the two spatial dimensions of the scene and the spectral dimension representing various wavelengths [137,138]. In recent years, HSI technology has been widely employed in crop type identification [139,140], crop component analysis [141,142,143], and the detection of weeds, pests, and diseases [144,145,146]; however, it still faces challenges including high costs, large data volumes, and environmental sensitivity. To address these challenges, principal component analysis (PCA) and continuous wavelet transform (CWT) can be utilized to identify and aggregate redundant data across multiple channels [123,139].

3.2.4. Two-Dimensional Cameras

In the field of agriculture, 2D cameras (monocular cameras) can not only perform object recognition [147,148] and phenotypic analysis [149,150], but can also obtain the positioning information of objects through perspective transformation calculations [151]. In response to the shortcomings of 2D cameras, some researchers have developed effective depth estimation algorithms that can obtain the distance information of objects from two-dimensional images. These algorithms can be classified into two categories: those based on machine learning and those based on deep learning. The former estimates depth by solving the unknown parameters in the assumed model or by searching for similar images in a large deep dataset [152,153], while the latter uses classification methods to predict the depth values of pixels in the image [154].

3.2.5. Three-Dimensional Camera

Common 3D cameras primarily encompass stereo vision cameras, structured light (SL) cameras, and time-of-flight (ToF) cameras. Stereo vision cameras typically consist of two or more monocular cameras, which capture and analyze disparities in images acquired from different viewpoints to generate high-resolution depth information [105,155]. However, the stereo matching and calibration processes for such cameras are frequently complex and time-intensive [156,157]. Structured light (SL) cameras are visual sensors equipped with one or more monocular cameras and a projector, which utilize projected light patterns to measure the three-dimensional (3D) shape of objects. Specifically, the projector emits a grid pattern onto the target object’s surface, while the camera captures the reflected pattern image [158]. This perception technology enables rapid acquisition of depth information for the target object and is unaffected by the object’s texture features. However, increases in measurement distance and intense lighting conditions result in reduced accuracy. Time-of-flight (ToF) cameras, including light detection and ranging (LiDAR) systems, measure the flight time of light pulses by emitting continuous beams toward the target object, receiving the reflected light, and subsequently calculating the object’s distance or depth [105]. Relative to stereo vision cameras and SL cameras, the measurement accuracy of ToF cameras is unaffected by object features or changes in distance. Therefore, ToF cameras are well-suited for 3D measurement tasks requiring high accuracy and robust motion stability at scale. However, ToF cameras exhibit relatively low resolution and higher power consumption. Furthermore, the speed of laser light renders direct detection of optical ToF practically unfeasible. Therefore, researchers typically acquire 3D measurement data by detecting the phase shift of modulated light waves [159]. Within the agricultural sector, 3D cameras have been widely employed for crop localization [86,160,161], crop detection [162,163,164], crop phenotypic analysis [165,166,167], and livestock monitoring [168,169,170].

3.2.6. Fusion of Multi-Perception Technologies

Each individual sensor has its own limitations. By using advanced data processing methods, data from multiple sensors can be fused together, thereby improving the accuracy and reliability of perception, especially under challenging conditions or in the presence of occlusion [171,172]. Different sensor combinations have different applicable scenarios. Multi-sensor combinations including LiDAR are typically used for positioning and measurement operations. For example, Kang et al. [173] located fruits by fusing the depth information of LiDAR and the color information of RGB cameras. The experimental results show that even under the condition of strong afternoon sunlight, the sensor can achieve accurate and robust positioning. The standard deviations of positioning at 0.5 m, 1.2 m, and 1.8 m are 0.253, 0.230, and 0.285, respectively. Zhang et al. [174] collected long-range images of large areas of crops by equipping unmanned aerial vehicles with LiDAR and multispectral cameras, thereby achieving the estimation of crop yields. Multi-sensor combinations that incorporate multispectral or hyperspectral imaging techniques are typically employed for large-scale monitoring of crop or environmental conditions. For instance, Dash et al. [175] evaluated the moisture conditions of operation-intensive farmland based on the thermal imaging and multispectral effects of unmanned aerial vehicles. Javidan et al. [176] detected the disease infection of tomatoes based on RGB and hyperspectral image data. Multi-sensor combinations containing RGB cameras are typically used for feature recognition or phenotypic analysis of the operation objects. Bhole et al. [177] used RGB images and thermal images to identify Holstein cattle in cattle herds. Gutierrez et al. [178] used RGB cameras and hyperspectral line scanning cameras to estimate the maturity of mangoes.

3.2.7. Deep Learning Training Methods for Environments with Scarce Labels

Methods based on deep learning play an effective role in solving various complex perception tasks. However, to fully utilize their potential, a large amount of labeled data is required. In the agricultural environment, obtaining large-scale and high-quality labeled data is not only costly but also time-consuming. To address this issue, semi-supervised learning (SSL) and domain transfer techniques have become effective alternatives, enabling high-performance visual perception tasks with limited labeled samples. Among them, semi-supervised learning (SSL) significantly reduces the reliance on labeling by simultaneously utilizing a small amount of labeled data and a large amount of unlabeled data. For example, Zhang et al. proposed a semi-supervised weed detection method based on a diffusion model. This model effectively integrates generated features with real features through an attention mechanism and a semi-diffusion loss function, achieving a performance of 0.94 in accuracy, 0.90 in recall rate, and 0.92 in mAP@50 compared to the fully supervised method DETR, which improves the accuracy and recall rate by approximately 10% and 8%, respectively [179]. Li et al. proposed a semi-supervised few-shot learning method, which can adaptively select pseudo-labeled samples from the confidence interval to help fine-tune the model, improving the accuracy of few-shot classification. The results show that the average improvement rate of the proposed single semi-supervised method is 2.8%, and the average improvement rate of the iterative semi-supervised method is 4.6% [180]. Benchallal et al. developed a new deep learning framework based on the ConvNeXt encoder combined with a semi-supervised learning method with consistency regularization, effectively utilizing labeled and unlabeled data. This method achieved high classification accuracy on public datasets such as DeepWeeds and 4-Weeds, outperforming the fully supervised model trained with only limited labeled data [181].
Another effective method is domain transfer techniques, especially suitable for pre-training models in the source domain (such as laboratory environment) and then transferring to the target domain (such as real fields). For example, Ashour et al. used bamboo (source domain) for pre-training in the sugarcane quality estimation task, and then adapted it to the sugarcane target domain through transfer learning, achieving an average estimation error of 4.5% and 5.9% for bamboo and sugarcane, respectively, using only sparse labeled data. This method avoids the high cost of frame-by-frame labeling and can train deep networks relying only on integrated labels over time [182]. Moreover, transfer learning can also reduce the computational resources and data required for training deep learning models by leveraging pre-trained models already trained on large datasets and fine-tuning them on a smaller dataset. It can also reduce the risk of overfitting in deep learning and improve the accuracy of model predictions and generalization ability in different environments [183]. For example, Simhadri et al. applied transfer learning to 15 pre-trained CNN models to achieve automatic recognition of rice leaf diseases. The results showed that the InceptionV3 model performed the best, with an average accuracy of 99.64%, while the worst was the AlexNet model, which had an average accuracy of 97.35% [184]. Buchke et al. applied transfer learning to tomato leaf disease image datasets with sizes of 3000, 8000, and 10,000, achieving impressive accuracy rates of 97.3%, 99.2%, and 99.5%. The experimental results show that transfer learning can achieve excellent performance even in small- and medium-sized datasets [185].

3.3. Motion Planning and Control Technology

Motion planning technology usually creates environmental maps based on positioning and perception data and then plans safe and efficient driving (motion) paths. Control technology determines specific dynamic parameters by further adopting inverse kinematics calculations or control algorithms to perform stable and accurate trajectory tracking tasks [186,187]. Motion planning and control technologies in agricultural machinery autonomous driving often cover vehicle navigation and machinery operation. Vehicle navigation enables agricultural machinery equipment to move between different operation sites, providing a guarantee for the completion of large-scale automated operations. Machinery operations involve the use of robotic arms and end effectors to complete specific agricultural tasks such as plowing, sowing, fertilizing, weeding, picking, feeding livestock and poultry, and environmental cleaning, which helps to achieve precise operations such as feeding, cutting, and grasping.

3.3.1. Motion Planning

Motion planning technology is the key for agricultural machinery autonomous driving to achieve autonomous motion and precise operation. The rationality of the motion planning algorithm depends on the consideration of various constraints, including obstacles, coverage rate, operational efficiency, etc. [188,189,190]. For the motion planning of machinery operations, it is also necessary to consider the posture of the target object and the motion coordination among various mechanical arms (multi-arm robots) to improve the success rate and efficiency of the operation [191,192,193]. According to environmental awareness and the state of obstacles, motion planning can be divided into global planning and local planning. Global planning is usually a kind of static planning that can be completed offline. It involves conducting path planning based on the known and fixed obstacle positions and environmental structures in the environmental map. It can be said that the overall planning determines the logic of agricultural machinery autonomous driving in independently performing operation tasks and affects the efficiency of task execution. However, in the actual operation process, one may encounter unknown or constantly changing scenarios, such as sudden obstacles, moving people, or livestock and poultry. In this case, local (dynamic) motion planning can effectively avoid collisions and perform agricultural operation tasks because it can achieve online real-time adjustment according to the actual situation [96,194,195]. Both global path planning and local path planning consist of two links: front-end path search and back-end path optimization. Among them, the purpose of path finding is to quickly find a feasible path that meets specific constraint conditions between the starting point and the target point, while path optimization involves optimizing and improving the path generated at the front end to make it more continuous, efficient, and smooth [196,197].
A wide array of path search algorithms have been developed to address various navigation requirements. Based on their underlying technical principles, these algorithms can be categorized into four types: graph-search-based, sampling-based, optimization-based, and learning-based algorithms [198]. The key features and representative algorithms of each category are summarized in Table 11 and Table 12. Furthermore, according to specific task demands, path planning algorithms can be classified into point-to-point, multi-objective, and complete-coverage path planning methods [96]. Point-to-point algorithms are commonly applied in navigation strategies for fixed-route autonomous robots or beacon-guided systems operating in semi-structured agricultural environments, where precise stopping at predefined locations is essential for task execution [199]. Multi-objective path planning aims to determine optimal routes that connect multiple target points, thereby enhancing operational efficiency and reducing energy consumption. These target points are often discrete and variable. For instance, in motion planning for autonomous feeding robots, routes must be dynamically adjusted to account for dispersed feed buckets and uncertain animal movement patterns in semi-intensive farms, which may lead to varying levels of feed availability at each feeding point [200]. In harvesting operations, autonomous picking robots must plan optimal picking trajectories for robotic arms to ensure all ripe fruits on a tree are harvested swiftly and efficiently [201,202]. Complete-coverage path planning focuses on ensuring full spatial coverage for operations such as sowing, monitoring, and cleaning [203,204]. Applications like floor egg collection or weeding also require a comprehensive inspection of the operating environment to determine exhaustive and efficient movement trajectories [205,206,207]. These applications often require systematic inspection of the environment to generate comprehensive motion paths. However, most conventional path planning algorithms primarily account for geometric constraints, often neglecting kinematic and safety considerations. To address this, back-end path optimization methods refine the initially generated paths by incorporating additional constraints to produce executable, smooth, and collision-free trajectories [208]. Some advanced algorithms, such as Kinodynamic RRT* [209], Hybrid A* [210], and the Covariant Hamiltonian Optimization Motion Planning (CHOMP) algorithm [211], explicitly consider kinematic constraints during the path generation phase to ensure feasibility. Despite the progress in this field, path planning algorithms still face significant challenges in unstructured and dynamic agricultural environments. Common issues include limited accuracy, low computational efficiency, and difficulties in trajectory convergence. Additionally, many algorithms insufficiently address external factors such as terrain variability, vehicle structure, machine dynamics, and energy consumption [212,213]. Recent studies have proposed hybrid approaches that combine multiple algorithms to improve adaptability, real-time performance, and planning robustness [214]. Future research can further explore two promising directions: (1) integrating conventional path planning algorithms with end-to-end learning-based approaches to develop high-performance, dedicated computing platforms for path planning, and (2) leveraging multi-robot collaboration to enhance the efficiency and scalability of agricultural task execution [215].

3.3.2. Motion Control

After obtaining the planned movement path, autonomous driving agricultural machinery still has to rely on control technology to track the trajectory stably and accurately. The motion control technology of autonomous driving agricultural machinery has been widely studied. Control methods can be divided into two types: model-based and model-independent. Model-based control methods such as pure tracking control, backstepping control, model predictive control (MPC), and adaptive control can build robust controllers based on the kinematic or dynamic models of vehicles or machinery to achieve precise control of autonomous driving agricultural machinery. However, this control method usually relies on high-precision sensors to obtain real-time status information. When autonomous driving agricultural machinery operates in complex and unstructured environments, it is prone to problems such as sensor or actuator failure, which directly affect the performance of the control algorithm. Most model-based control methods are still limited by parameter uncertainty, inaccurate modeling, unknown noise, and interference [216,217]. This poses challenges for achieving high-precision trajectory tracking of autonomous driving agricultural machinery. In contrast, model-independent control methods can adapt to environmental changes or deal with positional interference through learning or feedback mechanisms. Furthermore, this type of control algorithm has a faster response speed to the dynamic changes of the system and is suitable for handling complex systems and real-time control tasks [218]. The control methods independent of the model mainly include fuzzy control, proportional–integral–differential controller (PID), sliding mode control (SMC), and neural network control. However, these methods have their own limitations and have insufficient accuracy and stability. Table 13 and Table 14 show the characteristics and applicable scenarios of each control algorithm.
Facing the diverse and changeable agricultural operation environment, traditional control methods have certain limitations. In this regard, scholars have attempted to improve and combine traditional control algorithms or develop more accurate and stable control algorithms. To address the challenges agricultural unmanned aerial vehicles (UAVs) face due to wind disturbances, payload variations, and propeller failures during ultra-low-altitude phenotypic remote sensing and precision spraying, Wang et al. [219] proposed an adaptive composite disturbance rejection control algorithm. Considering variations in terrain conditions, Kraus et al. [220] introduced a nonlinear model predictive control strategy for autonomous tractor navigation, achieving an average deviation of 0.60 m during headland turning on damp and uneven grassland. Tang et al. [221] developed a linear active disturbance rejection control method optimized using an improved particle swarm optimization algorithm to mitigate the impact of environmental disturbances such as stones and potholes on agricultural machinery stability. Additionally, Yang et al. [96] applied adaptive and model predictive controllers in floor-based poultry houses to reduce wheel slippage caused by loose bedding materials. Several studies have implemented fault-tolerant control algorithms to address actuator faults in agricultural vehicles, thereby improving the stability of path tracking control. These methods effectively reduce the worst-case discrepancies between the desired and actual control inputs, while enhancing the fault tolerance and robustness of system fault detection and diagnosis. More specific details are discussed in greater depth in the following section [222,223]. Additionally, autonomous driving systems for agricultural machinery must coordinate the control of both mobile platforms and operational equipment. Yue et al. [224] proposed a multi-level coordinated control strategy to enable dynamic collaborative obstacle avoidance and trajectory planning for towed implements and mobile platforms. A notable aspect of motion control in agricultural machinery autonomy is the precision required in task execution. Combining the requirements of the operation tasks and the characteristics of the operation objects is crucial for improving the success rate of the operation and reducing crop losses. Aiming to solve the problem of the picking action of the autonomous driving robot being prone to cause fruit drop damage, Sun et al. [225] used PID control to track the target position of the robotic arm and proposed adaptive input shaping control to suppress the vibration of the end effector. Yin et al. [226] combined the fragile nature of mushroom fruits as the operation object, adopted model predictive control as the bottom position and speed tracker and used admittance control as the top control, which enhanced the flexibility of the robot picking operation. Chen et al. [227] proposed a force-feedback dynamic control method with slip detection function. By using an ultrasonic sensor to detect the distance between the end actuator and the operation object, and then using the feedback information facts to challenge the servo output torque, the success rate of the operation was enhanced.
Table 13. Applicable scenarios of different control algorithms.
Table 13. Applicable scenarios of different control algorithms.
CategoryAlgorithmApplicable ScenariosExample Reference
Model-Based ControlPure Pursuit ControlLow-speed path tracking in structured environments such as dry farmland and livestock barns (e.g., AGV linear navigation, agricultural vehicle straight-line guidance).[228,229]
Backstepping ControlMulti-degree-of-freedom coupled systems (e.g., UAV attitude control, robotic arm trajectory tracking).[230,231]
MPCHigh-precision dynamic trajectory tracking (e.g., cornering maneuvers, dynamic obstacle avoidance), multi-constraint optimization tasks (e.g., spray painting, sorting operations).[232,233]
Adaptive ControlSystems with uncertain parameters (e.g., agricultural machinery or robotic arms with varying payloads).[234,235]
Model-Free ControlFuzzy ControlUnstructured environments (e.g., muddy farmland, poultry house robot navigation), robotic arm vibration suppression.[197,236]
PIDSteady-state environments (fixed-trajectory tracking), low-speed low-disturbance scenarios (e.g., cage chicken house inspection).[43,237]
SMCHigh-disturbance environments (e.g., paddy field skidding), scenarios requiring rapid response (e.g., emergency vehicle obstacle avoidance).[238,239]
Neural Network ControlComplex dynamic tasks (e.g., lettuce sorting, apple picking), end-to-end control (e.g., autonomous driving vision navigation).[240,241]
Table 14. Characteristics of different control algorithms.
Table 14. Characteristics of different control algorithms.
AlgorithmAdvantagesDisadvantages
Pure Pursuit Control① Simple implementation with low computational cost① Performance highly sensitive to lookahead distance parameter tuning
② Well-suited for low-speed structured path tracking② Degraded performance under dynamic disturbances
Backstepping Control① Naturally adapted for nonlinear systems① High computational complexity requiring recursive derivation
② Flexible hierarchical design, compatible with adaptive/robust strategies② Moderate dependence on model accuracy
MPC① Multi-step predictive optimization with explicit constraint handling① High computational overhead requiring real-time solvers
② Adaptable to dynamic changes and complex paths② Performance degradation under model mismatch
Adaptive Control① Online parameter adjustment for time-varying disturbance rejection① Convergence depends on parameter estimation accuracy
② No requirement for prior disturbance boundary information② Difficulty in stability analysis for complex systems
Fuzzy Control① No requirement for precise mathematical models, relies on expert knowledge① Time-consuming rule base design process
② Strong capability in handling nonlinearities② Limited adaptability without sufficient data validation
PID① Simple structure, easily implemented into engineering① Poor performance in nonlinear/time-varying systems
② Low computational cost, suitable for embedded systems② Dependence on manual parameter tuning with weak robustness
SMC① Strong robustness against disturbances and parameter variations① High-frequency chattering in traditional SMC implementations
② Fast convergence, well-suited for nonlinear systems② Requirement for disturbance boundary information (mitigated in improved versions)
Neural Network Control① Model-free, data-driven approach① Requirement for large amounts of training data
② Capable of handling high-dimensional nonlinearities and dynamic changes② High computational resource demands and poor interpretability
In practical applications, the autonomous driving motion control algorithms for agricultural machinery continue to present inherent challenges. First, given that agricultural robots typically operate in outdoor or harsh environments, challenges such as limited computing resources and insufficient energy supply are frequently encountered within their hardware configurations. Therefore, the feasibility of control algorithms in resource-constrained environments must be thoroughly evaluated during their selection and implementation. These robots are predominantly equipped with embedded systems, which are characterized by limited processor performance, memory capacity, and battery endurance. Thus, a judicious balance between accuracy and efficiency must be achieved during the selection of control algorithms [242]. Model-based control methods, although capable of providing high-precision control effects, are associated with substantial computational costs and a dependence on high-precision sensors. Consequently, their implementation on devices with limited computing resources proves challenging. For example, while model predictive control (MPC) can handle multi-constraint optimization problems, its online solution necessitates considerable computing power and typically requires a high-performance processor. This often renders it unfeasible given the inherent power and cost constraints of agricultural robots. In contrast, model-free control methods are more readily deployable on devices with limited computing capabilities. Specifically, PID control, with its simple structure and low computational burden, is particularly well-suited for fixed trajectory tracking and scenarios characterized by low speed and minimal interference. However, PID control is associated with discernible limitations when applied to nonlinear and time-varying systems. These include the necessity for manual parameter tuning and a demanding prerequisite for robust system performance. Neural network control can handle high-dimensional nonlinear problems; nevertheless, its training and inference processes are computationally intensive and present significant challenges for real-time execution on low-power microcontrollers [243]. Therefore, in practical applications, acceptable tracking performance with limited resources is frequently achieved through the utilization of simplified models, the reduction in controller complexity (e.g., simplifying multi-degree-of-freedom models to equivalent low-dimensional models), or the integration of lightweight algorithms (such as fuzzy PID and adaptive sliding mode control). Additionally, to effectively address the uncertainties and dynamic changes characteristic of agricultural environments and to obviate excessive computational and storage overheads, algorithms should be equipped with online learning or adaptive capabilities [242]. In the future, with the ongoing development of edge computing hardware and the advancement of algorithm lightweighting technologies, more complex control strategies are projected to be effectively deployed on resource-constrained agricultural robot platforms. Second, despite the development of diverse agricultural autonomous driving motion control algorithms for various operational scenarios, most of these algorithms are exclusively validated under low-speed, small-curvature experimental conditions. Consequently, control schemes for high-speed, high-curvature, and other complex conditions present a notable gap [244,245]. Third, owing to the absence of a unified data communication protocol, research concerning coordinated control between mobile platforms and operational equipment remains constrained [246]. Finally, the dynamic and kinematic models of the agricultural autonomous driving chassis are indispensable for the efficacious design of control algorithms. Future research is recommended to investigate further multi-parameter fusion intelligent control technologies for each subsystem of the agricultural autonomous driving chassis [247,248].

3.4. Actuator Technology

Actuators are fundamental to autonomous agricultural machinery, translating control system signals (electrical, hydraulic, or pneumatic) into mechanical motion to enable diverse robotic tasks. Their performance and reliability are paramount, directly impacting the precision, efficiency, and safety of agricultural operations. Due to varied functional requirements, actuators are broadly classified by application: drive actuators are for locomotion and vehicle control and manipulator actuators are for intricate interaction tasks (e.g., grasping and cutting). The challenging agricultural environment often leads to actuator failures, necessitating robust fault detection and isolation (FDI) systems and effective fault-tolerant control strategies for continuous, reliable, and safe operation.

3.4.1. Drive Actuator

A drive actuator is tasked with converting the instructions of the control system into physical actions, subsequently actuating the steering motor and throttle controller for vehicle navigation tasks, as well as engaging the robotic arm and end effector to execute the machinery’s operational tasks. In agricultural vehicle autonomous driving, the actuator’s design must fully account for the operational requirements of various tasks, including working speed, driving torque, accuracy, and maximum handling weight. Traditional actuators primarily comprise three categories: electric, hydraulic, and pneumatic, as delineated in Table 15.
Motor actuator is the most commonly used type of driver. It directly converts mechanical energy into operational actions by utilizing the force and torque generated by the motor. This type of actuator has the advantages of easy implementation, high control accuracy, good environmental adaptability, convenient maintenance, and good reliability. Furthermore, due to bypassing the intermediate energy conversion stage, the motor actuator has higher efficiency and lower energy consumption than other drivers [258]. The commonly used types of motors in the field of agricultural machinery autonomous driving mainly include DC, AC, stepper, and servo motors [259]. Among them, DC motors are driven by the rotational force from DC electricity and have the advantages of large low-speed torque, fast start–stop response, good speed regulation performance, and simple control. They are suitable for steering control systems and the drive of small autonomous driving agricultural machinery vehicles. However, the wear rate of the brushes and armature of this type of motor is relatively high. Although brushless DC motors have higher efficiency and longer service life, their cost is also relatively high [258]. In contrast, the maintenance cost of AC motors is lower and their reliability is higher [66]. Therefore, AC motors are more suitable for the movement of threshing rollers in high-power continuous-load equipment such as combine harvesters [260]. Unlike the continuous rotation of DC motors and AC motors, stepper motors rotate to the required angle by infeeding a fixed number of steps to each electrical pulse signal, enabling precise and simple position control and staged motion control. However, they are typically used for low-speed operation and remote stepper positioning [261]. The servo motor is a high-precision actuator based on closed-loop control, which can adjust position or speed through feedback implementation and is a research hotspot in precision agriculture [262]. In terms of vehicle navigation, it is usually used to precisely control the traveling distance and turning angle of the vehicle to avoid collisions or damage to crops [263,264]. In terms of machinery operation, most recent studies have constructed hand–eye servo control systems by integrating them with robotic arms, end effectors, and visual perception devices. This system can not only obtain multi-angle visual information by adjusting the position of the visual sensor, but can also adjust the movement trajectory of the machinery according to the posture and position of the operation object, thereby achieving complex, flexible, and non-destructive agricultural operation control [265,266,267,268,269]. In conclusion, electric motor actuators have been widely applied in the field of autonomous driving of agricultural machinery. However, in agricultural environments with high dust levels and complex terrain conditions, this actuator still faces problems such as power device burnout, bearing failure, and encoder malfunction. Regular fault detection and systematic maintenance are necessary to extend its service life as much as possible.
A hydraulic actuator provides control support such as power, adjustment, and assistance for agricultural machinery through hydraulic pressure and flow. A typical hydraulic system includes a hydraulic pump, a hydraulic cylinder, a hydraulic motor, a storage tank, and control valves [270]. These components can provide higher pressure values and greater torque output, enabling the system to complete heavy agricultural tasks in a smaller size [253,271]. Furthermore, the hydraulic cylinder and the hydraulic motor, respectively, provide mechanical energy for linear motion and rotational motion for autonomous driving agricultural machinery, which is conducive to achieving complex and flexible motion control [255]. Despite the above progress, hydraulic actuators still have a certain gap compared with motor actuators in terms of control accuracy. In addition, hydraulic actuators have a higher failure rate. Among them, the common faults include four aspects: insufficient power of the hydraulic system, unstable hydraulic drive process, leakage or component damage of the hydraulic system, and excessively high temperature of the hydraulic system and deterioration of the oil [272].
A pneumatic actuator achieves motion control by relying on compression control. A pneumatic actuator is mainly composed of a pneumatic motor, an oscillating gas actuator, a gas compressor or vacuum pump, a gas storage tank, and a control valve. Among them, pneumatic actuators are used to grasp light objects or perform light agricultural operations. So far, mechanical equipment for operations such as planting, fertilization, plant disease diagnosis, and pesticide spraying based on pneumatic actuators have been developed [273]. Although pneumatic actuators have the advantages of being clean and safe, responding quickly, being maintained simply, and being lightweight and convenient, due to the compressibility of air, achieving precise motion control still poses challenges. In this regard, Pi et al. [274] simulated the muscle characteristics and movement principles of octopus tentacles and designed a flexible pneumatic actuator that could drive the extension, shortening, and bending actions of the end effector. Gao et al. [256] proposed a pneumatic finger-end actuator for cherry tomato picking, which can imitate human picking actions to harvest tomato fruits. The research results show that accurately identifying the location of the fruits is crucial for improving the success rate of picking. Therefore, future research can further enhance the performance of pneumatic actuators by improving the recognition and positioning accuracy of fruits and enhancing the adaptability of end effectors. Furthermore, since the reliability of gas actuators is highly dependent on the quality of the compressed gas, insufficient air pressure and pipeline leakage can easily lead to actuator failure, thereby causing control failure. For such equipment, it is necessary to install a real-time air pressure monitoring system to achieve fault early warning.
With the continuous promotion of precision agriculture, automatic driving of agricultural machinery will be applied to various complex operation scenarios and tasks. Traditional actuators can no longer meet the diverse operational requirements. Recent studies have developed novel device actuators using new working principles, such as magnetostrictive actuators [275], piezoelectric actuators [276], fluid variable pressure actuators [277,278,279], shape memory material actuators [280,281], ultrasonic actuators [282], optical actuators [283], etc. These innovative drive technologies break through the physical limitations of traditional mechanical transmission and are expected to enhance the intelligence level of agricultural equipment and promote the development of automatic driving of agricultural machinery towards lightweight, high-precision, and non-destructive operation, as well as multimodal collaboration.
Given that autonomous agricultural machinery typically operates for extended durations in complex and unstructured outdoor environments, drive actuators are susceptible to adverse effects from factors such as rain, direct sunlight, vibration, and repetitive operations, thereby leading to frequent failures [284,285,286]. Typical failures encompass hydraulic system leaks, insufficient power, electrical system sensor malfunctions, and damage to electromagnetic valve coils. For example, in the application of agricultural trailers for braking, the hydraulic system often experiences pressure loss due to seal failure, which can result in environmental pollution and a complete loss of braking capability. In response to this, Kisiel et al. proposed the utilization of additional purification filters, dryers, and air tanks to enhance system reliability and the activation of an emergency braking mechanism in the event of pipe rupture to prevent complete failure [287]. Additionally, electrical actuators, such as the electromagnetic valves in automatic irrigation systems, are susceptible to high-humidity environments, where they may experience issues such as decreased coil insulation performance and abnormal current reception, potentially leading to valve actuation failure. For such failures, lightweight detection methods based on power consumption signals demonstrate considerable application potential. For instance, Ahmed et al. monitored the power consumption state of the electromagnetic valve using an ammeter sensor, and through the combination of digital filtering and threshold analysis, high-precision fault classification was achieved without reliance on complex machine learning models. The experimental results indicated that the proposed method achieved an average accuracy of 99.9% and 99.7% for detecting abnormal control of two-channel electromagnetic valves [288]. Moreover, autonomous agricultural machinery may also experience various actuator failures, such as instances of nozzle non-opening and incorrect machine status, which can severely compromise operational quality. In consideration of the diverse issues that may arise within agricultural machinery systems, Conesa-Munoz et al. developed a supervision system based on a distributed and multi-level architecture. This system is capable of analyzing all information provided by the vehicle’s sensors and subsystems in real time and alerting the user upon the detection of a fault (including actuator failure), thereby facilitating more effective fault management and repair [289].

3.4.2. Manipulator Actuator

A manipulator actuator is defined as a mechanical arm system employed to perform physical interaction tasks in agricultural environments. Its core function involves the achievement of precise and adaptive operations on crops through the integration of multi-degree-of-freedom mechanical arms and highly customized end effectors. Unlike drive actuators that only provide power output, manipulator actuators prioritize the execution of a series of complex tasks, including recognition, positioning, grasping, and cutting, within unstructured and dynamically changing agricultural scenarios. Their performance is a direct determinant of the success rate and applicability of agricultural robots [290,291,292].
The mechanical arm serves as the kinematic framework of the manipulator actuator and is typically constituted by a series of links and joints. Its type of structure directly affects the system’s workspace, motion flexibility, and load capacity. Depending on its configuration, the mechanical arm may be categorized into three primary types: serial, parallel, and coordinate. Serial mechanical arms adopt an open-chain joint structure, exhibiting a large workspace and high flexibility. They are extensively utilized in applications such as fruit harvesting in orchards and greenhouse crop management [293]. Parallel mechanical arms connect the moving platform and the static platform through multiple motion chains, thereby possessing high stiffness, large load capacity, and high precision. They are instrumental in tasks such as fruit sorting and seedling transplantation within high-speed operation scenarios [294]. Some studies have designed hybrid mechanical hands that integrate serial and parallel structures. For instance, Li et al. developed a serial–parallel hybrid joint mechanical hand for a citrus harvesting robot, which expanded the workspace while simultaneously improving stiffness and precision [295]. Coordinate-type mechanical arms (including Cartesian, cylindrical, and polar coordinate robots) are mechanisms that are designed to execute linear, curvilinear, and/or other motion trajectories within a spatial coordinate system. This type of mechanical arm exhibits a simple structure and high positioning accuracy, making it particularly suitable for fixed-point operation tasks in structured greenhouse environments [293]. Additionally, based on the number of mechanical arms, manipulators can be further categorized into single-arm systems and multi-arm collaborative systems. Multi-arm systems achieve higher operational efficiency through cooperative control strategies, including task allocation and collision avoidance planning. For example, in a dual-arm tomato or apple harvesting system, the mechanical arms can simultaneously perform fruit positioning and cutting, thereby substantially enhancing overall harvesting efficiency [296].
The end effector, defined as the component directly interacting with the crop, necessitates a design tailored to specific operation objects and task requirements. According to their operation mode, end effectors can be categorized into pure removal and grasping removal types. Pure-removal-type end effectors consist of cutting tools such as scissors and blades and are typically employed for harvesting, pruning, and weeding operations [297,298]. Grasping-removal-type end effectors include grasping and removal components and can be further subcategorized into vacuum suction, soft grasping, and mechanical grasping types. The vacuum suction type is appropriate for non-destructive harvesting of delicate fruits (e.g., tomatoes and strawberries), achieving non-contact grasping through negative pressure adsorption. While this effectively reduces mechanical damage, it is susceptible to interference from fruit surface attachments and exhibits unstable adhesion effects on complex-shaped, soft, and fragile fruits [299]. Soft grasping devices imitate the structure and movement of biological muscles. These are usually driven by pneumatic or hydraulic methods and achieve adaptive wrapping-based grasping of crops through material elastic deformation, rendering them particularly suitable for operations on irregularly shaped or fragile crops. Common soft grasping end effectors include pneumatic soft hands, cable soft hands, and shape memory alloy soft hands. Mechanical-gripping-type end effectors are mostly rigid structures, making them suitable for the grasping and transportation of objects with certain strength, such as branches. However, their gripping force needs to be precisely controlled through force sensors or control algorithms to prevent damage to crops [213].
From a material perspective, manipulator actuators may be classified into rigid, flexible, and rigid–flexible coupling structures. Rigid actuators are mostly composed of metals or high-strength engineering plastics, exhibiting stable structures, strong load-bearing capacity, and high control accuracy. However, they demonstrate poor adaptability and are susceptible to surface damage or abrasion when interacting with irregularly shaped crops [300]. Flexible actuators, on the other hand, utilize soft materials such as silicone, which can effectively reduce operational damage to fragile crops. However, they also face challenges such as complex control, poor durability, low load-bearing capacity, and stringent air-tightness requirements [301,302]. The hybrid structure of rigidity and flexibility combines the structural stability of rigid materials with the adaptability of flexible materials. For example, a rigid structure may be employed in the main body of the robotic arm to ensure load-bearing capacity, while flexible materials or soft robotic mechanisms are integrated into the end grasping unit, thereby enabling higher levels of operational fault tolerance and precision in complex agricultural scenarios. This has become an important research direction in the design of agricultural robot actuators.
Agricultural manipulators also face various failure challenges in practical applications. Positioning errors can exceed 50 mm outdoors due to environmental disturbances such as wind, vibration, and changes in lighting. Solutions involve the use of fault-tolerant end effectors, the improvement of stereo vision algorithms, and the integration of multi-sensor information (such as IMU and encoders) for real-time correction [303,304]. Obstacle interference constitutes another prevalent issue, particularly in dense crop environments where the robotic arm is prone to colliding with branches, leaves, and other fruits. Increasing degrees of freedom can improve motion flexibility, or real-time motion planning algorithms such as RRT* can be introduced for dynamic obstacle avoidance [213,272]. Fruit damage is often caused by excessive grasping force or inaccurate operation positions. To mitigate this, the deployment of flexible end designs, force-feedback control, and bionic grasping strategies can significantly reduce the damage rate. Mechanical wear is frequently observed during long-term, high-load operations, especially in harsh environments. It is recommended to use corrosion-resistant materials, adopt modular design for maintenance, and introduce predictive maintenance mechanisms based on sensor data [272]. Finally, workspace limitations often prevent the robotic arm from reaching peripheral or concealed fruits. The addition of mobile platforms, the utilization of telescopic structures, or the development of adaptive control algorithms based on real-time pose estimation can expand their operational range [305,306].

3.4.3. Actuator Fault Detection and Fault-Tolerant Control

The malfunction of an actuator can prevent it from completing its predetermined function within the specified time or from meeting expected constraints, consequently leading to a significant reduction in the control accuracy and operational efficiency of agricultural machinery, and potentially triggering safety accidents [307]. To ensure the reliability and safety of autonomous agricultural machinery, a collaborative design approach is imperative at both the hardware and software levels. At the hardware level, redundant design of actuators (such as multiple motor drives [223,308]) may be implemented to facilitate function reconstruction through redundant units when a specific actuator fails, or to adapt to diverse operational requirements through different actuator groups [309]. At the software level, a high-precision fault detection and isolation system (FDI) and effective fault-tolerant control strategies are essential for maintaining the stable operation of the system.
The fault detection and isolation system (FDI) relies on sensors to obtain key status information of the actuator and performs fault discrimination through real-time comparison with standard model signals. Once the signal deviation exceeds the preset threshold, a fault alarm is triggered [310,311]. For different types of actuators (electric, hydraulic, pneumatic), corresponding sensors should be selected (e.g., current, hydraulic, or pressure sensors) and signal processing techniques should be used to suppress noise and improve data reliability [288]. The existing FDI methods mainly include four types: (1) Model-based methods, which rely on system dynamics to calculate residuals; these possess low computational complexity but are only suitable for systems of limited complexity. (2) Signal-based methods, which utilize frequency or vibration characteristics to identify anomalies; these do not require precise modeling but exhibit poor noise resistance. (3) Data-driven methods, which employ dimensionality reduction and statistical/machine learning techniques to mine potential fault patterns; these are suitable for complex systems but are highly dependent on data quality and scale. (4) Knowledge-based methods, which draw on expert experience and rule libraries to achieve fault discrimination and assessment; although capable of simulating human reasoning, they incur high knowledge acquisition costs [289,312,313,314,315].
After a fault is identified, the fault-tolerant control strategy can adjust the system according to the fault’s type and degree in order to maintain continuous operation [316]. Fault-tolerant control is divided into active fault-tolerant control (AFTC) and passive fault-tolerant control (PFTC). AFTC relies on FDI or observers to obtain fault information in real time and it dynamically reconfigures the control system or adjusts the controller parameters. Common observers include error observers [317], extended state observers [318], sliding mode observers [319], high-gain observers [320], and disturbance observers [321]. These observers are capable of effectively estimating system states and fault parameters, as well as compensating for faults and external disturbances. AFTC technology has been widely applied in the field of agricultural machinery autonomous driving, for example, in trajectory tracking control of unmanned multi-rotor systems [322,323] and unmanned ground vehicles [324,325]. However, AFTC exhibits a high dependency on real-time and accurate fault information [326]. In contrast, PFTC is an offline design control strategy that does not rely on real-time fault information, nor does it require FDI support and online controller reconfiguration. Consequently, it offers lower latency, stronger robustness, and a simpler control architecture [327]. However, its design needs to cover all potential fault modes and requires the development of corresponding fault-tolerant strategies based on the fault probability distribution [328]. In recent years, research has optimized controller design through probability constraint relaxation strategies, wherein robustness constraints are appropriately relaxed for low-probability faults to expand the parameters’ feasible domain. This approach thereby enhances overall system performance while maintaining the core fault-tolerant capability, as exemplified by dynamically switching fault-tolerant strategies based on the probability of actuator faults [329] or fault-tolerant controllers [330].
These fault-tolerant control strategies are mainly used to drive the actuators; however, in the field of agricultural robots, the fault-tolerant design of manipulating actuators is equally crucial. To address the visual positioning errors and actuator deviations caused by disturbances (such as wind and mechanical vibrations) in unstructured agricultural environments, Zou et al. proposed a fault-tolerant design method for the fruit picking end effector based on visual positioning errors. By treating random positioning errors as “fault errors” and incorporating them in the mechanism design, combined with the modeling of the clamping mechanism–tool relationship and fault-tolerant mathematical programming, the end effector can stably complete the operation within the error range (z-direction error ≤ 60.1 mm, x-direction error ≤ 17.2 mm). This design achieved an indoor and outdoor success rate of over 84% and 78%, respectively, in experiments of lychee and citrus fruit picking [303]. Xie et al. further emphasized that the end effector should exhibit robustness to machine errors (e.g., visual positioning deviations, mechanical arm motion errors) and be capable of absorbing errors through flexible materials, adaptive grasping strategies (such as if–then enveloping grasping and vacuum suction compensation), thereby preventing operational failure due to position deviations [272]. For example, in apple picking, the utilization of flexible materials for the end effector or a flexible–rigid coupling structure can compensate for positioning and grasping errors within a certain range, consequently leading to a significant reduction in the fruit damage rate and an improvement in the picking success rate.
Although fault-tolerant control for single actuator failures has been widely studied, when agricultural machinery operates continuously in the field, concurrent failures of multiple actuators may be encountered due to vibrations, environmental erosion, and similar factors. Traditional fault-tolerant methods struggle to maintain the continuous operation of the system [331]. To address this issue, scholars have proposed several novel solutions. For example, Wang et al. [331] modeled multiple faults as composite disturbances and utilized a multi-fault nonlinear observer to achieve aircraft tracking control. Ma et al. [332] designed a distributed adaptive direct fault compensation strategy for fault-tolerant formation control under multiple actuator failures. Furthermore, Wang et al. [333] investigated the synchronization problem of memristive neural networks (MNNs) in the presence of multiple actuator failures and proposed a state feedback synchronization method based on FTC. In addition, the agricultural autonomous driving system is required to exhibit actuator failure adaptability and high-precision control performance within a limited time. By combining event-triggered control, finite-time control, and fault-tolerant technology, the stability, resource utilization, and finite-time convergence ability of the system can be significantly enhanced [334,335]. However, the application of this direction in agricultural fault-tolerant control remains insufficient and warrants further exploration. Future research should place a greater emphasis on the overall fault-tolerant performance of the actuator system, integrating hardware redundancy, intelligent materials (such as flexible actuators), and advanced control algorithms (such as learning-based fault-tolerant control) to enhance the reliability and intelligent adaptability of agricultural robots in complex unstructured environments. At the same time, increased emphasis should be directed toward research on fault-tolerant control under multi-actuator collaborative failures, thereby promoting the development of agricultural robots towards higher levels of autonomy and intelligence.

4. Application Status

As an autonomous operation device with multiple degrees of freedom, the agricultural autonomous driving mechanical system integrates the capabilities of perception, decision-making, control, and execution, while the multi-machine collaborative system also has efficient communication capabilities. By integrating advanced technologies such as artificial intelligence, cloud computing, and big data, researchers have developed multi-functional autonomous driving agricultural machinery. According to the different objects of operation, these devices can be classified into four major categories: autonomous driving for farmland, autonomous driving for orchards, autonomous driving for livestock and poultry breeding, and autonomous driving for aquaculture. In addition, agricultural production is a long-term cycle, involving multiple functional links such as sowing, planting, cultivation, harvesting, processing, monitoring, transportation, environmental cleaning, and health protection. Therefore, autonomous driving agricultural machinery can also be classified according to functional types.

4.1. Autonomous Driving in Farmland

Farmland, as the core carrier of the agricultural production system, undertakes the production and planting functions of grain, feed, and economic crops. The traditional farmland production operation mode relies on a large number of laborers. Currently, the shortage of laborers has become a key bottleneck restricting the production efficiency of farmland. Using autonomous driving in farmland to replace humans in performing agricultural production tasks can not only effectively address the challenge of food shortages but also promote the intelligent transformation process of agricultural production. Farmland autonomous driving refers to an intelligent equipment system that realizes fully unmanned operations in unstructured farmland environments by integrating advanced technologies such as environmental perception, decision-making planning, and execution control [336]. At present, the main application scenarios of autonomous driving in farmland cover intelligent operation platforms throughout the entire production cycle, including tillage and land preparation, sowing, field management, monitoring, and harvesting, forming a closed-loop operation chain from soil tillage to crop harvesting, as shown in Figure 7.

4.1.1. Autonomous Driving for Farming and Land Preparation

Farming and land preparation, as fundamental links in the agricultural production chain, have operation processes with the significant characteristic of highly repetitive physical labor. The application of autonomous driving technology can not only effectively reduce the labor intensity of farmers, but also improve farming efficiency and operation quality, providing key technical support for digital agriculture [337]. Therefore, researchers at home and abroad are committed to developing more intelligent autonomous driving machinery for farming and land preparation. The AgBot series of equipment launched by the Dutch company AgXeed, as the world’s first mass-produced autonomous driving machinery for farmland, integrates a Global Positioning System, sensors, and an optical obstacle recognition system and supports the continuous adjustment of a width ranging from 1.8 to 3.0 m. Its working device can be folded onto the top of the unmanned vehicle and can be connected to the tractor through an independent double-wheel axle or traction rod to meet different working requirements [338]. Tamaki et al. [339] developed an agricultural autonomous driving system composed of three devices, specifically for large-scale rice cultivation, in order to enhance the automation level of agricultural production and reduce the reliance on manual labor. Matsuo et al. [340] designed autonomous tillage machinery based on the traditional tractor platform whose working capacity is comparable to that of a manually controlled tractor and realizes unmanned rotational tillage in a rectangular field with the help of a navigation system. Furthermore, Zheng et al. [341] developed an intelligent autonomous driving device for paddy field rotary tillage on flat land based on the dual antennas of the Global Navigation Satellite System, and optimized the design of its hydraulic control system to enhance the operational stability and accuracy.

4.1.2. Autonomous Driving for Sowing

Sowing is the key and primary link in farmland production. Sowing autonomous driving agricultural machinery can sow seeds at accurate positions, thus saving farmers’ time and financial costs [342]. Bhimanpallewar et al. [343] designed and tested a robot for automatic sowing and micro-dose fertilization based on robot technology and Internet of Things technology. It can not only accurately quantify the number of seeds to be sown but can also control the distance between different sowing points. Kumar et al. [344] developed an intelligent sowing autonomous driving device, which controlled the movement and sowing of the mechanical system through a mobile application, achieving a fully automated sowing process. Zhu et al. [345] designed a combined operation machine for stratified fertilization and sowing of winter wheat with wide seedling strips, which can meet the agricultural demands of rotary tillage and land preparation, the stratified application of fertilizers, and precise sowing in the farmland environment. The autonomous sowing equipment developed by Shaikh et al. is equipped with advanced obstacle detection functions and AI cameras, and is capable of automatically navigating through fields and conducting autonomous sowing [346]. Griepentrog et al. [347] developed an autonomous sowing machine using pre-stored map data, achieving grid-like sowing and perforated sowing, and improving the accuracy of sowing. Santhi et al. [348] proposed an agricultural autonomous driving robot based on sensors and vision, which can navigate on any farmland and perform seed sowing operations simultaneously. Azmi et al. [349] developed a low-cost automated system for crop seeding in farmland. This system consists of a mobile chassis module and a precision seeding component, demonstrating all-terrain adaptability while maintaining the ability of continuous seedling implantation. Experimental data show that, compared with the manual sowing method, this system can achieve an increase of more than 35% in operational efficiency, fully realize the autonomous navigation and sowing functions, and effectively reduce labor costs.

4.1.3. Autonomous Driving for Field Management

Field management encompasses key agricultural operations such as irrigation, fertilization, weeding, and pest control. These tasks span the majority of the crop growth cycle and constitute a significant portion of total production costs. Automated irrigation systems regulate water flow via computer-controlled mechanisms to provide adequate hydration for crops and ensure their normal metabolic functions. This technology facilitates precise water usage, thereby enhancing water use efficiency and addressing the challenges posed by water scarcity [350]. Cheng et al. [351] proposed an electric spraying robotic arm mounted on unmanned agricultural machinery for large-scale irrigation tasks, which significantly reduces labor input. Autonomous irrigation equipment further integrates advanced path planning and autonomous navigation technologies, enabling stable and accurate irrigation in farmlands with complex topography [305]. By deploying sensor and actuator nodes, such systems can dynamically regulate water discharge based on soil moisture and temperature data and flexibly control actuator valves, thus contributing to resource conservation. For instance, Lamsen et al. [352] developed an autonomous irrigation device that detects soil dryness via humidity sensors and modulates water flow through an Arduino microcontroller, effectively minimizing water waste. Similarly, Bodunde et al. [353] introduced an intelligent mobile sprinkler irrigation robot based on ZigBee wireless communication. This system overcomes the spatial limitations of fixed irrigation installations, reduces crop damage, and helps maintain optimal soil moisture balance.
Fertilizing autonomous driving agricultural machinery can provide essential nutrients according to the growth status of crops, promoting their growth and development. Cruz Ulloa et al. [354] conducted selective fertilization by using multiple sensors such as laser, multispectral, and RGB to distinguish green and red cabbages in farmland. Considering the waste of fertilizers and the environmental pollution caused by excessive or unbalanced use of fertilizers, some researchers have developed autonomous driving devices for precise fertilization. Mao et al. [355] developed an agricultural robot for autonomous spraying and fertilization, which effectively avoided the residue and waste of chemical fertilizers caused by surface injection and improved the effective utilization rate of chemical fertilizers. The autonomous driving fertilization robot proposed by Zhu et al. [356] uses computer vision technology to obtain the category and phenotypic information of plants, and then formulates corresponding fertilization decisions to achieve efficient and accurate variable fertilization. An autonomous driving robot named SwarmBot developed by an Australian team can obtain physiological data of crops during movement based on machine learning algorithms, thereby applying an appropriate amount of fertilizer [23]. Similarly, an agricultural autonomous driving robot launched by Small Robot, a British sustainable agriculture startup, also analyzes the physiological data of crops through intelligent sensing technology and artificial intelligence algorithms to implement variable fertilization strategies [357].
Autonomous driving agricultural machinery for weeding mainly includes three types: chemical spraying, mechanical weeding, and thermal weeding (Table 16). Chemical weeding robots involve the use of directional spraying of herbicides to kill or inhibit the growth of weeds. AgBot II is an agricultural autonomous vehicle with an automatic navigation function developed by researchers from Queensland University of Technology. It is capable of detecting and classifying weeds in fields and spraying chemical herbicides to kill them [358]. To minimize the use of chemical herbicides and environmental pollution and avoid weed resistance as much as possible, recent studies have controlled nozzles through artificial intelligence technology to precisely trace the application of herbicides to the identified target weeds [359,360]. The SprayBox crop protection robot developed by Verdant Company of the United States is composed of 50 nozzles and a complex computer system. It can target individual weeds and crops at a speed of 20 times per second and spray herbicides with millimeter accuracy. Compared with traditional spraying techniques, it saves about 95% of the usage of chemical herbicides [22]. The autonomous driving weeding robot developed by the Swiss company EcoRobotix achieves precise positioning and weed identification through GPS RTK (real-time kinematic), cameras, and sensors, and uses two articulated arms to spray micro-doses of herbicide [361]. In addition, drones have strong applicability in chemical spraying for weeding. The unmanned aerial vehicle Hylio AG-130 (Hylio, Katy, TX, USA) developed by the United States is capable of applying herbicides to large-scale farmland through a high-precision spraying system [362]. The unmanned aerial vehicle EA-30X-Pro (EFT Aerial Robotics, Shenzhen, Guangdong, China) developed in China is equipped with binocular environmental perception technology, which can achieve automatic image calibration of target crops and precise low-altitude fixed-point spraying [363]. As an environmentally friendly intelligent agricultural machine, mechanical weeding robots usually utilize advanced technologies or sensors to identify and distinguish crops from weeds, and adopt mechanical end effectors to cut, pull out, burn, and bury to eradicate or interrupt the growth of weeds. They can not only effectively reduce labor costs but also contribute to sustainable agriculture [364,365]. Tillett et al. [366] guided agricultural vehicles along crops through a machine vision system and hydraulic drive discs and used steering hoes with interrow tillage blades and two rotating cutting discs to remove weeds within the crop rows. The field trial results show that the equipment causes very low damage to crops and can effectively remove 62–87% of the weeds within a 240-mm radius around the crops. The intelligent weeding equipment developed by Sori et al. [367] adopts a dual-wheel drive structure, integrates touch sensors and rotational orientation sensors, and can remove weeds by stirring the soil and can inhibit the growth of weeds by blocking sunlight. Eth Zurich has developed an autonomous robot named Rowesys for dealing with dense weeds, which is capable of pulling weeds out of crop rows [368]. Although mechanical weeding autonomous driving robots have promoted the development of environmentally friendly agriculture and improved the efficiency of agricultural production, the mechanical tools such as hoes and rotating blades they use may cause damage to nearby crops. Furthermore, the disturbance to the soil during the operation process may damage beneficial soil organisms [369,370]. Thermal weeding eliminates weeds by using different types of end effector, flame, laser, and other heat treatment methods. It can destroy membrane proteins in plant cells or inhibit the growth of apical meristematic tissues of weeds without harming crops and soil organisms, and can cause fatal damage to some weed species. Therefore, it is regarded as a promising means of weed control [371,372]. Zhao et al. designed an autonomous driving laser weeding device for strawberry fields based on the YOLOv8-pose computer vision framework, achieving precise identification of weed growth points, autonomous navigation, and laser weeding operations, and enabling efficient weeding without damaging strawberry seedlings [373]. The Dino autonomous driving robot developed by the French Naio Technology company integrates cameras and GPS monitoring devices. It can move autonomously among crops and precisely eradicate weeds through lasers [374]. The self-propelled weeding system LaserWeeder developed by Carbon Robotics in the United States adopts the most advanced deep learning system Carbon AI to achieve the location and elimination of weeds with sub-millimeter accuracy [375]. Xiong et al. separated and eliminated weeds by building an autonomous driving platform equipped with machine vision and laser indicators, with a success rate of up to 97% [376].
Pest control autonomous driving agricultural machinery and weeding autonomous driving agricultural machinery have similar operational natures. They mainly ensure the healthy growth of crops by detecting pests and diseases and spraying chemical pesticides or releasing natural enemies of pests. Martin et al. developed an autonomous agricultural robot integrating navigation, detection, and operation to deal with early pests to avoid pest outbreaks [377]. Similar to autonomous weeding, drones also have significant advantages in pest control in complex working environments. In view of the uneven distribution of pests in fields, Iost et al. [378] elaborated on the application of unmanned aerial vehicle (UAV) technology in pest control in farmland. Specifically, a sensing UAV is mainly responsible for obtaining and processing plant reflection data to generate digital maps, while a driving UAV precisely releases natural enemies of pests or sprays pesticides based on digital maps. The research by Mao et al. also shows that using multiple unmanned aerial vehicles to spray pesticides in cotton fields can not only carry out targeted control of pests, but also prevent the rapid spread of pests [379]. Although drones have extensive applications in weeding and pest control, there are still challenges such as ambiguous optimal parameters, poor permeability to crop crowns, low droplet coverage, and heterogeneous droplet distribution. In order to ensure the control effect of pests and diseases, it is necessary for future research to further explore the specific influences of parameters such as spraying height and spraying speed [380].
Table 16. Comparison of autonomous agricultural machines for weeding.
Table 16. Comparison of autonomous agricultural machines for weeding.
Weeding TypeTechnical PrinciplePerformanceRef.Schematic Diagram
Chemical sprayingDirectional herbicide spraying to inhibit weed growth① Overall success rate of weed detection and classification exceeds 90%.
② Weed control operation costs reduced by approximately 90%.
[358]Actuators 14 00464 i001
Chemical sprayingAI-controlled nozzle for precise micro-dosing of herbicides① Spraying frequency: 20 times per second, with millimeter-level precision.
② Throughput: Processes over 500,000 plants per hour.
③ Herbicide usage reduced by 95%.
[22]Actuators 14 00464 i002
Chemical sprayingGPS-RTK, camera, and sensor-based weed localization and identification with articulated arm for micro-dose herbicide sprayingHerbicide reduction: 95%.[381]Actuators 14 00464 i003
Chemical sprayingHigh-precision large-scale herbicide spraying-[362]Actuators 14 00464 i004
Chemical sprayingAutomatic calibration of target crop images and low-altitude fixed-point precise spraying① Control response time: ≤0.2 s.
② Spraying flow rate: 0.5–10 L per minute.
③ Spraying radius accuracy: ≤0.5 m.
④ Operational efficiency improved by over 30%.
[363]Actuators 14 00464 i005
Mechanical weedingComputer vision-based weed identification with steering hoe weedingWeed reduction within a 240 mm radius around crops: 62–87%.[366]Actuators 14 00464 i006
Mechanical weedingSoil tillage and sunlight blocking to suppress weedsCompared to traditional farming:
① Rice plant height in weed-controlled areas: 920 mm vs. 760 mm.
② Single-plant rice grain weight: four-fold increase (46.5 g vs. 9.5 g).
③ Yield per unit area: 20,440 g/m2.
[367]Actuators 14 00464 i007
Mechanical weedingComputer vision-based weed identification with rotating rack for cutting, uprooting, and burying weedsPlow depth: 2 cm below ground.[368]Actuators 14 00464 i008
Mechanical weedingStatic and dynamic hoes for weed chopping between and within rows-[382]Actuators 14 00464 i009
Thermal weedingComputer vision-based weed identification with laser weeding① Mean Average Precision (mAP) for area and point target detection: 88.5% and 85.0%, respectively.
② Weed control rate: 92.6%.
③ Seedling damage rate: 1.2%.
[373]Actuators 14 00464 i010
Thermal weedingDeep learning system for weed detection/identification with laser weeding① Annual weed control cost reduction: 80%.[383]Actuators 14 00464 i011
② Weed kill rate: 99% for tested weed types.
Thermal weedingCamera + GPS monitoring with laser precision weeding-[374]Actuators 14 00464 i012
Thermal weedingMachine vision-based weed identification with laser pointer for targeted weed elimination① Average weeding position error: 1.97 mm (standard deviation: 0.88 mm).
② Hit rate: 97%.
[376]Actuators 14 00464 i013

4.1.4. Autonomous Driving for Crop Harvesting

Traditional crop harvesting requires a large amount of labor, especially during the seasons when harvesting tasks are frequent. Autonomous driving agricultural machinery for crop harvesting is usually capable of autonomous navigation in the farmland environment, accurately identifying crops, distinguishing their maturity levels, and harvesting, storing and transporting crops without causing damage [384]. The existing research is mainly based on the improvement of existing agricultural machinery such as tractors or combine harvesters to develop autonomous driving operation equipment for crop harvesting [385]. These devices are typically composed of three parts: a machine vision system for crop perception, a platform for moving between different crops, and a robotic arm or end effector for harvesting [386]. Kurita et al. [387] applied image processing technology to combine harvesters, using machine vision to position the nozzles of the harvesters at the crop positions, achieving automatic harvesting and loading of grains. Zhang et al. [388] introduced a multi-vehicle operation mode. Each autonomous driving tractor can independently load, unload, and transport the harvested crops, and can also work collaboratively in large farmlands to reduce the risk of collision and improve operation efficiency. Geng et al. [389] developed an autonomous corn harvesting robot. Under normal moving speed and harvester speed, the average crop cutting deviation was 0.063m and the corn kernel loss rate was only 0.76%. It is worth noting that with the continuous improvement of agricultural automation levels and the complication of operation scenarios, the development of high-precision and high-efficiency autonomous driving robots for crop harvesting has become crucial [390]. Li et al. [391] developed an autonomous harvester based on deep learning algorithms, which can detect and avoid obstacles in the field in real time during movement, and the success rate of collision avoidance can reach 96.6%. Pooranam et al. [392] invented a swarm harvester that can achieve large-scale automated harvesting, threshing, and cleaning, significantly improving the efficiency of crop harvesting. Wang et al. [393] designed a trajectory planning algorithm for autonomous driving in crop harvesting, effectively solving the large overcall and long convergence time caused by navigation errors and enhancing the stability of machine driving.

4.1.5. Summary of Performance and Benefits

Autonomous agricultural machinery demonstrates significant improvements in efficiency, precision, and resource optimization across various field operations. These systems reduce labor dependency while enhancing task-specific outcomes such as planting accuracy, weed control, and crop management. For instance, robotic transplanting and seeding systems achieve high-speed operations with minimal deviation errors, while automated weeding and spraying technologies drastically reduce herbicide usage (up to 95% savings) and operational costs (80% reduction annually). Precision navigation systems enable millimeter-level accuracy in tasks like fertilization and harvesting, with real-time data integration ensuring adaptive performance in diverse terrains. Table 17 synthesizes key research projects and commercial products, highlighting their operational benefits and technological capabilities.

4.2. Autonomous Driving in Orchards

Compared with the farmland environment, orchards have a complex terrain environment and semi-structured nature, making it impossible to use large-scale machinery for automated operations [394,395]. Recently, the implementation of precision agriculture technology has been regarded as the best solution, which can not only maximize the yield but also maintain the sustainability of the orchard environment [396]. However, the fruit tree industry still relies heavily on labor. The implementation of precision agriculture technology in orchards has further increased the demand for labor in this industry. Therefore, developing autonomous driving robots for orchards capable of autonomous navigation is crucial for improving the production efficiency of orchards [397]. Although greenhouses have a completely different environment from orchards, there are certain similarities in the nature of their operations. Therefore, in this paper, the research on autonomous driving of agricultural machinery in orchards and greenhouses is discussed in the same chapter. At present, the common autonomous driving devices in orchards mainly include those for transplanting, monitoring, pruning, and picking, as shown in Figure 8.

4.2.1. Autonomous Driving for Transplanting

The transplantation of autonomous driving agricultural machinery is a key production tool in the field of modern agricultural planting. It is mainly used to place seedlings at the target positions in the orchard, enabling rapid planting of seedlings and reducing the labor intensity of farmers [398,399]. The typical transplanting process includes picking, punching, and planting. Past studies usually used transplanters to complete all the above operation steps. To improve the degree of automation of the transplanter, sufficient research has been conducted on the design of key mechanisms and the optimization of structural parameters [400,401,402,403]. Accuracy and stability are important indicators for measuring the performance of equipment transplantation. In this regard, Jin et al. [404] proposed a control method for the transplanting manipulator based on the fuzzy PID control strategy, effectively solving the problem of low seedling picking accuracy of the transplanting machine caused by interference from external factors. Han et al. [405] constructed a multi-task robot transplanting work unit, aiming to automatically pick up and plant seedlings and the overall success rate of transplanting operations was able to reach 90%. Paradkar et al. [406] used an autonomous driving device with a mechanical arm for seedling transplantation, with the aid of the LDR (light-dependent resistor)-LED (Light-Emitting Diode) sensing unit. The equipment was able to achieve intelligent sensing of potted seedlings and precise control of machine movement. The transplanting trajectory of seedlings has a direct impact on the yield. The traditional single-degree-of-freedom transplanter is limited by its structure and is unable to plan and control the transplanting trajectory, which is insufficient to meet the actual needs of orchard planting [407]. In this regard, Liu et al. proposed a dual-degree-of-freedom sweet potato transplanting robot. With two degrees of freedom, the designed robotic arm can achieve various complex transplanting trajectories and adopt corresponding transplanting strategies according to different terrain types, realizing multiple transplanting methods of sweet potatoes and improving the success rate and yield of transplanting [408]. Yang et al. [409] developed a new type of three-degree-of-freedom parallel transplanting autonomous driving robot in space, which can plan the appropriate transplanting trajectory and motion control function according to the requirements of the transplanted seedlings. The experimental results show that even in the high-speed motion state with an acceleration of 30 m/s2, the success rate of transplanting still reaches 95.3%. When transplanting seedlings, it is equally crucial to consider their survival characteristics. However, most of the existing autonomous transplanters lack self-perception capabilities and are unable to identify the survival characteristics of transplanted seedlings and select high-quality seedlings for planting. As a result, the quality of the planted seedlings varies greatly. Some seedlings with poor survival characteristics are prone to death, leading to a decrease in yield [407]. In response to this issue, Yue et al. [410] considered the vulnerable nature of seedlings and adopted multi-sensor detection technology to monitor the transplanting process of seedlings. They also designed a flexible pneumatic end effector to buffer and optimize the seedling picking operation. Even at a high-speed planting frequency of 90 plants per row per minute, the average success rate of seedling harvesting and deployment can still reach 97.3%. Li et al. [411] reduced the missed planting rate of transplanting by 9.91% and increased the average robustness score of seedlings by 18.92% by constructing a transplanting machine vision system and designing a selective intelligent seedling collection framework based on deep learning.

4.2.2. Autonomous Driving for Fruit Tree Pruning

Fruit tree pruning is an indispensable activity in orchard management, including removing some vegetative organs to control the size and shape of fruit trees as well as adjusting the structural composition [412]. Seasonal pruning not only helps regulate crop load and maintain the balance of fruit trees, but also reduces management and harvest costs. By retaining strong and fruiting tender buds, pruning effectively improves the overall quality of orchard production [413,414]. As a labor-intensive task, it accounts for more than 20% of the pre-harvest costs of fruits such as cherries, apples, and pears, and increases with the increase in labor costs [415]. With the continuous expansion of orchard area and the increasing shortage of labor resources, pruning has become more time-consuming, which further reduces the market competitiveness of farm products. Therefore, it is crucial to seek mechanized alternatives to fruit tree pruning. Installing pruners on tractors is regarded as a useful mechanization method, but it lacks selectivity during pruning and is prone to resulting in low yields of certain types of fruit trees. Therefore, it is mostly applied in standardized orchards [416]. Furthermore, the differences among various fruit trees and the unstructured orchard environment further increase the challenges of standardized and mechanized pruning. Developing autonomous driving robots capable of autonomous evaluation and selective pruning is a promising approach [417,418]. For autonomous driving pruning robots, they must be capable of perceiving and collecting environmental information, formulating specialized pruning strategies, precisely moving the cutting tools to the cutting point while avoiding other obstacles, and controlling the movement process based on the path planning results [419]. At present, the relevant research mainly focuses on perception and reconstruction [414]. For instance, Botterill et al. [418] constructed an automatic pruning system for grapevines by introducing a computer vision system. Specifically, the system determines the pruning position by identifying the plant structure and reconstructs the vine model by matching image features and jointly optimizing the robot trajectory. You et al. [419] developed a pruning robot device, which includes an industrial manipulator, an RGB-D camera configuration, and a custom pneumatic cutting machine, capable of accurately perceiving the 3D state of the pruned branches and planning and executing a series of cuts. To achieve full automation of fruit tree pruning, it is necessary to be able to make pruning decisions independently [420]. Therefore, Williams et al. [413] proposed an autonomous driving system for grape pruning, which can scan grapevines and generate 3D models to make high-quality pruning decisions. Westling et al. [420] proposed a data-driven analysis framework for the impact of pruning decisions, which can identify the branches and leaves in fruit trees that damage the light distribution and suggest pruning positions, thereby increasing the light distribution through the tree canopy by 25.15%. This framework can provide important reference value for the development of autonomous pruning equipment.

4.2.3. Autonomous Driving for Monitoring

The yield and fruit quality of fruit trees are affected by various environmental factors such as weather, humidity, and soil conditions. During the long growth period of fruit trees, these factors are constantly changing. Farmers need to regularly monitor the conditions of each element and the health status of fruit trees to ensure high-quality fruit production [421,422]. Traditional agricultural monitoring is a costly, labor-intensive, and time-consuming task. On the one hand, it involves multiple operation links such as pruning, irrigation, fertilization, and pesticide spraying; on the other hand, farmers can only conduct monitoring through manual visual inspection, manual sampling, and recording of crop conditions [423,424]. In orchards with vast areas, most farmers are unable to precisely monitor the condition of each fruit tree. Therefore, they usually extend the results of sparse observations to the entire orchard and adopt uniform management measures. Doing so is prone to leading to inaccurate estimates and a decline in product quality [425]. Precision agriculture provides a key solution to this problem, including the adoption of new technologies such as remote sensing, computer vision, artificial intelligence, and the Internet of Things to automate the data collection and evaluation processes. In recent years, unmanned aerial vehicles (UAVs) and unmanned ground vehicles have been widely used in orchard monitoring. Most of them are equipped with high-resolution sensors and high-performance computers [426]. Among them, unmanned aerial vehicles (UAVs) are not restricted by site conditions; possess outstanding spatial, spectral, and practical resolution and relatively low operating costs; and can quickly collect large-scale orchard data. Therefore, scholars mostly use unmanned aerial vehicles (UAVs) to monitor the dynamics of farmland, crop growth, threats of pests and diseases, the water and fertilizer environment of farmland, and the impact of natural disasters [427,428]. Beniaich et al. [429] evaluated the vegetation coverage and soil erosion of olive orchards through unmanned aerial vehicle aerial images, thereby promoting the sustainable development of the orchard environment. Abdulridha et al. [430] detected citrus canker in orchards based on the hyperspectral imaging technology of unmanned aerial vehicles and classified them into asymptomatic, early, and late stages, with classification accuracy rates of 94%, 964%, and 1004%, respectively. Stefas et al. [431] proposed an unmanned aerial vehicle (UAV) system with obstacle detection and avoidance functions, which can safely navigate and autonomously collect yield data in orchards. Although drones can provide information about the orchard environment and the overall condition of crops, they have deficiencies in terms of accuracy. In addition, they also have problems such as low flight autonomy, poor load capacity, short battery life, and easy crashing. Therefore, they are not suitable for use in orchards with dense vegetation and large scales [432,433]. In recent years, the use of unmanned ground vehicles and wheeled mobile robots in orchard monitoring has significantly increased. They can not only navigate in orchards with high autonomy and safety, but also conduct multi-angle and precise sampling of crops by carrying various types of sensors and sampling tools [434,435]. Xaud et al. [436] developed an autonomous mobile robot system that autonomously navigates and performs mapping and sample collection in a densely vegetated orchard by carrying multiple sensors. Shafiekhani et al. [437] introduced a low-cost architecture for monitoring robots. This architecture consists of two parts, autonomous ground vehicles and mobile observation towers, and is capable of conducting phenotypic analysis of plants with high spatial and/or temporal resolution. Milella et al. [438] enhanced the safety and operational efficiency of unmanned agricultural vehicles on unstructured terrain by equipping them with multiple sensors to enable accurate soil and topographic mapping. However, there are still challenges for unmanned ground vehicles to navigate in orchards with complex terrain. Furthermore, the high precision of their monitoring is bound to require sacrificing efficiency. Therefore, unmanned aerial vehicles (UAVs) and unmanned ground vehicles have certain complementarity in the field of orchard monitoring. Under the premise that the cost of implementing this is allowed, combining these two tools for monitoring is a direction worth exploring. The “Prosperity” project, funded by the European Community’s “Horizon 2020” initiative, is dedicated to developing a new operational model for air–ground collaborative monitoring. They attempt to conduct aerial monitoring using small autonomous drones and carry out targeted interventions on the ground through multi-purpose agricultural unmanned vehicles [439]. Although this field has demonstrated great application potential, the related research is still in its infancy at present. Future scholars can make further breakthroughs in this direction.

4.2.4. Autonomous Driving for Picking

Autonomous fruit picking agricultural machinery is typically equipped with functionalities including autonomous navigation, image recognition, and environmental modeling. It accurately detects fruits and determines their position and orientation using computer vision and positioning systems and manipulates robotic arms or end effectors through automated control systems to complete fruit picking tasks. These autonomous picking robots are generally classified into two categories based on their picking mechanisms: ground-based robotic arms with end effectors and aerial harvesting using unmanned aerial vehicles (UAVs). The former offers high flexibility and mobility, enabling dynamic adjustment of the field of view and effective collision avoidance. Additionally, by employing various numbers and types of end effectors, these robots are capable of implementing diverse picking strategies, thereby minimizing fruit damage and enhancing picking efficiency [440,441,442]. For instance, Ye et al. developed a six-degree-of-freedom lychee picking robot, which rapidly estimated the collection-free picking posture and motion path by installing a binocular vision system at the end effector on the mechanical arm [266]. Williams et al. [443] introduced an autonomous kiwi fruit harvesting robot. The robot consists of four sub-modules, each of which is composed of a mechanical arm specially designed for kiwi fruit picking and a new type of end effector capable of effectively and undestructively picking fruits from the tree canopy. The picking robot EVE, developed by Ripe Robotics in Australia, is equipped with an articulated mechanical arm and a straw-shaped end effector. It uses negative pressure adsorption to pick target fruits and is particularly suitable for hard fruits such as apples and peaches [444]. The autonomous driving apple picker developed by FFRobotics uses 12 robotic arms to work in coordination, each equipped with a three-finger rotary end effector. Compared with ordinary manual picking, its productivity is increased by 10 times [445]. More specifically, autonomous picking robots can be further classified based on the number of robotic arms, the type of actuators, the installation structure, and the picking method. The characteristics of various types of robots are shown in Table 18 and Table 19. However, this picking method based on mechanical arms is vulnerable to the limitations of orchard terrain and the height of fruit trees [266,446]. In contrast, drones can operate in complex terrains and reach fruit trees of different heights and picking ranges. Therefore, a small number of studies have also attempted to develop autonomous fruit picking devices based on unmanned aerial vehicles. The orchard picking unmanned aerial vehicle FAR developed by Tevel Aerobotics is a typical representative. It can identify the size and maturity of fruits with the help of on-board cameras and visual algorithms and achieves fruit picking using the equipped grasping arm [447]. However, unmanned aerial vehicles (UAVs) are prone to encounter various obstacles when crossing the canopies of fruit trees, and their endurance and load capacity have always been the key challenges for them to achieve efficient picking [448]. Recently, some studies have begun to conduct more in-depth explorations of the above limitations of unmanned aerial vehicles. For instance, Li et al. considered the influence of branches around fruits on the safety of unmanned aerial vehicles (UAVs) and proposed a method for UAV collision detection and flight path planning based on LiDAR data, achieving efficient picking of target fruits [449]. Although various types of autonomous driving picking robots have been developed and significant progress has been made, these devices are still in their early stages of development and have not yet achieved full intelligence and autonomy, making it difficult for them to meet the requirements of commercial applications. To meet the diverse production demands of orchards, the next-generation autonomous picking equipment should have the ability to move flexibly and continuously perceive and make autonomous decisions in complex orchard environments, thereby more effectively replacing manual picking. This process requires comprehensive consideration of the environmental characteristics of the orchard, the distribution of fruits, and the specific requirements of the task. Essentially, the challenges of this system’s engineering mainly stem from the fact that the algorithms and mechanical design parameters in the basic functions are difficult to adapt to the characteristics of the picking task. The key algorithms currently used for autonomous driving picking, such as SLAM, path planning, stereo vision, and deep learning, are still vulnerable to interference from environmental factors like tree canopy occlusion and uneven lighting, which in turn leads to a decrease in positioning and recognition accuracy. Furthermore, these algorithms usually consume a large amount of computing resources and have a long running time, which limits the harvesting efficiency of the equipment. The efficient collaboration among various functional modules is also crucial for enhancing the overall practicality of the equipment. However, current related research mostly focuses on isolated functional modules while ignoring picking as an overall solution for a systematic task. Compared with the extensive discussion of basic functional algorithms, the research on key parameters in mechanical design is relatively scarce. Future work should further focus on the optimization and improvement of the performance indicators of autonomous driving equipment [386].

4.2.5. Summary of Performance and Benefits

Autonomous agricultural machinery demonstrates exceptional capabilities in enhancing precision, efficiency, and sustainability within orchard and greenhouse environments. Leveraging advanced technologies such as computer vision, multi-sensor integration, and robotic manipulation, these systems optimize critical tasks including transplanting, pruning, monitoring, and harvesting. For instance, transplanting robots achieve success rates exceeding 97% under high-speed operations, while vision-based pruning systems improve light distribution by over 25% and reduce labor dependency. Harvesting robots exhibit remarkable efficiency gains, with collaborative platforms achieving picking rates of 1.8 s per fruit and productivity increases up to 10-fold. Despite variations in environmental complexity (e.g., success rates drop in unpruned natural settings), these technologies consistently reduce operational costs, minimize crop damage, and enable continuous large-scale operations. Table 20 synthesizes key research breakthroughs and commercial solutions, highlighting their performance metrics and technological innovations.

4.3. Autonomous Driving in Livestock and Poultry Breeding

Livestock and poultry breeding play a key role in meeting the growing food demands of mankind and are an important part of agricultural production. Against the background of the modern intensive agricultural model, large-scale and high-density breeding has replaced traditional free-range farming [462]. Farmers need to spend more time and energy maintaining the normal and healthy operation of livestock and poultry houses, including animal breeding, environmental cleaning, monitoring, and livestock collection. Furthermore, farmers may unintentionally become the vectors of pathogen transmission, thereby leading to the spread of diseases [463]. Automated equipment plays a certain role in alleviating human operational pressure, but it still cannot participate in some complex agricultural tasks. The rapid development of autonomous driving and robotics technology provides solutions for the intelligent execution of various livestock and poultry breeding tasks. Compared with the crop breeding industry, large-scale livestock and poultry breeding is usually carried out in structured livestock and poultry houses, making it easier to achieve autonomous driving. Equipped with advanced technologies such as sensing, navigation, recognition, and decision-making, autonomous driving livestock and poultry breeding robots can move safely and operate autonomously within livestock and poultry houses, thereby effectively improving production efficiency and the quality of environmental hygiene during breeding [101]. Common autonomous driving agricultural machinery for livestock and poultry breeding is shown in Figure 9.

4.3.1. Autonomous Driving for Animal Feeding

In large-scale livestock and poultry breeding, timely and accurate feeding of feed is crucial for eliminating feed waste, reducing environmental pollution, and preventing the spread of diseases [464,465]. The manual operation mode has limitations in terms of refined nutritional regulation and timed feeding. Self-driving animal feeding robots can not only effectively solve the above problems, but also reduce labor costs and improve production efficiency. In actual breeding scenarios, the free movement of livestock and poultry can easily lead to the spillage or removal of feed from the feeding area, which in turn causes problems such as feed waste or accumulation and deterioration. Autonomous feeding equipment has emerged as a critical technological tool in animal husbandry, enabling timely redistribution of feed that accumulates outside designated feeding zones and ensuring its uniform distribution. Tian et al. [466] designed an autonomous feeding robot for use in cattle sheds, employing three-dimensional LiDAR to facilitate navigation. To enhance the robot’s ability to avoid moving cattle and other obstacles within the cowshed, the authors proposed a navigation path optimization and obstacle avoidance algorithm based on an improved artificial potential field method. Additionally, the Dutch company Lely integrated ultra-wideband ranging and Bluetooth communication technologies into its Lely Juno autonomous pusher, significantly enhancing operational efficiency and safety performance [467]. In recent years, some enterprises have developed rail-guided autonomous feeding robots and have initially achieved commercial application. This kind of equipment can precisely travel along the preset track to the feeding point and automatically return to the charging station. It has the advantages of simple structural design, low maintenance cost, and stable control. Typical products include the track feeding robot ROVER [468] launched by the Canadian company Rovibec, the automatic grass feed pusher RANGER [469], and the suspended-track-guided feeding robot Triomatic HP 2300 [470] produced by the Dutch company Trioliet. These devices have demonstrated remarkable efficiency in feeding large-sized livestock such as pigs, cattle, and sheep, but significant modifications to livestock sheds are required to plan and install the running tracks. Furthermore, the degree of refinement they are suitable for in terms of feed feeding is not high enough and lacks flexibility. In contrast, ground-based autonomous vehicles, with the aid of multi-sensor devices, can not only effectively utilize the space within livestock and poultry houses and flexibly plan driving routes but also meet the demand for precise feeding. Karn et al. [471] developed an autonomous feeding vehicle for cattle, which can travel along a pre-determined path and place feed on the feed fence, thereby ensuring accurate, timely, and adequate feeding of each cattle. The autonomous rabbit feeding robot with autonomous driving capability developed by Jiang et al. operates at a maximum speed of 0.20 m/s to minimize stress responses in rabbits. It demonstrates horizontal and vertical positioning deviations of 5.3 mm and 7.6 mm, respectively, and maintains a feed quantification error of 4.3%, indicating a relatively high feed utilization efficiency [464].

4.3.2. Autonomous Driving for Monitoring

Among autonomous livestock and poultry breeding robots, monitoring robots are the field with the most accumulated research achievements, including research into environmental conditions and animal health status. In the past, these tasks were mainly accomplished by inspectors; they needed to patrol the livestock sheds many times every day and observe and record the behaviors and living conditions of the animals [472,473]. This labor-intensive process not only consumes a great deal of human and material resources, but also increases the risk of cross-infection of diseases. Furthermore, manual observation is affected by subjective factors such as individual experience differences and fatigue states, making it difficult to avoid judgment deviations, which limits the objectivity and accuracy of the monitoring results [474,475]. The introduction of high-precision sensor technology has brought convenience to the realization of remote continuous monitoring. At present, there is a large amount of research and information on the deployment of various sensors in livestock and poultry houses, but there are few studies on the comprehensive inspection of livestock and poultry conditions using autonomous driving robots [101,476,477,478]. In fact, the former often requires the installation of high-density sensor arrays in the actual breeding process, which leads to a significant increase in investment costs. Furthermore, the environmental perception information obtained in this way may have a significant deviation from the real environment, which is not conducive to achieving accurate control [96]. By carrying various sensors necessary for safe livestock and poultry production, autonomous driving robots can not only reduce the number of devices required to perform perception at different positions, achieving all-round and precise monitoring without blind spots, but they are also believed to have the potential to promote animal movement, increase feed conversion rate, and improve the quality of livestock and poultry products [479]. Environmental condition monitoring robots collect parameters such as air quality, temperature, humidity, light intensity, and carbon dioxide levels, providing valuable data for administrators to make scientific decisions and helping to maintain and continuously optimize the livestock and poultry breeding environment [480]. The autonomous ground vehicle developed by Qi et al. [62] is equipped with nine DOF MEMS IMU sensors and is capable of simultaneously measuring temperature, relative humidity, and dust. Octopus Biosafety Company has designed a multi-task autonomous driving robot XO [481]. This robot can not only collect various environmental data such as temperature, humidity, carbon dioxide, and light intensity and provide early warnings but can also complete tasks such as cleaning and pad material processing. Based on LiDAR, remote sensing, sensor fusion, and communication technologies, the autonomous driving inspection robot developed by Quan et al. can detect and bypass obstacles on the path in real time and can accurately collect images of cage-reared chickens and environmental data [475]. Animal health status monitoring robots are usually used to identify sick and dead animals, which is helpful for timely prevention of disease epidemics and control of animal fat levels [482]. The Poultry Patrol autonomous driving robot monitors animals through various types of integrated cameras and identifies sick and dead birds through remote monitoring and video recording, providing early warnings for breeders. However, this image-based recognition method poses challenges in distinguishing dead chickens from lying healthy chickens [483]. In this regard, the autonomous driving robot developed by Charoen Pokphand Group identifies sick and dead chickens by using thermal imagers, because this device has the ability to directly perceive temperature, and the temperature characteristics of sick and dead chickens are significantly distinguished from those of healthy chickens [484]. Considering the deficiencies of infrared thermal imaging technology in terms of noise, contrast, and spatial resolution, Sensyn Robotics, Inc., designed a further autonomous driving patrol robot integrating infrared thermal images and visible light wavelength images. The detection rate and false positive rate of dead chickens were 93% and 0.3%, respectively [485]. However, the recognition accuracy of the sensor may have been affected by the breeding density, animal activities, and obstruction by obstacles. The research results of Ma et al. show that integrating deep learning technology for real-time detection and recognition and auxiliary lighting systems in autonomous driving robots can effectively solve the interference problems caused by high-density breeding environments and obstacle factors [486]. Integrating flexible mechanical arms and grippers in autonomous driving robots is another possible solution, which can assist and facilitate recognition by interacting with other objects [487,488,489]. However, at present, there is still a lack of robotic arms suitable for working in the challenging livestock and poultry breeding environment. In addition, future research still needs to explore more advanced sensors and hardware as well as more cutting-edge monitoring algorithms to enhance their accuracy and stability in complex environments and minimize interference with animals to the greatest extent.

4.3.3. Autonomous Driving for Environmental Cleaning

Under the intensive livestock and poultry breeding model, pathogenic infection has become one of the core challenges threatening biosecurity. Regular cleaning of the breeding environment helps prevent air pollution and the reproduction of pathogens, and it is the most fundamental, comprehensive, and effective epidemic prevention measure in the process of livestock and poultry breeding [490,491]. The cleaning tasks of livestock and poultry breeding environments mainly include dry cleaning, wet cleaning, waste clearance, daily disinfection, and handling of dead and diseased animals. The specific forms of cleaning are influenced by the breeding mode and the environment of the livestock and poultry. In bedding livestock and poultry houses, animals directly discharge their manure on the bedding, causing it to become damp and lumpy. This also easily leads to bacterial growth and the generation of toxic gases such as ammonia, methane, and nitrous oxide. Therefore, it is necessary to regularly wipe, disinfect, turn over, and replace new gaskets [96]. To reduce the growth of bacteria and microorganisms, some studies have proposed the use of slotted or porous floors to separate animals from feces [492]. However, this breeding mode also requires regular removal of manure and cleaning of the floor, because approximately 91% of the harmful gases in livestock and poultry houses come from the contaminated ground [493,494]. These tasks may pose a threat to the physical health of the keepers during the execution process. Therefore, the development and application of autonomous disinfection robots are of vital importance to ensuring the efficient and safe production of livestock and poultry breeding. Octopus XO is capable of autonomous positioning, navigation, and movement within the poultry house and can clean the bedding by scraping it and spraying disinfectant solution [481]. Birds Eye Robotics have designed a wheel system with spikes to better fix and decompose the clumped bedding and it can flip the bedding during movement to promote its ventilation [495]. Laying new bedding on old one is another way to keep the bedding dry and clean. INATECO’s autonomous driving bedding robot precisely identifies damp bedding areas by adopting infrared thermal imaging and multi-sensor technology, thereby laying new bedding in those areas [496]. Aimed at solving the problem of the insufficient self-cleaning effect of cracked floors, some studies have developed autonomous driving robots equipped with scrapers and cleaning equipment [497]. Doerfler et al. [493] used robot scrapers to clean the sidewalks, aiming to provide clean and dry walking and lying areas for dairy cows. The research results show that within one year, the proportion of dairy cows suffering from clinical mastitis decreased by 2.42%. Ebertz et al. [494] made minor technical modifications to the universal robot scraper and applied it for the first time in a pigsty. It could not only basically remove excrement but also kept the floor clean and anti-slip and maintained its cleaning effect for six hours. In addition to waste disposal and local cleaning, it is also necessary to regularly carry out unified cleaning and disinfection of the entire livestock and poultry house environment. For example, the autonomous driving car developed by Feng et al. is equipped with a monitoring unit, a control unit, and a disinfectant spraying unit and is capable of moving along the predetermined path in the navigation mode of “magnet-RFID”. The test results show that when the liquid flux is 1200 mL/min, the average diameter and average deposition density of the droplet at a maximum of 6.3 m are 231.09 µm and 186 points /cm2, respectively, which fully meet the disinfection requirements of high-flow and long-distance spraying [490]. The self-propelled radio-controlled high-pressure cleaning robot developed by Rabaud Company is equipped with four cleaning devices and six rotating nozzles. With a cleaning height of up to 5 m, it can achieve all-round and thorough cleaning of livestock and poultry houses without any blind spots [498]. By integrating monitoring functions or collaborating with monitoring robots, autonomous driving robots can quickly remove and handle diseased and dead animals. Robots for handling dead and diseased animals usually have sweeping mechanical arms and conveyor belts, and can store the picked dead and diseased animals for a short period of time. The picking robot developed by Birds Eye Robotics removes dead chickens through a rotating shovel structure. Liu et al. designed a dead chicken cleaning robot equipped with a cleaning turntable, conveyor belt, storage space, and tracked vehicles. The accuracy of identifying dead chickens using cameras and the YOLO v4 algorithm reached 95.24%, and it was able to successfully move dead chickens to the internal storage area [499]. At present, most of the research in this field is still focused on floor breeding environments. It is necessary to develop robots for handling dead animals that are suitable for different breeding environments (such as cage breeding). Furthermore, designing processing robots that combine the characteristics of different animals can effectively improve the success rate and efficiency of removal [500,501]. In conclusion, autonomous driving cleaning robots provide an automated and intelligent solution for replacing humans to complete the daily cleaning of livestock and poultry environments. In combination with sensors and deep learning technology, autonomous driving cleaning robots can also achieve refined operations and improve the efficiency of cleaning. However, the existing autonomous driving cleaning robots also lack integration with cutting-edge technologies, and accurately determining the areas that need cleaning and the amount of disinfectant used remains a challenge.

4.3.4. Autonomous Driving for Livestock Production Collection

In the domain of livestock product collection, numerous studies have focused on the development of autonomous floor egg collection systems, resulting in the emergence of effective egg picking robots. Hens frequently lay eggs outside designated nesting areas, such as in dimly lit corners or secluded spaces within chicken coops. If not collected promptly, these floor eggs are susceptible to pecking by other hens or contamination by bacteria, leading to spoilage [502]. To address this challenge, Chang et al. [503] developed an intelligent mobile robot equipped with a computer vision platform capable of accurately identifying and locating eggs under various climatic conditions in outdoor settings. The robot can autonomously collect, sort, and store eggs without causing physical damage, making it well-suited for automatic egg collection in free-range farms. Similarly, Vroegindeweij et al. introduced the PoultryBot, which is capable of autonomous navigation over distances of up to 3000 m. It can avoid both stationary obstacles and moving hens within high-density poultry house environments to locate scattered floor eggs. However, experimental evaluations revealed that only 46% of the floor eggs were successfully collected, while 16% were completely missed—largely due to proximity mismatches between the eggs and the collection mechanism. These findings underscore the need to refine the navigation strategy to better align with the orientation of the collection device and enhance the control loop’s responsiveness to enable timely adjustments [504]. To improve identification accuracy and collection efficiency, Joffe et al. [100] implemented a deep learning algorithm based on the Faster R-CNN architecture to detect floor eggs. They used a suction-cup-type end effector to enable non-destructive egg collection and mounted a camera on the robotic arm integrated with a visual servo algorithm to dynamically adjust the picking posture. Experimental results showed that the system achieved a floor egg collection success rate of 91.6%. While these studies have demonstrated the promising potential of autonomous egg picking robots, most systems still lack optimization tailored to the specific environmental and biological characteristics of poultry houses and eggs. Limitations remain in terms of recognition algorithms, navigation strategies, and collection mechanisms, leading to decreased accuracy and reliability under non-standard conditions [504,505]. Beyond egg collection, automated agricultural machinery for other livestock product collection has made preliminary advances. In the area of automated shearing, Trevelyan et al. proposed a system integrating sensor-based control, trajectory adaptation, and online strategy planning to facilitate robotic wool trimming [506]. Regarding automated milking, Gaworski et al. developed a mobile tractor-based milking device. While the system has the potential to improve milking efficiency, it also incurs additional operational costs due to the complexity of device cleaning and maintenance requirements [507]. Nevertheless, due to the inherent complexity of livestock operations, technologies for automatic milking and shearing remain in the early stages of research and development. These technologies face challenges such as high costs, stringent technical requirements, and ongoing maintenance needs. Future research should focus on enhancing the design and deployment of autonomous agricultural machinery for livestock product collection, aiming to reduce labor costs and boost operational efficiency.

4.3.5. Summary of Performance and Benefits

Autonomous agricultural machinery significantly enhances efficiency, animal welfare, and resource management in livestock and poultry farming through precision operations and intelligent automation. These systems reduce labor dependency by automating repetitive tasks such as feeding, cleaning, and monitoring while improving operational accuracy. For example, robotic feeding vehicles cut labor time by 25%, and automated manure cleaning reduces clinical mastitis incidence by 2.42%. Precision navigation ensures minimal deviation (e.g., <7.6 mm in feeding robots), and computer vision technologies achieve high recognition rates for tasks like egg collection (94.7–97.6%) and dead animal detection (93% accuracy). Commercial solutions further integrate real-time monitoring and multi-sensor technologies to optimize environmental control and hygiene maintenance. Table 21 details key research and commercial innovations, highlighting their performance metrics and technological advantages.

4.4. UAV-UGV Cooperative Operations

In agricultural fields, orchards, and livestock breeding scenarios, significant progress has been achieved in the application of autonomous agricultural machinery technology. However, single autonomous agricultural machines are progressively demonstrating limitations in addressing the large-scale and precise demands of industrialized agriculture. UAVs, possessing aerial mobility and the ability to quickly cover large areas, are highly suitable for tasks including monitoring, investigation, and spraying. However, due to their limited endurance and payload capacity, UAVs struggle to complete complex and precise ground operations, such as pesticide spraying in orchards in hilly and mountainous areas. Conversely, UGVs, although possessing strong payload and endurance capabilities and being capable of undertaking high-precision ground operations (such as spraying, fertilizing, and weeding), are constrained by their limited field of vision and terrain adaptability. This limitation often results in lower efficiency for large-scale farmland monitoring, tree canopy operations, or navigation in complex terrains [508]. The UAV-UGV collaborative technology effectively addresses this issue through the complementary advantages of air and ground platforms. Specifically, UAVs are equipped with multispectral sensors, RGB cameras, and laser radars, enabling them to rapidly acquire agricultural conditions (e.g., crop health and soil moisture) and generate a preliminary three-dimensional map of the farmland. Concurrently, through wireless communication technologies (e.g., 5G, LoRa), UAVs and UGVs can achieve efficient data transmission, information sharing, and task synchronization. After receiving data from UAVs, UGVs can carry out spraying, fertilizing, and harvesting operations in designated areas through automatic navigation and precise control functions. This “air reconnaissance–ground execution” integrated operation mode not only enhances the intelligence and precision of operations but also reduces resource waste and improves operational efficiency [218].
Currently, UAV-UGV collaborative systems have achieved preliminary application results in multiple agricultural scenarios. For example, within agricultural environments such as vineyards and orchards characterized by narrow working spaces and complex navigation requirements, Edelmann et al. developed a UAV-UGV collaborative system with a “master–slave” control architecture. In this system, the UAV serves as the master unit and utilizes its GNSS positioning and image processing capabilities to conduct global monitoring of the operation area and transmit environmental information and task instructions to multiple UGVs, directing them to perform precise ground operations. This structure not only reduces the dependence of UGVs on expensive navigation sensors but also improves the system’s robustness in GPS signal-limited conditions [509]. Similarly, Silva et al. developed a collaborative positioning method suitable for intercropping planting environments. This method employs two drones equipped with ultra-wideband (UWB) ranging sensors to achieve precise positioning of UGVs under trees and resolves the problem of signal ambiguity through adaptive drone repositioning strategies [510]. Potena et al. proposed the AgriColMap system, which further integrates the aerial mapping capabilities of UAVs and the ground intervention capabilities of UGVs. The UAV is responsible for generating a preliminary three-dimensional map of the farmland and resolves the problem of map merging through multimodal environmental representation and optical flow estimation techniques, thereby providing high-precision operation area information for UGVs to support their variable fertilization and precise weeding tasks [511]. In spraying applications, Zhang et al. developed a UAV-UGV collaborative sprayer system (UCBSS), which effectively combines the high payload capacity of UGVs with the high maneuverability of drones and ignores terrain characteristics, thereby enabling wide-span and stable orchard spraying operations. Concurrently, this study also designed an adaptive feedforward compensation PD feedback (AFCPF) control method for the collaborative motion control of a UAV-UGV. The results indicate that the average side inclination angle of the spray boom of this system is only 0.014 rad, and the tracking error of the unmanned aircraft is reduced by 60.6% compared to the traditional PD control method, demonstrating excellent stability and control accuracy [512]. Considering the influence of strong winds on the spraying operation of the unmanned aircraft, Farid et al. also developed a novel path planning method based on strategic reinforcement learning and proximal policy optimization. By adjusting the position of the unmanned aircraft in real time based on wind conditions, wind drift was compensated for. Concurrently, unmanned ground vehicles were used to complete operations in narrow areas. This UAV-UGV collaborative operation method significantly improved the spraying quality and operational efficiency [513]. In the field of agricultural monitoring, Quaglia et al. proposed a novel solar-powered UGV called “Agri.q”. This platform is equipped with a multi-degree-of-freedom positioning mechanism, a mechanical arm, and visual sensors and possesses the capability to perform precise operations in irregular terrain. In addition, Agri.q is also equipped with a UAV landing platform, supporting on-site monitoring and operational tasks with UAV and UGV collaboration [514].
Although the UAV-UGV collaborative system demonstrates substantial potential in agriculture, its further development still faces several challenges. First, the heterogeneity of UAV and UGV platforms, particularly differences in their communication protocols, data formats, and control architectures, elevates the complexity of system integration and collaboration. Second, the collaborative scheduling and autonomous decision-making capabilities of large-scale UAV-UGV clusters remain immature. Specifically, real-time response and fault recovery mechanisms in dynamic environments require further enhancement. Moreover, current research is limited to small-scale experiments and has not yet formed standardized, scalable collaborative architectures and technical specifications. In the future, with the development of embodied AI technology, the UAV-UGV system is anticipated to achieve higher levels of autonomy, including environmental interaction, online learning, and self-optimization capabilities. Multi-UAV–multi-UGV cluster collaboration, cross-platform data fusion, and robust and fault-tolerant control will become important research directions in precision agriculture development. By promoting the development of UAV-UGV collaborative systems towards intelligence, scale, and practicality, the construction of smart agriculture and the realization of sustainable agricultural development will be further advanced [515,516].

5. Existing Challenges and Future Prospects

Although agricultural machinery autonomous driving technology has made significant progress, it continues to confront numerous challenges during the transition from academic prototypes to industrial deployment. These challenges pertain not only to technological maturity but also encompass economic feasibility, social acceptance, and sustainability [305].

5.1. Existing Challenges

5.1.1. Developmental Challenges in Key Areas

(1) Positioning technology. The core challenge pertaining to agricultural machinery positioning technology is the contradiction between cost and robustness. Although high-precision devices, such as RTK-GNSS and LiDAR, can provide centimeter-level positioning, their high costs have hindered large-scale applications. Concurrently, a single sensor exhibits limitations in complex agricultural environments, where, for instance, GNSS signals are readily blocked and visual sensors are significantly impacted by lighting. The future challenge involves the development of more cost-effective positioning solutions and the design of multi-sensor fusion algorithms with high robustness. Such algorithms must effectively manage data anomalies and noise in extreme environments, thereby ensuring stable and precise navigation of agricultural machinery in various scenarios.
(2) Perception technology. The challenge of perception technology resides in the complexity of data processing and the insufficiency of the environmental generalization ability. High-spectrum and multi-spectrum technologies generate massive data, which intensifies the computational burden. This necessitates the development of efficient, lightweight algorithms capable of handling redundant information to adapt to low-power agricultural machinery platforms. Additionally, existing perception models are usually trained for specific crops and specific scenarios, consequently exhibiting weak generalization ability and being unable to meet the recognition requirements for different seasons, weather conditions, and crops. Future research should focus on building general perception models that can work across scenarios and seasons while optimizing data fusion algorithms to achieve real-time, accurate environmental perception.
(3) Motion planning and control technology. The core challenge of agricultural machinery motion planning and control pertains to accuracy and multi-machine collaboration in high-speed, high-curvature scenarios [517]. Existing algorithms are mostly tested in low-speed environments and struggle to maintain precise trajectory tracking and stability in high-speed, high-curvature complex conditions. Concurrently, due to the lack of a unified communication protocol in the industry, different brands and types of agricultural machinery equipment encounter difficulties in achieving seamless collaborative operations. Future research should focus on developing robust control algorithms that can handle high-speed dynamic environments and establishing universal data and control protocols to achieve efficient collaborative operations of multi-robot systems, thereby enhancing overall operational efficiency.
(4) Actuator technology. The challenge of actuator technology resides in its reliability in harsh environments and the fault-tolerant handling of multiple failures. In agricultural environments characterized by dust, moisture, and vibration, actuators are prone to failure. Furthermore, traditional fault-tolerant control technologies can only handle single actuator failures and cannot cope with simultaneous or cascading failures of multiple actuators. Future research should focus on breaking through fault-tolerant control algorithms for simultaneous actuator failures and exploring the combination of event-triggered control and finite-time control with fault-tolerant technology to ensure the agricultural machinery system’s ability to operate quickly and stably even in the event of failures, thereby maximizing operational continuity and safety.
(5) Application scenarios. The application of autonomous driving technology in agricultural machinery continues to face multi-dimensional challenges, with the core issue being the conflict between scenario specificity and technological adaptability. In field environments, complex terrain tests the dynamic performance and navigation systems of equipment, while variations in cropping practices limit the generalizability of operational modules. Furthermore, multi-machine coordination and obstacle avoidance lack sufficient accuracy, and variable lighting conditions and crop occlusion reduce sensor precision during operation. In orchard and greenhouse settings, semi-structured environments and crop diversity pose significant obstacles: fruit-picking robots suffer from positioning inaccuracies and insufficient flexibility of robotic arms; navigation systems, weakened by poor GNSS signals, rely on SLAM technology yet struggle to balance real-time performance with accuracy. The high humidity and carbon dioxide levels in greenhouses accelerate equipment aging, and sensor degradation leads to reduced reliability and poor interoperability. Within livestock farming environments, random animal movement disrupts the path planning of feeding robots, high-density breeding increases the risk of misidentifying sick animals, and egg collection in corner areas proves inefficient. Additionally, dust and ammonia gases in barns shorten the operational lifespan of equipment. Beyond these, UAV-UGV cooperation is hampered by poor communication in remote fields and incompatibility between device protocols. The lack of emergency fault management protocols further restricts large-scale implementation [461,518].

5.1.2. Technical Transfer Barriers

During the transition of agricultural autonomous driving technology from laboratory research to large-scale commercial application, the below key technical transfer barriers are encountered. These barriers directly affect the popularization and industrialization of the technology.
(1) Cost gap and commercialization divide. A significant cost gap exists between academic prototypes and commercial products. High-precision RTK-GNSS, LiDAR, and high-performance computing units, although providing excellent performance in research, incur high costs, which impede their large-scale application in commercial agricultural machinery products. This leads to a huge cost difference between the technical prototype and the actual commercial product. Moreover, system integration and maintenance costs are also high, exacerbating this gap.
(2) Energy consumption challenge. Energy consumption is another major challenge in the field of agricultural machinery autonomous driving, especially in long-duration operational scenarios. Complex perception and decision-making algorithms require powerful computing capabilities, which can lead to high energy consumption and directly affect the range of electric agricultural machinery. In the vast farmland environment, convenient energy supply infrastructure is often lacking. Concurrently, existing research reveals gaps in energy-saving technologies, including low-power hardware design, efficient algorithm optimization, and the collaborative optimization of path planning and energy management.
(3) Standard deficiency and interconnection bottleneck. The industry lacks unified technical standards, which results in incompatible products from different manufacturers in terms of sensor accuracy, data protocols, and control performance. This not only forms “data islands” but also hinders the interconnection between agricultural machinery, supporting farm tools, and intelligent control platforms. In academic research, due to the lack of unified testing standards and evaluation systems, the comparability of different research results is poor, thereby impeding technology transfer and industrial collaboration.

5.1.3. Regulatory, Ethical, and Safety Standard Deficiency

In addition to technical and economic challenges, the popularization of agricultural machinery autonomous driving must also address regulatory, ethical, and social safety issues.
(1) Operational safety and risk management. Farmland is a complex and unstructured environment, characterized by steep slopes, ditches, and the presence of uncertain moving pedestrians or livestock. If autonomous agricultural machinery makes a decision or execution failure, it may lead to serious collision accidents, thereby posing a threat to personal and property safety. Currently, the formulation of rigorous operational safety regulations, the construction of multiple redundant systems, and the establishment of emergency handling mechanisms remains an urgent problem to be solved.
(2) Insufficient standard coverage and incomplete regulations. Although some international organizations (such as ISO and ASABE) have issued agricultural machinery safety standards, their adoption rates worldwide are low. Especially in developing countries, special regulations for emerging technologies such as drones are still in a state of absence or are lagging behind. This regulatory vacuum brings uncertainty to the promotion of technology and may also trigger potential safety and legal risks [519,520].
(3) Ethics and data security. The large-scale application of agricultural machinery autonomous driving will trigger social ethical issues, such as the impact of automation on traditional agricultural labor employment. In addition, the autonomous system will collect a large amount of agricultural data (such as crop growth status and soil data), and the privacy and security of this data are of crucial importance. Ensuring that data is not abused or leaked necessitates the establishment of a complete legal framework and technical protection measures [521].

5.2. Future Prospects

In the future, the development of agricultural machinery autonomous driving technology will focus on addressing the above challenges, thereby facilitating the transition of technology from the laboratory to the field and achieving large-scale, sustainable commercial applications.

5.2.1. Core Research Gaps and Future Directions

(1) Positioning technology. The core research gap in positioning technology pertains to the balance between high precision and low cost, as well as the robustness of integrated positioning in complex environments. Future research directions should focus on developing integrated solutions that combine perception and positioning, aiming to effectively utilize low-cost sensor data and achieve centimeter-level positioning in unstructured environments with GNSS signal limitations or absence. This is intended to fundamentally address the current bottlenecks in the commercialization of technology and to provide a foundation for autonomous operations of agricultural machinery in all weather conditions and scenarios.
(2) Perception technology. The core research gap in perception technology resides in the ability of environmental generalization and the lightweight processing of multimodal data. Future research directions should focus on building a universal perception model with strong generalization capabilities; enabling it to adapt to complex changes in lighting, weather, and crop growth stages; and reducing reliance on specific scene data. Concurrently, to cope with large-scale data generated by hyperspectral sensors, it is imperative to develop efficient and real-time multi-modal data fusion algorithms in order to reduce computational complexity and make them applicable to power-constrained agricultural embedded systems [522,523].
(3) Motion planning and control technology. The core research gap in motion planning and control pertains to precise control in high-speed, high-curvature, and dynamic scenarios and the collaboration among heterogeneous agricultural machinery systems. Future research directions should focus on developing advanced control algorithms with high robustness and real-time performance to address uncertainties in complex agricultural environments. Concurrently, it is urgently necessary to develop unified communication and control protocols to break the technical barriers between different manufacturers’ equipment and achieve seamless collaboration among multiple robot systems, thereby building an efficient and collaborative agricultural automation ecosystem.
(4) Actuator technology. The core research gap in actuator technology resides in the fault-tolerant capability of multiple actuators, alongside self-diagnosis and predictive maintenance in harsh environments. Future research directions should focus on designing fault-tolerant control strategies that can handle simultaneous failures of multiple actuators to ensure the safety and stability of the system in extreme fault conditions. In addition, introducing event-triggered control and finite-time control techniques into the fault-tolerant control field will help to achieve more efficient resource utilization and rapid fault response. Finally, by integrating advanced sensors and algorithms, predictive maintenance of actuators can significantly improve the long-term reliability of agricultural machinery equipment. In terms of manipulating actuators, the utilization of flexible end effectors (such as pneumatic clamps) and the development of contact force analysis algorithms will reduce damage during fruit picking [524,525].
(5) Application scenarios. Primary research gaps are concentrated in four key dimensions: insufficient cross-scenario adaptability, wherein existing technologies fail to simultaneously satisfy the divergent requirements of farmland, orchard, and livestock shed environments; a lack of unified multi-vehicle scheduling mechanisms, with unresolved challenges in UAV-UGV communication coverage and protocol compatibility; inadequate equipment reliability under harsh environmental conditions (e.g., high humidity, dust, corrosive gases), coupled with insufficient endurance of sensors and mechanical components; and the absence of a proactive fault emergency scheduling system, which hinders effective response to unexpected operational failures. Future research should prioritize targeted advancements in the following areas: the development of cross-scenario navigation technologies based on multi-sensor fusion, combining GNSS and SLAM to optimize positioning accuracy; the design of lightweight SLAM algorithms and flexible actuators suitable for semi-structured orchard and greenhouse environments to enhance the operational performance of picking robots; the development of corrosion-resistant, humidity-tolerant materials and high-stability sensors to improve the service life and reliability of equipment deployed in livestock sheds; and the establishment of unified communication protocols and dynamic emergency scheduling mechanisms to address UAV-UGV coordination challenges and facilitate large-scale application [526].

5.2.2. Cost and Practicality Optimization

In the future, the promotion of low-cost, high-efficiency solutions will continue. This includes developing perception algorithms based on computer vision to reduce reliance on expensive LiDAR; optimizing multi-sensor fusion algorithms to achieve high precision on low-cost hardware [527]; and developing more universal and easy-to-operate modular platforms to lower the usage and maintenance barriers for farmers. Concurrently, energy structure optimization will be a key research direction, including the integration of clean energy, the development of energy management systems, and the design of hybrid power systems. Path planning will also be used to reduce unnecessary movement and energy consumption [528,529,530].

5.2.3. Standards and Ecosystem Construction

To accelerate technology transfer and industrial collaboration, standardization is crucial. In the future, efforts should be made to formulate unified hardware interfaces, communication protocols, and data format standards and to build an open software and hardware ecosystem. Concurrently, it is necessary to establish test standards and evaluation systems covering diverse agricultural scenarios to provide reliable benchmarks for technological research and commercialization. In addition, active cooperation with government and international organizations is required to jointly promote the improvement of relevant laws and regulations, thereby ensuring the safe and compliant application of the technology.

6. Conclusions

Autonomous driving technology for agricultural machinery has made significant progress over the past few decades, evolving from early mechanical guidance devices to current intelligent systems that integrate artificial intelligence, multi-source sensing, and precise control. This technology has become a crucial force in addressing the increasing demand for food, environmental deterioration, and agricultural labor shortages. The core architecture of agricultural machinery autonomous driving technology is systematically introduced in this study. The classification and working principles of key technologies, such as positioning, perception, motion planning and control, and actuators, are deeply analyzed. Furthermore, the application status of autonomous agricultural machinery in fields, orchards, and livestock breeding is extensively explored. The results indicate that existing autonomous agricultural machinery has initially acquired autonomous navigation and automated operation functions. However, a series of challenges persist. Most systems remain limited to single tasks or specific operating environments, lacking unified testing and evaluation standards across scenarios. Moreover, room for improvement exists in positioning accuracy, perception generalization, multi-machine collaboration, actuator fault tolerance, and energy consumption control. Additionally, multiple constraints such as cost, standardization, safety supervision, and data ethics are present. Looking to the future, to promote this technology from laboratory prototypes to large-scale commercial applications, efforts should focus on building a multi-scenario testing environment and performance evaluation system. The development of low-cost, high-robustness perception and positioning fusion algorithms should be promoted; high-speed dynamic control and multi-machine collaboration capabilities strengthened; key technologies in actuator fault tolerance and predictive maintenance advanced; and energy optimization and standardized interconnection solutions explored. Through multi-technology integration, high-performance AI algorithm empowerment, and system-level innovation, agricultural machinery autonomous driving will continue to evolve towards more efficient, precise, reliable, and intelligent directions, providing the core impetus for the implementation of precision agriculture and global agricultural sustainable development.

Author Contributions

Conceptualization, Q.L.; methodology, R.Y.; software, H.S.; validation, Y.C., L.C. and H.J.; formal analysis, R.Y.; investigation, H.S.; resources, Q.L.; writing—original draft preparation, R.Y.; writing—review and editing, Q.L., R.Y. and H.S.; supervision, Y.C., L.C. and H.J.; project administration, Q.L.; funding acquisition, Q.L., Y.C., L.C. and H.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (52372413, 52225212); the National Key R&D Program of China (2023YFB2504400); and the Overseas training plan for outstanding young and middle-aged teachers and principals in colleges and universities in Jiangsu Province and the Young Talent Cultivation Project of Jiangsu University.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Ghobadpour, A.; Monsalve, G.; Cardenas, A.; Mousazadeh, H. Off-road electric vehicles and autonomous robots in agricultural sector: Trends, challenges, and opportunities. Vehicles 2022, 4, 843–864. [Google Scholar] [CrossRef]
  2. Reid, J.; Moorehead, S.; Foessel, A.; Sanchez, J. Autonomous driving in agriculture leading to autonomous worksite solutions. In The Best of COMVEC 2016 Select Technical Papers from the SAE Commercial Vehicle Engineering Congress; SAE International: Warrendale, PA, USA, 2016; p. 1. [Google Scholar] [CrossRef]
  3. Yaghoubi, S.; Akbarzadeh, N.A.; Bazargani, S.S.; Bazargani, S.S.; Bamizan, M.; Asl, M.I. Autonomous robots for agricultural tasks and farm assignment and future trends in agro robots. Int. J. Mech. Mechatron. Eng. 2013, 13, 1–6. [Google Scholar]
  4. Thomasson, J.A.; Baillie, C.P.; Antille, D.L.; Lobsey, C.R.; McCarthy, C.L. Autonomous technologies in agricultural equipment: A review of the state of the art. In Proceedings of the 2019 Agricultural Equipment Technology Conference, Louisville, KY, USA, 11–13 February 2019; ASABE Technical Library: St. Joseph, MI, USA, 2019; Volume 40, pp. 1–17. [Google Scholar] [CrossRef]
  5. Reid, J.F. Establishing automated vehicle navigation as a reality for production agriculture. IFAC Proc. Vol. 2000, 33, 31–38. [Google Scholar] [CrossRef]
  6. Klerkx, L.; Jakku, E.; Labarthe, P. A review of social science on digital agriculture, smart farming and agriculture 4.0: New contributions and a future research agenda. NJAS-Wagening. J. Life Sci. 2019, 90, 100315. [Google Scholar] [CrossRef]
  7. Grosse, E.H.; Sgarbossa, F.; Berlin, C.; Neumann, W.P. Human-centric production and logistics system design and management: Transitioning from Industry 4.0 to Industry 5.0, 2023. Int. J. Prod. Res. 2023, 61, 7749–7759. [Google Scholar] [CrossRef]
  8. Iqbal, B.; Alabbosh, K.F.; Jalal, A.; Suboktagin, S.; Elboughdiri, N. Sustainable food systems transformation in the face of climate change: Strategies, challenges, and policy implications. Food Sci. Biotechnol. 2025, 34, 871–883. [Google Scholar] [CrossRef] [PubMed]
  9. Darko, R.O.; Yuan, S.Q.; Hong, L.; Liu, J.P.; Yan, H.F. Irrigation, a productive tool for food security—A review. Acta Agric. Scand. Sect. B-Soil Plant Sci. 2016, 66, 191–206. [Google Scholar] [CrossRef]
  10. Duckett, T.; Pearson, S.; Blackmore, S.; Grieve, B.; Chen, W.H.; Cielniak, G.; Cleaversmith, J.; Dai, J.; Davis, S.; Fox, C. Agricultural robotics: The future of robotic agriculture. arXiv 2018, arXiv:1806.06762. [Google Scholar] [CrossRef]
  11. Oliveira, L.F.; Moreira, A.P.; Silva, M.F. Advances in agriculture robotics: A state-of-the-art review and challenges ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
  12. Jin, Y.; Liu, J.; Xu, Z.; Yuan, S.; Li, P.; Wang, J. Development status and trend of agricultural robot technology. Int. J. Agric. Biol. Eng. 2021, 14, 1–19. [Google Scholar] [CrossRef]
  13. Rosell-Polo, J.R.; Cheein, F.A.; Gregorio, E.; Andújar, D.; Puigdomènech, L.; Masip, J.; Escolà, A. Advances in structured light sensors applications in precision agriculture and livestock farming. Adv. Agron. 2015, 133, 71–112. [Google Scholar] [CrossRef]
  14. Saiz-Rubio, V.; Rovira-Más, F. From smart farming towards agriculture 5.0: A review on crop data management. Agronomy 2020, 10, 207. [Google Scholar] [CrossRef]
  15. Zambon, I.; Cecchini, M.; Egidi, G.; Saporito, M.G.; Colantoni, A. Revolution 4.0: Industry vs. agriculture in a future development for SMEs. Processes 2019, 7, 36. [Google Scholar] [CrossRef]
  16. Zhang, L.Y.; Wang, A.C.; Zhang, H.Y.; Zhu, Q.Z.; Zhang, H.H.; Sun, W.H.; Niu, Y.X. Estimating leaf chlorophyll content of winter wheat from uav multispectral images using machine learning algorithms under different species, growth stages, and nitrogen stress conditions. Agriculture 2024, 14, 1064. [Google Scholar] [CrossRef]
  17. Rejeb, A.; Rejeb, K.; Abdollahi, A.; Hassoun, A. Precision agriculture: A bibliometric analysis and research agenda. Smart Agric. Technol. 2024, 9, 100684. [Google Scholar] [CrossRef]
  18. Sørensen, C.G.; Madsen, N.A.; Jacobsen, B.H. Organic farming scenarios: Operational analysis and costs of implementing innovative technologies. Biosyst. Eng. 2005, 91, 127–137. [Google Scholar] [CrossRef]
  19. Pedersen, S.M.; Fountas, S.; Have, H.; Blackmore, B. Agricultural robots—System analysis and economic feasibility. Precis. Agric. 2006, 7, 295–308. [Google Scholar] [CrossRef]
  20. Hussain, A.; Fatima, H.S.; Zia, S.M.; Hasan, S.; Khurram, M.; Stricker, D.; Afzal, M.Z. Development of cost-effective and easily replicable robust weeding machine—Premiering precision agriculture in Pakistan. Machines 2023, 11, 287. [Google Scholar] [CrossRef]
  21. Al-Amin, A.A.; Lowenberg-DeBoer, J.; Franklin, K.; Dickin, E.; Monaghan, J.M.; Behrendt, K. Autonomous regenerative agriculture: Swarm robotics to change farm economics. Smart Agric. Technol. 2025, 11, 101005. [Google Scholar] [CrossRef]
  22. Robotics, V. Sharp-Shooting Farm Robot. Available online: https://www.techbriefs.com/component/content/article/47381-sharp-shooting-farm-robot (accessed on 10 January 2023).
  23. Robotics, S. SwarmFarm SwarmBot. 2019. Available online: https://www.swarmfarm.com/s (accessed on 10 January 2023).
  24. Naim, M.; Rizzo, D.; Sauvée, L.; Medici, M. Advancing agroecology and sustainability with agricultural robots at field level: A scoping review. Comput. Electron. Agric. 2025, 237, 110650. [Google Scholar] [CrossRef]
  25. Santos, L.; Santos, F.N.; Magalhães, S.; Costa, P.; Reis, R. Path planning approach with the extraction of topological maps from occupancy grid maps in steep slope vineyards. In Proceedings of the 2019 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Porto, Portugal, 24–26 April 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–7. [Google Scholar] [CrossRef]
  26. Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  27. Yang, Z.; Ouyang, L.; Zhang, Z.; Duan, J.; Yu, J.; Wang, H. Visual navigation path extraction of orchard hard pavement based on scanning method and neural network. Comput. Electron. Agric. 2022, 197, 106964. [Google Scholar] [CrossRef]
  28. Sun, H.; Slaughter, D.; Ruiz, M.P.; Gliever, C.; Upadhyaya, S.; Smith, R. RTK GPS mapping of transplanted row crops. Comput. Electron. Agric. 2010, 71, 32–37. [Google Scholar] [CrossRef]
  29. Koo, G.; Kim, K.; Chung, J.Y.; Choi, J.; Kwon, N.Y.; Kang, D.Y.; Sohn, H. Development of a high precision displacement measurement system by fusing a low cost RTK-GPS sensor and a force feedback accelerometer for infrastructure monitoring. Sensors 2017, 17, 2745. [Google Scholar] [CrossRef]
  30. Dabove, P.; Manzino, A.M. Artificial neural network for detecting incorrectly fixed phase ambiguities for L1 mass-market receivers. GPS Solut. 2017, 21, 1213–1219. [Google Scholar] [CrossRef]
  31. Chou, H.Y.; Khorsandi, F.; Vougioukas, S.G.; Fathallah, F.A. Developing and evaluating an autonomous agricultural all-terrain vehicle for field experimental rollover simulations. Comput. Electron. Agric. 2022, 194, 106735. [Google Scholar] [CrossRef]
  32. Bin, X.; Junxiong, Z.; Feng, Q.; Zhiqi, F.; Dashuai, W.; Wei, L. Navigation control system for orchard spraying machine based on Beidou navigation satellite system. Trans. Chin. Soc. Agric. Mach. 2017, 48, 45–50. [Google Scholar] [CrossRef]
  33. Perez-Ruiz, M.; Slaughter, D.C.; Gliever, C.; Upadhyaya, S.K. Tractor-based real-time kinematic-global positioning system (RTK-GPS) guidance system for geospatial mapping of row crop transplant. Biosyst. Eng. 2012, 111, 64–71. [Google Scholar] [CrossRef]
  34. Consoli, A.; Ayadi, J.; Bianchi, G.; Pluchino, S.; Piazza, F.; Baddour, R.; Parés, M.E.; Navarro, J.; Colomina, I.; Gameiro, A. A multi-antenna approach for UAV’s attitude determination. In Proceedings of the 2015 IEEE Metrology for Aerospace (MetroAeroSpace), Benevento, Italy, 4–5 June 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 401–405. [Google Scholar] [CrossRef]
  35. Eling, C.; Klingbeil, L.; Kuhlmann, H. Real-time single-frequency GPS/MEMS-IMU attitude determination of lightweight UAVs. Sensors 2015, 15, 26212–26235. [Google Scholar] [CrossRef]
  36. Galati, R.; Mantriota, G.; Reina, G. RoboNav: An affordable yet highly accurate navigation system for autonomous agricultural robots. Robotics 2022, 11, 99. [Google Scholar] [CrossRef]
  37. Mousazadeh, H. A technical review on navigation systems of agricultural autonomous off-road vehicles. J. Terramech. 2013, 50, 211–232. [Google Scholar] [CrossRef]
  38. Valente, D.S.M.; Momin, A.; Grift, T.; Hansen, A. Accuracy and precision evaluation of two low-cost RTK global navigation satellite systems. Comput. Electron. Agric. 2020, 168, 105142. [Google Scholar] [CrossRef]
  39. Lyle, S.D. Experiment to test RTK GPS with satellite “internet to tractor” for precision agriculture. Int. J. Agric. Environ. Inf. Syst. (IJAEIS) 2013, 4, 1–13. [Google Scholar] [CrossRef]
  40. Li, Q.; Nevalainen, P.; Peña Queralta, J.; Heikkonen, J.; Westerlund, T. Localization in unstructured environments: Towards autonomous robots in forests with delaunay triangulation. Remote Sens. 2020, 12, 1870. [Google Scholar] [CrossRef]
  41. Cong, C.; Guangqiao, C.; Jinlong, Z.; Jianping, H. Dynamic monitoring of harvester working progress based on traveling trajectory and header status. Eng. Agric. 2023, 43, e20220196. [Google Scholar] [CrossRef]
  42. Hao, H.; Fang, P.; Duan, E.; Yang, Z.; Wang, L.; Wang, H. A dead broiler inspection system for large-scale breeding farms based on deep learning. Agriculture 2022, 12, 1176. [Google Scholar] [CrossRef]
  43. Jan, M.S.; Ke, J.Y.; Chang, C.L. Integrated positioning method based on UWB and RTK-GNSS for seamless navigation of poultry robots. In Proceedings of the 2022 International Automatic Control Conference (CACS), Kaohsiung, Taiwan, China, 3–6 November 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–6. [Google Scholar] [CrossRef]
  44. Feng, Q.; Wang, B.; Zhang, W.; Li, X. Development and test of spraying robot for anti-epidemic and disinfection in animal housing. In Proceedings of the 2021 WRC Symposium on Advanced Robotics and Automation (WRC SARA), Beijing, China, 11 September 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 24–29. [Google Scholar] [CrossRef]
  45. Li, M.H.; Gao, H.Y.; Zhao, M.X.; Mao, H.P. Development and experimentation of a real-time greenhouse positioning system based on IUKF-UWB. Agriculture 2024, 14, 1479. [Google Scholar] [CrossRef]
  46. Radcliffe, J.; Cox, J.; Bulanon, D.M. Machine vision for orchard navigation. Comput. Ind. 2018, 98, 165–171. [Google Scholar] [CrossRef]
  47. Åstrand, B.; Baerveldt, A.J. A vision based row-following system for agricultural field machinery. Mechatronics 2005, 15, 251–269. [Google Scholar] [CrossRef]
  48. Zheng, K.; Zhao, X.G.; Han, C.J.; He, Y.K.; Zhai, C.Y.; Zhao, C.J. Design and experiment of an automatic row-oriented spraying system based on machine vision for early-stage maize corps. Agriculture 2023, 13, 691. [Google Scholar] [CrossRef]
  49. Zhang, S.; Wang, Y.; Zhu, Z.; Li, Z.; Du, Y.; Mao, E. Tractor path tracking control based on binocular vision. Inf. Process. Agric. 2018, 5, 422–432. [Google Scholar] [CrossRef]
  50. Hiremath, S.; Van Evert, F.K.; ter Braak, C.; Stein, A.; van der Heijden, G. Image-based particle filtering for navigation in a semi-structured agricultural environment. Biosyst. Eng. 2014, 121, 85–95. [Google Scholar] [CrossRef]
  51. Chen, S.; Noguchi, N. Remote safety system for a robot tractor using a monocular camera and a YOLO-based method. Comput. Electron. Agric. 2023, 215, 108409. [Google Scholar] [CrossRef]
  52. Ma, Z.; Yang, S.Y.; Li, J.B.; Qi, J.T. Research on SLAM localization algorithm for orchard dynamic vision based on YOLOD-SLAM2. Agriculture 2024, 14, 1622. [Google Scholar] [CrossRef]
  53. Song, Y.; Xu, F.; Yao, Q.; Liu, J.; Yang, S. Navigation algorithm based on semantic segmentation in wheat fields using an RGB-D camera. Inf. Process. Agric. 2023, 10, 475–490. [Google Scholar] [CrossRef]
  54. Peng, Y.; Wang, A.C.; Liu, J.Z.; Faheem, M. A comparative study of semantic segmentation models for identification of grape with different varieties. Agriculture 2021, 11, 997. [Google Scholar] [CrossRef]
  55. Liu, H.; Zeng, X.; Shen, Y.; Xu, J.; Khan, Z. A single-stage navigation path extraction network for agricultural robots in orchards. Comput. Electron. Agric. 2025, 229, 109687. [Google Scholar] [CrossRef]
  56. Upadhyay, A.; Zhang, Y.; Koparan, C.; Rai, N.; Howatt, K.; Bajwa, S.; Sun, X. Advances in ground robotic technologies for site-specific weed management in precision agriculture: A review. Comput. Electron. Agric. 2024, 225, 109363. [Google Scholar] [CrossRef]
  57. Jiang, B.; He, J.; Yang, S.; Fu, H.; Li, T.; Song, H.; He, D. Fusion of machine vision technology and AlexNet-CNNs deep learning network for the detection of postharvest apple pesticide residues. Artif. Intell. Agric. 2019, 1, 1–8. [Google Scholar] [CrossRef]
  58. Abbas, I.; Liu, J.; Faheem, M.; Noor, R.S.; Shaikh, S.A.; Solangi, K.A.; Raza, S.M. Different sensor based intelligent spraying systems in Agriculture. Sens. Actuators A Phys. 2020, 316, 112265. [Google Scholar] [CrossRef]
  59. Liu, Y.; Noguchi, N.; Ishii, K. Attitude angle estimation for agricultural robot navigation based on sensor fusion with a low-cost IMU. IFAC Proc. Vol. 2013, 46, 130–134. [Google Scholar] [CrossRef]
  60. Hoang, M.L.; Pietrosanto, A. Yaw/Heading optimization by drift elimination on MEMS gyroscope. Sens. Actuators A Phys. 2021, 325, 112691. [Google Scholar] [CrossRef]
  61. Leanza, A.; Galati, R.; Ugenti, A.; Cavallo, E.; Reina, G. Where am I heading? A robust approach for orientation estimation of autonomous agricultural robots. Comput. Electron. Agric. 2023, 210, 107888. [Google Scholar] [CrossRef]
  62. Haixia, Q.; Banhazi, T.M.; Zhigang, Z.; Low, T.; Brookshaw, I.J. Preliminary laboratory test on navigation accuracy of an autonomous robot for measuring air quality in livestock buildings. Int. J. Agric. Biol. Eng. 2016, 9, 29–39. [Google Scholar] [CrossRef]
  63. Iqbal, J.; Xu, R.; Sun, S.; Li, C. Simulation of an autonomous mobile robot for LiDAR-based in-field phenotyping and navigation. Robotics 2020, 9, 46. [Google Scholar] [CrossRef]
  64. Grimaldi, V.; Simon, L.S.; Sans, M.; Courtois, G.; Lissek, H. Human head yaw estimation based on two 3-axis accelerometers. IEEE Sensors J. 2022, 22, 16963–16974. [Google Scholar] [CrossRef]
  65. Chen, W.; Li, X.; Zhang, H.; Jia, P.; Zou, F.; Lyu, W.; Sang, S. A heading correction technology based on magnetometer calibration and adaptive anti-interference algorithm. Sens. Actuators A Phys. 2023, 363, 114726. [Google Scholar] [CrossRef]
  66. Jones, M.H.; Bell, J.; Dredge, D.; Seabright, M.; Scarfe, A.; Duke, M.; MacDonald, B. Design and testing of a heavy-duty platform for autonomous navigation in kiwifruit orchards. Biosyst. Eng. 2019, 187, 129–146. [Google Scholar] [CrossRef]
  67. Mueller-Sim, T.; Jenkins, M.; Abel, J.; Kantor, G. The Robotanist: A ground-based agricultural robot for high-throughput crop phenotyping. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May 2017–3 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 3634–3639. [Google Scholar] [CrossRef]
  68. Higuti, V.A.; Velasquez, A.E.; Magalhaes, D.V.; Becker, M.; Chowdhary, G. Under canopy light detection and ranging-based autonomous navigation. J. Field Robot. 2019, 36, 547–567. [Google Scholar] [CrossRef]
  69. Bell, J.; MacDonald, B.A.; Ahn, H.S. Row following in pergola structured orchards. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 640–645. [Google Scholar] [CrossRef]
  70. Gasparino, M.V.; Higuti, V.A.; Velasquez, A.E.; Becker, M. Improved localization in a corn crop row using a rotated laser rangefinder for three-dimensional data acquisition. J. Braz. Soc. Mech. Sci. Eng. 2020, 42, 592. [Google Scholar] [CrossRef]
  71. Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef]
  72. Sukvichai, K.; Thongton, N.; Yajai, K. Implementation of a monocular orb slam for an indoor agricultural drone. In Proceedings of the 2023 Third International Symposium on Instrumentation, Control, Artificial Intelligence, and Robotics (ICA-SYMP), Bangkok, Thailand, 18–20 January 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 45–48. [Google Scholar] [CrossRef]
  73. Barth, R.; Hemming, J.; Van Henten, E.J. Design of an eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation. Biosyst. Eng. 2016, 146, 71–84. [Google Scholar] [CrossRef]
  74. Kemper, R.J.H.; Gonzalez, C.; Gardini, S.R.P. Autonomous navigation of a four-wheeled robot in a simulated blueberry farm environment. In Proceedings of the 2022 IEEE ANDESCON, Barranquilla, Colombia, 16–19 November 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–6. [Google Scholar] [CrossRef]
  75. Tan, H.; Zhao, X.; Zhai, C.; Fu, H.; Chen, L.; Yang, M. Design and experiments with a SLAM system for low-density canopy environments in greenhouses based on an improved Cartographer framework. Front. Plant Sci. 2024, 15, 1276799. [Google Scholar] [CrossRef] [PubMed]
  76. Yudanto, R.; Ompusunggu, A.P.; Bey-Temsamani, A. On improving low-cost IMU performance for online trajectory estimation. In Proceedings of the Smart Sensors, Actuators, and MEMS VII; and Cyber Physical Systems, Barcelona, Spain, 21 May 2015; SPIE: Bellingham, WA, USA, 2015; Volume 9517, pp. 639–650. [Google Scholar] [CrossRef]
  77. Fei, K.; Mai, C.; Jiang, R.; Zeng, Y.; Ma, Z.; Cai, J.; Li, J. Research on a Low-Cost High-Precision Positioning System for Orchard Mowers. Agriculture 2024, 14, 813. [Google Scholar] [CrossRef]
  78. Fu, Z.; Shi, Y.; Si, P.; Gao, S.; Yang, Y. Tightly coupled visual-inertial fusion with image enhancement for robust positioning. Meas. Sci. Technol. 2024, 35, 096311. [Google Scholar] [CrossRef]
  79. Qu, J.; Qiu, Z.; Li, L.; Guo, K.; Li, D. Map Construction and Positioning Method for LiDAR SLAM-Based Navigation of an Agricultural Field Inspection Robot. Agronomy 2024, 14, 2365. [Google Scholar] [CrossRef]
  80. Hiremath, S.A.; Van Der Heijden, G.W.; Van Evert, F.K.; Stein, A.; Ter Braak, C.J. Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter. Comput. Electron. Agric. 2014, 100, 41–50. [Google Scholar] [CrossRef]
  81. Carvalho, G.S.; Silva, F.O.; Pacheco, M.V.O.; Campos, G.A. Performance analysis of relative GPS positioning for low-cost receiver-equipped agricultural rovers. Sensors 2023, 23, 8835. [Google Scholar] [CrossRef]
  82. Li, C.Q.; Wu, J.G.; Pan, X.Y.; Dou, H.J.; Zhao, X.G.; Gao, Y.Y.; Yang, S.; Zhai, C.Y. Design and experiment of a breakpoint continuous spraying system for automatic-guidance boom sprayers. Agriculture 2023, 13, 2203. [Google Scholar] [CrossRef]
  83. Rigatos, G.; Tzafestas, S. Extended Kalman filtering for fuzzy modelling and multi-sensor fusion. Math. Comput. Model. Dyn. Syst. 2007, 13, 251–266. [Google Scholar] [CrossRef]
  84. Nubert, J.; Khattak, S.; Hutter, M. Graph-based multi-sensor fusion for consistent localization of autonomous construction robots. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 10048–10054. [Google Scholar] [CrossRef]
  85. Tian, Y.; Mai, Z.; Zeng, Z.; Cai, Y.; Yang, J.; Zhao, B.; Zhu, X.; Qi, L. Design and experiment of an integrated navigation system for a paddy field scouting robot. Comput. Electron. Agric. 2023, 214, 108336. [Google Scholar] [CrossRef]
  86. Gai, J.; Xiang, L.; Tang, L. Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle. Comput. Electron. Agric. 2021, 188, 106301. [Google Scholar] [CrossRef]
  87. Yan, Y.; Zhang, B.; Zhou, J.; Zhang, Y.; Liu, X. Real-time localization and mapping utilizing multi-sensor fusion and visual–IMU–wheel odometry for agricultural robots in unstructured, dynamic and GPS-denied greenhouse environments. Agronomy 2022, 12, 1740. [Google Scholar] [CrossRef]
  88. Miao, C.; Chu, H.; Cao, J.; Sun, Z.; Yi, R. Steering angle adaptive estimation system based on GNSS and MEMS gyro. Comput. Electron. Agric. 2018, 153, 196–201. [Google Scholar] [CrossRef]
  89. Li, S.; Zhang, M.; Ji, Y.; Zhang, Z.; Cao, R.; Chen, B.; Li, H.; Yin, Y. Agricultural machinery GNSS/IMU-integrated navigation based on fuzzy adaptive finite impulse response Kalman filtering algorithm. Comput. Electron. Agric. 2021, 191, 106524. [Google Scholar] [CrossRef]
  90. Ban, C.; Wang, L.; Su, T.; Chi, R.; Fu, G. Fusion of monocular camera and 3D LiDAR data for navigation line extraction under corn canopy. Comput. Electron. Agric. 2025, 232, 110124. [Google Scholar] [CrossRef]
  91. Vroegindeweij, B.A.; IJsselmuiden, J.; van Henten, E.J. Probabilistic localisation in repetitive environments: Estimating a robot’s position in an aviary poultry house. Comput. Electron. Agric. 2016, 124, 303–317. [Google Scholar] [CrossRef]
  92. Moura, M.S.; Ruiz, X.; Serrano, D.; Rizzo, C. A multisensor factor-graph SLAM framework for steep slope vineyards. In Proceedings of the Iberian Robotics Conference, Coimbra, Portugal, 22–24 November 2023; Springer: Berlin/Heidelberg, Germany, 2024; pp. 386–397. [Google Scholar] [CrossRef]
  93. Zhang, W.; Gong, L.; Huang, S.; Wu, S.; Liu, C. Factor graph-based high-precision visual positioning for agricultural robots with fiducial markers. Comput. Electron. Agric. 2022, 201, 107295. [Google Scholar] [CrossRef]
  94. Jiang, A.; Ahamed, T. Development of an autonomous navigation system for orchard spraying robots integrating a thermal camera and LiDAR using a deep learning algorithm under low-and no-light conditions. Comput. Electron. Agric. 2025, 235, 110359. [Google Scholar] [CrossRef]
  95. Teng, H.; Wang, Y.; Chatziparaschis, D.; Karydis, K. Adaptive LiDAR odometry and mapping for autonomous agricultural mobile robots in unmanned farms. Comput. Electron. Agric. 2025, 232, 110023. [Google Scholar] [CrossRef]
  96. Yang, D.; Cui, D.; Ying, Y. Development and trends of chicken farming robots in chicken farming tasks: A review. Comput. Electron. Agric. 2024, 221, 108916. [Google Scholar] [CrossRef]
  97. Jiang, S.; Qi, P.; Han, L.; Liu, L.; Li, Y.; Huang, Z.; Liu, Y.; He, X. Navigation system for orchard spraying robot based on 3D LiDAR SLAM with NDT_ICP point cloud registration. Comput. Electron. Agric. 2024, 220, 108870. [Google Scholar] [CrossRef]
  98. Zhu, F.H.; Chen, J.; Guan, Z.H.; Zhu, Y.H.; Shi, H.; Cheng, K. Development of a combined harvester navigation control system based on visual simultaneous localization and mapping-inertial guidance fusion. J. Agric. Eng. 2024, 55. [Google Scholar] [CrossRef]
  99. Zhao, Z.; Zhang, Y.; Long, L.; Lu, Z.; Shi, J. Efficient and adaptive lidar–visual–inertial odometry for agricultural unmanned ground vehicle. Int. J. Adv. Robot. Syst. 2022, 19, 17298806221094925. [Google Scholar] [CrossRef]
  100. Joffe, B.P.; Usher, C.T. Autonomous robotic system for picking up floor eggs in poultry houses. In Proceedings of the 2017 ASABE Annual International Meeting. American Society of Agricultural and Biological Engineers, 2017, Spokane, WA, USA, 16–19 July 2017; p. 1. [Google Scholar] [CrossRef]
  101. Ren, G.; Lin, T.; Ying, Y.; Chowdhary, G.; Ting, K.C. Agricultural robotics research applicable to poultry production: A review. Comput. Electron. Agric. 2020, 169, 105216. [Google Scholar] [CrossRef]
  102. Gronewold, A.M.; Mulford, P.; Ray, E.; Ray, L.E. Tactile sensing & visually-impaired navigation in densely planted row crops, for precision fertilization by small ugvs. Comput. Electron. Agric. 2025, 231, 110003. [Google Scholar] [CrossRef]
  103. Shi, H.; Xu, G.; Lu, W.; Ding, Q.; Chen, X. An electric gripper for picking brown mushrooms with flexible force and in situ measurement. Agriculture 2024, 14, 1181. [Google Scholar] [CrossRef]
  104. Istiak, M.A.; Syeed, M.M.; Hossain, M.S.; Uddin, M.F.; Hasan, M.; Khan, R.H.; Azad, N.S. Adoption of Unmanned Aerial Vehicle (UAV) imagery in agricultural management: A systematic literature review. Ecol. Inform. 2023, 78, 102305. [Google Scholar] [CrossRef]
  105. Hua, W.; Zhang, Z.; Zhang, W.; Liu, X.; Hu, C.; He, Y.; Mhamed, M.; Li, X.; Dong, H.; Saha, C.K. Key technologies in apple harvesting robot for standardized orchards: A comprehensive review of innovations, challenges, and future directions. Comput. Electron. Agric. 2025, 235, 110343. [Google Scholar] [CrossRef]
  106. Liu, Z.; Wu, J.; Fu, L.; Majeed, Y.; Feng, Y.; Li, R.; Cui, Y. Improved kiwifruit detection using pre-trained VGG16 with RGB and NIR information fusion. IEEE Access 2019, 8, 2327–2336. [Google Scholar] [CrossRef]
  107. Yang, L.; Noguchi, T.; Hoshino, Y. Development of a pumpkin fruits pick-and-place robot using an RGB-D camera and a YOLO based object detection AI model. Comput. Electron. Agric. 2024, 227, 109625. [Google Scholar] [CrossRef]
  108. Chen, H.Y.; Sang, I.C.; Norris, W.R.; Soylemezoglu, A.; Nottage, D. Terrain classification method using an NIR or RGB camera with a CNN-based fusion of vision and a reduced-order proprioception model. Comput. Electron. Agric. 2024, 227, 109539. [Google Scholar] [CrossRef]
  109. Nguyen, A.H.; Holt, J.P.; Knauer, M.T.; Abner, V.A.; Lobaton, E.J.; Young, S.N. Towards rapid weight assessment of finishing pigs using a handheld, mobile RGB-D camera. Biosyst. Eng. 2023, 226, 155–168. [Google Scholar] [CrossRef]
  110. Tu, S.; Xue, Y.; Zheng, C.; Qi, Y.; Wan, H.; Mao, L. Detection of passion fruits and maturity classification using Red-Green-Blue Depth images. Biosyst. Eng. 2018, 175, 156–167. [Google Scholar] [CrossRef]
  111. Hu, T.T.; Wang, W.B.; Gu, J.A.; Xia, Z.L.; Zhang, J.; Wang, B. Research on apple object detection and localization method based on improved YOLOX and RGB-D images. Agronomy 2023, 13, 1816. [Google Scholar] [CrossRef]
  112. Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T. Development of a sweet pepper harvesting robot. J. Field Robot. 2020, 37, 1027–1039. [Google Scholar] [CrossRef]
  113. Zhang, Y.; Tian, Y.; Zhao, D.; Gao, P.; Duan, K. Segmentation of apple point clouds based on ROI in RGB images. INMATEH-Agric. Eng. 2019, 58, 209–218. [Google Scholar] [CrossRef]
  114. Shafi, U.; Mumtaz, R.; Iqbal, N.; Zaidi, S.M.H.; Zaidi, S.A.R.; Hussain, I.; Mahmood, Z. A multi-modal approach for crop health mapping using low altitude remote sensing, internet of things (IoT) and machine learning. IEEE Access 2020, 8, 112708–112724. [Google Scholar] [CrossRef]
  115. Guo, Y.; Ren, H. Remote sensing monitoring of maize and paddy rice planting area using GF-6 WFV red edge features. Comput. Electron. Agric. 2023, 207, 107714. [Google Scholar] [CrossRef]
  116. Scutelnic, D.; Muradore, R.; Daffara, C. A multispectral camera in the VIS–NIR equipped with thermal imaging and environmental sensors for non invasive analysis in precision agriculture. HardwareX 2024, 20, e00596. [Google Scholar] [CrossRef]
  117. Zhang, S.C.; Xue, X.Y.; Chen, C.; Sun, Z.; Sun, T. Development of a low-cost quadrotor UAV based on ADRC for agricultural remote sensing. Int. J. Agric. Biol. Eng. 2019, 12, 82–87. [Google Scholar] [CrossRef]
  118. Wang, R.; Zhao, H.; Zhang, C.; Hao, Z.; Chen, A.; Xu, R.; He, J. Development of soil water content retrieving method for irrigation agriculture areas using the red-edge band of Gaofen-6 satellite. Agric. Water Manag. 2024, 303, 109045. [Google Scholar] [CrossRef]
  119. Gozukara, G.; Akça, E.; Dengiz, O.; Kapur, S.; Adak, A. Soil particle size prediction using Vis-NIR and pXRF spectra in a semiarid agricultural ecosystem in Central Anatolia of Türkiye. Catena 2022, 217, 106514. [Google Scholar] [CrossRef]
  120. Sun, Y.; Qin, Q.; Zhang, Y.; Ren, H.; Han, G.; Zhang, Z.; Zhang, T.; Wang, B. A leaf chlorophyll vegetation index with reduced LAI effect based on Sentinel-2 multispectral red-edge information. Comput. Electron. Agric. 2025, 236, 110500. [Google Scholar] [CrossRef]
  121. Wang, H.; Jiang, M.; Yan, L.; Yao, Y.; Fu, Y.; Luo, S.; Lin, Y. Angular effect in proximal sensing of leaf-level chlorophyll content using low-cost DIY visible/near-infrared camera. Comput. Electron. Agric. 2020, 178, 105765. [Google Scholar] [CrossRef]
  122. Andritoiu, D.; Bazavan, L.C.; Besnea, F.L.; Roibu, H.; Bizdoaca, N.G. Agriculture autonomous monitoring and decisional mechatronic system. In Proceedings of the 2018 19th International Carpathian Control Conference (ICCC), Szilvasvarad, Hungary, 28–31 May 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 241–246. [Google Scholar] [CrossRef]
  123. Sun, Q.; Gu, X.; Chen, L.; Xu, X.; Wei, Z.; Pan, Y.; Gao, Y. Monitoring maize canopy chlorophyll density under lodging stress based on UAV hyperspectral imagery. Comput. Electron. Agric. 2022, 193, 106671. [Google Scholar] [CrossRef]
  124. Liang, L.; Zhao, S.H.; Qin, Z.H.; He, K.X.; Chong, C.; Luo, Y.X.; Zhou, X.D. Drought change trend using MODIS TVDI and its relationship with climate factors in China from 2001 to 2010. J. Integr. Agric. 2014, 13, 1501–1508. [Google Scholar] [CrossRef]
  125. Guo, J.; Bai, Q.; Guo, W.; Bu, Z.; Zhang, W. Soil moisture content estimation in winter wheat planting area for multi-source sensing data using CNNR. Comput. Electron. Agric. 2022, 193, 106670. [Google Scholar] [CrossRef]
  126. Liu, Y.; Qian, J.; Yue, H. Comprehensive evaluation of Sentinel-2 red edge and shortwave-infrared bands to estimate soil moisture. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 7448–7465. [Google Scholar] [CrossRef]
  127. Santasup, N.; Theanjumpol, P.; Santasup, C.; Kittiwachana, S.; Mawan, N.; Prantong, L.; Khongdee, N. Development of near-infrared spectroscopy (NIRS) for estimating organic matter, total carbon, and total nitrogen in agricultural soil. MethodsX 2024, 13, 102798. [Google Scholar] [CrossRef]
  128. Munawar, A.A.; Yunus, Y.; Satriyo, P. Calibration models database of near infrared spectroscopy to predict agricultural soil fertility properties. Data Brief 2020, 30, 105469. [Google Scholar] [CrossRef] [PubMed]
  129. Havens, K.J.; Sharp, E.J. Thermal Imaging Techniques to Survey and Monitor Animals in the Wild: A Methodology; Academic Press: Cambridge, MA, USA, 2015. [Google Scholar] [CrossRef]
  130. Das, S.; Christopher, J.; Apan, A.; Choudhury, M.R.; Chapman, S.; Menzies, N.W.; Dang, Y.P. UAV-thermal imaging: A robust technology to evaluate in-field crop water stress and yield variation of wheat genotypes. In Proceedings of the 2020 IEEE India Geoscience and Remote Sensing Symposium (InGARSS), Ahmedabad, India, 1–4 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 138–141. [Google Scholar] [CrossRef]
  131. Naik, S.; Patel, B. Thermal imaging with fuzzy classifier for maturity and size based non-destructive mango (Mangifera indica L.) grading. In Proceedings of the 2017 International Conference on Emerging Trends & Innovation in ICT (ICEI), Pune, India, 3–5 February 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 15–20. [Google Scholar] [CrossRef]
  132. Yang, Z.; Sun, W.; Liu, F.; Zhang, Y.; Chen, X.; Wei, Z.; Li, X. Field collaborative recognition method and experiment for thermal infrared imaging of damaged potatoes. Comput. Electron. Agric. 2024, 223, 109096. [Google Scholar] [CrossRef]
  133. Gräf, M.; Immitzer, M.; Hietz, P.; Stangl, R. Water-stressed plants do not cool: Leaf surface temperature of living wall plants under drought stress. Sustainability 2021, 13, 3910. [Google Scholar] [CrossRef]
  134. Zhou, Z.; Majeed, Y.; Naranjo, G.D.; Gambacorta, E.M. Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications. Comput. Electron. Agric. 2021, 182, 106019. [Google Scholar] [CrossRef]
  135. Gómez-Candón, D.; Virlet, N.; Labbé, S.; Jolivot, A.; Regnard, J.L. Field phenotyping of water stress at tree scale by UAV-sensed imagery: New insights for thermal acquisition and calibration. Precis. Agric. 2016, 17, 786–800. [Google Scholar] [CrossRef]
  136. Ribeiro-Gomes, K.; Hernández-López, D.; Ortega, J.F.; Ballesteros, R.; Poblete, T.; Moreno, M.A. Uncooled thermal camera calibration and optimization of the photogrammetry process for UAV applications in agriculture. Sensors 2017, 17, 2173. [Google Scholar] [CrossRef]
  137. Esposito, M.; Crimaldi, M.; Cirillo, V.; Sarghini, F.; Maggio, A. Drone and sensor technology for sustainable weed management: A review. Chem. Biol. Technol. Agric. 2021, 8, 1–11. [Google Scholar] [CrossRef]
  138. Sousa, J.J.; Toscano, P.; Matese, A.; Di Gennaro, S.F.; Berton, A.; Gatti, M.; Poni, S.; Pádua, L.; Hruška, J.; Morais, R. UAV-based hyperspectral monitoring using push-broom and snapshot sensors: A multisite assessment for precision viticulture applications. Sensors 2022, 22, 6574. [Google Scholar] [CrossRef]
  139. Du, B.; Mao, D.; Wang, Z.; Qiu, Z.; Yan, H.; Feng, K.; Zhang, Z. Mapping wetland plant communities using unmanned aerial vehicle hyperspectral imagery by comparing object/pixel-based classifications combining multiple machine-learning algorithms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 8249–8258. [Google Scholar] [CrossRef]
  140. Bourriz, M.; Hajji, H.; Laamrani, A.; Elbouanani, N.; Abdelali, H.A.; Bourzeix, F.; El-Battay, A.; Amazirh, A.; Chehbouni, A. Integration of hyperspectral imaging and ai techniques for crop type mapping: Present status, trends, and challenges. Remote Sens. 2025, 17, 1574. [Google Scholar] [CrossRef]
  141. Sun, Q.; Chen, L.; Gu, X.; Zhang, S.; Dai, M.; Zhou, J.; Gu, L.; Zhen, W. Estimation of canopy nitrogen nutrient status in lodging maize using unmanned aerial vehicles hyperspectral data. Ecol. Inform. 2023, 78, 102315. [Google Scholar] [CrossRef]
  142. Meiyan, S.; Qizhou, D.; ShuaiPeng, F.; Xiaohong, Y.; Jinyu, Z.; Lei, M.; Baoguo, L.; Yuntao, M. Improved estimation of canopy water status in maize using UAV-based digital and hyperspectral images. Comput. Electron. Agric. 2022, 197, 106982. [Google Scholar] [CrossRef]
  143. Zhang, Y.; Sun, J.; Li, J.; Wu, X.; Dai, C. Quantitative analysis of cadmium content in tomato leaves based on hyperspectral image and feature selection. Appl. Eng. Agric. 2018, 34, 789–798. [Google Scholar] [CrossRef]
  144. Li, Y.; Al-Sarayreh, M.; Irie, K.; Hackell, D.; Bourdot, G.; Reis, M.M.; Ghamkhar, K. Identification of weeds based on hyperspectral imaging and machine learning. Front. Plant Sci. 2021, 11, 611622. [Google Scholar] [CrossRef]
  145. Xiao, Z.T.; Yin, K.; Geng, L.; Wu, J.; Zhang, F.; Liu, Y.B. Pest identification via hyperspectral image and deep learning. Signal Image Video Process. 2022, 16, 873–880. [Google Scholar] [CrossRef]
  146. Wan, L.; Li, H.; Li, C.S.; Wang, A.C.; Yang, Y.H.; Wang, P. Hyperspectral sensing of plant diseases: Principle and methods. Agronomy 2022, 12, 1451. [Google Scholar] [CrossRef]
  147. Henila, M.; Chithra, P. Segmentation using fuzzy cluster-based thresholding method for apple fruit sorting. IET Image Process. 2020, 14, 4178–4187. [Google Scholar] [CrossRef]
  148. He, Z.; Xiong, J.; Chen, S.; Li, Z.; Chen, S.; Zhong, Z.; Yang, Z. A method of green citrus detection based on a deep bounding box regression forest. Biosyst. Eng. 2020, 193, 206–215. [Google Scholar] [CrossRef]
  149. Qu, H.; Du, H.; Tang, X.; Zhai, S. Citrus fruit diameter estimation in the field using monocular camera. Biosyst. Eng. 2025, 252, 47–60. [Google Scholar] [CrossRef]
  150. Niu, Y.X.; Han, W.T.; Zhang, H.H.; Zhang, L.Y.; Chen, H.P. Estimating maize plant height using a crop surface model constructed from UAV RGB images. Biosyst. Eng. 2024, 241, 56–67. [Google Scholar] [CrossRef]
  151. Parrish, E.; Goksel, A. Pictorial pattern recognition applied to fruit harvesting. Trans. ASAE 1977, 20, 0822–0827. [Google Scholar] [CrossRef]
  152. Liu, B.; Gould, S.; Koller, D. Single image depth estimation from predicted semantic labels. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 13–18 June 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 1253–1260. [Google Scholar] [CrossRef]
  153. Khan, F.; Salahuddin, S.; Javidnia, H. Deep learning-based monocular depth estimation methods—A state-of-the-art review. Sensors 2020, 20, 2272. [Google Scholar] [CrossRef] [PubMed]
  154. Masoumian, A.; Rashwan, H.A.; Cristiano, J.; Asif, M.S.; Puig, D. Monocular depth estimation using deep learning: A review. Sensors 2022, 22, 5353. [Google Scholar] [CrossRef] [PubMed]
  155. Luo, Y.S.; Wei, L.L.; Xu, L.Z.; Zhang, Q.; Liu, J.Y.; Cai, Q.B.; Zhang, W.B. Stereo-vision-based multi-crop harvesting edge detection for precise automatic steering of combine harvester. Biosyst. Eng. 2022, 215, 115–128. [Google Scholar] [CrossRef]
  156. Ji, W.; Meng, X.; Qian, Z.; Xu, B.; Zhao, D. Branch localization method based on the skeleton feature extraction and stereo matching for apple harvesting robot. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417705276. [Google Scholar] [CrossRef]
  157. Li, T.; Fang, W.; Zhao, G.; Gao, F.; Wu, Z.; Li, R.; Fu, L.; Dhupia, J. An improved binocular localization method for apple based on fruit detection using deep learning. Inf. Process. Agric. 2023, 10, 276–287. [Google Scholar] [CrossRef]
  158. Atif, M.; Lee, S. Adaptive pattern resolution for structured light 3D camera system. In Proceedings of the 2018 IEEE SENSORS, New Delhi, India, 28–31 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–4. [Google Scholar] [CrossRef]
  159. Zhang, Y.; Li, N.; Zhang, L.; Lin, J.; Gao, X.; Chen, G. A review on the recent developments in vision-based apple-harvesting robots for recognizing fruit and picking pose. Comput. Electron. Agric. 2025, 231, 109968. [Google Scholar] [CrossRef]
  160. Gao, C.; Jiang, H.; Liu, X.; Li, H.; Wu, Z.; Sun, X.; He, L.; Mao, W.; Majeed, Y.; Li, R. Improved binocular localization of kiwifruit in orchard based on fruit and calyx detection using YOLOv5x for robotic picking. Comput. Electron. Agric. 2024, 217, 108621. [Google Scholar] [CrossRef]
  161. Wang, B.; Chen, Z.; Gao, J.; Fu, L.; Su, B.; Cui, Y. The acquisition of kiwifruit feature point coordinates based on the spatial coordinates of image. In Proceedings of the Computer and Computing Technologies in Agriculture IX, Beijing, China, 27–30 September 2015; Springer: Berlin/Heidelberg, Germany, 2016; pp. 399–411. [Google Scholar] [CrossRef]
  162. Tang, Y.; Zhou, H.; Wang, H.; Zhang, Y. Fruit detection and positioning technology for a Camellia oleifera C. Abel orchard based on improved YOLOv4-tiny model and binocular stereo vision. Expert Syst. Appl. 2023, 211, 118573. [Google Scholar] [CrossRef]
  163. Perez, R.M.; Cheein, F.A.; Rosell-Polo, J.R. Flexible system of multiple RGB-D sensors for measuring and classifying fruits in agri-food Industry. Comput. Electron. Agric. 2017, 139, 231–242. [Google Scholar] [CrossRef]
  164. Mack, J.; Lenz, C.; Teutrine, J.; Steinhage, V. High-precision 3D detection and reconstruction of grapes from laser range data for efficient phenotyping based on supervised learning. Comput. Electron. Agric. 2017, 135, 300–311. [Google Scholar] [CrossRef]
  165. Yi, W.; Xia, S.; Kuzmin, S.; Gerasimov, I.; Cheng, X. RTFVE-YOLOv9: Real-time fruit volume estimation model integrating YOLOv9 and binocular stereo vision. Comput. Electron. Agric. 2025, 236, 110401. [Google Scholar] [CrossRef]
  166. Andujar, D.; Ribeiro, A.; Fernández-Quintanilla, C.; Dorado, J. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
  167. Vázquez-Arellano, M.; Reiser, D.; Paraforos, D.S.; Garrido-Izard, M.; Burce, M.E.C.; Griepentrog, H.W. 3-D reconstruction of maize plants using a time-of-flight camera. Comput. Electron. Agric. 2018, 145, 235–247. [Google Scholar] [CrossRef]
  168. Xiao, L.; Ding, K.; Gao, Y.; Rao, X. Behavior-induced health condition monitoring of caged chickens using binocular vision. Comput. Electron. Agric. 2019, 156, 254–262. [Google Scholar] [CrossRef]
  169. Salau, J.; Haas, J.H.; Junge, W.; Thaller, G. Automated calculation of udder depth and rear leg angle in Holstein-Friesian cows using a multi-Kinect cow scanning system. Biosyst. Eng. 2017, 160, 154–169. [Google Scholar] [CrossRef]
  170. Yin, L.; Cai, G.; Tian, X.; Sun, A.; Shi, S.; Zhong, H.; Liang, S. Three dimensional point cloud reconstruction and body size measurement of pigs based on multi-view depth camera. Trans. Chin. Soc. Agric. Eng. 2019, 35, 201–208. [Google Scholar] [CrossRef]
  171. Zhang, Y.Y.; Zhang, B.; Shen, C.; Liu, H.L.; Huang, J.C.; Tian, K.P.; Tang, Z. Review of the field environmental sensing methods based on multi-sensor information fusion technology. Int. J. Agric. Biol. Eng. 2024, 17, 1–13. [Google Scholar] [CrossRef]
  172. Liu, Q.C.; Yu, R.H.; Cai, Y.F.; Yuan, Q.; Wei, H.L.; Lv, C. Collision risk prediction and takeover requirements assessment based on radar-video integrated sensors data: A system framework based on LLM. Accid. Anal. Prev. 2025, 218, 108041. [Google Scholar] [CrossRef]
  173. Kang, H.; Wang, X.; Chen, C. Accurate fruit localisation using high resolution LiDAR-camera fusion and instance segmentation. Comput. Electron. Agric. 2022, 203, 107450. [Google Scholar] [CrossRef]
  174. Zhang, F.; Hassanzadeh, A.; Letendre, P.; Kikkert, J.; Pethybridge, S.; van Aardt, J. Enhancing snap bean yield prediction through synergistic integration of UAS-Based LiDAR and multispectral imagery. Comput. Electron. Agric. 2025, 230, 109923. [Google Scholar] [CrossRef]
  175. Dash, S.K.; Sembhi, H.; Langsdale, M.; Wooster, M.; Dodd, E.; Ghent, D.; Sinha, R. Assessing the field-scale crop water condition over an intensive agricultural plain using UAV-based thermal and multispectral imagery. J. Hydrol. 2025, 655, 132966. [Google Scholar] [CrossRef]
  176. Javidan, S.M.; Banakar, A.; Vakilian, K.A.; Ampatzidis, Y.; Rahnama, K. Early detection and spectral signature identification of tomato fungal diseases (Alternaria alternata, Alternaria solani, Botrytis cinerea, and Fusarium oxysporum) by RGB and hyperspectral image analysis and machine learning. Heliyon 2024, 10, e38017. [Google Scholar] [CrossRef] [PubMed]
  177. Bhole, A.; Udmale, S.S.; Falzon, O.; Azzopardi, G. CORF3D contour maps with application to Holstein cattle recognition from RGB and thermal images. Expert Syst. Appl. 2022, 192, 116354. [Google Scholar] [CrossRef]
  178. Gutiérrez, S.; Wendel, A.; Underwood, J. Spectral filter design based on in-field hyperspectral imaging and machine learning for mango ripeness estimation. Comput. Electron. Agric. 2019, 164, 104890. [Google Scholar] [CrossRef]
  179. Li, R.; Wang, X.; Cui, Y.; Xu, Y.; Zhou, Y.; Tang, X.; Jiang, C.; Song, Y.; Dong, H.; Yan, S. A Semi-Supervised Diffusion-Based Framework for Weed Detection in Precision Agricultural Scenarios Using a Generative Attention Mechanism. Agriculture 2025, 15, 434. [Google Scholar] [CrossRef]
  180. Li, Y.; Chao, X. Semi-supervised few-shot learning approach for plant diseases recognition. Plant Methods 2021, 17, 68. [Google Scholar] [CrossRef]
  181. Benchallal, F.; Hafiane, A.; Ragot, N.; Canals, R. ConvNeXt based semi-supervised approach with consistency regularization for weeds classification. Expert Syst. Appl. 2024, 239, 122222. [Google Scholar] [CrossRef]
  182. Hamdan, M.K.; Rover, D.T.; Darr, M.J.; Just, J. Generalizable semi-supervised learning method to estimate mass from sparsely annotated images. Comput. Electron. Agric. 2020, 175, 105533. [Google Scholar] [CrossRef]
  183. Li, J.; Zhao, X.; Xu, H.; Zhang, L.; Xie, B.; Yan, J.; Zhang, L.; Fan, D.; Li, L. An interpretable high-accuracy method for rice disease detection based on multisource data and transfer learning. Plants 2023, 12, 3273. [Google Scholar] [CrossRef]
  184. Simhadri, C.G.; Kondaveeti, H.K. Automatic recognition of rice leaf diseases using transfer learning. Agronomy 2023, 13, 961. [Google Scholar] [CrossRef]
  185. Buchke, P.; Mayuri, A. Recognize and classify illnesses on tomato leaves using EfficientNet’s Transfer Learning Approach with different size dataset. Signal Image Video Process. 2024, 18, 731–746. [Google Scholar] [CrossRef]
  186. Emmi, L.; Le Flécher, E.; Cadenat, V.; Devy, M. A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture. Precis. Agric. 2021, 22, 524–549. [Google Scholar] [CrossRef]
  187. Wang, Z.; Bian, Y.; Shladover, S.E.; Wu, G.; Li, S.E.; Barth, M.J. A survey on cooperative longitudinal motion control of multiple connected and automated vehicles. IEEE Intell. Transp. Syst. Mag. 2019, 12, 4–24. [Google Scholar] [CrossRef]
  188. Le, W.; Xue, Z.; Chen, J.; Zhang, Z. Coverage path planning based on the optimization strategy of multiple solar powered unmanned aerial vehicles. Drones 2022, 6, 203. [Google Scholar] [CrossRef]
  189. Zeeshan, S.; Aized, T. Performance analysis of path planning algorithms for fruit harvesting robot. J. Biosyst. Eng. 2023, 48, 178–197. [Google Scholar] [CrossRef]
  190. Zhou, M.; Sun, H.; Xu, X.; Yang, J.; Wang, G.; Wei, Z.; Xu, T.; Yin, J. Study on the method and mechanism of seedling picking for pepper (Capsicum annuum L.) plug seedlings. Agriculture 2023, 14, 11. [Google Scholar] [CrossRef]
  191. Kok, E.; Chen, C. Occluded apples orientation estimator based on deep learning model for robotic harvesting. Comput. Electron. Agric. 2024, 219, 108781. [Google Scholar] [CrossRef]
  192. Kang, H.; Zhou, H.; Chen, C. Visual perception and modeling for autonomous apple harvesting. IEEE Access 2020, 8, 62151–62163. [Google Scholar] [CrossRef]
  193. Li, T.; Xie, F.; Zhao, Z.; Zhao, H.; Guo, X.; Feng, Q. A multi-arm robot system for efficient apple harvesting: Perception, task plan and control. Comput. Electron. Agric. 2023, 211, 107979. [Google Scholar] [CrossRef]
  194. Liu, L.; Wang, X.; Yang, X.; Liu, H.; Li, J.; Wang, P. Path planning techniques for mobile robots: Review and prospect. Expert Syst. Appl. 2023, 227, 120254. [Google Scholar] [CrossRef]
  195. Ahmed, S.; Qiu, B.J.; Kong, C.W.; Xin, H.; Ahmad, F.; Lin, J.L. A Data-Driven Dynamic Obstacle Avoidance Method for Liquid-Carrying Plant Protection UAVs. Agronomy 2022, 12, 873. [Google Scholar] [CrossRef]
  196. Quan, L.; Han, L.; Zhou, B.; Shen, S.; Gao, F. Survey of UAV motion planning. IET Cyber-Systems Robot. 2020, 2, 14–21. [Google Scholar] [CrossRef]
  197. Cui, B.B.; Cui, X.Y.; Wei, X.H.; Zhu, Y.Y.; Ma, Z.; Zhao, Y.; Liu, Y.F. Design and testing of a tractor automatic navigation system based on dynamic path search and a fuzzy stanley model. Agriculture 2024, 14, 2136. [Google Scholar] [CrossRef]
  198. Yao, Z.; Zhao, C.; Zhang, T. Agricultural machinery automatic navigation technology. Iscience 2024, 27, 108714. [Google Scholar] [CrossRef] [PubMed]
  199. Zhao, X.; Liu, Z.; Liu, Y.; Zhang, B.; Sui, J.; Jiang, K. Structure design and application of combination track intelligent inspection robot used in substation indoor. Procedia Comput. Sci. 2017, 107, 190–195. [Google Scholar] [CrossRef]
  200. Zhang, Y.; Sun, W.; Yang, J.; Wu, W.; Miao, H.; Zhang, S. An approach for autonomous feeding robot path planning in poultry smart farm. Animals 2022, 12, 3089. [Google Scholar] [CrossRef] [PubMed]
  201. Xie, F.; Guo, Z.W.; Li, T.; Feng, Q.C.; Zhao, C.J. Dynamic task planning for multi-arm harvesting robots under multiple constraints using deep reinforcement learning. Horticulturae 2025, 11, 88. [Google Scholar] [CrossRef]
  202. Li, H. A Visual Recognition and Path Planning Method for Intelligent Fruit-Picking Robots. Sci. Program. 2022, 2022, 1297274. [Google Scholar] [CrossRef]
  203. Li, Y.; Wu, T.; Xiao, Y.; Gong, L.; Liu, C. Path planning in continuous adjacent farmlands and robust path-tracking control of a rice-seeding robot in paddy field. Comput. Electron. Agric. 2023, 210, 107900. [Google Scholar] [CrossRef]
  204. Lei, T.; Li, G.; Luo, C.; Zhang, L.; Liu, L.; Gates, R.S. An informative planning-based multi-layer robot navigation system as applied in a poultry barn. Intell. Robot. 2022, 2, 313–332. [Google Scholar] [CrossRef]
  205. Chen, D.; Wang, Q.; Lin, Y.; Ma, Z.; Sun, L.; Yu, G. Path tracking control of paddy field weeder integrated with satellite and visual methods. Comput. Electron. Agric. 2025, 234, 110257. [Google Scholar] [CrossRef]
  206. Vroegindeweij, B.A.; van Willigenburg, G.L.; Koerkamp, P.W.G.; van Henten, E.J. Path planning for the autonomous collection of eggs on floors. Biosyst. Eng. 2014, 121, 186–199. [Google Scholar] [CrossRef]
  207. Wang, Y.; Ye, Y.; Wu, H.; Tao, K.; Qian, M. In different weed distributions, the dynamic coverage algorithm for mechanical selective weeding robot. Comput. Electron. Agric. 2024, 226, 109486. [Google Scholar] [CrossRef]
  208. Ma, Z.; Qiu, H.; Wang, H.; Yang, L.; Huang, L.; Qiu, R. A* algorithm path planning and minimum snap trajectory generation for mobile robot. In Proceedings of the 2021 4th International Conference on Robotics, Control and Automation Engineering (RCAE), Wuhan, China, 4–6 November 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 284–288. [Google Scholar] [CrossRef]
  209. Webb, D.J.; Van Den Berg, J. Kinodynamic RRT*: Asymptotically optimal motion planning for robots with linear dynamics. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 5054–5061. [Google Scholar] [CrossRef]
  210. Petereit, J.; Emter, T.; Frey, C.W.; Kopfstedt, T.; Beutel, A. Application of hybrid A* to an autonomous mobile robot for path planning in unstructured outdoor environments. In Proceedings of the ROBOTIK 2012; 7th German conference on Robotics, Munich, Germany, 21–22 May 2012; VDE: Hong Kong, China, 2012; pp. 1–6. [Google Scholar]
  211. Zucker, M.; Ratliff, N.; Dragan, A.D.; Pivtoraiko, M.; Klingensmith, M.; Dellin, C.M.; Bagnell, J.A.; Srinivasa, S.S. Chomp: Covariant hamiltonian optimization for motion planning. Int. J. Robot. Res. 2013, 32, 1164–1193. [Google Scholar] [CrossRef]
  212. Wang, N.; Li, X.; Zhang, K.; Wang, J.; Xie, D. A survey on path planning for autonomous ground vehicles in unstructured environments. Machines 2024, 12, 31. [Google Scholar] [CrossRef]
  213. Chen, C.; Song, Z.; Li, X.; Chen, C.; Yang, F.; Wang, Z. Research status of apple picking robotic arm picking strategy and end-effector. Comput. Electron. Agric. 2025, 235, 110349. [Google Scholar] [CrossRef]
  214. Zhuang, M.; Li, G.; Ding, K. Obstacle avoidance path planning for apple picking robotic arm incorporating artificial potential field and A* algorithm. IEEE Access 2023, 11, 100070–100082. [Google Scholar] [CrossRef]
  215. Wang, B.; Du, X.X.; Wang, Y.N.; Mao, H.P. Multi-machine collaboration realization conditions and precise and efficient production mode of intelligent agricultural machinery. Int. J. Agric. Biol. Eng. 2024, 17, 27–36. [Google Scholar] [CrossRef]
  216. Gao, B.; Liu, Y.J.; Liu, L. Adaptive neural fault-tolerant control of a quadrotor UAV via fast terminal sliding mode. Aerosp. Sci. Technol. 2022, 129, 107818. [Google Scholar] [CrossRef]
  217. Ji, X.; He, X.; Lv, C.; Liu, Y.; Wu, J. Adaptive-neural-network-based robust lateral motion control for autonomous vehicle at driving limits. Control Eng. Pract. 2018, 76, 41–53. [Google Scholar] [CrossRef]
  218. Ren, Z.; Zheng, H.; Chen, J.; Chen, T.; Xie, P.; Xu, Y.; Deng, J.; Wang, H.; Sun, M.; Jiao, W. Integrating UAV, UGV and UAV-UGV collaboration in future industrialized agriculture: Analysis, opportunities and challenges. Comput. Electron. Agric. 2024, 227, 109631. [Google Scholar] [CrossRef]
  219. Wang, S.; Chen, J.; He, X. An adaptive composite disturbance rejection for attitude control of the agricultural quadrotor UAV. ISA Trans. 2022, 129, 564–579. [Google Scholar] [CrossRef]
  220. Kraus, T.; Ferreau, H.J.; Kayacan, E.; Ramon, H.; De Baerdemaeker, J.; Diehl, M.; Saeys, W. Moving horizon estimation and nonlinear model predictive control for autonomous agricultural vehicles. Comput. Electron. Agric. 2013, 98, 25–33. [Google Scholar] [CrossRef]
  221. Tang, L.D.; Wang, W.; Zhang, C.J.; Wang, Z.Y.; Ge, Z.Y.; Yuan, S.Q. Linear active disturbance rejection control system for the travel speed of an electric reel sprinkling irrigation machine. Agriculture 2024, 14, 1544. [Google Scholar] [CrossRef]
  222. Wang, Y.; Gao, J.; Li, K.; Chen, H. Integrated design of control allocation and triple-step control for over-actuated electric ground vehicles with actuator faults. J. Frankl. Inst. 2020, 357, 3150–3167. [Google Scholar] [CrossRef]
  223. Jiang, Y.; Meng, H.; Chen, G.; Yang, C.; Xu, X.; Zhang, L.; Xu, H. Differential-steering based path tracking control and energy-saving torque distribution strategy of 6WID unmanned ground vehicle. Energy 2022, 254, 124209. [Google Scholar] [CrossRef]
  224. Yue, M.; Wu, X.; Guo, L.; Gao, J. Quintic polynomial-based obstacle avoidance trajectory planning and tracking control framework for tractor-trailer system. Int. J. Control. Autom. Syst. 2019, 17, 2634–2646. [Google Scholar] [CrossRef]
  225. Sun, M.; Liu, D. Two-loop control of harvesting mechanical arm base on adaptive input shaping algorithm. In Proceedings of the 2022 International Conference on Virtual Reality, Human-Computer Interaction and Artificial Intelligence (VRHCIAI), Changsha, China, 28–30 October 2022; IEEE: Piscataway, NJ, USA, 2023; pp. 182–188. [Google Scholar] [CrossRef]
  226. Yin, X.; Yang, L.; Yao, D.; Yang, X.; Bian, Y.; Gong, Y. Improved DeepLabV3+ and GR-ConvNet for shiitake mushroom harvest robots flexible grasping of mimicry. Comput. Electron. Agric. 2025, 236, 110449. [Google Scholar] [CrossRef]
  227. Chen, K.W.; Li, T.; Yan, T.J.; Xie, F.; Feng, Q.C.; Zhu, Q.Z.; Zhao, C.J. A soft gripper design for apple harvesting with force feedback and fruit slip detection. Agriculture 2022, 12, 1802. [Google Scholar] [CrossRef]
  228. Zhao, X.; Wang, W.; Wen, L.; Chen, Z.; Wu, S.; Zhou, K.; Sun, M.; Xu, L.; Hu, B.; Wu, C. Digital twins in smart farming: An autoware-based simulator for autonomous agricultural vehicles. Int. J. Agric. Biol. Eng. 2023, 16, 184–189. [Google Scholar] [CrossRef]
  229. Li, J.Y.; Wu, Z.Z.; Li, M.Q.; Shang, Z.J. Dynamic measurement method for steering wheel angle of autonomous agricultural vehicles. Agriculture 2024, 14, 1602. [Google Scholar] [CrossRef]
  230. Koksal, N.; An, H.; Fidan, B. Backstepping-based adaptive control of a quadrotor UAV with guaranteed tracking performance. ISA Trans. 2020, 105, 98–110. [Google Scholar] [CrossRef]
  231. Lu, E.; Ma, Z.; Li, Y.M.; Xu, L.Z.; Tang, Z. Adaptive backstepping control of tracked robot running trajectory based on real-time slip parameter estimation. Int. J. Agric. Biol. Eng. 2020, 13, 178–187. [Google Scholar] [CrossRef]
  232. He, J.; Hu, L.; Wang, P.; Liu, Y.; Man, Z.; Tu, T.; Yang, L.; Li, Y.; Yi, Y.; Li, W. Path tracking control method and performance test based on agricultural machinery pose correction. Comput. Electron. Agric. 2022, 200, 107185. [Google Scholar] [CrossRef]
  233. Liu, H.; Yan, S.C.; Shen, Y.; Li, C.H.; Zhang, Y.F.; Hussain, F. Model predictive control system based on direct yaw moment control for 4WID self-steering agriculture vehicle. Int. J. Agric. Biol. Eng. 2021, 14, 175–181. [Google Scholar] [CrossRef]
  234. Jing, Y.; Li, Q.; Ye, W.; Liu, G. Development of a GNSS/INS-based automatic navigation land levelling system. Comput. Electron. Agric. 2023, 213, 108187. [Google Scholar] [CrossRef]
  235. Wang, W.; Yang, S.; Zhang, X.; Xia, X. Research on the smart broad bean harvesting system and the self-adaptive control method based on CPS technologies. Agronomy 2024, 14, 1405. [Google Scholar] [CrossRef]
  236. An, G.; Zhong, Z.; Yang, S.; Yang, L.; Jin, C.; Du, J.; Yin, X. EASS: An automatic steering system for agricultural wheeled vehicles using fuzzy control. Comput. Electron. Agric. 2024, 217, 108544. [Google Scholar] [CrossRef]
  237. Ding, F.; Zhang, W.; Luo, X.; Hu, L.; Zhang, Z.; Wang, M.; Li, H.; Peng, M.; Wu, X.; Hu, L. Gain self-adjusting single neuron PID control method and experiments for longitudinal relative position of harvester and transport vehicle. Comput. Electron. Agric. 2023, 213, 108215. [Google Scholar] [CrossRef]
  238. Ji, X.; Ding, S.; Wei, X.; Cui, B. Path tracking of unmanned agricultural tractors based on a novel adaptive second-order sliding mode control. J. Frankl. Inst. 2023, 360, 5811–5831. [Google Scholar] [CrossRef]
  239. Sun, J.L.; Wang, Z.; Ding, S.H.; Xia, J.; Xing, G.Y. Adaptive disturbance observer-based fixed time nonsingular terminal sliding mode control for path-tracking of unmanned agricultural tractors. Biosyst. Eng. 2024, 246, 96–109. [Google Scholar] [CrossRef]
  240. Zhang, W.; Gai, J.; Zhang, Z.; Tang, L.; Liao, Q.; Ding, Y. Double-DQN based path smoothing and tracking control method for robotic vehicle navigation. Comput. Electron. Agric. 2019, 166, 104985. [Google Scholar] [CrossRef]
  241. Sierra-García, J.E.; Santos, M. Intelligent control of an UAV with a cable-suspended load using a neural network estimator. Expert Syst. Appl. 2021, 183, 115380. [Google Scholar] [CrossRef]
  242. Etezadi, H.; Eshkabilov, S. A comprehensive overview of control algorithms, sensors, actuators, and communication tools of autonomous all-terrain vehicles in agriculture. Agriculture 2024, 14, 163. [Google Scholar] [CrossRef]
  243. Ly, T.T.K. Neural controller design of unmanned ground vehicle with four-wheel independent drive in agricultural farming. In Proceedings of the 2023 Asia Meeting on Environment and Electrical Engineering, Hanoi, Vietnam, 13–15 November 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 01–06. [Google Scholar] [CrossRef]
  244. Wang, H.; Zhang, X.; Meng, X.; Song, W.; Chen, Z. Electronic sheepdog: A novel method in with UAV-Assisted wearable grazing Monitoring. IEEE Internet Things J. 2023, 10, 16036–16047. [Google Scholar] [CrossRef]
  245. Ge, Z.; Man, Z.; Wang, Z.; Bai, X.; Wang, X.; Xiong, F.; Li, D. Robust adaptive sliding mode control for path tracking of unmanned agricultural vehicles. Comput. Electr. Eng. 2023, 108, 108693. [Google Scholar] [CrossRef]
  246. Gao, Y.Y.; Feng, K.Y.; Yang, S.; Han, X.; Wei, X.H.; Zhu, Q.Z.; Chen, L.P. Design and experiment of an unmanned variable-rate fertilization control system with self-calibration of fertilizer discharging shaft speed. Agronomy 2024, 14, 2336. [Google Scholar] [CrossRef]
  247. Tzafestas, S.G. Mobile robot control and navigation: A global overview. J. Intell. Robot. Syst. 2018, 91, 35–58. [Google Scholar] [CrossRef]
  248. Chen, J.; Ning, X.; Li, Y.; Yang, G.; Wu, P.; Chen, S. A fuzzy control strategy for the forward speed of a combine harvester based on KDD. Appl. Eng. Agric. 2017, 33, 15–22. [Google Scholar] [CrossRef]
  249. Liu, Z.; Xia, J.; Liu, G.; Cheng, J.; Wei, Y.; Xie, D. Design and analysis of a pneumatic automatic compensation system for miss-seeding based on speed synchronization. Agriculture 2023, 13, 1232. [Google Scholar] [CrossRef]
  250. Li, Z.; Chen, L.; Zheng, Q.; Dou, X.; Yang, L. Control of a path following caterpillar robot based on a sliding mode variable structure algorithm. Biosyst. Eng. 2019, 186, 293–306. [Google Scholar] [CrossRef]
  251. Nan, Y.; Zhang, H.; Zheng, J.; Yang, K.; Ge, Y. Low-volume precision spray for plant pest control using profile variable rate spraying and ultrasonic detection. Front. Plant Sci. 2023, 13, 1042769. [Google Scholar] [CrossRef] [PubMed]
  252. Xiong, Y.; Ge, Y.; Grimstad, L.; From, P.J. An autonomous strawberry-harvesting robot: Design, development, integration, and field evaluation. J. Field Robot. 2020, 37, 202–224. [Google Scholar] [CrossRef]
  253. Roshanianfard, A.; Noguchi, N.; Kamata, T. Design and performance of a robotic arm for farm use. Int. J. Agric. Biol. Eng. 2019, 12, 146–158. [Google Scholar] [CrossRef]
  254. Mirzakhaninafchi, H.; Singh, M.; Dixit, A.K.; Prakash, A.; Sharda, S.; Kaur, J.; Nafchi, A.M. Performance assessment of a sensor-based variable-rate real-time fertilizer applicator for rice crop. Sustainability 2022, 14, 11209. [Google Scholar] [CrossRef]
  255. Yang, Y.; Zhang, G.; Chen, Z.; Wen, X.; Cheng, S.; Ma, Q.; Qi, J.; Zhou, Y.; Chen, L. An independent steering driving system to realize headland turning of unmanned tractors. Comput. Electron. Agric. 2022, 201, 107278. [Google Scholar] [CrossRef]
  256. Gao, J.; Zhang, F.; Zhang, J.; Yuan, T.; Yin, J.; Guo, H.; Yang, C. Development and evaluation of a pneumatic finger-like end-effector for cherry tomato harvesting robot in greenhouse. Comput. Electron. Agric. 2022, 197, 106879. [Google Scholar] [CrossRef]
  257. Shao, Y.; Han, X.; Xuan, G.; Liu, Y.; Gao, C.; Wang, G.; Hu, Z. Development of a multi-adaptive feeding device for automated plug seedling transplanter. Int. J. Agric. Biol. Eng. 2021, 14, 91–96. [Google Scholar] [CrossRef]
  258. Tu, X.; Gai, J.; Tang, L. Robust navigation control of a 4WD/4WS agricultural robotic vehicle. Comput. Electron. Agric. 2019, 164, 104892. [Google Scholar] [CrossRef]
  259. Zeng, H.; Yang, J.; Yang, N.; Huang, J.; Long, H.; Chen, Y. A review of the research progress of pruning robots. In Proceedings of the 2022 IEEE 2nd International Conference on Data Science and Computer Application (ICDSCA), Dalian, China, 28–30 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1069–1073. [Google Scholar] [CrossRef]
  260. Tang, Z.; Li, Y.M.; Cheng, C. Development of multi-functional combine harvester with grain harvesting and straw baling. Span. J. Agric. Res. 2017, 15, e0202. [Google Scholar] [CrossRef]
  261. Zhu, Q.Z.; Zhu, Z.H.; Zhang, H.Y.; Gao, Y.Y.; Chen, L.P. Design of an electronically controlled fertilization system for an air-assisted side-deep fertilization machine. Agriculture 2023, 13, 2210. [Google Scholar] [CrossRef]
  262. Yao, M.; Hu, J.; Liu, W.; Shi, J.; Jin, Y.; Lv, J.; Sun, Z.; Wang, C. Precise Servo-Control System of a Dual-Axis Positioning Tray Conveying Device for Automatic Transplanting Machine. Agriculture 2024, 14, 1431. [Google Scholar] [CrossRef]
  263. Ye, S.; Xue, X.; Si, S.; Xu, Y.; Le, F.; Cui, L.; Jin, Y. Design and testing of an elastic comb reciprocating a soybean plant-to-plant seedling avoidance and weeding device. Agriculture 2023, 13, 2157. [Google Scholar] [CrossRef]
  264. Zhang, L.; Zhu, X.; Huang, J.; Huang, J.; Xie, J.; Xiao, X.; Yin, G.; Wang, X.; Li, M.; Fang, K. BDS/IMU integrated auto-navigation system of orchard spraying robot. Appl. Sci. 2022, 12, 8173. [Google Scholar] [CrossRef]
  265. Ramon Soria, P.; Sukkar, F.; Martens, W.; Arrue, B.C.; Fitch, R. Multi-view probabilistic segmentation of pome fruit with a low-cost RGB-D camera. In Proceedings of the ROBOT 2017: Third Iberian Robotics Conference, Sevilla, Spain, 22–24 November 2017; Springer: Berlin/Heidelberg, Germany, 2017; Volume 2, pp. 320–331. [Google Scholar] [CrossRef]
  266. Ye, L.; Duan, J.; Yang, Z.; Zou, X.; Chen, M.; Zhang, S. Collision-free motion planning for the litchi-picking robot. Comput. Electron. Agric. 2021, 185, 106151. [Google Scholar] [CrossRef]
  267. Shi, Y.; Jin, S.; Zhao, Y.; Huo, Y.; Liu, L.; Cui, Y. Lightweight force-sensing tomato picking robotic arm with a “global-local” visual servo. Comput. Electron. Agric. 2023, 204, 107549. [Google Scholar] [CrossRef]
  268. Liu, J.Z.; Liang, J.; Zhao, S.Y.; Jiang, Y.X.; Wang, J.; Jin, Y.C. Design of a virtual multi-interaction operation system for hand-eye coordination of grape harvesting robots. Agronomy 2023, 13, 829. [Google Scholar] [CrossRef]
  269. Wu, Q.; Gu, J. Design and research of robot visual servo system based on artificial intelligence. Agro Food Ind. Hi-Tech 2017, 28, 125–128. [Google Scholar]
  270. Koivumäki, J.; Zhu, W.H.; Mattila, J. Energy-efficient and high-precision control of hydraulic robots. Control Eng. Pract. 2019, 85, 176–193. [Google Scholar] [CrossRef]
  271. Hu, J.P.; Pan, J.H.; Dai, B.W.; Chai, X.Y.; Sun, Y.X.; Xu, L.Z. Development of an attitude adjustment crawler chassis for combine harvester and experiment of adaptive leveling system. Agronomy 2022, 12, 717. [Google Scholar] [CrossRef]
  272. Xie, D.; Chen, L.; Liu, L.; Chen, L.; Wang, H. Actuators and sensors for application in agricultural robots: A review. Machines 2022, 10, 913. [Google Scholar] [CrossRef]
  273. Jin, T.; Han, X. Robotic arms in precision agriculture: A comprehensive review of the technologies, applications, challenges, and future prospects. Comput. Electron. Agric. 2024, 221, 108938. [Google Scholar] [CrossRef]
  274. Pi, J.; Liu, J.; Zhou, K.H.; Qian, M.Y. An octopus-inspired bionic flexible gripper for apple grasping. Agriculture 2021, 11, 1014. [Google Scholar] [CrossRef]
  275. Gandomzadeh, D.; Abbaspour-Fard, M.H. Numerical study of the effect of core geometry on the performance of a magnetostrictive transducer. J. Magn. Magn. Mater. 2020, 513, 166823. [Google Scholar] [CrossRef]
  276. Yuan, Z.; Li, X.; Xiao, Z.; Zhang, Z.; Zhou, S.; Hong, C.; Chen, X.; Zeng, L.; Wang, Y.; Wu, J. A novel fractional-order framework for creep nonlinearity in piezoelectric actuators. Sens. Actuators A Phys. 2025, 391, 116639. [Google Scholar] [CrossRef]
  277. Liao, W.; Yang, Z. 3D printing programmable liquid crystal elastomer soft pneumatic actuators. Mater. Horizons 2023, 10, 576–584. [Google Scholar] [CrossRef]
  278. Brown, E.; Rodenberg, N.; Amend, J.; Mozeika, A.; Steltz, E.; Zakin, M.R.; Lipson, H.; Jaeger, H.M. Universal robotic gripper based on the jamming of granular material. Proc. Natl. Acad. Sci. USA 2010, 107, 18809–18814. [Google Scholar] [CrossRef]
  279. Zhou, K.H.; Xia, L.R.; Liu, J.; Qian, M.Y.; Pi, J. Design of a flexible end-effector based on characteristics of tomatoes. Int. J. Agric. Biol. Eng. 2022, 15, 13–24. [Google Scholar] [CrossRef]
  280. Kalulu, M.; Mwanza, C.; Hussain, M.; Fu, G. Design and fabrication of anisotropic bilayer hydrogels with gradient structures: Enhanced swelling, conductivity, and programmable shape deformation for smart actuator applications. Sens. Actuators A Phys. 2025, 392, 116710. [Google Scholar] [CrossRef]
  281. Kozuki, H.; Yoshida, K.; Yasuga, H.; Kurashina, Y. Hydrogel-polymer hybrid actuator with soft lattice skeleton for excellent connectivity. Sens. Actuators B Chem. 2025, 430, 137377. [Google Scholar] [CrossRef]
  282. Liu, J.; Ding, L.; Pan, C.; Lai, X.; Wu, J.; Ding, Z.; Wang, L.; Jing, X.; Wang, Y.; Lv, L. A centipede-inspired bonded-type ultrasonic actuator with high thrust force density driven by dual-torsional-vibration-induced flexural traveling waves. Sens. Actuators A Phys. 2024, 377, 115733. [Google Scholar] [CrossRef]
  283. Tian, Z.; Xue, J.; Xiao, X.; Du, C.; Han, Z.; Liu, Y. Untethered multifunctional biomimetic soft actuator with programmable shape deformation capabilities and localized maneuverability. Sens. Actuators B Chem. 2024, 410, 135678. [Google Scholar] [CrossRef]
  284. Zhang, Z.; Jia, X.; Yang, T.; Gu, Y.; Wang, W.; Chen, L. Multi-objective optimization of lubricant volume in an ELSD considering thermal effects. Int. J. Therm. Sci. 2021, 164, 106884. [Google Scholar] [CrossRef]
  285. Li, Y.M.; Liu, Y.B.; Ji, K.Z.; Zhu, R.H. A fault diagnosis method for a differential inverse gearbox of a crawler combine harvester based on order analysis. Agriculture 2022, 12, 1300. [Google Scholar] [CrossRef]
  286. Gao, Y.; Yang, Y.; Fu, S.; Feng, K.; Han, X.; Hu, Y.; Zhu, Q.; Wei, X. Analysis of vibration characteristics of tractor–rotary cultivator combination based on time domain and frequency domain. Agriculture 2024, 14, 1139. [Google Scholar] [CrossRef]
  287. Kisiel, M.; Szpica, D.; Czaban, J.; Köten, H. Pneumatic brake valves used in vehicle trailers–A review. Eng. Fail. Anal. 2024, 158, 107942. [Google Scholar] [CrossRef]
  288. Ahmed, S.; Reza, M.N.; Karim, M.R.; Jin, H.; Kim, H.; Chung, S.O. Abnormal Operation Detection of Automated Orchard Irrigation System Actuators by Power Consumption Level. Sensors 2025, 25, 331. [Google Scholar] [CrossRef]
  289. Conesa-Muñoz, J.; Gonzalez-de Soto, M.; Gonzalez-de Santos, P.; Ribeiro, A. Distributed multi-level supervision to effectively monitor the operations of a fleet of autonomous vehicles in agricultural tasks. Sensors 2015, 15, 5402–5428. [Google Scholar] [CrossRef]
  290. Hussain, M.; He, L.; Schupp, J.; Heinemann, P. Green fruit removal dynamics for development of robotic green fruit thinning end-effector. J. ASABE 2022, 65, 779–788. [Google Scholar] [CrossRef]
  291. Roshanıanfard, A.R.; Noguchı, N. Kinematics analysis and simulation of a 5DOF articulated robotic arm applied to heavy products harvesting. J. Agric. Sci. 2018, 24, 90–104. [Google Scholar] [CrossRef]
  292. Li, K.; Huo, Y.; Liu, Y.; Shi, Y.; He, Z.; Cui, Y. Design of a lightweight robotic arm for kiwifruit pollination. Comput. Electron. Agric. 2022, 198, 107114. [Google Scholar] [CrossRef]
  293. Yang, Q.; Du, X.; Wang, Z.; Meng, Z.; Ma, Z.; Zhang, Q. A review of core agricultural robot technologies for crop productions. Comput. Electron. Agric. 2023, 206, 107701. [Google Scholar] [CrossRef]
  294. Jianping, H.; Xiaoyue, Y.; Jun, M.; Chunhui, Q.; Kumi, F.; Hanping, M. Dimensional synthesis and trajectory planning of plug seedling transplanting robot based on delta parallel mechanism. Comput. Electron. Agric. 2014, 107, 64–72. [Google Scholar] [CrossRef]
  295. Li, Z.; Luo, Y.; Shi, Z.; Xie, D.; Li, C. Design and research of end-effector for naval orange harvesting. J. Mech. Transm. 2020, 44, 67–73. [Google Scholar] [CrossRef]
  296. Zhao, Y.; Gong, L.; Liu, C.; Huang, Y. Dual-arm robot design and testing for harvesting tomato in greenhouse. IFAC-Pap. 2016, 49, 161–165. [Google Scholar] [CrossRef]
  297. Feng, Q.; Zou, W.; Fan, P.; Zhang, C.; Wang, X. Design and test of robotic harvesting system for cherry tomato. Int. J. Agric. Biol. Eng. 2018, 11, 96–100. [Google Scholar] [CrossRef]
  298. Wang, Y.; Yang, Y.; Yang, C.; Zhao, H.; Chen, G.; Zhang, Z.; Fu, S.; Zhang, M.; Xu, H. End-effector with a bite mode for harvesting citrus fruit in random stalk orientation environment. Comput. Electron. Agric. 2019, 157, 454–470. [Google Scholar] [CrossRef]
  299. Bu, L.; Chen, C.; Hu, G.; Sugirbay, A.; Sun, H.; Chen, J. Design and evaluation of a robotic apple harvester using optimized picking patterns. Comput. Electron. Agric. 2022, 198, 107092. [Google Scholar] [CrossRef]
  300. Li, Z.; Yuan, X.; Wang, C. A review on structural development and recognition–localization methods for end-effector of fruit–vegetable picking robots. Int. J. Adv. Robot. Syst. 2022, 19, 17298806221104906. [Google Scholar] [CrossRef]
  301. Hohimer, C.J.; Wang, H.; Bhusal, S.; Miller, J.; Mo, C.; Karkee, M. Design and field evaluation of a robotic apple harvesting system with a 3D-printed soft-robotic end-effector. Trans. ASABE 2019, 62, 405–414. [Google Scholar] [CrossRef]
  302. Salem, M.E.; Wang, Q.; Wen, R.; Xiang, M. Design and characterization of soft pneumatic actuator for universal robot gripper. In Proceedings of the 2018 International Conference on Control and Robots (ICCR), Hong Kong, China, 15–17 September 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 6–10. [Google Scholar] [CrossRef]
  303. Zou, X.; Ye, M.; Luo, C.; Xiong, J.; Luo, L.; Wang, H.; Chen, Y. Fault-tolerant design of a limited universal fruit-picking end-effector based on vision-positioning error. Appl. Eng. Agric. 2016, 32, 5–18. [Google Scholar] [CrossRef]
  304. Wang, D.; Dong, Y.; Lian, J.; Gu, D. Adaptive end-effector pose control for tomato harvesting robots. J. Field Robot. 2023, 40, 535–551. [Google Scholar] [CrossRef]
  305. Liu, L.; Yang, F.; Liu, X.; Du, Y.; Li, X.; Li, G.; Chen, D.; Zhu, Z.; Song, Z. A review of the current status and common key technologies for agricultural field robots. Comput. Electron. Agric. 2024, 227, 109630. [Google Scholar] [CrossRef]
  306. Hua, W.; Zhang, W.; Zhang, Z.; Liu, X.; Huang, M.; Igathinathane, C.; Vougioukas, S.; Saha, C.K.; Mustafa, N.; Salama, D.S. Vacuum suction end-effector development for robotic harvesters of fresh market apples. Biosyst. Eng. 2025, 249, 28–40. [Google Scholar] [CrossRef]
  307. Zhai, L.Y.; Khoo, L.P.; Fok, S.C. Knowledge acquisition and uncertainty in fault diagnosis: A rough sets perspective. In Data Mining and Knowledge Discovery Approaches Based on Rule Induction Techniques; Springer: Berlin/Heidelberg, Germany, 2006; pp. 359–394. [Google Scholar] [CrossRef]
  308. Paul, S.; Chang, J. Consequent pole flux modulated linear actuator under winding chang and field oriented control driving conditions for long track and multi-track agricultural robot. Comput. Electron. Agric. 2024, 217, 108582. [Google Scholar] [CrossRef]
  309. Chen, J.; Wang, S.; Li, P.; Tan, Y.; Zheng, Y.; Ren, Z. Fault-tolerant control of a multi-actuator agricultural aircraft against actuator failures. In Proceedings of the 2017 36th Chinese Control Conference (CCC), Dalian, China, 26–28 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 7327–7332. [Google Scholar] [CrossRef]
  310. Isermann, R. Fault-Diagnosis Applications: Model-Based Condition Monitoring: Actuators, Drives, Machinery, Plants, Sensors, and Fault-Tolerant Systems; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2011. [Google Scholar] [CrossRef]
  311. Zhang, B.; Bai, T.; Wu, G.; Wang, H.; Zhu, Q.; Zhang, G.; Meng, Z.; Wen, C. Fatigue Analysis of Shovel Body Based on Tractor Subsoiling Operation Measured Data. Agriculture 2024, 14, 1604. [Google Scholar] [CrossRef]
  312. Rout, R.; Bera, T.K.; Chatti, N. Fault tolerant self-reconfigurable waypoint guidance for mobile robots under actuator faults. J. Frankl. Inst. 2025, 362, 107735. [Google Scholar] [CrossRef]
  313. Wang, J.; Wang, X.; Wang, Y.; Sun, Y.; Sun, G. Intelligent joint actuator fault diagnosis for heavy-duty industrial robots. IEEE Sens. J. 2024, 24, 5292–15301. [Google Scholar] [CrossRef]
  314. Hao, S.H.; Tang, Z.; Guo, S.B.; Ding, Z.; Su, Z. Model and method of fault signal diagnosis for blockage and slippage of rice threshing drum. Agriculture 2022, 12, 1968. [Google Scholar] [CrossRef]
  315. Zhang, J.; Zhang, K.; An, Y.; Luo, H.; Yin, S. An integrated multitasking intelligent bearing fault diagnosis scheme based on representation learning under imbalanced sample condition. IEEE Trans. Neural Networks Learn. Syst. 2023, 35, 6231–6242. [Google Scholar] [CrossRef] [PubMed]
  316. Oh, H.; Choi, Y.Y.; Kim, M.; Sohn, Y.J.; Kim, S.G.; Lee, W.Y. Experimental validation of passive and active fault-tolerant controls against sensor faults in a proton exchange membrane fuel cell system. J. Process Control 2023, 129, 103064. [Google Scholar] [CrossRef]
  317. Xu, D.; Jiang, B.; Shi, P. Robust NSV fault-tolerant control system design against actuator faults and control surface damage under actuator dynamics. IEEE Trans. Ind. Electron. 2015, 62, 5919–5928. [Google Scholar] [CrossRef]
  318. Liu, K.; Wang, R.; Wang, X.; Wang, X. Anti-saturation adaptive finite-time neural network based fault-tolerant tracking control for a quadrotor UAV with external disturbances. Aerosp. Sci. Technol. 2021, 115, 106790. [Google Scholar] [CrossRef]
  319. Yu, Z.; Zhang, Y.; Jiang, B.; Yu, X.; Fu, J.; Jin, Y.; Chai, T. Distributed adaptive fault-tolerant close formation flight control of multiple trailing fixed-wing UAVs. ISA Trans. 2020, 106, 181–199. [Google Scholar] [CrossRef]
  320. Yu, Z.; Qu, Y.; Zhang, Y. Fault-tolerant containment control of multiple unmanned aerial vehicles based on distributed sliding-mode observer. J. Intell. Robot. Syst. 2019, 93, 163–177. [Google Scholar] [CrossRef]
  321. Liu, Z.; Liu, J.; Zhang, O.; Zhao, Y.; Chen, W.; Gao, Y. Adaptive disturbance observer-based fixed-time tracking control for uncertain robotic systems. IEEE Trans. Ind. Electron. 2024, 71, 14823–14831. [Google Scholar] [CrossRef]
  322. Lien, Y.H.; Peng, C.C.; Chen, Y.H. Adaptive observer-based fault detection and fault-tolerant control of quadrotors under rotor failure conditions. Appl. Sci. 2020, 10, 3503. [Google Scholar] [CrossRef]
  323. Ashraf, M.A.; Ijaz, S.; Javaid, U.; Hussain, S.; Anwaar, H.; Marey, M. A robust sensor and actuator fault tolerant control scheme for nonlinear system. IEEE Access 2021, 10, 626–637. [Google Scholar] [CrossRef]
  324. Ngo, V.T.; Tsai, C.T.; Liu, Y.C. Actuator fault-tolerant control allocation for cooperative transportation of multiple omnidirectional mobile robots. Adv. Robot. 2024, 38, 112–127. [Google Scholar] [CrossRef]
  325. Karras, G.C.; Fourlas, G.K. Model predictive fault tolerant control for omni-directional mobile robots. J. Intell. Robot. Syst. 2020, 97, 635–655. [Google Scholar] [CrossRef]
  326. Lao, L.; Ellis, M.; Christofides, P.D. Proactive fault-tolerant model predictive control: Concept and application. In Proceedings of the 2013 American Control Conference, Washington, DC, USA, 17–19 June 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 5140–5145. [Google Scholar] [CrossRef]
  327. Jia, F.; Cao, F.; Lyu, G.; He, X. A novel framework of cooperative design: Bringing active fault diagnosis into fault-tolerant control. IEEE Trans. Cybern. 2022, 53, 3301–3310. [Google Scholar] [CrossRef] [PubMed]
  328. Du, D.; Li, Z. Research on weakly conservative passive fault-tolerant control method considering the fault distribution. Eng. Sci. Technol. Int. J. 2025, 61, 101948. [Google Scholar] [CrossRef]
  329. Qiao, M.Y.; Chang, X.H. Quantified guaranteed cost fault-tolerant control for continuous-time fuzzy singular systems with sensor and actuator faults. IEEE Trans. Fuzzy Syst. 2023, 32, 660–670. [Google Scholar] [CrossRef]
  330. Wu, J.; Shi, H.; Jiang, X.; Su, C.; Li, P. Stochastic fuzzy predictive fault-tolerant control for time-delay nonlinear system with actuator fault under a certain probability. Optim. Control Appl. Methods 2023, 44, 1798–1827. [Google Scholar] [CrossRef]
  331. Fu, Z.; Wang, Y.; Tao, F.; Wang, N. Fixed-time trajectory tracking for multi-fault nonlinear systems: A passive fault-tolerant control scheme. Commun. Nonlinear Sci. Numer. Simul. 2025, 145, 108709. [Google Scholar] [CrossRef]
  332. Ma, Y.; Jiang, B.; Wang, J.; Gong, J. Adaptive fault-tolerant formation control for heterogeneous UAVs-UGVs systems with multiple actuator faults. IEEE Trans. Aerosp. Electron. Syst. 2023, 59, 6705–6716. [Google Scholar] [CrossRef]
  333. Wang, M.; Zhu, S.; Shen, M.; Liu, X.; Wen, S. Fault-tolerant synchronization for memristive neural networks with multiple actuator failures. IEEE Trans. Cybern. 2024, 54, 5092–5101. [Google Scholar] [CrossRef]
  334. Wang, L.; Li, A.; Lu, H.; Wang, C.; Zabolotnov, Y. Distributed adaptive event-triggered finite-time fault-tolerant containment control for multi-UAVs with input constraints and actuator failures. J. Frankl. Inst. 2024, 361, 107308. [Google Scholar] [CrossRef]
  335. Zhang, Z.; Jiao, T.; Li, Y.; Li, B.; Sun, H. Adaptive prescribed-time fault-tolerant control of robotic manipulators with actuator faults and unknown disturbances. Eur. J. Control. 2025, 84, 101234. [Google Scholar] [CrossRef]
  336. Liu, C.; Gong, L.; Yuan, J.; Li, Y. Current status and development trends of agricultural robots. Trans. Chin. Soc. Agric. Mach. 2022, 53, 1–22. [Google Scholar] [CrossRef]
  337. Zhao, C.; Fan, B.; Li, J.; Feng, Q. Agricultural robots: Technology progress, challenges and trends. Smart Agric. 2023, 5, 1–15. [Google Scholar] [CrossRef]
  338. AgXeed. AgBot. 2024. Available online: https://www.agxeed.com/our-solutions/agbot-2-055w3/ (accessed on 25 July 2025).
  339. Tamaki, K.; Nagasaka, Y.; Nishiwaki, K.; Saito, M.; Kikuchi, Y.; Motobayashi, K. A robot system for paddy field farming in Japan. IFAC Proc. Vol. 2013, 46, 143–147. [Google Scholar] [CrossRef]
  340. Matsuo, Y.; Yukumoto, O.; Noguchi, N. Enhanced adaptability of tilling robot (initial report)-outline of a tilling robot and enhanced adaptability of unmanned operation. Jarq-Jpn. Agric. Res. Q. 2012, 46, 295–303. [Google Scholar] [CrossRef]
  341. Zheng, W.Y.; Liang, Z.; Zhou, J. Research on combined noise reduction of GNSS elevation data of paddy field grader with EMD and SG filter. J. South China Agric. Univ. 2024, 45, 80–87. [Google Scholar] [CrossRef]
  342. Ahmad, F.; Adeel, M.; Qiu, B.J.; Ma, J.; Shoaib, M.; Shakoor, A.; Chandio, F.A. Sowing uniformity of bed-type pneumatic maize planter at various seedbed preparation levels and machine travel speeds. Int. J. Agric. Biol. Eng. 2021, 14, 165–171. [Google Scholar] [CrossRef]
  343. Bhimanpallewar, R.N.; Narasingarao, M.R. AgriRobot: Implementation and evaluation of an automatic robot for seeding and fertiliser microdosing in precision agriculture. Int. J. Agric. Resour. Gov. Ecol. 2020, 16, 33–50. [Google Scholar] [CrossRef]
  344. Kumar, P.; Ashok, G. Design and fabrication of smart seed sowing robot. Mater. Today Proc. 2021, 39, 354–358. [Google Scholar] [CrossRef]
  345. Qingzhen, Z.; Guangwei, W.; Zhihao, Z.; Hengyuan, Z.; Yuanyuan, G.; Liping, C. Design and test on winter wheat precision separated layer fertilization and wide-boundary sowing combined machine. Trans. Chin. Soc. Agric. Mach. 2022, 53, 25–35. [Google Scholar] [CrossRef]
  346. Shaikh, T.A.; Mir, W.A.; Rasool, T.; Sofi, S. Machine learning for smart agriculture and precision farming: Towards making the fields talk. Arch. Comput. Methods Eng. 2022, 29, 4557–4597. [Google Scholar] [CrossRef]
  347. Griepentrog, H.W.; Dühring Jaeger, C.; Paraforos, D. Robots for field operations with comprehensive multilayer control. KI-KüNstliche Intell. 2013, 27, 325–333. [Google Scholar] [CrossRef]
  348. Santhi, P.V.; Kapileswar, N.; Chenchela, V.K.; Prasad, C.V.S. Sensor and vision based autonomous AGRIBOT for sowing seeds. In Proceedings of the 2017 International Conference on Energy, Communication, Data Analytics and Soft Computing (ICECDS), Chennai, India, 1–2 August 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 242–245. [Google Scholar] [CrossRef]
  349. Azmi, H.N.; Hajjaj, S.S.H.; Gsangaya, K.R.; Sultan, M.T.H.; Mail, M.F.; Hua, L.S. Design and fabrication of an agricultural robot for crop seeding. Mater. Today Proc. 2023, 81, 283–289. [Google Scholar] [CrossRef]
  350. Lakhiar, I.A.; Yan, H.F.; Zhang, C.; Wang, G.Q.; He, B.; Hao, B.B.; Han, Y.J.; Wang, B.Y.; Bao, R.X.; Syed, T.N.; et al. A review of precision irrigation water-saving technology under changing climate for enhancing water use efficiency, crop yield, and environmental footprints. Agriculture 2024, 14, 1141. [Google Scholar] [CrossRef]
  351. Huang, C.C.; Chang, C.L. Design and implementation of bio-inspired snake bone-armed robot for agricultural irrigation application. IFAC-Pap. 2019, 52, 98–101. [Google Scholar] [CrossRef]
  352. Lamsen, F.C.; Favi, J.C.; Castillo, B.H.F. Indoor gardening with automatic irrigation system using arduino microcontroller. ASEAN Multidiscip. Res. J. 2022, 10, 131–148. [Google Scholar]
  353. Bodunde, O.; Adie, U.; Ikumapayi, O.; Akinyoola, J.; Aderoba, A. Architectural design and performance evaluation of a ZigBee technology based adaptive sprinkler irrigation robot. Comput. Electron. Agric. 2019, 160, 168–178. [Google Scholar] [CrossRef]
  354. Cruz Ulloa, C.; Krus, A.; Barrientos, A.; Del Cerro, J.; Valero, C. Trend technologies for robotic fertilization process in row crops. Front. Robot. AI 2022, 9, 808484. [Google Scholar] [CrossRef]
  355. Mao, J.; Niu, W.; Wang, H.; Zhang, B.; Cao, Z.; Guo, Z.; Zhao, H.; Zhou, C.; Gong, X. A agricultural spraying and fertilization robot based on visual navigation. In Proceedings of the 2020 15th IEEE Conference on Industrial Electronics and Applications (ICIEA), Kristiansand, Norway, 9–13 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 586–591. [Google Scholar] [CrossRef]
  356. Zhu, C.T.; Hao, S.H.; Liu, C.L.; Wang, Y.W.; Jia, X.; Xu, J.T.; Guo, S.B.; Huo, J.X.; Wang, W.M. An efficient computer vision-based dual-face target precision variable spraying robotic system for foliar fertilisers. Agronomy 2024, 14, 2770. [Google Scholar] [CrossRef]
  357. Robot, S. Tom Robot. 2023. Available online: https://smallrobotco.com/about.html (accessed on 25 July 2025).
  358. Queensland University of Technology. AgBot II Robotic Site-Specific Crop and Weed Management Tool. 2015. Available online: https://research.qut.edu.au/qcr/Projects/agbot-ii-robotic-site-specific-crop-and-weed-management-tool/#:~:text=AgBot%20II%20is%20a%20prototype%20agricultural%20robot (accessed on 25 July 2025).
  359. Özlüoymak, O. Design and development of a servo-controlled target-oriented robotic micro-dose spraying system in precision weed control. Semin. Cienc. Agrar. 2021, 42, 635–656. [Google Scholar] [CrossRef]
  360. Liu, J.Z.; Abbas, I.; Noor, R.S. Development of deep learning-based variable rate agrochemical spraying system for targeted weeds control in strawberry crop. Agronomy 2021, 11, 1480. [Google Scholar] [CrossRef]
  361. Sonke, S. Ecorobotix Ara spot sprayer: Face recognition for plants. In The Professional Farm Machinery Magazine; Profi: Yalding, Kent, UK, 2023; pp. 66–68. [Google Scholar]
  362. Hylio. AgDrones. 2025. Available online: https://www.hyl.io/ (accessed on 25 July 2025).
  363. Robotics, J. EA-30X. 2021. Available online: https://www.eav.top/EA30X (accessed on 25 July 2025).
  364. Hussain, M.; Farooq, S.; Merfield, C.; Jabran, K. Chapter 8—Mechanical weed control. In Non-Chemical Weed Control; Academic Press: Cambridge, MA, USA, 2018; pp. 133–155. [Google Scholar] [CrossRef]
  365. Gai, J.; Tang, L.; Steward, B.L. Automated crop plant detection based on the fusion of color and depth images for robotic weed control. J. Field Robot. 2020, 37, 35–52. [Google Scholar] [CrossRef]
  366. Tillett, N.; Hague, T.; Grundy, A.; Dedousis, A.P. Mechanical within-row weed control for transplanted crops using computer vision. Biosyst. Eng. 2008, 99, 171–178. [Google Scholar] [CrossRef]
  367. Sori, H.; Inoue, H.; Hatta, H.; Ando, Y. Effect for a paddy weeding robot in wet rice culture. J. Robot. Mechatron. 2018, 30, 198–205. [Google Scholar] [CrossRef]
  368. Group, M. Autonomous Field Worker. 2025. Available online: https://www.maxongroup.com/en-us/knowledge-and-support/blog/autonomous-field-roboter-41880 (accessed on 25 July 2025).
  369. Rabier, F.; Stas, M.; Manderyck, B.; Huyghebaert, B.; Limbourg, Q. Assessment of the integration of mechanical weeding for weed control in sugar beet-growing. In Proceedings of the 9th International Scientific Symposium on Farm Machinery and Process Management in Sustainable Agriculture, ULS, Lublin, Poland, 22–24 November 2017; pp. 330–335. [Google Scholar] [CrossRef]
  370. Machleb, J.; Peteinatos, G.G.; Kollenda, B.L.; Andújar, D.; Gerhards, R. Sensor-based mechanical weed control: Present state and prospects. Comput. Electron. Agric. 2020, 176, 105638. [Google Scholar] [CrossRef]
  371. Tran, D.; Schouteten, J.J.; Degieter, M.; Krupanek, J.; Jarosz, W.; Areta, A.; Emmi, L.; De Steur, H.; Gellynck, X. European stakeholders’ perspectives on implementation potential of precision weed control: The case of autonomous vehicles with laser treatment. Precis. Agric. 2023, 24, 2200–2222. [Google Scholar] [CrossRef]
  372. Scavo, A.; Mauromicale, G. Integrated weed management in herbaceous field crops. Agronomy 2020, 10, 466. [Google Scholar] [CrossRef]
  373. Zhao, P.; Chen, J.; Li, J.; Ning, J.; Chang, Y.; Yang, S. Design and Testing of an autonomous laser weeding robot for strawberry fields based on DIN-LW-YOLO. Comput. Electron. Agric. 2025, 229, 109808. [Google Scholar] [CrossRef]
  374. Tecnologies, N. Dino-Autonomous Mechanical Weeding Robot. SIM Magazin. Available online: https://siamagazin.com/dino-autonomous-mechanical-weeding-robot/ (accessed on 17 February 2020).
  375. He, B.; Cao, X.; Gu, Z. Kinematics of underactuated robotics for product carbon footprint. J. Clean. Prod. 2020, 257, 120491. [Google Scholar] [CrossRef]
  376. Xiong, Y.; Ge, Y.; Liang, Y.; Blackmore, S. Development of a prototype robot and fast path-planning algorithm for static laser weeding. Comput. Electron. Agric. 2017, 142, 494–503. [Google Scholar] [CrossRef]
  377. Martin, J.; Ansuategi, A.; Maurtua, I.; Gutierrez, A.; Obregón, D.; Casquero, O.; Marcos, M. A generic ROS-based control architecture for pest inspection and treatment in greenhouses using a mobile manipulator. IEEE Access 2021, 9, 94981–94995. [Google Scholar] [CrossRef]
  378. Iost Filho, F.H.; Heldens, W.B.; Kong, Z.; de Lange, E.S. Drones: Innovative technology for use in precision pest management. J. Econ. Entomol. 2020, 113, 1–25. [Google Scholar] [CrossRef]
  379. Mao, W.; Liu, Z.; Liu, H.; Yang, F.; Wang, M. Research progress on synergistic technologies of agricultural multi-robots. Appl. Sci. 2021, 11, 1448. [Google Scholar] [CrossRef]
  380. Qin, W.C.; Qiu, B.J.; Xue, X.Y.; Chen, C.; Xu, Z.F.; Zhou, Q.Q. Droplet deposition and control effect of insecticides sprayed with an unmanned aerial vehicle against plant hoppers. Crop Prot. 2016, 85, 79–88. [Google Scholar] [CrossRef]
  381. Ecorobotix. ARA Field Sprayer. 2025. Available online: https://ecorobotix.com/crop-care/ara-field-sprayer/ (accessed on 25 July 2025).
  382. Strube. BlueBob. 2009. Available online: https://www.strube.net/united-kingdom/business/innovations/bluebobr (accessed on 25 July 2025).
  383. Robotics, C. Driving The Future Of Farming. 2025. Available online: https://carbonrobotics.com/ (accessed on 25 July 2025).
  384. Miao, Z.; Yu, X.; Li, N.; Zhang, Z.; He, C.; Li, Z.; Deng, C.; Sun, T. Efficient tomato harvesting robot based on image processing and deep learning. Precis. Agric. 2023, 24, 254–287. [Google Scholar] [CrossRef]
  385. Kamata, T.; Roshanianfard, A.; Noguchi, N. Heavy-weight crop harvesting robot-controlling algorithm. IFAC-Pap. 2018, 51, 244–249. [Google Scholar] [CrossRef]
  386. Zahedi, A.; Shafei, A.M.; Shamsi, M. Application of hybrid robotic systems in crop harvesting: Kinematic and dynamic analysis. Comput. Electron. Agric. 2023, 209, 107724. [Google Scholar] [CrossRef]
  387. Kurita, H.; Iida, M.; Suguri, M.; Masuda, R. Application of image processing technology for unloading automation of robotic head-feeding combine harvester. Eng. Agric. Environ. Food 2012, 5, 146–151. [Google Scholar] [CrossRef]
  388. Zhang, C.; Noguchi, N. Development of a multi-robot tractor system for agriculture field work. Comput. Electron. Agric. 2017, 142, 79–90. [Google Scholar] [CrossRef]
  389. Geng, A.; Hu, X.; Liu, J.; Mei, Z.; Zhang, Z.; Yu, W. Development and Testing of Automatic Row Alignment System for Corn Harvesters. Appl. Sci. 2022, 12, 6221. [Google Scholar] [CrossRef]
  390. Da Silveira, F.; Lermen, F.H.; Amaral, F.G. An overview of agriculture 4.0 development: Systematic review of descriptions, technologies, barriers, advantages, and disadvantages. Comput. Electron. Agric. 2021, 189, 106405. [Google Scholar] [CrossRef]
  391. Li, Y.; Iida, M.; Suyama, T.; Suguri, M.; Masuda, R. Implementation of deep-learning algorithm for obstacle detection and collision avoidance for robotic harvester. Comput. Electron. Agric. 2020, 174, 105499. [Google Scholar] [CrossRef]
  392. Pooranam, N.; Vignesh, T. A swarm robot for harvesting a paddy field. In Nature-Inspired Algorithms Applications; Wiley: Hoboken, NJ, USA, 2021; pp. 137–156. [Google Scholar] [CrossRef]
  393. Wang, L.; Liu, M. Path tracking control for autonomous harvesting robots based on improved double arc path planning algorithm. J. Intell. Robot. Syst. 2020, 100, 899–909. [Google Scholar] [CrossRef]
  394. Li, X.; Li, T.; Qiu, Q.; Fan, Z.; Sun, N. Review on autonomous navigation for orchard mobile robots. J. Chin. Agric. Mech. 2022, 43, 156. [Google Scholar] [CrossRef]
  395. Li, H.; Chen, L.; Zhang, Z. A study on the utilization rate and influencing factors of small agricultural machinery: Evidence from 10 hilly and mountainous Provinces in China. Agriculture 2022, 13, 51. [Google Scholar] [CrossRef]
  396. Brown, J.; Paudel, A.; Biehler, D.; Thompson, A.; Karkee, M.; Grimm, C.; Davidson, J.R. Tree detection and in-row localization for autonomous precision orchard management. Comput. Electron. Agric. 2024, 227, 109454. [Google Scholar] [CrossRef]
  397. Raikwar, S.; Fehrmann, J.; Herlitzius, T. Navigation and control development for a four-wheel-steered mobile orchard robot using model-based design. Comput. Electron. Agric. 2022, 202, 107410. [Google Scholar] [CrossRef]
  398. Singh, N.K.; Narang, M.K.; Thakur, S.S.; Singh, M.; Singh, S.K.; Prakash, A. Influence of transplanting techniques and age of wash root type seedlings on planting attributes of paddy rice. Cogent Food Agric. 2023, 9, 2176978. [Google Scholar] [CrossRef]
  399. Liu, J.Z.; Zhao, S.Y.; Li, N.; Faheem, M.; Zhou, T.; Cai, W.J.; Zhao, M.Z.; Zhu, X.Y.; Li, P.P. Development and field test of an autonomous strawberry plug seeding transplanter for use in elevated cultivation. Appl. Eng. Agric. 2019, 35, 1067–1078. [Google Scholar] [CrossRef]
  400. Hu, J.; Zhang, J.; He, J.; Yan, X. Motion analysis and experiment for planting mechanism with planetary gears of transplanting machine. Trans. Chin. Soc. Agric. Mach. 2013, 44, 57–61. [Google Scholar] [CrossRef]
  401. Ni, Y.; Jin, C.; Liu, J. Design and experiment of system for picking up and delivering seedlings in automatic transplanter. Trans. Chin. Soc. Agric. Eng. 2015, 31, 10–19. [Google Scholar] [CrossRef]
  402. Ying, W.; Jianneng, C.; Xiong, Z.; Xincheng, S. Parameter optimization and experiment of planting mechanism driven by planetary non circular gears. Trans. Chin. Soc. Agric. Mach. 2015, 46, 85–93. [Google Scholar] [CrossRef]
  403. Han, L.; Mao, H.; Hu, J.; Kumi, F. Development of a riding-type fully automatic transplanter for vegetable plug seedlings. Span. J. Agric. Res. 2019, 17, e0205. [Google Scholar] [CrossRef]
  404. Jin, X.; Chen, K.; Zhao, Y.; Ji, J.; Jing, P. Simulation of hydraulic transplanting robot control system based on fuzzy PID controller. Measurement 2020, 164, 108023. [Google Scholar] [CrossRef]
  405. Han, L.; Mao, H.; Kumi, F.; Hu, J. Development of a multi-task robotic transplanting workcell for greenhouse seedlings. Appl. Eng. Agric. 2018, 34, 335–342. [Google Scholar] [CrossRef]
  406. Paradkar, V.; Raheman, H. Development of a metering mechanism with serial robotic arm for handling paper pot seedlings in a vegetable transplanter. Artif. Intell. Agric. 2021, 5, 52–63. [Google Scholar] [CrossRef]
  407. Hu, L.; Wang, B.; Wang, G.; Yu, Z.; You, Z.; Hu, Z.; Wang, B.; Gao, X. Design and experiment of type 2ZGF-2 duplex sweet potato transplanter. Trans. Chin. Soc. Agric. Eng. (Trans. CSAE) 2016, 32, 8–16. [Google Scholar] [CrossRef]
  408. Liu, Z.; Lv, Z.; Zheng, W.; Wang, X. Trajectory control of two-degree-of-freedom sweet potato transplanting robot arm. IEEE Access 2022, 10, 26294–26306. [Google Scholar] [CrossRef]
  409. Yang, Q.z.; Jia, C.p.; Sun, M.t.; Zhao, X.q.; He, M.s.; Mao, H.p.; Hu, J.p.; Addy, M. Trajectory planning and dynamics analysis of greenhouse parallel transplanting robot. Int. Agric. Eng. J. 2020, 29, 64–76. [Google Scholar]
  410. Yue, R.C.; Yao, M.J.; Zhang, T.F.; Shi, J.W.; Zhou, J.H.; Hu, J.P. Design and experiment of dual-row seedling pick-up device for high-speed automatic transplanting machine. Agriculture 2024, 14, 942. [Google Scholar] [CrossRef]
  411. Li, M.; Zhu, X.; Ji, J.; Jin, X.; Li, B.; Chen, K.; Zhang, W. Visual perception enabled agriculture intelligence: A selective seedling picking transplanting robot. Comput. Electron. Agric. 2025, 229, 109821. [Google Scholar] [CrossRef]
  412. Zahid, A.; Mahmud, M.S.; He, L.; Heinemann, P.; Choi, D.; Schupp, J. Technological advancements towards developing a robotic pruner for apple trees: A review. Comput. Electron. Agric. 2021, 189, 106383. [Google Scholar] [CrossRef]
  413. Williams, H.; Smith, D.; Shahabi, J.; Gee, T.; Nejati, M.; McGuinness, B.; Black, K.; Tobias, J.; Jangali, R.; Lim, H. Modelling wine grapevines for autonomous robotic cane pruning. Biosyst. Eng. 2023, 235, 31–49. [Google Scholar] [CrossRef]
  414. Huang, C.; Cai, D.; Wang, W.; Li, J.; Duan, J.; Yang, Z. Development of an automatic control system for a hydraulic pruning robot. Comput. Electron. Agric. 2023, 214, 108329. [Google Scholar] [CrossRef]
  415. Kolmanič, S.; Strnad, D.; Kohek, Š.; Benes, B.; Hirst, P.; Žalik, B. An algorithm for automatic dormant tree pruning. Appl. Soft Comput. 2021, 99, 106931. [Google Scholar] [CrossRef]
  416. Zhang, B.; Liu, Y.; Zhang, H.; Shen, C.; Fu, W. Design and evaluation of a shaping and pruning machine for dwarf and densely planted jujube trees. Appl. Sci. 2022, 12, 2699. [Google Scholar] [CrossRef]
  417. He, L.; Schupp, J. Sensing and automation in pruning of apple trees: A review. Agronomy 2018, 8, 211. [Google Scholar] [CrossRef]
  418. Botterill, T.; Paulin, S.; Green, R.; Williams, S.; Lin, J.; Saxton, V.; Mills, S.; Chen, X.; Corbett-Davies, S. A robot system for pruning grape vines. J. Field Robot. 2017, 34, 1100–1122. [Google Scholar] [CrossRef]
  419. You, A.; Sukkar, F.; Fitch, R.; Karkee, M.; Davidson, J.R. An efficient planning and control framework for pruning fruit trees. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 3930–3936. [Google Scholar] [CrossRef]
  420. Westling, F.; Underwood, J.; Bryson, M. A procedure for automated tree pruning suggestion using LiDAR scans of fruit trees. Comput. Electron. Agric. 2021, 187, 106274. [Google Scholar] [CrossRef]
  421. Litavniece, L.; Kodors, S.; Dekšne, J.; Lācis, G.; Zarembo, I.; Pacejs, A. Risk analysis for apple orchard survey and monitoring using uav. In Proceedings of the Environment Technology Resources Proceedings of the International Scientific and Practical Conference, Rezekne, Latvia, 13 June 2023. [Google Scholar] [CrossRef]
  422. Zhang, C.; Valente, J.; Kooistra, L.; Guo, L.; Wang, W. Orchard management with small unmanned aerial vehicles: A survey of sensing and analysis approaches. Precis. Agric. 2021, 22, 2007–2052. [Google Scholar] [CrossRef]
  423. Gaddikeri, V.; K R, A.; Satpute, A.; Jatav, M.; Rajput, J.; Dimple, D. UAVs in Orchard Management: Need, Challenges, and Applications; Just Agriculture: Jalandhar, India, 2022. [Google Scholar]
  424. Kim, K.; Deb, A.; Cappelleri, D.J. P-AgBot: In-row & under-canopy agricultural robot for monitoring and physical sampling. IEEE Robot. Autom. Lett. 2022, 7, 7942–7949. [Google Scholar] [CrossRef]
  425. Cossio-Montefinale, L.; Quiñinao, C.; Verschae, R. Orchard sweet cherry color distribution estimation from wireless sensor networks and video-based fruit detection. Comput. Electron. Agric. 2025, 235, 110334. [Google Scholar] [CrossRef]
  426. Barbosa, W.S.; Oliveira, A.I.; Barbosa, G.B.; Leite, A.C.; Figueiredo, K.T.; Vellasco, M.M.; Caarls, W. Design and development of an autonomous mobile robot for inspection of soy and cotton crops. In Proceedings of the 2019 12th International Conference on Developments in eSystems Engineering (DeSE), Kazan, Russia, 7–10 October 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 557–562. [Google Scholar] [CrossRef]
  427. Zhang, H.; Wang, L.; Tian, T.; Yin, J. A review of unmanned aerial vehicle low-altitude remote sensing (UAV-LARS) use in agricultural monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  428. Pei, H.T.; Sun, Y.Q.; Huang, H.; Zhang, W.; Sheng, J.J.; Zhang, Z.Y. Weed detection in maize fields by uav images based on crop row preprocessing and improved yolov4. Agriculture 2022, 12, 975. [Google Scholar] [CrossRef]
  429. Beniaich, A.; Silva, M.L.; Guimarães, D.V.; Avalos, F.A.; Terra, F.S.; Menezes, M.D.; Avanzi, J.C.; Cândido, B.M. UAV-based vegetation monitoring for assessing the impact of soil loss in olive orchards in Brazil. Geoderma Reg. 2022, 30, e00543. [Google Scholar] [CrossRef]
  430. Abdulridha, J.; Batuman, O.; Ampatzidis, Y. UAV-based remote sensing technique to detect citrus canker disease utilizing hyperspectral imaging and machine learning. Remote Sens. 2019, 11, 1373. [Google Scholar] [CrossRef]
  431. Stefas, N.; Bayram, H.; Isler, V. Vision-based monitoring of orchards with UAVs. Comput. Electron. Agric. 2019, 163, 104814. [Google Scholar] [CrossRef]
  432. Barrientos, A.; Colorado, J.; Cerro, J.d.; Martinez, A.; Rossi, C.; Sanz, D.; Valente, J. Aerial remote sensing in agriculture: A practical approach to area coverage and path planning for fleets of mini aerial robots. J. Field Robot. 2011, 28, 667–689. [Google Scholar] [CrossRef]
  433. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  434. Zhang, W.; Wang, J.; Liu, Y.; Chen, K.; Li, H.; Duan, Y.; Wu, W.; Shi, Y.; Guo, W. Deep-learning-based in-field citrus fruit detection and tracking. Hortic. Res. 2022, 9, uhac003. [Google Scholar] [CrossRef]
  435. Santos, T.T.; De Souza, L.L.; dos Santos, A.A.; Avila, S. Grape detection, segmentation, and tracking using deep neural networks and three-dimensional association. Comput. Electron. Agric. 2020, 170, 105247. [Google Scholar] [CrossRef]
  436. Xaud, M.F.; Leite, A.C.; Barbosa, E.S.; Faria, H.D.; Loureiro, G.S.; From, P.J. Robotic tankette for intelligent bioenergy agriculture: Design, development and field tests. arXiv 2019, arXiv:1901.00761. [Google Scholar] [CrossRef]
  437. Shafiekhani, A.; Kadam, S.; Fritschi, F.B.; DeSouza, G.N. Vinobot and vinoculer: Two robotic platforms for high-throughput field phenotyping. Sensors 2017, 17, 214. [Google Scholar] [CrossRef] [PubMed]
  438. Milella, A.; Reina, G.; Nielsen, M. A multi-sensor robotic platform for ground mapping and estimation beyond the visible spectrum. Precis. Agric. 2019, 20, 423–444. [Google Scholar] [CrossRef]
  439. Walter, A.; Khanna, R.; Lottes, P.; Stachniss, C.; Siegwart, R.; Nieto, J.; Liebisch, F. Flourish-a robotic approach for automation in crop management. In Proceedings of the International Conference on Precision Agriculture (ICPA), Montreal, QC, Canada, 24–27 June 2018. [Google Scholar]
  440. Chen, M.; Chen, Z.; Luo, L.; Tang, Y.; Cheng, J.; Wei, H.; Wang, J. Dynamic visual servo control methods for continuous operation of a fruit harvesting robot working throughout an orchard. Comput. Electron. Agric. 2024, 219, 108774. [Google Scholar] [CrossRef]
  441. Li, T.; Qiu, Q.; Zhao, C.; Xie, F. Task planning of multi-arm harvesting robots for high-density dwarf orchards. Trans. CSAE 2021, 37, 1–10. [Google Scholar] [CrossRef]
  442. Yu, Y.; Xie, H.H.; Zhang, K.L.; Wang, Y.J.; Li, Y.T.; Zhou, J.M.; Xu, L.Z. Design, development, integration, and field evaluation of a ridge-planting strawberry harvesting robot. Agriculture 2024, 14, 2126. [Google Scholar] [CrossRef]
  443. Williams, H.A.; Jones, M.H.; Nejati, M.; Seabright, M.J.; Bell, J.; Penhall, N.D.; Barnett, J.J.; Duke, M.D.; Scarfe, A.J.; Ahn, H.S. Robotic kiwifruit harvesting using machine vision, convolutional neural networks, and robotic arms. Biosyst. Eng. 2019, 181, 140–156. [Google Scholar] [CrossRef]
  444. Robotics, R. Fruit-Picking Robot Eve Ready to Harvest Apples Commercially, as Shortage of Workers Persists. Available online: https://www.abc.net.au/news/2023-05-19/fruit-picking-robot-eve-commercial-apple-harvest-worker-shortage/102355690 (accessed on 19 May 2023).
  445. FFRobotics. The Future of Fresh Fruit Harvest. 2020. Available online: https://www.ffrobotics.com/ (accessed on 25 July 2025).
  446. Uppalapati, N.K.; Walt, B.; Havens, A.J.; Mahdian, A.; Chowdhary, G.; Krishnan, G. A berry picking robot with a hybrid soft-rigid arm: Design and task space control. In Proceedings of the Robotics: Science and Systems. Robotics: Science and Systems Foundation, Corvallis, OR, USA, 12–16 July 2020; p. 95. [Google Scholar] [CrossRef]
  447. Barnett, K.S. A high-tech way to put fruit in the basket. Ind. Syst. Eng. Work 2021, 53, 1. [Google Scholar]
  448. Su, J.; Zhu, X.; Li, S.; Chen, W.H. AI meets UAVs: A survey on AI empowered UAV perception systems for precision agriculture. Neurocomputing 2023, 518, 242–270. [Google Scholar] [CrossRef]
  449. Li, J.; Zhou, H.; Mai, Y.; Jia, Y.; Zhou, Z.; Wu, K.; Chen, H.; Lin, H.; Luo, M.; Shi, L. An autonomous obstacle avoidance and path planning method for fruit-picking UAV in orchard environments. Smart Agric. Technol. 2025, 10, 100752. [Google Scholar] [CrossRef]
  450. Khan, H.A.; Farooq, U.; Saleem, S.R.; Rehman, U.u.; Tahir, M.N.; Iqbal, T.; Cheema, M.J.M.; Aslam, M.A.; Hussain, S. Design and development of machine vision robotic arm for vegetable crops in hydroponics. Smart Agric. Technol. 2024, 9, 100628. [Google Scholar] [CrossRef]
  451. Wang, X.; Kang, H.; Zhou, H.; Au, W.; Wang, M.Y.; Chen, C. Development and evaluation of a robust soft robotic gripper for apple harvesting. Comput. Electron. Agric. 2023, 204, 107552. [Google Scholar] [CrossRef]
  452. Zhang, Z.; Zhou, J.; Yi, B.; Zhang, B.; Wang, K. A flexible swallowing gripper for harvesting apples and its grasping force sensing model. Comput. Electron. Agric. 2023, 204, 107489. [Google Scholar] [CrossRef]
  453. Zhou, H.; Kang, H.; Wang, X.; Au, W.; Wang, M.Y.; Chen, C. Branch interference sensing and handling by tactile enabled robotic apple harvesting. Agronomy 2023, 13, 503. [Google Scholar] [CrossRef]
  454. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
  455. Thorne, J. Apple-Picking Robots Gear Up for US Debut in Washington State. Available online: https://www.geekwire.com/2019/apple-picking-robots-gear-u-s-debut-washington-state/ (accessed on 13 May 2019).
  456. Ji, W.; He, G.; Xu, B.; Zhang, H.; Yu, X. A new picking pattern of a flexible three-fingered end-effector for apple harvesting robot. Agriculture 2024, 14, 102. [Google Scholar] [CrossRef]
  457. Zhang, F.; Chen, Z.J.; Wang, Y.F.; Bao, R.F.; Chen, X.G.; Fu, S.L.; Tian, M.M.; Zhang, Y.K. Research on Flexible End-Effectors with Humanoid Grasp Function for Small Spherical Fruit Picking. Agriculture 2023, 13, 123. [Google Scholar] [CrossRef]
  458. Wang, M.; Zhou, Z.; Wang, Y.; Xu, J.; Cui, Y. Design and experiment of facility elevated planting strawberry continuous picking manipulator. Comput. Electron. Agric. 2025, 228, 109703. [Google Scholar] [CrossRef]
  459. Hou, G.; Chen, H.; Niu, R.; Li, T.; Ma, Y.; Zhang, Y. Research on multi-layer model attitude recognition and picking strategy of small tomato picking robot. Comput. Electron. Agric. 2025, 232, 110125. [Google Scholar] [CrossRef]
  460. Zhang, K.; Lammers, K.; Chu, P.; Li, Z.; Lu, R. An automated apple harvesting robot—From system design to field evaluation. J. Field Robot. 2024, 41, 2384–2400. [Google Scholar] [CrossRef]
  461. Fue, K.G.; Porter, W.M.; Barnes, E.M.; Rains, G.C. An extensive review of mobile agricultural robotics for field operations: Focus on cotton harvesting. AgriEngineering 2020, 2, 150–174. [Google Scholar] [CrossRef]
  462. Bergeron, S.; Pouliot, E.; Doyon, M. Commercial poultry production stocking density influence on bird health and performance indicators. Animals 2020, 10, 1253. [Google Scholar] [CrossRef]
  463. Park, M.; Britton, D.; Daley, W.; McMurray, G.; Navaei, M.; Samoylov, A.; Usher, C.; Xu, J. Artificial intelligence, sensors, robots, and transportation systems drive an innovative future for poultry broiler and breeder management. Anim. Front. 2022, 12, 40–48. [Google Scholar] [CrossRef]
  464. Jiang, W.; Hao, H.; Fan, J.; Wang, L.; Wang, H. Rabbit feeding robot: Autonomous navigation and precision feeding. Biosyst. Eng. 2024, 239, 68–80. [Google Scholar] [CrossRef]
  465. Chen, C.; Liu, X.Q.; Liu, C.J.; Pan, Q. Development of the precision feeding system for sows via a rule-based expert system. Int. J. Agric. Biol. Eng. 2023, 16, 187–198. [Google Scholar] [CrossRef]
  466. Tian, F.; Wang, X.; Yu, S.; Wang, R.; Song, Z.; Yan, Y.; Li, F.; Wang, Z.; Yu, Z. Research on navigation path extraction and obstacle avoidance strategy for pusher robot in dairy farm. Agriculture 2022, 12, 1008. [Google Scholar] [CrossRef]
  467. Lely. Lely Juno. 2025. Available online: https://www.lely.com/cn/solutions/feeding/juno/ (accessed on 25 July 2025).
  468. Rovibec. ROVER Feeding Robot. 2025. Available online: https://www.agriexpo.online/prod/rovibec-agrisolusions/product-172425-9278.html (accessed on 25 July 2025).
  469. Rovibec. RANGER Feed Pusher. 2025. Available online: https://www.agriexpo.cn/prod/rovibec-agrisolusions/product-172425-129658.html (accessed on 25 July 2025).
  470. Trioliet. Triomatic HP Suspended Feeding Robot. 2020. Available online: https://www.trioliet.com/products/automatic-feeding-systems/feeding-robot/suspended-feeding-robot (accessed on 25 July 2025).
  471. Karn, P.; Sitikhu, P.; Somai, N. Automatic cattle feeding system. In Proceedings of the 2nd International Conference on Engineering and Technology, KEC Conference, Dhapakhel, Lalitpur, Nepal, 26 September 2019; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
  472. Abbas, G.; Jaffery, S.; Hashmi, A.; Tanveer, A.; Arshad, M.; Amin, Q.; Saeed, M.; Saleem, M.; Qureshi, R.; Khan, A. Prospects and challenges of adopting and implementing smart technologies in poultry production. Pak. J. Sci. 2022, 74, 108. [Google Scholar]
  473. Oliveira, J.L.; Xin, H.; Chai, L.; Millman, S.T. Effects of litter floor access and inclusion of experienced hens in aviary housing on floor eggs, litter condition, air quality, and hen welfare. Poult. Sci. 2019, 98, 1664–1677. [Google Scholar] [CrossRef]
  474. Yang, X.; Huo, X.; Li, G.; Purswell, J.L.; Tabler, G.T.; Chesser, G.D.; Magee, C.L.; Zhao, Y. Effects of elevated platform and robotic vehicle on broiler production, welfare, and housing environment. Trans. ASABE 2020, 63, 1981–1990. [Google Scholar] [CrossRef]
  475. Quan, Q.; Palaoag, T.D.; Sun, H. Research and design of intelligent inspection robot for large-scale chicken farms. In Proceedings of the 2024 5th International Conference on Machine Learning and Human-Computer Interaction (MLHMI), Kawasaki, Japan, 14–16 March 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–6. [Google Scholar] [CrossRef]
  476. Selle, M.; Spieß, F.; Visscher, C.; Rautenschlein, S.; Jung, A.; Auerbach, M.; Hartung, J.; Sürie, C.; Distl, O. Real-time monitoring of animals and environment in broiler precision farming—How robust is the data quality? Sustainability 2023, 15, 15527. [Google Scholar] [CrossRef]
  477. Pan, Y.Z.; Zhang, Y.Z.; Wang, X.P.; Gao, X.X.; Hou, Z.Y. Low-cost livestock sorting information management system based on deep learning. Artif. Intell. Agric. 2023, 9, 110–126. [Google Scholar] [CrossRef]
  478. Yu, Z.J.; Guo, Y.Y.; Zhang, L.Y.; Ding, Y.; Zhang, G.; Zhang, D.Y. Improved lightweight zero-reference deep curve estimation low-light enhancement algorithm for night-time cow detection. Agriculture 2024, 14, 1003. [Google Scholar] [CrossRef]
  479. Planquette, L. Mobile Robot, in Particular an Education Robot, for Poultry Farm and Rearing Facility Implementing One or More Robots, USA, 20180360002. 2018. Available online: https://www.freepatentsonline.com/y2018/0360002.html (accessed on 25 July 2025).
  480. Olejnik, K.; Popiela, E.; Opaliński, S. Emerging precision management methods in poultry sector. Agriculture 2022, 12, 718. [Google Scholar] [CrossRef]
  481. Biosafety, O. XO Robot for Precision Poultry Farming. 2022. Available online: https://www.octopusbiosafety.com/en/xo/ (accessed on 25 July 2025).
  482. Zhao, Y.; Xiao, Q.; Li, J.; Tian, K.; Yang, L.; Shan, P.; Lv, X.; Li, L.; Zhan, Z. Review on image-based animals weight weighing. Comput. Electron. Agric. 2023, 215, 108456. [Google Scholar] [CrossRef]
  483. Patrol, P. Autonomous Robotics for the Poultry Shed. Available online: https://agtecher.com/product/poultry-patrol/ (accessed on 7 August 2020).
  484. Group, C.P. Chicken Nannies Are All the Rage in China. Available online: https://www.farmprogress.com/farm-life/chicken-nannies-are-all-the-rage-in-china (accessed on 6 May 2017).
  485. Sensyn Robotics, T.I.C. Sensyn Robotics and Taiho Industrial Corporation Jointly Develop an Autonomous Cage Monitoring System. Available online: https://www.sensyn-robotics.com/en/news/taiho (accessed on 14 October 2020).
  486. Ma, W.; Wang, X.; Yang, S.X.; Xue, X.; Li, M.; Wang, R.; Yu, L.; Song, L.; Li, Q. Autonomous inspection robot for dead laying hens in caged layer house. Comput. Electron. Agric. 2024, 227, 109595. [Google Scholar] [CrossRef]
  487. Althoefer, K. Antagonistic actuation and stiffness control in soft inflatable robots. Nat. Rev. Mater. 2018, 3, 76–77. [Google Scholar] [CrossRef]
  488. Kirby, E.; Zenha, R.; Jamone, L. Comparing single touch to dynamic exploratory procedures for robotic tactile object recognition. IEEE Robot. Autom. Lett. 2022, 7, 4252–4258. [Google Scholar] [CrossRef]
  489. Mandil, W.; Rajendran, V.; Nazari, K.; Ghalamzan-Esfahani, A. Tactile-sensing technologies: Trends, challenges and outlook in agri-food manipulation. Sensors 2023, 23, 7362. [Google Scholar] [CrossRef]
  490. Feng, Q.C.; Wang, X. Design of disinfection robot for livestock breeding. Procedia Comput. Sci. 2020, 166, 310–314. [Google Scholar] [CrossRef]
  491. Wu, D.; Cui, D.; Zhou, M.; Ying, Y. Information perception in modern poultry farming: A review. Comput. Electron. Agric. 2022, 199, 107131. [Google Scholar] [CrossRef]
  492. Heitmann, S.; Stracke, J.; Adler, C.; Ahmed, M.F.; Schulz, J.; Büscher, W.; Kemper, N.; Spindler, B. Effects of a slatted floor on bacteria and physical parameters in litter in broiler houses. Vet. Anim. Sci. 2020, 9, 100115. [Google Scholar] [CrossRef]
  493. Doerfler, R.L.; Petzl, W.; Rieger, A.; Bernhardt, H. Impact of robot scrapers on clinical mastitis and somatic cell count in lactating cows. J. Appl. Anim. Res. 2018, 46, 467–470. [Google Scholar] [CrossRef]
  494. Ebertz, P.; Krommweh, M.S.; Büscher, W. Feasibility study: Improving floor cleanliness by using a robot scraper in group-housed pregnant sows and their reactions on the new device. Animals 2019, 9, 185. [Google Scholar] [CrossRef] [PubMed]
  495. Robotics, B.E. We Make Growing Chickens Easier. 2019. Available online: https://www.birdseyerobotics.com/ (accessed on 25 July 2025).
  496. Inateco. Sentinel Automatic Bedding Robot—An Innovative Solution. 2018. Available online: https://www.inateco.eu/sentinel-automatic-bedding-robot/?lang=en (accessed on 25 July 2025).
  497. House, H.; Eng, P. Manure handling options for robotic milking barns. In Dairy Housing; Ontario.ca: Toronto, ON, Canada, 2016; pp. 1–8. [Google Scholar]
  498. Rabaud. Poultry Houses Cleaning Machine: Lavicole. 2025. Available online: https://gesproequipement.com/en/poultry-houses-cleaning-machine-lavicole-2/ (accessed on 25 July 2025).
  499. Liu, H.W.; Chen, C.H.; Tsai, Y.C.; Hsieh, K.W.; Lin, H.T. Identifying images of dead chickens with a chicken removal system integrated with a deep learning algorithm. Sensors 2021, 21, 3579. [Google Scholar] [CrossRef] [PubMed]
  500. Li, G.; Chesser, G.D.; Purswell, J.L.; Magee, C.; Gates, R.S.; Xiong, Y. Design and development of a broiler mortality removal robot. Appl. Eng. Agric. 2022, 38, 853–863. [Google Scholar] [CrossRef]
  501. Hu, Z.; Jiang, L.; Wang, H.; Wang, W.; Tang, J.; Zhang, T.; Huo, X. Design and simulation analysis of dead chicken picking end-effector based on under actuated principle. J. Northeast Agric. Univ. 2021, 52, 78–86. [Google Scholar] [CrossRef]
  502. Li, G.; Hui, X.; Zhao, Y.; Zhai, W.; Purswell, J.L.; Porter, Z.; Poudel, S.; Jia, L.; Zhang, B.; Chesser, G.D. Effects of ground robot manipulation on hen floor egg reduction, production performance, stress response, bone quality, and behavior. PLoS ONE 2022, 17, e0267568. [Google Scholar] [CrossRef]
  503. Chang, C.L.; Xie, B.X.; Wang, C.H. Visual guidance and egg collection scheme for a smart poultry robot for free-range farms. Sensors 2020, 20, 6624. [Google Scholar] [CrossRef]
  504. Vroegindeweij, B.A.; Blaauw, S.K.; IJsselmuiden, J.M.; van Henten, E.J. Evaluation of the performance of PoultryBot, an autonomous mobile robotic platform for poultry houses. Biosyst. Eng. 2018, 174, 295–315. [Google Scholar] [CrossRef]
  505. Wu, Z.; Zhang, H.; Fang, C. Research on machine vision online monitoring system for egg production and quality in cage environment. Poult. Sci. 2025, 104, 104552. [Google Scholar] [CrossRef]
  506. Trevelyan, J.P. Sensing and control for sheep shearing robots. IEEE Trans. Robot. Autom. 1989, 5, 716–727. [Google Scholar] [CrossRef]
  507. Gaworski, M.; Kic, P. Improvement of mobile milking parlours in small dairy farms including technical and functional aspects. In Proceedings of the 16th International Scientific Conference Engineering for Rural Development, Citeseer, Jelgava, Latvia, 24–26 May 2017; Volume 16, pp. 24–26. [Google Scholar] [CrossRef]
  508. Wang, S.; Han, Y.; Chen, J.; He, X.; Zhang, Z.; Liu, X.; Zhang, K. Weed density extraction based on few-shot learning through UAV remote sensing RGB and multispectral images in ecological irrigation area. Front. Plant Sci. 2022, 12, 735230. [Google Scholar] [CrossRef]
  509. Edlerman, E.; Linker, R. Autonomous multi-robot system for use in vineyards and orchards. In Proceedings of the 2019 27th Mediterranean Conference on Control and Automation (MED), Akko, Israel, 1–4 July 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 274–279. [Google Scholar] [CrossRef]
  510. De Silva, A.; Katupitiya, J.; Savkin, A.V. UAV-UGV Collaborative Localisation with Ambiguity Aversion by UAV Re-Positioning. IFAC-Pap. 2022, 55, 101–106. [Google Scholar] [CrossRef]
  511. Potena, C.; Khanna, R.; Nieto, J.; Siegwart, R.; Nardi, D.; Pretto, A. AgriColMap: Aerial-ground collaborative 3D mapping for precision farming. IEEE Robot. Autom. Lett. 2019, 4, 1085–1092. [Google Scholar] [CrossRef]
  512. Chen, Y.; Liu, Z.; Xu, Z.; Lin, J.; Guan, X.; Zhou, Z.; Zheng, D.; Hewitt, A. UAVs-UGV cooperative boom sprayer system based on swarm control. Comput. Electron. Agric. 2025, 235, 110339. [Google Scholar] [CrossRef]
  513. Farid, A.M.; Roshanian, J.; Mouhoub, M. Multiple aerial/ground vehicles coordinated spraying using reinforcement learning. Eng. Appl. Artif. Intell. 2025, 151, 110686. [Google Scholar] [CrossRef]
  514. Quaglia, G.; Visconte, C.; Scimmi, L.S.; Melchiorre, M.; Cavallone, P.; Pastorelli, S. Design of a UGV powered by solar energy for precision agriculture. Robotics 2020, 9, 13. [Google Scholar] [CrossRef]
  515. Ding, Y.; Xin, B.; Chen, J. A review of recent advances in coordination between unmanned aerial and ground vehicles. Unmanned Syst. 2021, 9, 97–117. [Google Scholar] [CrossRef]
  516. Guo, Z.; Wei, C.; Shen, Y.; Yuan, W. Event-triggered consensus control method with communication faults for multi-UAV. Intell. Robot. 2023, 3, 596–613. [Google Scholar] [CrossRef]
  517. Liu, W.; Zhou, J.H.; Zhang, T.F.; Zhang, P.C.; Yao, M.J.; Li, J.H.; Sun, Z.T.; Ma, G.X.; Chen, X.X.; Hu, J.P. Key technologies in intelligent seeding machinery for cereals: Recent advances and future perspectives. Agriculture 2025, 15, 8. [Google Scholar] [CrossRef]
  518. Ahmed, S.; Qiu, B.J.; Ahmad, F.; Kong, C.W.; Xin, H. A State-of-the-Art Analysis of Obstacle Avoidance Methods from the Perspective of an Agricultural Sprayer UAV’s Operation Scenario. Agronomy 2021, 11, 1069. [Google Scholar] [CrossRef]
  519. Rodriguez, R. Perspective: Agricultural aerial application with unmanned aircraft systems: Current regulatory framework and analysis of operators in the United States. Trans. ASABE 2021, 64, 1475–1481. [Google Scholar] [CrossRef]
  520. Singh, V.; Bagavathiannan, M.; Chauhan, B.S.; Singh, S. Evaluation of current policies on the use of unmanned aerial vehicles in Indian agriculture. Curr. Sci. 2019, 117, 25–29. [Google Scholar] [CrossRef]
  521. Sparrow, R.; Howard, M. Robots in agriculture: Prospects, impacts, ethics, and policy. Precis. Agric. 2021, 22, 818–833. [Google Scholar] [CrossRef]
  522. Zhong, W.; Yang, W.T.; Zhu, J.H.; Jia, W.D.; Dong, X.; Ou, M.X. An improved UNet-Based path recognition method in low-light environments. Agriculture 2024, 14, 1987. [Google Scholar] [CrossRef]
  523. Zhang, T.F.; Zhou, J.H.; Liu, W.; Yue, R.C.; Shi, J.W.; Zhou, C.J.; Hu, J.P. SN-CNN: A Lightweight and Accurate Line Extraction Algorithm for Seedling Navigation in Ridge-Planted Vegetables. Agriculture 2024, 14, 1446. [Google Scholar] [CrossRef]
  524. Huang, M.; Jiang, X.; He, L.; Choi, D.; Pecchia, J.; Li, Y. Development of a robotic harvesting mechanism for button mushrooms. Trans. Asabe 2021, 64, 565–575. [Google Scholar] [CrossRef]
  525. Zhang, H.W.; Ji, W.; Xu, B.; Yu, X.W. Optimizing Contact Force on an Apple Picking Robot End-Effector. Agriculture 2024, 14, 996. [Google Scholar] [CrossRef]
  526. Zhu, S.J.; Wang, B.; Pan, S.Q.; Ye, Y.T.; Wang, E.G.; Mao, H.P. Task allocation of multi-machine collaborative operation for agricultural machinery based on the improved fireworks algorithm. Agronomy 2024, 14, 710. [Google Scholar] [CrossRef]
  527. Lu, Y.Z.; Xu, W.X.; Leng, J.Y.; Liu, X.Y.; Xu, H.Y.; Ding, H.N.; Zhou, J.F.; Cui, L.F. Review and research prospects on additive manufacturing technology for agricultural manufacturing. Agriculture 2024, 14, 1207. [Google Scholar] [CrossRef]
  528. Zhu, Z.H.; Chai, X.Y.; Xu, L.Z.; Quan, L.; Yuan, C.C.; Tian, S.C. Design and performance of a distributed electric drive system for a series hybrid electric combine harvester. Biosyst. Eng. 2023, 236, 160–174. [Google Scholar] [CrossRef]
  529. Zhu, Z.; Zeng, L.X.; Chen, L.; Zou, R.; Cai, Y.F. Fuzzy adaptive energy management strategy for a hybrid agricultural tractor equipped with HMCVT. Agriculture 2022, 12, 1986. [Google Scholar] [CrossRef]
  530. Liu, Q.; Wang, R.; Cai, Y.; Wang, H.; Chen, L.; Lv, C. Real-time energy consumption prediction of connected automated electric vehicles based on temporal fusion transformers. IEEE Trans. Transp. Electrif. 2025, 11, 10297–10309. [Google Scholar] [CrossRef]
Figure 1. Four development stages of agricultural production.
Figure 1. Four development stages of agricultural production.
Actuators 14 00464 g001
Figure 2. Statistics on the amount of relevant research corresponding to each year from 1 January 2011 to 31 May 2025.
Figure 2. Statistics on the amount of relevant research corresponding to each year from 1 January 2011 to 31 May 2025.
Actuators 14 00464 g002
Figure 3. Co-occurrence map of author countries.
Figure 3. Co-occurrence map of author countries.
Actuators 14 00464 g003
Figure 4. Keywords co-occurrence network.
Figure 4. Keywords co-occurrence network.
Actuators 14 00464 g004
Figure 5. Components of an autonomous agricultural machine.
Figure 5. Components of an autonomous agricultural machine.
Actuators 14 00464 g005
Figure 6. Commonly used perception techniques: (a) monocular camera, (b) LiDAR, (c) imaging spectrometer, (d) binocular stereo vision camera, (e) structured light camera, (f) ToF camera.
Figure 6. Commonly used perception techniques: (a) monocular camera, (b) LiDAR, (c) imaging spectrometer, (d) binocular stereo vision camera, (e) structured light camera, (f) ToF camera.
Actuators 14 00464 g006
Figure 7. Automatic driving of farmland agricultural machinery: (ac) automatic driving for tillage and land preparation, (df) automatic driving for seeding, (gi) automatic driving for field management, (jl) automatic driving for crop harvesting.
Figure 7. Automatic driving of farmland agricultural machinery: (ac) automatic driving for tillage and land preparation, (df) automatic driving for seeding, (gi) automatic driving for field management, (jl) automatic driving for crop harvesting.
Actuators 14 00464 g007
Figure 8. Autonomous farm machinery in orchards: (ac) a tree planting robot (TreeRover), (df) pruning with autonomous driving, (gi) monitoring with autonomous driving, (jl) picking with autonomous driving.
Figure 8. Autonomous farm machinery in orchards: (ac) a tree planting robot (TreeRover), (df) pruning with autonomous driving, (gi) monitoring with autonomous driving, (jl) picking with autonomous driving.
Actuators 14 00464 g008
Figure 9. Automatic driving for livestock and poultry breeding: (ac) automatic feeding machine, (d,e) automatic driving monitoring machine, (fh) automatic driving cleaning machine, (i) automated livestock collection machine.
Figure 9. Automatic driving for livestock and poultry breeding: (ac) automatic feeding machine, (d,e) automatic driving monitoring machine, (fh) automatic driving cleaning machine, (i) automated livestock collection machine.
Actuators 14 00464 g009
Table 1. An overview of the review paper outline.
Table 1. An overview of the review paper outline.
Review MethodologyKey TechnologiesApplication StatusChallenges and Future Trends
(a)    Select database(a)    Positioning technology(a)    Farmland scenarios(a)    Existing challenges
-         ScienceDirect-         Absolute positioning-         Land preparation-         Key technical challenges
-         IEEE Explore-         Relative positioning-         Sowing-         Technical transfer barriers
-         Web of Science-         Fusion localization-         Field management-         Regulatory and ethical
(b)    Determine keywords(b)    Perception technology-         Crop harvesting(b)    Future prospects
-         Agricultural machinery-         Visible-spectrum imaging(b)    Orchard scenarios-         Core research gaps
-         Automatic driving, etc.-         Multispectral imaging-         Transplanting-         Future directions
(c)     Apply inclusion criteria-         Hyperspectral imaging-         Pruning-         Cost and practicality
-         Limit time horizon-         2D cameras-         Monitoring-         Standards and ecosystem
-         Select relevant papers-         3D cameras-         Picking
-         Filter out duplicates-         Multi-sensor fusion(c)     Livestock houses
(d)    Bibliometric analysis(c)     Motion control technology-         Animal feeding
-         Keyword co-occurrence-         Motion planning-         Monitoring
-         Publication time-         Motion control-         Environment cleaning
-         Contribution map(d)    Actuator technology-         Animal product collection
-         Drive actuator(d)    UAV-UGV Cooperative
-         Manipulate actuator
-         Fault detection
-         Fault-tolerant control
Table 2. Literature classification statistics.
Table 2. Literature classification statistics.
Classification DimensionCategoryNumber of DocumentsProportion
Core technology fieldPositioning and navigation12820.3%
Perception and vision22335.3%
Motion planning and control12720.1%
Actuator and fault-tolerant control15424.4%
Application scenarioFarmland26750.4%
Orchards and greenhouses16330.7%
Livestock and poultry farming7614.4%
Other244.6%
Research maturityTheoretical/Conceptual Research19436.5%
Prototype/Experiment29555.7%
Commercialization/Deployed417.7%
Table 3. Advantages of different positioning sensors.
Table 3. Advantages of different positioning sensors.
Sensor TypeAdvantagesRef.
Visual sensor① Relatively low cost[57]
② Information-rich signals
③ Wide detection range
LiDAR① High resolution[58]
② A larger field of view
③ Strong environmental robustness
Accelerometer① Directly measures the acceleration[59]
② Low cost
③ Strong anti-interference ability
Gyroscope① Low cost[60]
② High-frequency dynamic response
Magnetometer① Absolute heading reference[61]
② Excellent long-term stability
Wheel odometer① Low cost[62]
② Simple structure
IMU① High-frequency dynamic response[59]
② Low cost
Table 4. Disadvantages of different positioning sensors.
Table 4. Disadvantages of different positioning sensors.
Sensor TypeDisadvantagesRef.
Visual sensor① Greatly influenced by environmental factors such as light intensity, jitter effect, and weather conditions[57]
② Accuracy is limited by the distance of the object
LiDAR① High cost[63]
② High data processing complexity
③ Susceptible to vibration
Accelerometer① Relies heavily on gravity[64]
② Mixed with strong noise
③ Unable to sense the rotational motion on the Z-axis
④ Lack of environmental perception ability
Gyroscope① Sensor drift leads to cumulative errors[60]
② Vibration sensitivity
③ Lack of environmental perception ability
Magnetometer① Significantly affected by magnetic interference[65]
② Restricted by the manufacturing materials
③ Difficult to calibrate
④ Lack of environmental perception ability
Wheel odometer① Greatly affected by road conditions[62]
② Estimation error accumulates with the driving distance
③ Lack of environmental perception ability
IMU① Sensor drift leads to cumulative errors[59]
② Lack of absolute positioning ability
③ Vibration sensitivity
Table 5. Performance of the SLAM algorithm in structured and unstructured agricultural environments.
Table 5. Performance of the SLAM algorithm in structured and unstructured agricultural environments.
Algorithm NameEnvironment TypeAccuracyComputational CostRobustness
ORB-SLAMStructuredHighModerate to HighRobust in well-textured environments
UnstructuredModerateHighSensitive to dynamic objects and fast motion
LSD-SLAMStructuredHighModerateGood for low-texture environments, but sensitive to lighting changes and motion blur
UnstructuredModerateModerateStruggles in fast-moving or highly dynamic environments
LeGO-LOAMStructuredHighModerate to HighHigh accuracy, capable of handling large-scale environments
UnstructuredModerateModerate to HighWeak in dynamic environments, relies on LiDAR
CartographerStructuredHighHighGood adaptability to environmental changes, strong loop closure
UnstructuredModerateHighRelies on feature matching, may produce errors in dynamic environments
Table 6. Examples of integrated positioning for autonomous driving of agricultural machinery.
Table 6. Examples of integrated positioning for autonomous driving of agricultural machinery.
Sensor TypeFusion AlgorithmPositioning PerformanceExample Reference
Gyroscope, Wheel odometerKalman Filter- Heading deviation: <5° (hard ground), 10–15° (soft ground)
- Circular path RMSE (carpet): <80 mm
- Z-shaped path cumulative error (concrete): 120 mm
[62]
GNSS, IMUKalman FilterFusion positioning: - Lateral offset reduced by 38.3% (avg. 5.93 cm)
- Azimuth changed by 26.7%
[85]
RTK-GPS, IMU, Depth cameraExtended Kalman Filter- Lateral positioning average absolute errors: 5.0 cm (corn), 4.2 cm (sorghum)[86]
VO, IMU, Wheel odometerExtended Kalman Filter- Closed path: Max APE = 0.25
- Crossroads: RMSE = 0.023, max error = 0.09
- Soilless greenhouse: max error = 0.214
- Corridor frame: RMSE = 0.109
[87]
GNSS, GyroscopeAdaptive Kalman Filter- Linear experiment: average error = 0.06, error variance = 0.215
- Curve experiment: average error = 0.746, error variance = 0.908
[88]
GNSS, IMUFinite Impulse Response Kalman FilterGNSS differential state:
- Filtering: avg. error = 1.074 cm, RMSE = 1.396 cm
- Non-filtering: avg. error = 1.17 cm,
RMSE = 1.551 cm
- GNSS non-differential state:
- Filtering: avg. error = 2.097 cm, RMSE = 2.72 cm
- Non-filtering: avg. error = 3.663 cm,
RMSE = 4.633 cm
[89]
Monocular camera, 3D LiDARAdaptive Radius Filter- Heading angle absolute error: average 1.53°,
- Standard deviation: 1.46°.
[90]
LIiDAR, Camera, IMU, Wheel odometerParticle Filter- Accuracy: 0.37 m (95% of time),
- Average error: 0.2 m.
[91]
Wheel odometer, IMU, LIDAR, GNSSFactor GraphCave des Onze Communes Vineyard:
- A01 dataset: RMSE = 0.123, max = 0.253
- A02 dataset: RMSE = 0.116, max = 0.273
[92]
Camera, LiDAR, IMUFactor Graph- Positioning: average errors of 0.056 m, 0.065 m, and 0.081 m
- All error standard deviations <0.05 m.
[93]
Thermal camera, LiDARPosture GraphAverage positional error in the real orchard is 0.20 m[94]
LiDAR, RTK-GPSGeneralized Iterative Nearest Point- Long-distance error is ≤0.71
- Short-distance error is ≤0.14
[95]
Table 7. Performance of filtering algorithms in structured and unstructured environments.
Table 7. Performance of filtering algorithms in structured and unstructured environments.
AlgorithmEnvironment TypeAccuracy PerformanceComputational CostRobustness
Kalman FilterStructuredHighLowRobust in static environments, sensitive to noise
UnstructuredModerateLowSensitive to noise, struggles in highly dynamic environments
Extended Kalman FilterStructuredHighModerateRobust in mild non-linear systems
UnstructuredModerateModerateSensitive to noise, struggles in dynamic environments
Adaptive Kalman FilterStructuredHighModerateVery robust in changing noise environments
UnstructuredHighModeratePerforms better in dynamic, noisy environments
Finite Impulse Response Kalman FilterStructuredHighLowRobust against specific noise patterns
UnstructuredModerateLowLess robust in dynamic environments
Adaptive Radius FilterStructuredHighModerate to HighRobust in environments with slowly changing conditions
UnstructuredHighModerate to HighVery effective in environments with dynamic obstacles
Particle FilterStructuredHighHighRobust in highly non-linear environments
UnstructuredHighHighVery effective in dynamic environments with non-linearities
Table 8. Performance of optimizing algorithms in structured and unstructured environments.
Table 8. Performance of optimizing algorithms in structured and unstructured environments.
AlgorithmEnvironment TypeAccuracy PerformanceComputational CostRobustness
Factor GraphStructuredHighHighRobust in large-scale environments, good for graph-based optimization
UnstructuredModerateHighRobust with proper sensor fusion, less effective in dynamic settings
Posture GraphStructuredHighModerate to HighRobust in static environments, good for optimization
UnstructuredModerateModerate to HighPerforms poorly in highly dynamic or rapidly changing environments
Generalized Iterative Nearest-PointStructuredHighHighRobust in structured environments
UnstructuredModerateHighLess effective in dynamic environments with sparse data
Table 9. Characteristics of different imaging techniques.
Table 9. Characteristics of different imaging techniques.
Classification CriteriaImaging Technology TypeDescription
Spectral resolution (band bandwidth) or the number of bandsVisible-spectrum imagingOnly the visible light spectrum of the human eye (400–700 nm) is captured, and imaging is carried out through three bands: red (R), green (G), and blue (B). The bandwidth of each band is relatively wide (such as 50–100 nm).
Multispectral imagingSimultaneously captures multiple (3 to 10) specific bands of light, including visible light and non-visible light (such as near-infrared, red edge, and infrared), with the bandwidth of each band ranging from tens of nanometers to hundreds of nanometers.
Hyperspectral imagingCaptures dozens to hundreds of narrow bands (bandwidth 1–20 nm), covering a relatively wide spectral range (such as 400–2500 nm).
Imaging dimension2D cameraObtains the two-dimensional planar image of the object.
3D cameraObtains the three-dimensional spatial information of objects, such as depth values or point cloud data, including binocular stereo cameras, structured light cameras, and TOF cameras.
Table 10. Advantages and disadvantages of different imaging techniques.
Table 10. Advantages and disadvantages of different imaging techniques.
Imaging Technology TypeAdvantagesDisadvantages
Visible-spectrum imaging①    Intuitive and easy to understand①    Only covers visible light
②    Low cost②    Low spectral resolution
③    Strong real-time performance③    Discontinuous wavelength bands
Multispectral imaging①    Capable of acquiring partial non-visible light①    Relatively low spectral resolution
②    Has scene adaptability②    Discontinuous wavelength bands
Hyperspectral imaging①    Extremely high spectral resolution, capable of accurately identifying the composition of substances①    Complex equipment
②    Strong band continuity②    High cost
③    Poor real-time performance
④    High requirements for sensor calibration and the environment
2D camera①    Low cost①    Unable to perceive spatial information
②    Strong real-time performance②    Low precision
③    Intuitive data
3D camera①    Supports 3D modeling①    Limited accuracy
②    Good real-time performance②    Shorter acting distance
③    Low cost③    Greatly affected by light and material
Table 11. Definition of each type of path-finding algorithm.
Table 11. Definition of each type of path-finding algorithm.
CategoryDefinitionCommon AlgorithmsApplicability of Global PlanningApplicability of Local Planning
Graph Search-Based AlgorithmsAbstract the road or work environment into a graph structure to determine the shortest path between a starting point and an ending point.DijkstraStrongWeak
A*StrongWeak
Hybrid A*Relatively strongMedium
JPSStrongWeak
D*MediumRelatively strong
Sampling-Based AlgorithmsGenerate a set of candidate paths and filter the optimal motion path by combining multiple constraint conditions.PRMStrongWeak
RRTMediumRelatively strong
Informed RRTStrongMedium
Kinodynamic RRT*MediumStrong
Optimization-Based AlgorithmsTransform the path planning problem into a mathematical optimization problem and search for the optimal solution through iteration.CHOMPMediumWeak
STOMPMediumMedium
TEBWeakStrong
DT-TEBRelatively strongStrong
Evolutionary algorithmStrongWeak
Swarm intelligenceStrongMedium
Artificial potential field (APF)WeakRelatively strong
Dynamic window algorithm (DWA)WeakStrong
Learning-Based AlgorithmsUse data to train a model, enabling it to possess learning capabilities, thereby formulating path planning strategies.Deep learningRelatively strongWeak
Reinforcement learningMediumStrong
Deep reinforcement learningRelatively strongStrong
Table 12. Characteristics of each type of path-finding algorithm.
Table 12. Characteristics of each type of path-finding algorithm.
CategoryAdvantagesDisadvantages
Graph Search-Based Algorithms① Capable of finding the global optimal path① High computational complexity
② Mature algorithmic theory, suitable for structured environments② Poor adaptability to dynamic environments
③ Strong interpretability③ Insufficient consideration of kinematic constraints
Sampling-Based Algorithms① Does not require explicit construction of the entire configuration space, suitable for unstructured environments① Sampling bias issue
② Adaptability to dynamic environments② Unstable path quality
③ Integration of kinematic constraints③ Difficulty in handling multiple constraints
④ High dependency on environmental modeling
Optimization-Based Algorithms① Capable of handling multiple constraints① High computational complexity
② Strong path smoothness② Risk of falling into local optima
③ Adaptability to dynamic environments③ Weak global optimization capability
④ Susceptible to environmental noise
Learning-Based Algorithms① Strong autonomous learning ability, suitable for complex scenarios① High data dependency
② Adaptability to dynamic scenarios② Heavy computational resource requirements
③ Possesses generalization potential③ Risk of overfitting
④ Poor interpretability
Table 15. Motor, hydraulic, and pneumatic actuators.
Table 15. Motor, hydraulic, and pneumatic actuators.
Actuator TypeCharacteristicsApplicable ScenariosRef.
Brushless DC Motors① Strong vibration resistanceFarmland seeding[249]
② Capable of supporting medium-to-high-speed operations
Brushless DC Motors① High adaptabilityTravel drive[250]
② Fast dynamic response
Brushless AC Motors① Strong load capacitySteering and travel drive[66]
② High energy utilization efficiency
Stepper Drive Motors① High control precisionPesticide spraying[251]
② Strong adaptability
③ Poor stability in long-term control
Stepper Drive Motors① Low design costFruit picking[252]
② High control precision
③ Limited dynamic response
④ Poor environmental adaptability
Servo Motors① High control precisionCrop harvesting[253]
② Strong control flexibility
③ High compatibility
④ Complex system design
Hydraulic Drives① Short response timeCrop fertilization[254]
② High control precision
③ Strong flexibility
Hydraulic Drives① High torque and load capacitySteering drive[255]
② Fast response speed
③ Limited control precision
④ High system complexity
Pneumatic Drives① High picking success rateFruit picking[256]
② Fast response speed
③ High operational stability
④ Limited precision
Pneumatic Drives① Strong adaptabilitySeedling transplanting[257]
② Simple structure
③ Fast response speed
④ Insufficient stability and precision
Table 17. Performance and benefits of research and commercial products in farmland.
Table 17. Performance and benefits of research and commercial products in farmland.
TypeNamePerformance and BenefitsRef.
ResearchThree-Device Rice Planting System0.3-hectare rice field transplanted in 56 min.[339]
Intelligent seeding equipmentSeeding efficiency is 8–36 s per 2 feet.[344]
Low-cost automated seeding systemEfficiency improved by 35%, with full terrain adaptability.[349]
Bionic snake bone-arm robotSpraying robotic arm can bend up to 115.7 degrees.[351]
Spraying distance: 60 cm.
Independent variable fertilization robotLeaf area extraction: >97% accuracy.[356]
Height information extraction: >96% accuracy.
Navigation system errors: distance 5.598 cm, angle 0.2245°.
Spraying volume precision: avg. difference 0.46 mL.
Adaptive sprinkler irrigation robotWith a 5 L water tank, a complete watering cycle lasts approximately 2 min 30 s.[353]
AgBot IIDetects and classifies weeds and sprays herbicides with over 90% success.[358]
Self-operated laser weeding equipmentmAP reaches: 88.5%.[373]
Weed removal rate: 92.6%.
Seedling damage rate: 1.2%.
Self-operated laser weeding platformAverage error: 1.97 mm.[376]
Hit rate: 97%.
Autonomous pest control agricultural robotGreenhouse simulation leaf image capture success rate: 100%.[377]
Real-world leaf image capture success rate: 53.6%.
Autonomous corn harvesting robotAverage cutting deviation: 0.063 m.[389]
Grain loss rate: 0.76%.
Commercial productsAgXeedSupport width: 1.8 to 3.0 m; integrated with GPS, sensors, and optical obstacle recognition.[338]
SprayBoxProcesses 20 times per second, with millimeter-level precision, reducing herbicide use by 95%.[22]
EcoRobotixReduces herbicide use by 95%.[381]
LaserWeederReduced annual weeding costs: 80%.[375]
Weeding rate: 99%.
Hyliq AG-130High-precision spraying system suitable for large-scale farmland.[362]
EA 30X-ProEnables precise pesticide application within a 0.5 m radius, improving operational efficiency by over 30%.[363]
Table 18. Types of autonomous picking equipment.
Table 18. Types of autonomous picking equipment.
Classification CriteriaTypeSchematic DiagramExample Reference
Number of mechanical armsSingle armActuators 14 00464 i014[450]
Multi-armActuators 14 00464 i015[443,445]
Effector materialsFlexible effectorActuators 14 00464 i016[451,452]
Rigid effectorActuators 14 00464 i017[290]
Rigid–flexible coupling effectorActuators 14 00464 i018[453]
Installation architectureSerial typeActuators 14 00464 i019[299,454]
Parallel typeActuators 14 00464 i020[455]
Picking methodClamping and rotatingActuators 14 00464 i021[456,457]
Clamping and cuttingActuators 14 00464 i022[458,459]
Vacuum suctionActuators 14 00464 i023[444,460]
Table 19. Characteristics of autonomous picking equipment.
Table 19. Characteristics of autonomous picking equipment.
TypeAdvantagesDisadvantages
Single arm① Simple modeling① Small workspace
② Good dexterity② Limited picking speed
③ Low cost③ Low picking efficiency
Multi-arm① High efficiency① Degree-of-freedom-controlling contradictions
② Large workspace② Inefficient coordination
③ High success rate of picking③ Inflexible operational zone division
④ Strong environmental adaptability④ Real-time planning latency
Flexible effector① High degree of freedom① Inadequate actuation
② High flexibility② Complex modeling
③ High security③ Challenging control hinders precision and adaptability
④ Strong fit④ Low durability
Rigid effector① Accurate grasping① Poor flexibility
② Reliable load② Poor compliance
③ High durability③ Limited degrees of freedom
④ Strong stability④ High fruit damage rate
Rigid–flexible coupling effector① High flexibility① Complex modeling
② Good buffering② Vibration damping challenges
③ High strength
④ High security
⑤ Strong fit
Serial type① Wide spatial coverage① Low load ratio
② Simple modeling② Large inertia
③ Flexible movement path
Parallel type① High rigidity① Complex modeling
② Low inertia② Low flexibility
③ Large load-bearing capacity③ Poor adaptability
④ High precision
Clamping and rotating① High success rate of fruit separation① Susceptible to environmental factors
② High intact rate of fruit stem② Poor adaptability to fruit shape and size
③ Short picking cycle③ High equipment damage rate
Clamping and cutting① Low fruit damage rate① Low fault tolerance
② Unaffected by fruit posture② High requirements for the positioning algorithm
Vacuum suction① Effective for picking smooth and light fruits① Heavy picking machine
② No requirements for high-precision fruit positioning② Difficulty in parameter design
③ High picking speed
④ Unaffected by fruit growth direction
Table 20. Performance and benefits of research and commercial products in orchards.
Table 20. Performance and benefits of research and commercial products in orchards.
TypeNamePerformance and BenefitsRef.
ResearchMulti-task robot transplanting unitOverall success rate: 90%[405]
Three-degree-of-freedom parallel transplanting robotTransplanting success rate of 95.3% at an acceleration of 30 m/s2[409]
Multi-sensor-detection transplanting systemAverage success rate of 97.3% at high-speed planting frequency[410]
Machine vision transplanting systemReduces missed seeding rate by 9.91%, and increases seedling robustness score by 18.92%[411].
Computer vision-based pruning systemTotal time to prune one vine is 2 min, similar to human pruning, and can be further reduced with a faster arm[418]
Planning and control framework for fruit tree pruningAverage cutting duration: 13 s; success rate: 75%[419]
Autonomous grape pruning systemThe system accurately captures 85% of the grapevine cane structures[413]
Fruit tree pruning recommendation frameworkImproves light distribution by 25.15% over conventional pruning and 15% over commercial pruning[461]
Drone monitoring systemEffectively monitors olive grove soil erosion, achieving 93% vegetation identification accuracy and 91% bare soil accuracy[429]
Drone-based hyperspectral monitoringClassification accuracy for citrus canker: 94–100%[430]
Low-cost ground monitoring robotPlant height (vs. Manual): RMS < 0.5 cm; MSE 2.36 cm[437]
Temperature (vs. Bradford Weather Station): avg. RMS error < 5 °C (all sensors)
Autonomous kiwi fruit harvesting robotKiwi fruit detection rate: 89.6%; harvesting success rate: 51.2%; average efficiency: 5.5 s per fruit[443]
Autonomous strawberry harvesting robotHarvesting success rate:[442]
Thinned natural environment: 49.30%
Unpruned natural environment: 30.23%
Average harvesting speed:
Single arm: 7 s/fruit
Dual arm: 4 s/fruit
Commercial productsEVE robotHarvests hard fruits such as apples and peaches using negative pressure adsorption[444]
FFRoboticsTwelve collaborative robotic arms achieve a picking efficiency of 1.8 s per fruit, boosting productivity 10-fold[445]
FAR orchard harvesting droneOne robot can cover one hectare of land and operate uninterrupted[447]
Table 21. Performance and benefits of research and commercial products in livestock and poultry breeding.
Table 21. Performance and benefits of research and commercial products in livestock and poultry breeding.
TypeNamePerformance and BenefitsRef.
ResearchAutonomous rabbit feeding robotHorizontal navigation deviation: 5.3 mm; vertical deviation: 7.6 mm; feed quantification error: 4.3%[464]
Autonomous cattle feeding vehicleReduces labor time by 25% compared to traditional manual feeding[471]
Automatic manure cleaning robotReduces clinical mastitis incidence in herds by 2.42%[493]
Universal robotic scraperCleaning area: 420 m2/h; manure removal: 1.4 kg/m2; cleanliness maintained for 6 h; saves 30 L of water per cleaning cycle[494]
Autonomous disinfection vehicleAt a flow rate of 1200 mL/min, average droplet diameter: 231.09 μ m and deposition density: 186 drops/cm2; fully meets disinfection requirements[490]
Dead chicken cleaning robotDead chicken recognition accuracy: 92.54%[499]
Intelligent mobile egg collection robotEgg recognition rate: 94.7–97.6%[503]
Egg collection time (automated vs. manual):
-Corners: 6.61–8.62 min (vs. 0.53 min)
-Central: 3.62–4.6 min (vs. 0.25 min)
-Scattered: 8.53–9.49 min (vs. 0.81 min)
PoultryBotMaximum navigation distance: 3000 m; autonomously avoids moving hens; egg collection success rate: 46%[504]
Faster R-CNN-based egg collection systemEgg collection success rate: 91.6%[100]
Commercial productsLely JunoIntegrated UWB and Bluetooth communication; pushes feed 5–6 times daily[467]
ROVER rail feeding robotPrecisely follows preset tracks; automatically returns to charging station[468]
XO multi-task autonomous robotCollects environmental data (temperature, humidity, CO2, light, etc.); performs cleaning and litter management functions[481]
INATECO autonomous litter robotAccurately identifies wet litter areas via infrared thermography and multi-sensor technology[496]
Poultry Patrol autonomous inspection robotMonitors and identifies sick or dead chickens using various cameras[483]
Sensyn Robotics inspection robotDead chicken detection rate: 93%; false-positive rate: 0.3%[485]
Rabaud high-pressure cleaning robotCleaning height up to 5 m; comprehensive cleaning with no dead angles[498]
Birds Eye Robotics dead bird disposal robotPerceives and monitors the environment; removes dead chickens using a rotating shovel structure[495]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, Q.; Yu, R.; Suo, H.; Cai, Y.; Chen, L.; Jiang, H. Autonomous Driving in Agricultural Machinery: Advancing the Frontier of Precision Agriculture. Actuators 2025, 14, 464. https://doi.org/10.3390/act14090464

AMA Style

Liu Q, Yu R, Suo H, Cai Y, Chen L, Jiang H. Autonomous Driving in Agricultural Machinery: Advancing the Frontier of Precision Agriculture. Actuators. 2025; 14(9):464. https://doi.org/10.3390/act14090464

Chicago/Turabian Style

Liu, Qingchao, Ruohan Yu, Haoda Suo, Yingfeng Cai, Long Chen, and Haobin Jiang. 2025. "Autonomous Driving in Agricultural Machinery: Advancing the Frontier of Precision Agriculture" Actuators 14, no. 9: 464. https://doi.org/10.3390/act14090464

APA Style

Liu, Q., Yu, R., Suo, H., Cai, Y., Chen, L., & Jiang, H. (2025). Autonomous Driving in Agricultural Machinery: Advancing the Frontier of Precision Agriculture. Actuators, 14(9), 464. https://doi.org/10.3390/act14090464

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop