Next Article in Journal
Assessing Whole-Body Vibrations in an Agricultural Tractor Based on Selected Operational Parameters: A Machine Learning-Based Approach
Previous Article in Journal
Impact of Soil Amendments and Alternate Wetting and Drying Irrigation on Growth, Physiology, and Yield of Deeper-Rooted Rice Cultivar Under Internet of Things-Based Soil Moisture Monitoring
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Key Technological Developments in Autonomous Unmanned Operation Systems for Agriculture in China

1
National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China
2
School of Information Science and Engineering, Shandong Agricultural University, Taian 271018, China
3
Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
4
Key Laboratory of Digital Village Technology, Ministry of Agriculture and Rural Affairs, Beijing 100097, China
5
College of Water Resources and Civil Engineering, China Agricultural University, Beijing 100083, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
AgriEngineering 2025, 7(3), 71; https://doi.org/10.3390/agriengineering7030071
Submission received: 22 January 2025 / Revised: 28 February 2025 / Accepted: 4 March 2025 / Published: 6 March 2025

Abstract

:
Smart agricultural machinery is built upon traditional agricultural equipment, further integrating modern information technologies to achieve automation, precision, and intelligence in agricultural production. Currently, significant progress has been made in the autonomous operation and monitoring technologies of smart agricultural machinery in China. However, challenges remain, including poor adaptability to complex environments, high equipment costs, and issues with system implementation and standardization integration. To help industry professionals quickly understand the current state and promote the rapid development of smart agricultural machinery, this paper provides an overview of the key technologies related to autonomous operation and monitoring in China’s smart agricultural equipment. These technologies include environmental perception, positioning and navigation, autonomous operation and path planning, agricultural machinery status monitoring and fault diagnosis, and field operation monitoring. Each of these key technologies is discussed in depth with examples and analyses. On this basis, the paper analyzes the main challenges faced by the development of autonomous operation and monitoring technologies in China’s smart agricultural machinery sector. Furthermore, it explores the future directions for the development of autonomous operation and monitoring technologies in smart agricultural machinery. This research is of great importance for promoting the transition of China’s agricultural production towards automation and intelligence, improving agricultural production efficiency, and reducing reliance on human labor.

1. Introduction

Agricultural mechanization is the main pathway and fundamental direction for the development of modern agriculture, while the intelligence of agricultural machinery is a key task and advanced goal in improving the current level of agricultural mechanization. With the continuous advancement of agricultural mechanization and informatization, China has now entered a critical stage of accelerating the transition from traditional to modern agriculture. Driven by intelligent technologies, the automation, precision, and efficiency of agricultural machinery have significantly enhanced the efficiency and quality of agricultural production. The efficiency of smart agricultural machinery is from 50% to 60% higher than that of conventional machinery, and during the crop planting phase, the use of smart agricultural machinery can increase per acre yield by at least 5% to 10%. Although the total number of agricultural machines in China has reached 200 million units, with a total power output of 1.1 billion kilowatts and an overall mechanization rate of over 74%, high-end intelligent agricultural machinery accounts for less than 10% of the total agricultural machinery fleet and is primarily reliant on imports. At present, China faces issues such as an aging rural labor force, labor shortages in rural areas, and the imbalance between supply and demand in the agricultural labor market. To achieve modernization, labor reduction, and automation in agricultural production, the development of agricultural machinery intelligence is crucial [1].
The “14th Five-Year Plan” points out that the development of smart agriculture is a key direction for China’s continuous high-quality agricultural development. As a key component of smart agriculture, intelligent agricultural machinery plays a role in improving the quality and efficiency of agricultural production, which helps enhance the precision and intelligence of agricultural practices [2]. By fully integrating the new generation of information technology into agricultural mechanization and utilizing advanced information technologies and sensors to upgrade traditional agricultural machinery, it promotes a transformation in agricultural machinery operations, significantly improving production efficiency and reducing reliance on manual labor [3]. Autonomous operation and monitoring are the core features of intelligent agricultural machinery. These systems autonomously plan paths for agricultural operations while intelligently perceiving and precisely monitoring their own status and operational processes. The level of development in these technologies determines the degree of intelligence in agricultural machinery. Currently, China continues to increase investment in the research and development of autonomous operation and monitoring technologies for intelligent agricultural machinery. Significant progress has been made in this field, with initial results in precise operations and intelligent management across various stages of agricultural production. However, there remains a gap compared to advanced international levels, and further breakthroughs are urgently needed.
Some countries began the process of agricultural modernization early and have developed a higher level of intelligent agricultural machinery, with well-established research systems for autonomous operation and monitoring technologies. Internationally, the primary focus for achieving the autonomous operation of agricultural machinery is through research of key technologies such as obstacle avoidance and path planning [4]. In Italy, Reina et al. [5] developed a multi-sensor perception system to enhance the environmental awareness of agricultural vehicles in farmland. By integrating various onboard sensor technologies, including stereovision, LiDAR, radar, and thermal imaging, the system automatically detects obstacles and distinguishes between passable and impassable areas. In Germany, Höffmann et al. [6] divided complex agricultural areas into simpler units, each equipped with a guide rail, forming a fixed track system. The subsequent stages of route planning and smooth path planning calculate a path that adheres to path constraints, optimally navigating through the units while aligning with the track system. This laid a solid foundation for accurate and efficient agricultural coverage path planning. Multi-sensor perception provides real-time environmental data, allowing the obstacle avoidance system to take obstacles into account during path planning and optimize the route. Path planning technology dynamically adjusts the working route based on real-time environmental data, ensuring that agricultural machinery not only avoids obstacles in complex farmland environments but also completes tasks autonomously and efficiently.
In the process of operation monitoring, the focus is on integrating artificial intelligence technology with agricultural machinery, applied to various agricultural processes such as tilling, planting, and harvesting [7]. Intelligent operation monitoring aims to improve the accuracy and efficiency of field operations, thereby increasing crop yields and promoting the advancement of agricultural machinery equipment [8,9]. In Iran, Karimi et al. [10] designed and built a new seeder monitoring system based on an infrared seed sensor developed earlier. Hall-effect sensors measure ground speed, and location information provided by a GPS module is used to record seeding data. Ultrasonic sensors continuously measure the seed and fertilizer levels in the hopper. In South Korea, Kim [11] employed various image processing techniques, including Hough transformation, hue-saturation-value (HSV) color space conversion, image morphological techniques, and Gaussian blur, to accurately track seeding rates and locate missed seeding areas in mechanical seeders with planting trays. The integration of multi-sensor and image processing technologies enhances monitoring capabilities during the operation process, ensuring the accuracy of fieldwork under various environmental conditions. This provides more reliable data support for agricultural production.
This paper reviews the current development status of key technologies in the autonomous operation and monitoring of intelligent agricultural machinery in China. It provides a comprehensive analysis of the main research achievements across the entire process, from environmental perception technology, positioning and navigation technology, autonomous operation and path planning technology to agricultural machinery status monitoring and fault diagnosis technology, as well as the monitoring technologies for tilling, planting, crop protection, and harvesting operations in the field. The system architecture diagram is shown in Figure 1. By examining these key technologies, the paper discusses the issues that still need further research and solutions. It also looks ahead to the development trends of autonomous operation and monitoring technologies, offering suggestions for future directions toward developing intelligent agricultural machinery in China.

2. Environmental Perception Technology

With the rapid development of agricultural mechanization and automation, the demand for agricultural machinery to achieve autonomous operation in complex and variable field environments is increasing. Environmental perception technology, as a critical component of unmanned agricultural machinery in achieving autonomous operation, directly impacts operational accuracy. Environmental perception mainly involves obstacle detection and recognition, as well as crop row detection.

2.1. Obstacle Detection and Recognition

With the development of intelligent agricultural machinery, automatic navigation technology has been widely applied in production. To ensure the safe operation of agricultural machinery, it is essential to continuously detect and identify obstacles, ensuring the reliability and safety of autonomous operation. Obstacle detection in agricultural machinery primarily relies on technologies such as Light Detection and Ranging (LiDAR), millimeter-wave radar, ultrasonic radar, and visual sensors [12]. LiDAR provides a large amount of distance information through relatively high-frequency signals and can operate in all weather conditions. Millimeter-wave radar, with its longer wavelength, has stronger penetration capabilities, especially through smoke, and can not only measure the precise distance to a target but also determine the relative speed of the target. Ultrasonic radar is suitable for low-speed, short-range sensing. Visual sensors automatically detect obstacles from images and videos, using the corresponding algorithms to determine the location of the target obstacle [13]. In practical applications, a single sensor cannot guarantee reliable results in every scenario. Most researchers combine multiple sensors to perform obstacle detection through sensor fusion. The specific process is shown in Figure 2.
Li Xinghao [14] proposed an obstacle detection and recognition system that integrates LiDAR and machine vision. The LiDAR sensor determines information such as the distance, location, and size of obstacles, while the visual camera identifies the number and category of obstacles. The complementary functions of these two sensors provide a solution for obstacle detection in agricultural fields. Lv Youhao et al. [15] proposed a multi-modal information fusion Transformer architecture, with the model structure shown in Figure 3. This combines the proprioceptive states from robot state estimation, inertial measurement unit readings, and motor encoders with inputs from visual cameras and LiDAR. This approach enables the complementary and collaborative reasoning of three types of modalities. By utilizing end-to-end reinforcement learning, the robot is trained to autonomously avoid obstacles and traverse dynamically unknown complex terrains.

2.2. Crop Row Detection

Crop row detection is crucial for the precise operation of agricultural machinery, with its core being crop row identification and row control. Unmanned agricultural machinery achieves tasks such as precise inter-row weeding, spraying, and harvesting by detecting the position and direction of crop rows and continuously adjusting lateral offsets. The main sensors used for this purpose are visual sensors and LiDAR, with common methods primarily relying on vision-based image processing techniques and multi-sensor fusion for crop row detection.
Vision-based image processing techniques primarily identify crop rows by analyzing features such as the color and texture in the images. With the development of machine learning and deep learning technologies, learning crop row features through pre-training using large datasets has become an important method for crop row detection. Li Hongbo et al. [16] proposed a crop row detection method based on YOLOv8-G. The YOLOv8-G object detection algorithm is used to extract the central position of maize seedlings, followed by clustering analysis using an affinity propagation clustering algorithm. Finally, the crop rows are fitted using the least squares method. This detection method is capable of quickly and accurately identifying crops in the field and simulating the target crop rows, even under complex lighting conditions and interference from weeds.
The fusion of LiDAR and vision can effectively capture both the three-dimensional and color information of target objects, addressing the problem of poor adaptability associated with single sensors. Jiang Qing et al. [17] proposed a corn crop row detection method based on the fusion of solid-state LiDAR and an RGB camera. The method involves preprocessing the acquired corn crop row images and point cloud data. The preprocessed image and point cloud data are then fused to achieve “point cloud coloring”. Using the “colored” point cloud, clustering is performed, and the crop planting agronomic standards (such as row spacing) are incorporated to validate the usability of the point cloud information and color information. The optimal data are selected to complete the clustering of the crop row region of interest. Finally, the clustering of the target point cloud’s feature points is determined by dividing the point cloud into horizontal strips, from which the crop row feature points are extracted. The crop row detection line is then fitted using the least squares method.

3. Positioning and Navigation Technology

Precise positioning and stable navigation are key in enabling the autonomous operation of agricultural machinery. They mainly involve position tracking, path planning, and motion control. To improve positioning and navigation accuracy, technologies such as the Global Navigation Satellite System (GNSS), Inertial Navigation Systems (INSs), differential positioning, laser navigation, visual navigation, and multi-sensor information fusion are commonly used. At the same time, with the continuous development and integration of these technologies, the positioning accuracy and navigation capability of agricultural machinery have been steadily improving. Agricultural machinery positioning and navigation technology have become a research focus in efforts to achieve automation and intelligence in agricultural production.

3.1. Application of Global Navigation Satellite System

The GNSS is a technology system that provides precise positioning, navigation, and time synchronization services through multiple satellite platforms. The GNSS offers all-weather, all-day absolute position and heading information, and it is commonly applied in agricultural fields for outdoor large-scale navigation path searching, fine land leveling, positioning and speed measurement, as well as seeding depth control. GNSS signal loss due to extreme weather conditions or obstructions limits its application in complex farmland environments. In practical operations, achieving decimeter- or even centimeter-level positioning accuracy is often required. To meet these demands, Real-Time Kinematic (RTK) technology, based on carrier phase observations, is commonly used to enable fine operations in the autonomous driving process of agricultural machinery.
Liu Zhaopeng et al. [18] developed an autonomous navigation system based on the GNSS, designing a straight path tracking control algorithm with position and heading deviation as the state variables. A curved path tracking control algorithm was also developed based on the pure pursuit model, enabling automatic spraying operations with minimal human intervention. The system exhibits excellent stability and control accuracy, meeting the spraying requirements for both paddy and dryland environments. Zhou Jun et al. [19] designed a GNSS-based intelligent rice field rotary tilling and leveling machine. GNSS antennas were installed at both ends of the equipment to obtain elevation positioning and pitch angle information. Field trials were conducted in rice fields and under various soil conditions. The leveling accuracy after soil preparation reached approximately 3 cm, with the system showing good stability and adaptability, fully meeting the agronomic requirements for rice planting. Chen Yun et al. [20] calculated the desired steering angle of the steering wheel using GNSS attitude information and vehicle motion models. They used the steering motor speed and the transfer model of the fully hydraulic steering valve to estimate changes in the steering angle. Through Kalman filtering fusion, they obtained the real-time steering angle. During autonomous driving, the navigation error for straight-line operations was less than 2.5 cm, and for curved path operations, the navigation error was less than 9.0 cm, meeting the requirements for applications such as automatic headland turns in agricultural machinery. Xu Qimeng et al. [21] applied GNSS positioning and speed measurement technology, along with seeding depth control technology, to achieve automation in the seeding process. They implemented the walking and steering of the mobile platform through the CAN bus, multi-motor synchronous drive technology, and a combination of differential and electric push-rod steering systems. Gui Zhenliang et al. [22] addressed the “delay” issue of low-cost single-point GNSS receivers during acceleration and deceleration speed measurements. Without relying on multi-sensor fusion for speed measurement, they used a probabilistic statistical time method to compensate for speed during the initial acceleration phase and determine the deceleration and stopping phase, thereby reducing seeding lag distance during the initial acceleration of the wheat planter and preventing over-seeding during deceleration and stopping.

3.2. Integration of Inertial Navigation Systems

The GNSS, as an absolute positioning technology, has no cumulative errors and maintains high long-term accuracy. However, during certain semi-open field operations, GNSS signals may be temporarily lost due to obstructions from trees or crops, making it difficult to accurately obtain heading information. This can lead to a decrease in the accuracy of agricultural machinery’s positioning and navigation, potentially causing harm to crops and workers. INSs are navigation systems that use an Inertial Measurement Unit (IMU) to determine the position, velocity, and orientation of an object. The INS, with its relative positioning characteristics, offers advantages such as high short-term accuracy, high output frequency, minimal susceptibility to external interference, and strong real-time capabilities. It has been widely applied in agricultural machinery. However, its system errors accumulate over time, so the INS is often integrated with the GNSS to form a GNSS/INS hybrid navigation system. The integration of the INS can complement the absolute positioning of the GNSS and the relative positioning of the INS, thereby improving the robustness of the system.
In remote mountainous areas, orchards, and other environments where satellite signals are easily interfered with or obstructed, it is often impossible to reliably receive satellite signals. Zhang Jing et al. [23] proposed an INS/GNSS heading information fusion strategy based on GNSS signal quality and changes in heading angle, aiming to enhance the accuracy of heading angle determination in agricultural machinery’s automatic navigation system. This strategy improves the system’s ability to withstand environmental disturbances during actual field operations. Qiu Quan et al. [24] introduced a GNSS/INS integrated positioning Kalman filtering algorithm with an adaptive coefficient. When the GNSS is unable to perform differential positioning or when positioning data undergoes a sudden jump, adaptive Kalman filtering allows the system to switch to INS positioning, thereby achieving optimal estimation of the agricultural machine’s position and posture. Zhong Yin [25] utilized the three-axis attitude angular velocity and acceleration data from the INS, along with the three-axis position and velocity information from the GNSS, to adopt a loosely coupled approach and correct INS errors in real-time through Kalman filtering. This method enables the precise calculation of the agricultural machine’s position, speed, and attitude. Feng Shuang et al. [26] proposed a real-time angle computation system based on GNSS/INS fusion technology, addressing the issues of insufficient accuracy and complex installation and calibration commonly found in traditional angle sensors used for measuring steering wheel deflection in tractors. Tests conducted in agricultural fields on the tractor’s steering and straight-line performance confirmed that the system meets the required accuracy for automatic navigation.

3.3. Laser Navigation Technology

Laser navigation technology is characterized by high positioning accuracy, immunity to electromagnetic interference, and the ability to operate continuously in all time periods. Simultaneous Localization and Mapping (SLAM) involves using laser sensors (laser SLAM) to perceive the environment. Laser SLAM is widely used due to its minimal impact on the environment. The specific methods of map construction can be divided into the use of 2D LiDAR and 3D LiDAR. Laser navigation technology, in the absence of prior environmental information, uses LiDAR sensors to detect environmental data. Through SLAM techniques, it constructs an environmental map and plans driving routes and operational paths within the map, enabling the real-time positioning of agricultural machinery [27]. The specific method involves using LiDAR to scan the environment and collect three-dimensional point cloud data. Sensor pose estimation is achieved by registering the point cloud into a map of point clouds. Point cloud registration first associates the current point cloud with the map point cloud, and then continuously optimizes the pose by minimizing the overall distance from the points to the geometric primitives. The data association is based on the similarity of local geometric features in the point cloud, and the distance measurement is weighted through uncertainty analysis. Each time new point cloud data are obtained, the SLAM system updates the map [28].
Wang Yanli et al. [29] applied a laser scanner in greenhouses to collect plant position data for precise localization and real-time monitoring, designing a navigation path. They implemented a PID controller to manage the robot’s movement along the fitted path, achieving autonomous navigation. Ni Jiangnan [30] used a laser rangefinder to acquire information about paths, crops, and obstacles within the field of view. Based on the features of the ridges and crop areas, the path edges were detected to generate a driving route, and an incremental PID controller was employed to adjust the direction of the agricultural machinery. Hu Lian et al. [31] used a 2D LiDAR to emit scanning lasers to obtain point cloud data for the receiver on the agricultural robot, while the laser receiver sensed the scanning laser. By integrating the time difference of the scanned lasers and the features of the laser receiver point clouds, they achieved the accurate localization of the agricultural robot. Liu Yang et al. [32] proposed a positioning method for agricultural robots that integrates 3D LiDAR and IMU data. They processed the point cloud data from the LiDAR using an angle-based clustering technique and combined it with a 3D Normal Distributions Transform (3D-NDT) localization algorithm, enabling real-time localization with LiDAR.

3.4. Visual Navigation Technology

With the continuous development of machine vision and computer vision, significant progress has been made in visual navigation technology in China. Visual navigation technology utilizes visual sensors to acquire images and other environmental data of agricultural machinery, helping to determine the optimal path and actions by analyzing and recognizing landmarks, crops, and obstacles in farmland. It enables precise positioning and autonomous navigation in complex environments, reducing the reliance on manual control and realizing autonomous and intelligent operations of agricultural machinery. Visual navigation technology becomes particularly important in environments where GNSS signals are weak or unclear. However, due to factors such as crop growth conditions, complex farmland environments, and response speed, it often requires integration with satellite positioning and real-time sensory information from other sensors. Visual SLAM, primarily using image data collected by cameras, has developed rapidly thanks to the maturity of early camera technology. In the early research of visual SLAM, academia focused on geometry-oriented algorithms based on filtering and optimization techniques. Extended Kalman Filtering (EKF) and graph optimization methods were once central to research in this field. These visual SLAM techniques rely on carefully designed visual feature extraction and complex geometric models, which are often suitable for textured and structurally clear environments. With the rapid advancement of deep learning technology, visual SLAM methods combining neural networks have gradually become the forefront of research [33]. To highlight the advantages and disadvantages of laser navigation and visual navigation technologies, a comparative analysis of the two technologies is shown in Table 1.
Satellite navigation, in conjunction with the mobile terminals on agricultural machinery and fixed ground base stations, achieves precise differential positioning. Visual navigation technology is then used to correct the driving direction and avoid obstacles in the field. This combination integrates the speed of satellite navigation with the maneuverability of visual navigation, resulting in efficient and accurate navigation for agricultural machinery. Li Yunwu et al. [34] used RTK-GNSS for road network information collection, real-time positioning, and path planning. They applied machine vision to identify field roads and extract navigation lines. In terms of image processing, they enhanced road shadow handling and information fusion, achieving accurate segmentation of roads and background. Zhou Yanqiu et al. [35] analyzed and designed the relevant processes for controlling agricultural machinery movement through the visual system. They also developed a navigation function design that integrates visual technology with satellite systems, as well as the design and optimization of electro-hydraulic control technology and functions under visual navigation. Li Junxia et al. [36], using a grape harvesting robot as a model, proposed a specific method for agricultural product harvesting based on visual SLAM and visual navigation information processing technology, building upon the agricultural robot controller software architecture. Experimental results showed that, compared to traditional laser navigation algorithms, visual navigation technology provides higher positioning and navigation accuracy. Wang Dong et al. [37] designed a mountainous orchard drone equipped with a flight control system, GNSS mobile station, RGB camera, wireless video module, and electronic compass. The ground station includes a GNSS base station, flight control module, portable computer, and video reception equipment. By integrating the OpenCV library, they developed an orchard row recognition algorithm, using linear combination and least squares methods to extract the orchard row trend lines and calculate the yaw angle, enabling precise flight path control for the drone. Mao Wenju et al. [38], in order to address the limitations of a single navigation mode and the inability to start or stop at will in post-harvest apple transport equipment, designed a small orchard transport robot with dual navigation modes. The robot can select either pedestrian-guided navigation or fixed-point navigation based on the requirements. The pedestrian-guided navigation uses a target tracking control method based on OpenPose human posture recognition, while the fixed-point navigation employs a distance-direction control method based on RTK-GNSS. Yang Shengyu et al. [39], based on GNSS/INS combined positioning, proposed a multi-sensor combined positioning method for a rice transplanter, supplemented by a visual navigation system. The overall scheme is shown in Figure 4. This method effectively reduces positioning errors and tracking deviations in the navigation system.

3.5. Multi-Sensor Fusion Positioning

To improve navigation accuracy and reliability, and to overcome the limitations of single sensors in complex climate and terrain conditions in agricultural environments, multi-sensor fusion is commonly employed. Common methods include information fusion algorithms such as Kalman filtering and particle filtering. The visual method provides a large amount of information, including not only 3D shape and distance data but also detailed information such as target texture and color. LiDAR can offer precise scene distance and target shape information, IMU provides kinematic and dynamic data, while the satellite navigation module can provide stable global positioning information for agricultural machinery [40].
The fusion of visual sensors and LiDAR for perception can reduce the model errors in mapping based solely on visual sensor data, improving the accuracy and stability of mapping in complex farmland environments. This approach can generally meet the data requirements for 3D scene modeling, including color, texture, and geometric information, providing multiple redundant data for the mapping system. Furthermore, by integrating multi-source information from visual sensors, LiDAR, IMU, and GNSS, and leveraging the advantageous features of each sensor, the data complement each other, reducing the impact of GNSS signal loss due to occlusion and the drift caused by INS over time. This can enhance the robustness and accuracy of the system.
He Jie et al. [41] used GNSS antennas to measure heading and speed, while a MEMS gyroscope measured the combined rotational speed of the vehicle body and wheels. They applied adaptive Kalman filtering for fusion and correction to obtain the steering angle of the wheels. The technical implementation details are shown in Figure 5. Field test results showed that this method performed similarly to the linkage-type wheel angle sensors and can replace wheel angle sensors for low-speed agricultural machinery navigation.
Chen Shoubin et al. [42] proposed a LiDAR and visual SLAM backend system that utilizes LiDAR geometric features and visual features to perform loop closure detection. They constructed a Bag-of-Words (BoW) model to describe visual similarity, which assists in loop closure detection. Loop closure detection was then verified through point cloud re-matching and graph optimization. The results indicate that the incorporation of visual features effectively aids in loop closure detection and enhances the performance of LiDAR-based SLAM. Xiao Zhengbang et al. [43], in order to overcome the limitations of single sensors in target detection and mapping for agricultural robots, proposed a target detection algorithm combining a 16-line LiDAR and camera, and designed a 3D mapping method based on elevation mapping. By employing multi-sensor fusion to provide redundant data, the approach reduced the possibility of erroneous measurements and effectively addressed the challenges of walking and positioning detection in complex environments for agricultural robots. Liu Cheng et al. [44] developed a high-precision multi-sensor fusion positioning method suitable for agricultural and forestry environments. This method, based on the Unscented Kalman Filter (UKF), combines real-time motion dynamics models with data from the Differential GNSS, Ultra Wide Band (UWB), IMU, and wheel encoders. It effectively supports the precise positioning needs in agriculture, forestry, and other fields. Jie Kai Ting et al. [45] focused on the research direction of a collaborative positioning algorithm for agricultural vehicles in both indoor and outdoor environments, based on UWB frequency modulation technology. They proposed a Weighted Least Squares High Double-Sided Two-Way Ranging (WLS-HDS-TWR) collaborative positioning algorithm for agricultural machinery. This algorithm integrates distance information between the master and slave vehicles, combining GNSS positioning, the master vehicle’s coordinates, and ranging data to achieve collaborative positioning between the master and slave vehicles.

4. Autonomous Operation and Path Planning Technology

Autonomous operation and path planning technology refers to the ability of robots to autonomously navigate and plan paths in an unknown environment by using their own sensors to gather environmental information, enabling them to complete specific tasks. With the rapid development of agricultural machinery autonomous navigation technologies, intelligent agricultural equipment is playing an increasingly important role in agricultural production, and farms are entering the era of automation [46].
Autonomous operation allows agricultural machinery to complete specific tasks without human intervention, reducing the reliance on manual labor. It enables uninterrupted work, reduces labor costs, and significantly increases agricultural production efficiency. The autonomous operation system can also dynamically adjust its operational strategy based on real-time data and environmental changes, enhancing the flexibility and adaptability of the work. Path planning algorithms are designed to generate the optimal path that covers the entire work area for agricultural machinery. Path planning technology helps find the best route from the starting point to the destination, saving time and resources. It also helps avoid obstacles and find feasible paths. Multi-machine collaborative operation refers to the efficient and collaborative completion of agricultural tasks in the same working scenario by multiple agricultural machines, achieved through intelligent technology, communication technologies, and collaborative control algorithms.
In modern agriculture, achieving autonomous operation and efficient path planning for agricultural machinery is of great significance for improving production efficiency and reducing costs. For instance, the all-encompassing path planning method for multi-machine collaborative operations greatly enhances the overall work efficiency of agricultural machinery [47]; the autonomous tracking and obstacle avoidance functions of agricultural machinery provide new solutions for operations in complex environments [48]; the integration of autonomous operation technology, path planning algorithms, and multi-machine collaborative operation works synergistically to improve agricultural work efficiency, reduce labor costs, and optimize resource utilization. These are the core components of the intelligent development of agricultural machinery in modern agriculture.

4.1. Autonomous Operation Technology

With the continuous development of China’s economy, the decreasing number of agricultural laborers, and the rising labor costs, the traditional agricultural production model can no longer meet the current demands for automation, low cost, and high efficiency. In recent years, research related to autonomous operation of agricultural machinery has become an important focus in the field of agricultural mechanization. This research encompasses several key technologies, including obstacle avoidance technology, road defect detection technology, and path planning technology.
Agricultural machinery obstacle avoidance technology refers to how agricultural machines precisely identify and effectively avoid obstacles during autonomous operation, ensuring the safety of both the equipment and personnel. Field road defect detection technology aims to promptly identify issues such as mud, water accumulation, and other problems on farm roads, preventing the machine from becoming stuck in depressions or holes. Agricultural machinery path planning technology is designed to ensure that machines travel along the optimal path during operations, thereby improving work efficiency, reducing energy consumption, and protecting crops [49].
Ma Wenqiang [50] conducted research on the autonomous operation control model for transplanting robots. Based on a kinematic model, the study utilized a pure pursuit method to decompose the motion of a four-wheel steering chassis. To address the drawback of the pure pursuit method, which cannot dynamically adjust the lookahead distance in real-time, a particle swarm optimization-based pure pursuit method was proposed, enabling dynamic adjustment of the lookahead distance during the tracking process. Chen Zeyu [49] discussed and researched the implementation of obstacle avoidance technology in autonomous agricultural operations and how edge deployment can be achieved. For issues such as pedestrian detection in fields and the detection of road defects, lightweight models were optimized and designed using machine learning techniques. The edge deployment of obstacle avoidance technology for agricultural machinery was implemented through the Jetbot edge intelligence carrier on Jetson Nano. Meng Liwen et al. [51] elaborated on methods for the precise positioning and alignment of robots during the pruning and shaping process of single plants such as spheres, columns, and cones under complex environmental conditions. They analyzed the robot’s self-adaptive alignment method based on image recognition, as well as the features and advantages of a mapping method that integrates multi-line LiDAR and monocular vision. They also pointed out potential optimization improvements. Additionally, they researched and described a pruning robot for autonomous operations, including methods for seedling recognition, mapping, and alignment, significantly improving the automation and intelligence of pruning equipment. Li Mingchun [52] designed a binocular vision-based perception system that enables bulldozers to autonomously identify obstacles and provide real-time feedback on their depth information while performing leveling operations along a planned path. The system can also estimate earthwork information based on point clouds, facilitating construction process planning. Cai Daoqing [53] achieved the fusion of vision and millimeter-wave radar in both time and space. Using valid targets selected by the millimeter-wave radar as seed points, the system completed the task of detecting obstacle dimensions in the visual depth map. Bai Xiaoping [54] proposed a comprehensive distributed intelligent control solution for the autonomous harvesting operations of agricultural machinery. He developed core devices with independent intellectual property rights for intelligent harvesting control systems, including a header profiling device, an online grain breakage rate detection device, a foreign material loss rate detection device, a sieve opening adjustment device, and a concave clearance adjustment device.

4.2. Path Planning Algorithm

Path planning algorithms aim to generate the optimal path that covers the entire working area for agricultural machinery. In agriculture, path planning is often integrated into agricultural robots. Agricultural robot path planning refers to the use of autonomous navigation technologies and intelligent path planning algorithms to enable agricultural robots to effectively avoid obstacles during operations, thereby autonomously planning efficient, safe, and low-energy work paths [55,56]. Figure 6 shows a scenario where the intelligent agricultural machine performs autonomous turning and U-turns using a path optimization algorithm.
Common algorithms used in agricultural machinery path planning include graph-based search algorithms (such as the A* algorithm and Dijkstra’s algorithm), sampling-based algorithms (such as the RRT algorithm [57]), and swarm optimization algorithms (such as Particle Swarm Optimization (PSO)). These algorithms take into account the shape of the farmland, the location of the obstacles, and the motion characteristics of the machinery to generate efficient operational paths. They are widely used to address issues such as pathfinding and obstacle avoidance in agricultural machinery operations [58].
The A* algorithm is frequently used for optimal path planning in static and complex environments. By considering both the path length and heuristic function values, it can efficiently find the optimal path [59]. Dijkstra’s algorithm is used to find the shortest path in weighted graphs, and in agricultural robot path planning, it has been improved and applied according to practical scenarios. The basic idea of the PSO algorithm is to transform the optimization problem into a search problem in a multidimensional space, where particles simulate movement within this space to search for the optimal solution. The RRT algorithm is a random search algorithm used for path planning, and it has been widely applied in the field of agricultural machinery path planning. Each of these algorithms has its advantages and disadvantages. A comparative analysis of the current mainstream path planning algorithms for agricultural machinery is shown in Table 2.
With the advancement of technology, the integration of multi-sensor and multi-path planning technologies will help improve the efficiency and precision of agricultural production, and the application prospects of path planning technology in the agricultural sector are becoming increasingly broad [60]. Jiang Xinbo et al. [61] proposed a path planning algorithm that combines an improved A* algorithm with the Artificial Potential Field (APF) algorithm. By optimizing the heuristic function of the A* algorithm, introducing intermediate nodes, and employing non-uniform cubic spline interpolation, global path planning is achieved. Additionally, by modifying the gravitational potential field function and integrating the simulated annealing algorithm, the obstacle avoidance capability of the APF method is enhanced. The key points of the global path are extracted as sub-goal points, and the APF algorithm is applied for secondary planning. Xin Peng et al. [62], addressing the issues of high randomness and slow convergence speed in RRT, introduced the APF method into RRT. They proposed a method that guides the random tree growth using obstacles and the target, which reduces the randomness of tree growth and accelerates the convergence speed. Wang Yu et al. [63], considering the problem of repeated searching in RRT, proposed a coverage search method to avoid redundant sampling, significantly reducing the number of nodes maintained by the random tree, which facilitates the algorithm’s rapid convergence. Zou Qijie et al. [64] incorporated reinforcement learning into RRT, using it to optimize the direction of random tree growth and process the initial path, resulting in a more optimal path. Deng Yizhao et al. [65] proposed an improved RRT algorithm, which accelerates the convergence speed by modifying the random expansion approach of the traditional RRT algorithm, thereby avoiding local optima. They also introduced a one-step memory mechanism mathematically and improved the sampling method of traditional RRT. A random rotation handling mechanism for collision nodes was introduced to overcome the local optimum trap, and a bidirectional tree growth mechanism was incorporated to accelerate the algorithm’s convergence. Li Zhaoying et al. [66] integrated Deep Q-Networks (DQN) into RRT, and they proposed the concept of a variable step size. By using DQN to learn the optimal growth direction and step size, they improved the planning efficiency of the algorithm. Luo Ronghao et al. [67] proposed a static full-coverage path planning method and path optimization scheme. Based on information about rectangular fields, obstacles, the minimum turning radius of agricultural machines, and tool widths, the method automatically selects parameters such as work direction, work spacing, headland turning area, and turning mode. Furthermore, the algorithm was optimized to meet the needs for a high coverage rate and low repetition rate in field operations, based on the shape of obstacles and the requirement for a high coverage-to-work path ratio.
PSO, A*, and other classical algorithms perform well in simple environments but have limitations in complex and large-scale scenarios. With the development of artificial intelligence and the rise of machine learning, deep learning and reinforcement learning have been applied in path planning, enabling them to handle complex environments and large-scale data. This has significantly improved the adaptability and intelligence of path planning.
Deep learning-based path planning is a technology that uses deep neural networks to model and optimize environmental perception, path decision making, and trajectory planning. By learning the inherent patterns of path planning samples, it enables agricultural machinery to autonomously learn and plan feasible movement paths. With its strong environmental perception ability, adaptability, and high efficiency and precision, deep learning-based path planning being widely applied in agricultural machinery. The Transformer, as an important architecture in deep learning, is specifically designed to handle complex sequential data. With its self-attention mechanism, it demonstrates powerful feature extraction and global modeling capabilities across various domains. Not only has it revolutionized traditional Natural Language Processing (NLP) methods, but it has also been successfully applied in fields such as computer vision and audio processing. By incorporating Transformer networks into path planning, the self-attention mechanism captures temporal dependencies in long sequences without relying on traditional sequential information processing methods. This allows for more accurate inference of the current state and selection of the optimal path [68].
Dai Shengtang et al. [69] addressed the collaborative path planning problem in multi-UAV systems by utilizing deep reinforcement learning methods. They designed an efficient path planning framework and developed kinematic models for differential drive unmanned vehicles and mathematical models for collaborative obstacle avoidance scenarios. Based on this, they further analyzed the challenges of deep reinforcement learning in handling complex dynamic environments with high-dimensional state spaces and continuous action spaces, such as slow training speeds, low sampling efficiency, and poor adaptability. This work provides a theoretical foundation for multi-UAV collaborative path planning research. Yang Bu et al. [70] proposed an improved deep deterministic policy gradient algorithm for the path planning of the mechanical arm of intelligent weeding robots, addressing the lack of active weed avoidance path planning in the field of robotic weeding. Zhuang Jinwei et al. [71] proposed an optimal global coverage path planning method aimed at minimizing vehicle losses along the route. Based on the DQN algorithm, they created a reward strategy based on the vehicle’s real trajectory during operation, optimizing losses such as reducing the number of turns, U-turns, and overlapping work areas. They also designed the RLP-DQN algorithm for this purpose. Luo Xiangwen et al. [72] designed a real-time obstacle target recognition system based on the Swin Transformer network. Combining this with the RRT* algorithm and Bézier curve-based path fitting algorithm, they proposed a Swin Transformer-based autonomous driving path planning algorithm. Using video frames as the data source, they constructed an obstacle image dataset through data augmentation. After obstacle recognition, path smoothing optimization was applied to complete path planning. Li Juan et al. [73] proposed a path planning method for crop detection robots based on a lightweight Transformer. They replaced the softmax function with the cosine function to overcome the non-differentiable calculation issue of softmax, while retaining the key features of attention computation. This approach further reduces time complexity. Experiments conducted in farmland at different scales showed that the robot’s path length was shortened by 5.91%, inference time was reduced by 50% compared to the Transformer model, and training time was reduced by 75%. Xiong Chunyu et al. [74] proposed a path planning method for citrus harvesting robotic arms in unstructured environments, which combines deep reinforcement learning (DRL) and artificial potential fields, to overcome the challenges of low efficiency and poor success rate in picking path planning when using deep reinforcement learning in unstructured environments.

4.3. Multi-Machine Collaborative Operation

Multi-machine collaborative operation refers to the use of intelligent technologies, communication technologies, and collaborative control algorithms to enable multiple agricultural machines to efficiently and cooperatively complete agricultural production tasks in the same operational scenario. This approach involves the following two core aspects: task allocation and path planning [75]. Multi-machine collaborative operations can enhance work efficiency, save operational time, and reduce food waste caused by untimely harvesting of crops [76].
With the development of smart agriculture, multi-machine collaborative operation in agricultural machinery is increasingly applied in large-scale field production processes, contributing to enhanced production efficiency. Agricultural machinery path planning is a critical foundational technology in the implementation of smart agriculture. Depending on the number of operating machines, path planning can be divided into single-machine operation path planning and multi-machine collaborative operation path planning. Currently, in large-scale farmland operations, multi-machine collaborative operation is gradually replacing single-machine operation, becoming the main method of operation in field agriculture [77]. Figure 7 shows the collaborative operation scene of the unmanned radish harvester and transport vehicle. By integrating multi-machine collaboration with path planning, it becomes possible to effectively promote efficient agricultural production under complex and dynamic environmental conditions.
Wu Jian et al. [78], addressing the issue of poor endurance of single small inter-row weeding robots that are unable to independently complete large-scale weeding tasks in rice fields, proposed a multi-machine collaborative path planning method based on a multi-chromosome optimized genetic algorithm (MGA). They established a multi-machine collaborative path planning model. Xie Jinyan et al. [79], in order to improve the work efficiency of multiple unmanned mowing machines during collaborative operation in a new apple orchard, proposed an improved genetic algorithm (IGA) to assign and optimize operation paths for each mower. Based on the actual operation of unmanned mowers, they constructed a multi-machine operation path optimization model with total turning time and operation duration as comprehensive optimization objectives. Ma Haojie [80] aimed to minimize the movement distance of the most labor-intensive spiral propulsion vehicle and addressed the multi-machine collaborative weeding operation problem by establishing a multi-machine collaborative task allocation model. Liu Xiaoming [81] proposed a 5G-based multi-machine collaborative system for plant protection drones, where the cloud serves as the core for multi-machine networking. Through 5G and onboard nodes, the cloud can directly control the plant protection drones. A data transmission protocol between the plant protection drones and the cloud control terminal was designed to enable the cloud to acquire field information and monitor/control the drones during the multi-machine collaboration process. Additionally, a genetic algorithm-based task allocation method for multi-machine plant protection operations was proposed, allowing the cloud to rationally allocate and schedule tasks across multiple machines, thus improving overall operational efficiency. Gao Wenjie [77] proposed a U-shaped and bow-shaped turning path selection strategy for a rule-based farmland operation scenario. He designed a multi-machine collaborative operation path planning method for both regional and full-region operations and conducted field validation tests. Tang Can et al. [82] addressed the multi-drone path planning problem under the condition of farmland blocks with obstacles and proposed a complete multi-drone collaborative operation path optimization algorithm solution. Li Han et al. [83] designed and developed a WebGIS-based multi-machine collaborative navigation service platform to provide map and navigation service support for multi-machine agricultural machinery collaborative operations.

5. Farm Machinery Status Monitoring and Fault Diagnosis

In the process of modern agricultural mechanization, real-time monitoring of agricultural machinery’s operational status and fault diagnosis are crucial for improving operational efficiency and ensuring safe equipment operation. By leveraging advanced sensor technologies, data analysis methods, and intelligent diagnostic systems, it is possible to monitor the status of key components of agricultural machinery and provide early fault warnings. Fault diagnosis can predict and prevent potential mechanical, electrical, and hydraulic failures by monitoring the status of critical parts, determining the fault location, and providing technical support for troubleshooting. The application of operational status monitoring and fault diagnosis technologies in agricultural machinery helps achieve intelligent equipment management, improves operational efficiency, and reduces maintenance costs.

5.1. Key Components Status Monitoring

In agricultural production, due to the impact of working conditions and the intensity of tasks, agricultural machinery is prone to wear and tear, which can lead to malfunctions. Monitoring the operational status and key components of agricultural machinery can serve as an early warning system, transforming passive responses into proactive measures, thus effectively reducing the impact of faults. Monitoring the status of critical components involves using sensors to collect information on the condition of these parts, providing real-time monitoring to assess the overall operational state and performance of the machinery. Fault monitoring can track the current state of the machinery, perform diagnostic analysis, and enable remote data transmission through a data transmission module, providing essential data support for health management and fault diagnosis.
Wen Xin et al. [84] developed a remote monitoring system for a corn fertilizing machine, using the STM32103 main controller for data processing and conversion. The BC20 wireless communication module was responsible for data transmission. Through the OneNet platform, real-time remote monitoring of key parameters such as the fertilizing machine’s speed, coordinates, and fertilizer distribution shaft status was achieved on both PC and mobile platforms. Xiao Fengming [85] installed various sensors on critical parts of agricultural machinery to collect data on their operation, performance, and status. These data were then uploaded to a cloud platform via industrial control computers and communication networks. Additionally, a deep learning-based fault prediction model was developed to automatically analyze state changes and assess the likelihood of faults, enabling automatic alerts for typical mechanical failures.

5.2. Fault Diagnosis Technology

During sowing and harvesting operations, agricultural machinery typically operates under high intensity and full-load conditions, making it highly susceptible to mechanical failures. By monitoring key components of agricultural machinery and collecting information on the operating status of critical parts, real-time monitoring and fault diagnosis of the machine’s operation can be achieved through various detection modules and intelligent monitoring terminals. Currently, common fault diagnosis technologies include vibration signal feature extraction technology, intelligent fault diagnosis technology, and multi-parameter and multi-fault diagnosis techniques.
Vibration signal feature extraction is a crucial method in mechanical fault diagnosis. It involves measuring the machine’s vibration and noise signals, processing the fault signals using techniques such as time-domain analysis, frequency-domain analysis, and time-frequency analysis; With the rapid development of sensor technology, monitoring, and diagnostic techniques, intelligent fault diagnosis has attracted widespread attention and encouraged research in this field. Intelligent fault diagnosis typically utilizes fault association models based on simulation dynamics, big data, and neural network-driven fault analysis and health assessment, combined with fuzzy theory, regression prediction, and time series analysis techniques, to establish a comprehensive diagnostic system [86]. The multi-fault diagnosis method integrates various types of information, such as vibration signals, electromagnetic signals, radiation, and stress, to provide the comprehensive monitoring and fault diagnosis of agricultural machinery [87].
Zhang Yaping et al. [88] proposed an adaptive selection method for the Laplace wavelet parameters and damping parameters of the wavelet model to address the fault diagnosis of rolling bearings in agricultural machinery. They established a Laplace wavelet model that is the most suitable for the impact characteristics of the signal. Qi Meng et al. [89] proposed a bearing fault diagnosis method based on the short-time Fourier transform (STFT) time-frequency spectrogram and Vision Transformer (ViT). By applying the short-time Fourier transform, the original vibration signal is converted into a two-dimensional time-frequency image. This time-frequency image is then used as a feature map input into the ViT network for training, with the goal of constructing the optimal model structure to achieve fault diagnosis. Zhang Weipeng [90] analyzed common fault phenomena of combine harvesters during field operations and verified the fault diagnosis classification effect through simulation of field operations. By combining the sparrow search algorithm (SSA) with the BP neural network, a training set and test set were constructed, and the SSA-BP neural network algorithm was trained and validated, achieving an average recognition rate of 98.56%. The superiority of the SSA-BP algorithm was verified through its comparison with algorithms such as SVM and BP-SVM. Song Enzhe et al. [91] addressed the issue of reduced fault diagnosis accuracy caused by noise interference during motor operation. By extracting low-frequency information from noise signals using Mel Frequency Cepstral Coefficient (MFCC) dynamic feature extraction and combining the adaptive adjustment capability of the convolutional attention module with a multi-feature fusion strategy, they further reduced the impact of noise on fault diagnosis.

6. Field Operation Monitoring Technology

Agricultural machinery field operation monitoring technology utilizes multiple sensors to collect real-time operational data from the field. Through data processing and analysis, the working status of the machinery during operations is obtained for monitoring and analysis. By monitoring and analyzing the operational process, the accuracy and efficiency of field work are improved, promoting the sustainable development of agricultural production. This chapter provides a review focusing on the monitoring of key field operation states, including tillage depth, surface leveling, soil fragmentation rate, sowing, transplanting, crop protection, and harvesting. The monitoring technologies for various field operations are shown in Table 3.

6.1. Tillage and Land Preparation Operation Monitoring Technology

Soil tillage is a fundamental process in agricultural production, and the quality of its operation has a significant impact on subsequent stages. To ensure the quality of tillage, modern agriculture employs various monitoring technologies to track and control key parameters during the tillage process. These include monitoring the plowing depth, surface leveling, and soil fragmentation rate. Through data fusion and intelligent algorithms, the machine’s operating parameters are adjusted in real time to accommodate different soil conditions and operational requirements, ensuring optimal tillage results.

6.1.1. Tillage Depth Monitoring

Tillage depth refers to the depth at which a tillage machine operates in the soil, directly affecting the soil’s loosening degree and the growth space for plant roots. Traditional tillage depth measurement relies on manual measurement and recording, which is inefficient and limited in accuracy. Modern tillage depth monitoring technology mainly uses sensors and information technology to achieve real-time and precise monitoring. Sensors are installed on the beam of tillage machinery or the tractor’s three-point hitch mechanism. By fitting the sensor data to the relationship model of tillage depth, the actual tillage depth is calculated.
Wang Hongwei et al. [92] used the CAN bus and sensors to read the position of the tractor’s three-point hitch and then convert the data into soil penetration depth. They fitted the data through experiments to achieve real-time tillage depth monitoring. Du Xinwu et al. [93] analyzed the suspension posture of the rotary tiller and determined the mathematical relationship between the tillage depth and suspension posture. They incorporated factors such as the deformation of the rotary tiller and the lower limit of the tires to establish a three-parameter nonlinear tillage depth measurement model, which enables real-time monitoring of the tillage depth based on suspension posture. The overall scheme of the tilling depth monitoring system is shown in Figure 8. Jiang Xiaohu et al. [94] installed ultrasonic and infrared sensors on the machine frame. The ultrasonic sensor measures the tillage depth using the time-of-flight method, while the infrared sensor employs the triangulation method to measure the depth. The data from both sensors are filtered and fused in real time using the Kalman filtering method to monitor the tillage depth continuously.

6.1.2. Surface Leveling Monitoring

The surface leveling of the field affects the uniformity of planting and the efficiency of irrigation. Traditional assessments of surface leveling mainly rely on manual measurements, which are subjective and inefficient. Modern surface leveling monitoring technologies use sensors and data processing to quantify the surface leveling, with the main techniques being laser leveling and ultrasonic monitoring. Laser leveling technology utilizes laser transmitters and receivers to measure surface height differences, guiding leveling machinery to make adjustments to ensure the surface is even. Ultrasonic sensors, installed at the front of the machinery, can measure surface height changes in real time, feeding the data back to the control system, which automatically adjusts the machinery’s operational parameters to maintain a leveled surface.
Zhou Hao et al. [95] designed a laser-controlled paddy field leveling machine that adjusts the depth of the leveling blade based on the laser signal received by the laser receiver. The signal controls an electromagnetic valve that adjusts the elevation cylinder, automatically regulating the working depth of the blade. Wang Ling [96] mounted several sensors on an experimental vehicle, including attitude sensors, magnetostrictive displacement sensors, ultrasonic sensors, and GPS sensors. The displacement sensor is connected to the sliding rod via a connecting rod, with rollers installed on the sliding rod. The rollers move vertically with the surface contours, providing measurement values. The ultrasonic sensor calculates the distance from the transmitter to the soil surface based on the time difference between the emitted and received signals. The attitude sensor, mounted on a vertical plane, measures the vehicle’s attitude angle. The GPS module, mounted on the vehicle’s moving surface, provides real-time location data. When measuring the length of the connecting rod and the initial position, the angle between the sliding rod and the connecting rod is corrected for accurate measurements.

6.1.3. Soil Fragmentation Rate Monitoring

The soil fragmentation rate refers to the degree to which the soil is broken up after cultivation, which affects seed germination rates and transplanting quality. Traditional measurement of the soil fragmentation rate requires collecting soil samples and manually conducting sieving and analysis, a process that is cumbersome. Modern technology, through optimized mechanical design and real-time monitoring, enables the control and evaluation of the soil fragmentation rate. The main methods used include image processing techniques and online fragmentation rate detection systems. The image processing method involves capturing post-cultivation soil surface images using camera equipment and applying image processing algorithms to analyze the size distribution of soil particles, thus calculating the fragmentation rate. However, this method can only measure the surface layer of soil and does not represent the fragmentation rate of the entire soil layer, leading to larger errors in the results. The online fragmentation rate detection system simulates manual measurement, automatically performing the steps of soil sampling, weighing, sieving, re-weighing, and calculation, enabling the automated measurement of the soil fragmentation rate.
Xia Qicheng [97] mounted a camera on a small tillage machine to capture soil samples after cultivation. The camera measures the area of soil clods and calculates the fragmentation rate by determining the ratio of the area of clods with a diameter greater than 5 cm to the sample area. Yang Xulong [98] developed an online soil fragmentation rate monitoring system, which consists mainly of a soil sampling unit, a sieving unit, a weighing unit, and the corresponding software and hardware. The soil sampling unit collects soil and delivers it to the sieving unit for classification. The soil is then transferred into different bins of the weighing unit. When the total mass of soil in the weighing unit reaches a preset threshold, the system interrupts the soil transport using an active overbridge. It then calculates the fragmentation rate of the sampling point and unloads the weighed soil.

6.2. Planting Operation Monitoring Technology

In modern agriculture, the quality of planting operations directly impacts crop yield and quality. To achieve precision agriculture, monitoring technologies for sowing and transplanting operations have been widely applied. Through the real-time adjustment of the machine’s operational parameters, these technologies adapt to varying soil conditions and operational requirements, ensuring optimal planting results. The main areas of focus include sowing operation monitoring and transplanting operation monitoring.

6.2.1. Sowing Operation Monitoring Technology

Sowing is a fundamental step in agricultural production, and the quality of sowing directly affects crop growth and development. Monitoring the sowing process can effectively address issues such as blockages, missed seeds, and seed shortages, allowing for timely corrective actions to minimize losses. Commonly used methods include piezoelectric monitoring, photoelectric monitoring, capacitance monitoring, and machine vision methods [99,100,101].
The piezoelectric monitoring method uses various piezoelectric film materials. When the seeds come into contact with the film, it deforms, generating an electrical signal that allows for seed monitoring. This is a contact-based monitoring technique with relatively high accuracy. However, it requires the seeds to collide with the monitoring material to generate a signal, making it susceptible to vibration interference. Additionally, since it is typically installed in the seed distribution tube, it can also affect the trajectory of the seeds. The photoelectric monitoring method works by detecting the interruption of a photoelectric sensor during the falling of seeds, generating a voltage signal. This signal is then processed and converted into a pulse signal that can be recognized by the controller, enabling seed monitoring. This method has a simple structure and is easy to apply. However, it is highly effective for larger seeds, such as corn, with good accuracy, while its performance is less effective for smaller seeds, resulting in lower accuracy. The capacitive monitoring method uses a capacitive sensor, where seeds passing between the two plates of the sensor generate a signal. This signal is processed into a recognizable digital signal for seed monitoring. This method is more sensitive to larger seeds, but less effective for smaller seeds. The machine vision method utilizes high-speed photography of falling seeds, followed by image processing systems to obtain information such as the position and quantity of seeds, enabling real-time monitoring. While this method offers higher monitoring accuracy, its cost is relatively higher compared to other methods.
Ding Youchun et al. [102] designed a substrate-based piezoelectric film sensing structure, which reduced the collision signal attenuation time from 9 ms to 1 ms, thus improving the time resolution for high-frequency seed flow detection. This design also effectively mitigates interference from mechanical vibrations. Additionally, Ding Youchun and colleagues developed a medium- and small-sized seed flow monitoring device based on a thin-layer laser emitter module with a light thickness of approximately 1 mm and the photovoltaic effect of silicon photodiodes. This device enables the non-collisional detection of medium- and small-sized seed flows. Experimental results indicated that the monitoring accuracy for rapeseed was no less than 98.6%, and for wheat seeds, it was no less than 95.8%, with light conditions and equipment vibrations having no impact on monitoring accuracy. Xu Luochuan et al. [103] developed a finger-type capacitive cotton seed hole-planting monitoring system. Using the Pcap02 micro-capacitive acquisition module, it collects the capacitive output values and processes the data to accurately determine normal single-seed planting, replanting, and missed planting situations. Zhao Zhengbin et al. [104] applied machine vision technology to detect the planting performance of a precision seed-planting machine for seed trays. A photoelectric sensor detects the position of the seed tray, triggering a camera, and dual cameras scan and capture the images of the seed tray row by row. Visual algorithms were then used to analyze and process the images, identifying the planting conditions based on the image data.

6.2.2. Transplanting Operation Monitoring Technology

Transplanting is the process of transferring seedlings to the field, and the quality of this operation directly affects the survival rate and growth status of the crops. The monitoring of transplanting quality mainly focuses on parameters such as the mis-planting rate. The quality monitoring system for transplanting operations typically involves installing industrial cameras and other sensors on the transplanter. These sensors capture real-time images of the seedlings being transplanted in the field. The images are then processed and analyzed to calculate the precise planting distance by measuring the geometric center of the seedlings’ leaves. By comparing the actual planting distance with the theoretical distance, the system can identify quality issues such as missing seedlings, lodged seedlings, and exposed seedlings.
Han Hongfei et al. [105] designed a monitoring system for transplanting machine operation quality and working trajectory information. The system uses industrial cameras combined with absolute encoders to capture image data of the transplanted seedlings. The images are then analyzed to identify whether seedlings are missing at any planting points, and the positions of these points are stored. A GIS system was created to display the transplanting machine’s operating trajectory and missing seedling information on the map. Jiang Zhan et al. [106] designed a real-time monitoring system for transplanting missing seedlings based on video image stitching, as shown in Figure 9. The system collects real-time videos of rapeseed blanket seedlings during field transplanting operations. Using parameters such as video-to-frame rate, the number of image stitches, and corner detection algorithms, and evaluating based on image quality scores and image stitching efficiency, the optimal image stitching parameter combination was obtained through orthogonal experimental design and optimization methods. The system then uses image segmentation and centroid fitting algorithms, along with criteria based on the distance between adjacent projection points and column slopes, to calculate the number of missing seedlings.

6.3. Crop Protection Operation Monitoring Technology

Crop protection operation monitoring technology involves the real-time monitoring of crop information in the field, transmitting the data to a control unit for analysis and decision making on variable control. This allows for management based on the actual conditions of crops and soil, achieving precise pesticide application, improving operational efficiency and quality, and reducing pesticide waste. As shown in Figure 10, the smart agricultural machinery is performing crop protection monitoring operations. Currently, precision spraying technology primarily focuses on targeted pesticide application. Sensors are used to detect features of the target canopy, including the presence of the target, its appearance and outline, leaf area index, and pest or disease information, providing data support for the pesticide application decision model. The main sensors used include ultrasonic sensors, infrared sensors, LiDAR sensors, optical sensors, and machine vision. Ultrasonic sensors detect distances by measuring the echo time; infrared sensors calculate distance by measuring the time between emission and return of infrared light; LiDAR sensors detect distance either by measuring the time of flight between the emitted and reflected signal or by analyzing the phase difference between incident and reflected laser beams; optical sensors use image processing techniques to analyze images captured from different light positions for detection; machine vision detects by capturing and analyzing image feature information [107].
Jiang Honghua et al. [108] proposed a fast method for field weed identification based on deep convolutional networks and binary hash codes. By utilizing a trained model to extract the fully connected layer features and hash features of input images, they compared these features with those in the database, calculating the Hamming distance and Euclidean distance, respectively. The K most similar images were identified, and their labels were counted. The most frequent label was assigned to the image, achieving the classification and recognition goal. The experimental results showed that this method achieved an accuracy of 98.6% for field weed identification. Liu Liming et al. [109] integrated laser sensors and ultrasonic sensors for canopy information collection. The experiments demonstrated that this approach provided higher accuracy compared to using a single-sensor array. Gu Chenchen et al. [110] developed a mobile experimental platform equipped with LiDAR for canopy leaf area detection. They established a detection model using partial least squares regression and BP neural network algorithms. The experiments showed that the detection model had high accuracy for dense canopy detection but lower accuracy for sparse canopy detection. Song Ling et al. [111] proposed a cassava leaf disease detection model based on an improved YOLOX network. By employing methods such as data augmentation, multi-scale feature extraction modules, and channel attention mechanisms, along with a quality focal loss function as the classification loss function to assist the network’s convergence, the experimental results showed an average precision of 93.53%, which was 6.02 percentage points higher than the baseline model. Its comprehensive detection capability outperformed several mainstream models.

6.4. Harvest Operation Monitoring Technology

Harvest operation monitoring technology enhances harvesting efficiency, ensures operational quality, and reduces losses through the real-time monitoring of the working status and operational parameters of the harvesting machinery.
Guo Hui et al. [112] designed a cleaning loss monitoring device suitable for sunflower combine harvesters, as shown in Figure 11. The device mainly consists of a PVDF piezoelectric film, a signal acquisition and processing circuit board, and an installation adjustment device, enabling the real-time monitoring of sunflower cleaning loss rates. Du Yuefeng et al. [113] developed a cleaning loss monitoring sensor based on the minimal energy criterion EMD denoising method, which achieved the separation of vibration, operational noise, and other unwanted signals from the collected data. Chen Man et al. [114] proposed an online monitoring method for soybean mechanized harvest quality based on machine vision. By effectively segmenting soybean images and selecting RGB and HSV color space feature values, they constructed a quantitative evaluation model. Experimental results showed that this method is consistent with manual detection.

7. Problems Faced and Suggested Measures

7.1. Problems Faced

(1)
Impact of Complex Environments on Multi-Sensor Perception Technology
Currently, environmental perception technologies have made some progress in terms of recognition accuracy and response speed. The fusion of multiple sensors (such as LiDAR, millimeter-wave radar, and visual sensors) has been proven to enhance recognition accuracy. However, there is still significant room for improvement in the construction of precise maps for farmland. For instance, issues such as fuzzy farmland boundaries may cause deviations between the navigation recognition and the operation path. Changes in lighting conditions and adverse weather conditions can also affect the accuracy of environmental perception [115]. The irregular terrain and crop planting in farmland further complicate crop row monitoring, leading to a decline in recognition rates of image processing and multi-sensor fusion detection methods. Uneven farmland terrain and agricultural machinery vibrations can cause changes in the scanning direction of sensors, creating blind spots. Additionally, agricultural work environments tend to generate large amounts of dust, which can cover the sensor surfaces and result in false signals.
(2)
Accuracy and Real-Time Issues of Navigation and Positioning Technology
In scenarios with signal obstructions, such as mountainous areas and orchards, GNSS accuracy significantly decreases, leading to path deviations [116]. The drift error accumulation problem of INS has not been fully resolved, and even when fused with the GNSS, it struggles to provide sustained high-precision positioning in complex environments. Visual navigation is influenced by crop growth stages, lighting conditions, and obstacle occlusions, making its accuracy unstable. While LiDAR navigation equipment offers high precision, its high cost limits large-scale application on agricultural machinery. The real-time availability and synergy of multi-source data obtained from multiple sensors are insufficient, and current data fusion algorithms lack both real-time computational capacity and accuracy in complex farmland environments. The spatial complexity of narrow, cluttered, and large-scale scenarios increases the technical difficulty, and the combination of spatial complexity and temporal uncertainty has become a key technological bottleneck that urgently needs to be overcome in agricultural machinery navigation technology.
(3)
Technical Bottlenecks in Autonomous Operation and Multi-Machine Collaboration
Farmland environments typically feature an undulating terrain and irregularly distributed obstacles, which complicate path planning and navigation. In dynamic environments with numerous obstacles, obstacle avoidance technologies face issues such as delays and misjudgments, making it difficult to adjust autonomous operation strategies in real time. Traditional path planning algorithms (such as A* and RRT) perform poorly when dealing with dynamic environments and multi-objective tasks. While deep learning-based path planning algorithms offer adaptability, they require pre-training, which is time-consuming and has limited generalization capabilities for large-scale scenarios. The high real-time demands for environmental perception and path planning in multi-robot coordination algorithms increase system complexity. Additionally, the lack of efficient coordination mechanisms for communication and task allocation in multi-robot collaboration leads to problems such as information transmission delays and resource waste.
(4)
Challenges in Monitoring and Diagnostics of Key Components
Most agricultural field environments are harsh, and agricultural machinery often encounters various malfunctions during operation. Due to the lack of real-time tracking and monitoring, this results in abnormal machine operation and, in some cases, equipment damage. The arrangement of key component monitoring sensors for agricultural machinery is complex. Currently, there are few self-developed agricultural sensors, and their stability is poor. They also lack sufficient anti-interference capabilities when facing vibrations, dust, and high-temperature conditions in field operations. Some domestic manufacturers have developed monitoring systems based on mature foreign products, but these systems face issues such as low monitoring accuracy, complex operating systems, and unintuitive interfaces. Additionally, they are expensive, making it difficult to popularize the products. Most domestic monitoring systems are still in the experimental stage and cannot be widely deployed for large-scale use [117]. Monitoring data lacks unified standards, and the interoperability between devices is poor, which affects the accuracy of data analysis. Traditional vibration signal analysis methods rely on manual experience and are unable to adapt to the complexity of various fault modes. Intelligent diagnostic algorithms, such as those using deep learning methods for diagnosis, heavily depend on labeled data, making it difficult to cover all fault scenarios that may occur in actual operations.
(5)
Inadequate Adaptability of Operation Monitoring Technology
In some domestic agricultural fields, the area is relatively small, and the environment is complex, with soft and uneven soil. Domestic tillage and leveling machinery often use single sensors to monitor tillage depth and surface leveling. These systems are easily affected by surface undulations and soil quality, leading to poor monitoring results [118]. Monitoring technologies for parameters such as leveling accuracy and soil pulverization rate face challenges in balancing dynamic real-time performance with precision. Multi-parameter collaborative optimization algorithms are still not mature, resulting in insufficient alignment between monitoring results and actual operational requirements. There is a lack of domestically produced high-performance sensors, creating technological barriers when compared to international standards, such as seed-level resolution sowing monitoring sensors or precision devices for monitoring small seed sowing (e.g., photoelectric sensors and capacitive sensors), which have relatively low accuracy. Additionally, under harsh environmental conditions, transplanting operation monitoring for missed planting and row spacing control technologies are not yet stable enough.
(6)
Standardization and Efficiency Issues in Technology Integration
Currently, there is a lack of unified interface standards between various technical systems, making it difficult to achieve efficient integration of hardware and software. The lack of a standardized platform makes it difficult for systems developed by different manufacturers and research teams to be compatible. In this situation, integrating different technologies and algorithms becomes complex, limiting the system’s flexibility and interoperability, and increasing the costs of development and maintenance [119]. Incompatible cross-platform data formats and transmission protocols, along with the challenge of unifying field communication protocols, hinder the seamless fusion of heterogeneous data processing and reprocessing. Research in this area is still insufficient, limiting the deeper integration of multiple technologies. Additionally, algorithms such as deep learning and SLAM require significant computational power, but the embedded hardware in agricultural machinery often struggles to support the real-time execution of complex algorithms. Some data acquisition devices also face bandwidth limitations when transmitting high-frequency sampling data, which affects real-time processing. With limitations in real-time performance and hardware capabilities, one effective solution to the spatial complexity challenge is minimizing the number of samples taken while retaining as much spatial information as possible. There is also a lack of low-cost, high-efficiency solutions to promote large-scale applications.

7.2. Suggested Measures

(1)
Impact of Complex Environments on Multi-Sensor Perception Technology
To address the impact of complex environments on multi-sensor perception technology, dynamic sensor calibration and adaptive compensation techniques can be employed to enhance the system’s ability to adapt to variations in light, weather, and terrain. This can be combined with deep learning-based multi-modal data fusion technologies to integrate data from LiDAR, millimeter-wave radar, and visual sensors at the feature level, improving the perception accuracy and stability in complex agricultural environments. Additionally, to minimize dust interference, a gas jet cleaning system and hydrophobic coatings can be designed. These solutions, together with a gyroscope correction module, can dynamically adjust the sensor orientation, preventing blind spot issues caused by vibrations.
(2)
Accuracy and Real-Time Issues of Navigation and Positioning Technology
To improve the accuracy and real-time performance of navigation and positioning technologies, the fusion algorithm of GNSS and INS should be enhanced by adopting factor graph optimization and unbiased Kalman filtering techniques to correct navigation errors. In areas with signal blockage, the positioning accuracy can be compensated by combining low-cost GNSS RTK modules with SBAS augmentation systems. Additionally, lightweight deep learning-based semantic segmentation algorithms can optimize visual navigation, while low-cost 2D LiDAR can address positioning challenges in low-light or obstructed environments. Furthermore, a distributed real-time data processing framework should be implemented to enhance the efficiency and real-time performance of multi-sensor collaborative fusion, ensuring the reliability of navigation in complex scenarios.
(3)
Technical Bottlenecks in Autonomous Operation and Multi-Machine Collaboration
In autonomous operations and multi-machine collaboration, reinforcement learning-based path planning algorithms can be used to dynamically address the path adjustment needs in areas with dense obstacles, combined with local optimization methods to enhance short-term obstacle avoidance capabilities. By leveraging the multi-modal data processing capabilities of the Transformer model, data from different sensors such as vision, LiDAR, and others are integrated to enhance the comprehensiveness and accuracy of environmental perception, thereby optimizing path planning and obstacle avoidance. A distributed task scheduling protocol based on blockchain technology can be developed to ensure efficient task allocation and state synchronization between machines. By sharing LiDAR and GNSS data, a global 3D agricultural field model can be constructed to enhance the overall efficiency of collaborative operations and improve path planning accuracy, while simultaneously reducing resource waste caused by redundant modeling.
(4)
Challenges in Monitoring and Diagnostics of Key Components
To address the challenges in monitoring and diagnosing critical components, high-temperature-resistant, vibration-proof, and dust-resistant MEMS sensors can be deployed at key nodes of agricultural machinery. These sensors can be integrated into a multi-physical field collaborative monitoring system to comprehensively analyze vibration and temperature data. By leveraging a Transformer-based few-shot learning framework and self-supervised algorithms, the reliance on labeled data can be reduced, enhancing the diagnostic capability for complex fault patterns. Additionally, high-precision signal analysis algorithms can be used to accurately locate faults and predict trends in key components, improving the operational stability of the equipment.
(5)
Inadequate Adaptability of Operation Monitoring Technology
To address the issue of insufficient adaptability in operation monitoring technology, a multi-parameter dynamic optimization model based on genetic algorithms can be developed to adjust key parameters such as planting depth and soil leveling in real time, adapting to various terrains and soil conditions. Additionally, a seed monitoring device combining photoelectric and capacitive sensing technologies can be designed to enhance the accuracy of planting single seeds and small grains. In the operation data processing stage, an edge computing architecture can be deployed to run dynamic prediction models, enabling fast analysis and precise feedback of monitoring data, thus meeting the high adaptability requirements of field operations.
(6)
Standardization and Efficiency Issues in Technology Integration
To address the issues of standardization and efficiency in technology integration, a unified communication protocol based on OPC UA should be developed to optimize data compatibility between multiple technological systems. Additionally, by incorporating lightweight deep learning algorithms and utilizing GPU and FPGA co-processors, the performance of embedded hardware in complex algorithm processing can be enhanced. Moreover, by applying compressed sensing and adaptive sampling technologies, the bandwidth usage for data transmission can be reduced, improving the efficiency of transmitting and processing high-frequency sampling data. These efforts will promote the rapid development of agricultural engineering technologies in terms of standardization and large-scale application.

8. Conclusions and Development Prospects

8.1. Conclusions

This paper explores in depth the key technologies behind the development of intelligent agricultural machinery in China, covering aspects such as autonomous operation and path planning, state monitoring and fault diagnosis, and field operation monitoring. The integration of global satellite navigation, inertial navigation, and laser navigation technologies has driven precise positioning and efficient navigation of agricultural machinery. Path planning technology uses intelligent algorithms to optimize the operation paths of agricultural machines, improving operational efficiency and resource utilization. Meanwhile, multi-machine collaborative operations have demonstrated significant potential in large-scale farmland applications. The state monitoring and fault diagnosis technologies support the intelligent management of equipment, with real-time monitoring and fault prediction helping to reduce equipment failures and improve production efficiency. In terms of field operation monitoring, real-time monitoring technologies for tasks such as tillage, sowing, and harvesting have enhanced operational accuracy and quality, further advancing the application of precision agriculture.
The key technologies are closely interconnected, forming a highly integrated intelligent system. Environmental perception technology, through the fusion of multiple sensors, provides precise environmental data, supporting positioning and navigation technologies to ensure that agricultural machinery can accurately locate itself and autonomously navigate in complex farmland environments. Path planning technology generates the optimal path based on real-time environmental information, effectively avoiding obstacles and improving operational efficiency. State monitoring and fault diagnosis technologies allow for the monitoring of agricultural machinery’s health during operations, enabling the real-time tracking of the machine’s status and fault prediction, ensuring the stability of the system. Finally, field operation monitoring technology precisely tracks the quality of operations in tasks such as tillage, sowing, and harvesting, thereby improving operational accuracy and crop yield. The synergy of these technologies enables agricultural autonomous unmanned systems to achieve efficient, intelligent, and precise field operations, driving agriculture toward a more modern and intelligent direction.

8.2. Development Prospects

(1)
Autonomous Agricultural Machinery Operations and Sensor Development in Complex Environments
With the successful deployment of China’s BeiDou Satellite Navigation System, positioning and navigation technology has found widespread application in agricultural production. To mitigate the impact of complex environments, key technologies for autonomous operations in agricultural machinery, particularly in automated navigation and environmental sensing, require breakthroughs. Due to the diversity and uncontrollability of the agricultural environment, it is essential to further optimize environmental sensing and positioning/navigation technologies, develop sensors that are robust to interference and have wide adaptability, and utilize information fusion algorithms to integrate multi-source data for improved precision. At the same time, it is essential to explore how to dynamically plan paths in the ever-changing agricultural environment, responding in real-time to factors such as crop growth stages, field obstacles, and weather changes, to chart safe and efficient routes. The development and application of high-precision sensing devices (such as LiDAR) and intelligent algorithms come with high costs, making them difficult for farmers to adopt. Therefore, the development of low-cost, high-efficiency domestic sensors with independent intellectual property rights is crucial for promoting large-scale applications, ensuring the widespread adoption and popularization of autonomous operation technologies in agricultural machinery.
(2)
Multi-Parameter Monitoring and Comprehensive Fault Diagnosis of Smart Agricultural Machinery
The development of agricultural machinery state monitoring and fault diagnosis, as well as field operation monitoring technologies, will further promote the realization of intelligent and precise agricultural machinery. To achieve comprehensive monitoring of key components of agricultural machinery, a combination of vibration, temperature, acoustic, and other sensors can be used, while optimizing sensor anti-interference technology to enhance the equipment’s adaptability in harsh farmland environments. Regarding fault diagnosis technology, a multi-fault diagnosis platform can be built using big data and artificial intelligence, and through unsupervised learning and transfer learning algorithms, the dependency on labeled data in deep learning methods can be reduced.
For different operational stages, the development direction of monitoring technologies needs to shift from single-parameter monitoring to multi-parameter collaborative monitoring. Real-time monitoring and optimization of soil depth, leveling, and soil fragmentation can be achieved by integrating machine vision and edge computing technologies, providing higher accuracy and efficiency. Monitoring technologies for seeding and transplanting operations can be realized through high-precision monitoring systems based on deep learning and multi-sensor fusion to monitor parameters such as small seed sowing and complex crop missing seed rates. In crop protection and harvesting operation monitoring, the integration of LiDAR, ultrasonic sensors, and machine vision technologies can optimize key processes such as crop information sensing, spraying control, and harvest quality monitoring. Research into target recognition and variable-rate spraying models in dynamic environments can support the precise management of agricultural production.
(3)
Efficient Multi-Sensor Application and Technology Integration
In terms of multi-sensor fusion, as the number of sensors increases, the challenge of data processing and fusion complexity continues to rise. Efficient solutions need to be explored to reduce computational burdens while ensuring the quality of fusion and meeting real-time requirements. The development and integration of technologies are the core elements driving the full autonomy and intelligence of smart agricultural machinery. In the face of issues such as insufficient system real-time performance and low standardization of software and hardware during the fusion process, the future direction should focus on promoting the deep integration of artificial intelligence and smart agricultural machinery. It should emphasize the optimization of data fusion frameworks, particularly the application of intelligent algorithms combined with reinforcement learning and transfer learning. Promoting the modular design of hardware and software and the development of standardized interface protocols is crucial for unifying the standards and specifications for perception data in unmanned agricultural machinery operations. This will help reduce the complexity of system development and deployment, enabling data interoperability and fusion between different systems. By leveraging the integration of the Internet of Things (IoT) and edge computing, efficient collaboration between local rapid processing and cloud-based data analysis can be achieved. This will provide intelligent and real-time technical support for agricultural operations.

Author Contributions

Conceptualization: H.Z., H.W. and W.S.; validation: Y.M., W.G. and G.J.; investigation, W.L., J.G. and B.C.; resources: H.W. and W.S.; data curation: W.L., J.G. and H.Z.; writing—original draft preparation, W.L., J.G., B.C. and J.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by the National Key Research and Development Program of China under Grant (2023YFD2001205); the China Agriculture Research System of MOF and MARA (grant number CARS-23-D07); and the China Agriculture Research System of MOF and MARA (grant number CARS-23-D02).

Data Availability Statement

This study did not report any data.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Luo, X.W.; Liao, J.; Hu, L.; Zhou, Z.Y.; Zhang, Z.G.; Zang, Y.; Wang, P.; He, J. Research progress of intelligent agricultural machinery and practice of unmanned farm in China. J. South China Agric. Univ. 2021, 42, 8–17. [Google Scholar]
  2. Zhao, C.J. Current situations and prospects of smart agriculture. J. South China Agric. Univ. 2021, 42, 1–7. [Google Scholar]
  3. Zhai, Z.Y.; Yang, S.; Wang, X.; Zhang, C.F.; Song, J. Status and Prospect of Intelligent Measurement and Control Technology for Agricultural Equipment. Trans. Chin. Soc. Agric. Mach. 2022, 53, 1–20. [Google Scholar]
  4. Alshammrei, S.; Boubaker, S.; Kolsi, L. Improved Dijkstra Algorithm for Mobile Robot Path Planning and Obstacle Avoidance. Comput. Mater. Contin. 2022, 72, 5939–5954. [Google Scholar] [CrossRef]
  5. Reina, G.; Milella, A.; Rouveure, R.; Nielsen, M.; Worst, R.; Blas, M.R. Ambient awareness for agricultural robotic vehicles. Biosyst. Eng. 2016, 146, 114–132. [Google Scholar] [CrossRef]
  6. Höffmann, M.; Patel, S.; Büskens, C. Optimal Coverage Path Planning for Agricultural Vehicles with Curvature Constraints. Agriculture 2023, 13, 2112. [Google Scholar] [CrossRef]
  7. Wakchaure, M.; Patle, B.K.; Mahindrakar, A.K. Application of AI techniques and robotics in agriculture: A review. Artif. Intell. Life Sci. 2023, 3, 100057. [Google Scholar] [CrossRef]
  8. Javaid, M.; Haleem, A.; Singh, R.P.; Suman, R. Enhancing smart farming through the applications of Agriculture 4.0 technologies. Int. J. Intell. Netw. 2022, 3, 150–164. [Google Scholar] [CrossRef]
  9. Holzinger, A.; Fister, I.; Fister, I.; Kaul, H.; Asseng, S. Human-Centered AI in Smart Farming: Toward Agriculture 5.0. IEEE Access 2024, 12, 62199–62214. [Google Scholar] [CrossRef]
  10. Karimi, H.; Navid, H.; Besharati, B.; Eskandari, I. Assessing an infrared-based seed drill monitoring system under field operating conditions. Comput. Electron. Agric. 2019, 162, 543–551. [Google Scholar] [CrossRef]
  11. Kim, S.; Lee, H.; Hwang, S.; Kim, J.; Jang, M.; Nam, J. Development of Seeding Rate Monitoring System Applicable to a Mechanical Pot-Seeding Machine. Agriculture 2023, 13, 2000. [Google Scholar] [CrossRef]
  12. Wang, Z.R.; Chen, K.P.; Xiao, G.R.; Zhao, R.L. Construction of Teaching Case Database of Principle and Application of Machine Vision for Master of Mechanical Engineering. Educ. Teach. Forum 2022, 3, 53–56. [Google Scholar]
  13. Wan, H.; Ou, Y.Z.; Guan, X.L.; Jiang, R.; Zhou, Z.Y.; Luo, X.W. Review of the perception technologies for unmanned agricultural machinery operating environment. Trans. Chin. Soc. Agric. Eng. 2024, 40, 1–18. [Google Scholar]
  14. Li, X.H. Research on Farmland Obstacle Detection and Recognition System. Master’s Thesis, Ningxia University, Yinchuan, China, 2022. [Google Scholar]
  15. Lu, Y.H.; Jia, Y.J.; Zhuang, Y.; Dong, Q. Obstacle avoidance approach for quadruped robot based on multi-modal infor-mation fusion. Chin. J. Eng. 2024, 46, 1426–1433. [Google Scholar]
  16. Li, H.B.; Tian, X.; Ruan, Z.W.; Liu, S.W.; Ren, W.Q.; Su, Z.B.; Gao, R.; Kong, Q.M. Seedling Stage Corn Line Detection MethodBased on YOLOv8-G. Smart Agric. 2024, 6, 1–13. [Google Scholar]
  17. Jiang, Q.; An, D.; Han, H.Y.; Liu, J.H.; Guo, Y.C.; Chen, L.Q.; Yang, Y. Maize Crop Row Detection Algorithm Based on Fusion of LiDAR and RGB Camera. Trans. Chin. Soc. Agric. Mach. 2024, 55, 263–274. [Google Scholar]
  18. Liu, Z.P.; Zhang, Z.G.; Luo, X.W.; Wang, H.; Huang, P.K.; Zhang, J. Design of automatic navigation operation system for Lovol ZP9500 high clearance boom sprayer based on GNSS. Trans. Chin. Soc. Agric. Eng. 2018, 34, 15–21. [Google Scholar]
  19. Zhou, J.; Xu, J.K.; Wang, Y.X.; Liang, Y.B. Development of Paddy Field Rotary-leveling Machine Based on GNSS. Trans. Chin. Soc. Agric. Mach. 2020, 51, 38–43. [Google Scholar]
  20. Chen, Y.; He, Y. Development of agricultural machinery steering wheel angle measuring system based on GNSS attitude and motor encoder. Trans. Chin. Soc. Agric. Eng. 2021, 37, 10–17. [Google Scholar]
  21. Xu, Q.M.; Li, H.W.; He, J.; Wang, Q.J.; Lu, C.Y.; Wang, C.L. Design and experiment of the self-propelled agricultural mobile platform for wheat seeding. Trans. Chin. Soc. Agric. Eng. 2021, 37, 1–11. [Google Scholar]
  22. Gui, Z.L.; Gu, Y.Q.; Xu, L.X.; He, X.; Zhu, Y.H.; Wang, B.S.; Wang, W.Z. Design and experiment of electronic control system of wheat planter based on GNSS velocity measurement. J. Henan Agric. Univ. 2024, 1–14. [Google Scholar] [CrossRef]
  23. Zhang, J.; Chen, D.; Wang, S.M.; Yu, Z.J.; Wei, L.G.; Jia, Q. Research of INS / GNSS Heading Information Fusion Method for Agricultural Machinery Automatic Navigation System. Trans. Chin. Soc. Agric. Mach. 2015, 46 (Suppl. S1), 1–7. [Google Scholar]
  24. Qiu, Q.; Hu, Q.H.; Fan, Z.Q.; Sun, N.; Zhang, X.H. Adaptive-coefficient Kalman Filter Based Combined Positioning Algorithm for Agricultural Mobile Robots. Trans. Chin. Soc. Agric. Mach. 2022, 53 (Suppl. S1), 36–43. [Google Scholar]
  25. Zhong, Y.; Xue, M.Q.; Yuan, H.L. Intelligent agricultural machinery GNSS/INS integrated navigation system design. Trans. Chin. Soc. Agric. Eng. 2021, 37, 40–46. [Google Scholar]
  26. Feng, S.; Zhang, Z.G.; Sun, L.Z.; Wang, F.; Jie, K.T. Research on deflection angle measurement system of tractor guide wheel based on GNSS/INS. J. Intell. Agric. Mech. 2024, 5, 33–41. [Google Scholar]
  27. Tian, J.L.; Chen, X.B.; Zhang, W.C.; Wang, H.L.; Wu, D.Y.; Chang, D.S.; Liu, C.D. Research Progress on Navigation Technology for Autonomous Mobile Robot. Tech. Autom. Appl. 2024, 1–6. Available online: http://kns.cnki.net/kcms/detail/23.1474.TP.20241230.0908.016.html (accessed on 3 March 2025).
  28. Huang, K.; Zhao, J.J.; Feng, T.T. Exploration of Local Geometric Information Representation and Uncertainty Analysis in Simultaneous Localization and Mapping with LiDAR. Chin. J. Lasers 2024, 1–30+32–35. Available online: http://kns.cnki.net/kcms/detail/31.1339.TN.20241206.1746.010.html (accessed on 3 March 2025).
  29. Wang, Y.L.; Cao, R.Y.; Geng, Z.X. Research on automatic control system of laser navigation facility management robot. Jiangsu Agric. Sci. 2018, 46, 236–238. [Google Scholar]
  30. Ni, J.N. Research on Automatic Control System of Harvester Based on Laser Navigation Facility. J. Agric. Mech. Res. 2021, 43, 217–220. [Google Scholar]
  31. Hu, L.; Wang, Z.M.; Wang, P.; He, J.; Jiao, J.K.; Wang, C.Y.; Li, M.J. Agricultural robot positioning system based on laser sensing. Trans. Chin. Soc. Agric. Eng. 2023, 39, 1–7. [Google Scholar]
  32. Liu, Y.; Ji, J.; Pan, D.; Zhao, L.J.; Li, M.S. Localization Method for Agricultural Robots Based on Fusion of LiDAR and IMU. Smart Agric. 2024, 6, 94–106. [Google Scholar]
  33. Wang, H.L.; Chen, Y.Z.; Liu, Z.C.; Ma, X.L. Survey of visual simultaneous localization and mapping algorithms. Appli-Cation Res. Comput. 2025, 42, 321–333. [Google Scholar]
  34. Li, Y.W.; Xu, J.J.; Wang, M.F.; Liu, D.X.; Sun, H.W.; Wang, X.J. Development of autonomous driving transfer trolley on field roads and its visual navigation system for hilly areas. Trans. Chin. Soc. Agric. Eng. 2019, 35, 52–61. [Google Scholar]
  35. Zhou, Y.Q.; Gao, H.W.; Bao, Y.H.; Cui, L.W.; Fu, J.F.; Wang, D. Design of Agricultural Machinery Navigation and Electro Hydraulic Control Based on Computer Vision Correction. J. Agric. Mech. Res. 2022, 44, 196–200. [Google Scholar]
  36. Li, J.X.; Kong, D.Z. Research on visual navigation information processing technology of autonomous walking grape picking robot. J. Chin. Agric. Mech. 2020, 41, 157–162. [Google Scholar] [CrossRef]
  37. Wang, D.; Fan, Y.M.; Xue, J.R.; Yuan, D.; Shen, K.C.; Zhang, H.H. Flight Path Control of UAV in Mountain Orchards Based on Fusion of GNSS and Machine Vision. Trans. Chin. Soc. Agric. Mach. 2019, 50, 20–28. [Google Scholar]
  38. Mao, W.J.; Liu, H.; Wang, X.L.; Yang, F.Z.; Liu, Z.J.; Wang, Z.Y. Design and Experiment of Dual Navigation Mode Orchard Transport Robot. Trans. Chin. Soc. Agric. Mach. 2022, 53, 27–39, 49. [Google Scholar]
  39. Yang, S.Y.; Song, Y.; Xue, J.L.; Wang, P.X. Multi-sensor integrated positioning of rice transplanter based on visual supplementation. J. Huazhong Agric. Univ. 2024, 43, 234–246. [Google Scholar]
  40. Chen, M.Y.; Luo, L.F.; Liu, W.; Wei, H.L.; Wang, J.H.; Lu, Q.H.; Luo, S.M. Orchard-Wide Visual Perception and Au-tonomous Operation of Fruit Picking Robots:A Review. Smart Agric. 2024, 6, 20–39. [Google Scholar]
  41. He, J.; Gao, W.W.; Wang, H.; Yue, B.B.; Zhang, F.; Zhang, Z.G. Wheel steering angle measurement method of agricultural machinery based on GNSS heading differential and MEMS gyroscope. J. South China Agric. Univ. 2020, 41, 91–98. [Google Scholar]
  42. Chen, S.; Zhou, B.; Jiang, C.; Xue, W.; Li, Q.; Pan, D. A LiDAR/Visual SLAM Backend with Loop Closure Detection and Graph Optimization. Remote Sens. 2021, 13, 2720. [Google Scholar] [CrossRef]
  43. Xiao, Z.B. Agricultural Robot Navigation Based on Sensor Fusion. South Agric. Mach. 2022, 53, 78–80. [Google Scholar]
  44. Liu, C.; Li, J.Y.; Jia, N.; Hua, J. RTK-UWB multi-sensor fusion positioning method in agroforestry environment. J. For. Eng. 2024, 9, 142–151. [Google Scholar]
  45. Jie, K.T.; Zhang, Z.G.; Wang, F.; Zhu, S.L.; Xu, X.C.; Zhang, G.H. Cooperative Localization Algorithm for Full Center Mass of WLS-HDS-TWR Driverless Agricultural Machines. Trans. Chin. Soc. Agric. Mach. 2024, 55, 27–36, 110. [Google Scholar]
  46. Zhang, S.L.; Sun, Y.Q.; Zhang, J.Q.; Yu, P.F.; Huang, H. Research on Dynamic Path Planning Method for Agricultural Machinery Based on Autonomous Operation. Instrum. Technol. 2024, 49–54. [Google Scholar]
  47. Zhai, Z.Q.; Wang, X.Q.; Wang, L.; Zhu, Z.X.; Du, Y.F.; Mao, E.R. Collaborative Path Planning for Autonomous Agricultural Machinery of Master-Slave Cooperation. Trans. Chin. Soc. Agric. Mach. 2021, 52 (Suppl. S1), 542–547. [Google Scholar]
  48. Huang, D.Y.; Li, H.; Gao, Y.K. Design of a Trolley Car with Solar Automatic Tracking & Obstacle Avoidance. Instrum. Technol. 2019, 7, 34–36. [Google Scholar]
  49. Chen, Z.Y. Research on Obstacle Avoidance Technology in Autonomous Operation of Agricultural Machinery Based on Edge Intelligence. Master’s Thesis, Harbin Polytechnic Institute, Harbin, China, 2023. [Google Scholar]
  50. Ma, W.Q. Research on the Design of Software and Hardwaresystems and Autonomous Operation Controltechnology of Greenhouse Vegetable Transplantingrobot Based on Distributed Structure. Master’s Thesis, Northwest A&F University, Yangling, China, 2024. [Google Scholar]
  51. Meng, L.W.; Chen, S.F.; Chen, Q.C.; Zhai, X.L.; Han, B.; Xiong, S.K.; Li, Z.Q.; Wei, J. A review of target recognition and autonomous job positioning techniques for small multi-functional robots. Equip. Manuf. Technol. 2022, 12, 49–51. [Google Scholar]
  52. Li, M.C. Research on Scene Perception Technology of Bulldozerautonomous Operation Based on Binocular Visio. Master’s Thesis, Jiinan University, Jinan, China, 2022. [Google Scholar]
  53. Cai, D.Q. Research on Autonomous Job Sensing Technology in Unstructured Farmland Environment. Master’s Thesis, Shanghai Jiao Tong University, Shanghai, China, 2020. [Google Scholar]
  54. Bai, X.P. Autonomous operation technology research and system development. High-Technol. Ind. 2018, 5, 50–53. [Google Scholar]
  55. Chakraborty, S.; Elangovan, D.; Govindarajan, P.L.; ELnaggar, M.F.; Alrashed, M.M.; Kamel, S. A comprehensive review of path planning for agricultural ground robots. Sustainability 2022, 14, 9156. [Google Scholar] [CrossRef]
  56. Tang, Y.; Qi, S.; Zhu, L.; Zhuo, X.; Zhang, Y.; Meng, F. Obstacle avoidance motion in mobile robotics. J. Syst. Simul. 2024, 36, 1–26. [Google Scholar]
  57. Karaman, S.; Frazzoli, E. Sampling-based algorithms for optimal motion planning. Int. J. Robot. Res. 2011, 30, 846–894. [Google Scholar] [CrossRef]
  58. Wang, H.; Lou, S.; Jing, J.; Wang, Y.; Liu, W.; Liu, T. The EBS-A* algorithm: An improved A* algorithm for path planning. PLoS ONE 2022, 17, e0263841. [Google Scholar] [CrossRef]
  59. Guo, J.; Huo, X.; Guo, S.; Xu, J. In A path planning method for the spherical amphibious robot based on improved a-star algorithm. In Proceedings of the 2021 IEEE International Conference on Mechatronics and Automation (ICMA), Takamatsu, Japan, 8–11 August 2021; pp. 1274–1279. [Google Scholar]
  60. Zhang, B.K.; Zhu, T.X.; Liu, C.Q.; Wang, Y.Z.; Ma, Z.K.; Zhang, G.Q. Review on Path Planning of Agricultural Robot. Tract. Farm Transp. 2024, 51, 11–14. [Google Scholar]
  61. Jiang, X.B.; Wang, M.W.; Yang, C.M.; Jiang, B.W. Path planning for orchard spraying robot based on improved A* and APF algorithms. Transducer Microsyst. Technol. 2024, 43, 145–149. [Google Scholar]
  62. Xin, P.; Wang, Y.H.; Liu, X.L.; Ma, X.Q.; Xu, D. Path planning algorithm based on optimize and improve RRT and artificial potential field. Comput. Integr. Manuf. Syst. 2023, 29, 2899–2907. [Google Scholar]
  63. Wang, Y.; Liu, Y.J.; Jia, H.; Xue, G. Path planning of mechanical arm based on intensified RRT algorithm. J. Shandong Univ. (Eng. Sci.) 2022, 52, 123–130+138. [Google Scholar]
  64. Zou, Q.J.; Liu, S.H.; Zhang, Y.; Hou, Y.L. Rapidly-exploring random tree algorithm for path re-planning based on reinforcement learning under the peculiar environment. Control Theory Appl. 2020, 37, 1737–1748. [Google Scholar]
  65. Deng, Y.Z.; Tu, H.Y.; Song, M.J. Robot Path Planning Algorithm Based on Improved RRT. Modul. Mach. Tool Autom. Manuf. Tech. 2024, 6, 6–11. [Google Scholar]
  66. Li, Z.Y.; Ou, Y.M.; Shi, R.L. Improved RRT Path Planning Algorithm Based on Deep Q-network. AirSpace Def. 2021, 4, 17–23. [Google Scholar]
  67. Luo, R.H.; Gan, X.; Yang, G.Y. Complete Coverage Path Planning Method for Automatic Operation of Agricultural Machinery with Obstacle Avoidance Function. J. Agric. Mech. Res. 2025, 47, 36–43. [Google Scholar]
  68. Ma, H.J.; Xue, A.H. Research on robot path planning based on deep attention Q-networks. Transducer Microsyst. Technol. 2024, 43, 66–70, 75. [Google Scholar]
  69. Dai, S.T.; Wang, Y.; Shang, C.C. Multi-unmanned vehicle collaborative path planning method based on deep reinforcement learning. J. Beijing Univ. Aeronaut. Astronaut. 2024, 1–12. [Google Scholar] [CrossRef]
  70. Yang, B.; Wu, X.; Zhang, M.L.; Feng, S.K. Path Planning of Weeding Robot Arm Based on Deep Reinforcement Learning. J. Agric. Mech. Res. 2024, 1–7. [Google Scholar] [CrossRef]
  71. Zhuang, J.W.; Zhang, X.F.; Yin, Q.D.; Chen, K. Route Planning of Agricultural Unmanned Vehicle Based on DQN Algorithm. J. ShenYang Ligong Univ. 2024, 43, 32–37. [Google Scholar]
  72. Luo, X.W.; Liu, Y.; Xiang, G.L. Swin Transformer-Based Unpiloted Path Planning Algorithm. J. Wuhan Univ. (Nat. Sci. Ed.) 2024, 70, 697–703. [Google Scholar]
  73. Li, J.; Jin, Z.X. Path planning of crop inspection robot based on lightweight Transformer. J. Chin. Agric. Mech. 2024, 45, 227–233. [Google Scholar]
  74. Xiong, C.Y.; Xiong, J.T.; Yang, Z.G.; Hu, W.X. Path planning method for citrus picking manipulator based on deep reinforcement learning. J. South China Agric. Univ. 2023, 44, 473–483. [Google Scholar]
  75. Zhang, Z.G.; Mao, J.X.; Tan, H.R.; Wang, Y.N.; Zhang, X.B.; Jiang, Y.M. A Review of Task Allocation and Motion Planning for Multi-robot in Major Equipment Manufacturing. Acta Autom. Sin. 2024, 50, 21–41. [Google Scholar]
  76. Jin, B.L.; Xia, Z.G.; Han, J.Y. Full Coverage Path Planning for Multi Machine Collaborative Work. J. Agric. Mech. Res. 2024, 46, 28–33. [Google Scholar]
  77. Gao, W.J. Research on Path Planning Method of Collaborative Operation for Automatic Driving of Agricultural Machinery. Master’s Thesis, Yangzhou University, Yangzhou, China, 2022. [Google Scholar]
  78. Wu, J.; Ma, H.J.; Zhang, T.F. Research on multi-machine path planning of weeding robot based on genetic algorithm. J. Zhejiang Univ. Sci. Technol. 2024, 36, 357–368. [Google Scholar]
  79. Xie, J.Y.; Liu, L.X.; Yang, X.; Wang, X.S.; Wang, X.; Liu, S.T. A path optimization algorithm for cooperative operation of multiple unmanned mowers in apple orchard. J. South China Agric. Univ. 2024, 45, 578–587. [Google Scholar]
  80. Ma, H.J. Research on Cooperative Path Planningmethod of Paddy Field Screw Propulsionvehicle. Master’s Thesis, Zhejiang Science and Technology, Hangzhou, China, 2024. [Google Scholar]
  81. Liu, X.M. Design and Implementation of Multi-machineCollaborative System for Plant Protection UAV Based 5G. Master’s Thesis, Chinese academy of agricultural sciences, Beijing, China, 2022. [Google Scholar]
  82. Tang, C.; Zong, W.Y.; Huang, X.M.; Luo, C.M.; Li, W.C.; Wang, S.S. Path planning algorithm for cooperative operation of multiple agricultural UAVs in multiple fields. J. Huazhong Agric. Univ. 2021, 40, 187–194. [Google Scholar]
  83. Li, H.; Zhong, T.; Zhang, K.Y.; Wang, Y.; Zhang, M. Design of Agricultural Machinery Multi-machine Cooperative Navigation Service Platform Based on WebGIS. Trans. Chin. Soc. Agric. Mach. 2022, 53 (Suppl. S1), 28–35. [Google Scholar]
  84. Wen, X.; Wang, X. Research on OneNet remote monitoring system based on CAN bus of agricultural machinery. J. Chin. Agric. Mech. 2022, 43, 116–121. [Google Scholar]
  85. Xiao, F.M. Design and implementation of agricultural machinery operation condition monitoring and early warning system. South Agric. Mach. 2024, 55, 58–60. [Google Scholar]
  86. Zhao, B.; Zhang, W.P.; Yuan, Y.W.; Wang, F.Z.; Zhou, L.M.; Niu, K. Research Progress in Information Technology for Agricultural Equipment Maintenance and Operation Service Management. Trans. Chin. Soc. Agric. Mach. 2023, 54, 1–26. [Google Scholar]
  87. Xiao, M.H.; Zhang, H.T.; Zhou, S.; Wang, K.X.; Ling, Z.B. Research progress and trend of agricultural machinery fault diagnosis technology. J. Nanjing Agric. Univ. 2020, 43, 979–987. [Google Scholar]
  88. Zhang, Y.P.; Yu, A.D. Research on Agricultural Machinery Bearing Fault Diagnosis Based on Improved Compressive Sensing Method. Mach. Des. Res. 2024, 40, 216–222. [Google Scholar]
  89. Qi, M.; Wang, G.Q.; Shi, N.F.; Li, C.F.; He, Y.X. Intelligent Fault Diagnosis Method for Rolling Bearings Based on Time-Frequency Diagram and Vision Transformer. Bearing 2024, 10, 115–123. [Google Scholar]
  90. Zhang, W.P. Research on Key Technologies of Fault Diagnosis andMaintenance Service Decision of Combine Harvester. Ph.D. Thesis, Chinese Academy of Agricultural Mechanization Sciences, Beijing, China, 2023. [Google Scholar]
  91. Song, E.Z.; Zhu, R.J.; Jing, H.G.; Yao, C.; Ke, Y. Motor fault diagnosis based on MFCC-MAFCNN under strong noise background. J. Harbin Eng. Univ. 2024, 1–9. Available online: http://kns.cnki.net/kcms/detail/23.1390.u.20241227.1317.023.html (accessed on 3 March 2025).
  92. Wang, H.W.; Wen, C.K.; Liu, M.N.; Meng, Z.J.; Liu, Z.Y.; Luo, Z.H. Tractor Operating Condition Parameter Testing System. Trans. Chin. Soc. Agric. Mach. 2023, 54, 409–416. [Google Scholar]
  93. Du, X.W.; Yang, X.L.; Pang, J.; Ji, J.T.; Jin, X.; Chen, L. Design and Test of Tillage Depth Monitoring System for Suspended Rotary Tiller. Trans. Chin. Soc. Agric. Mach. 2019, 50, 43–51. [Google Scholar]
  94. Jiang, X.H.; Tong, J.; Ma, Y.H.; Li, J.G.; Wu, B.G.; Sun, J.Y. Study of Tillage Depth Detecting Device Based on Kalman Filter and Fusion Algorithm. Trans. Chin. Soc. Agric. Mach. 2020, 51, 53–60. [Google Scholar]
  95. Zhou, H.; Hu, L.; Luo, X.W.; Tang, L.M.; Du, P.; Zhao, R.M. Design and experiment of the beating-leveler controlled by laser for paddy field. J. South China Agric. Univ. 2019, 40, 23–27. [Google Scholar]
  96. Wang, L. Research on the Measurement Scheme of Paddy Rotary Tillage Depth and Flatness. Master’s Thesis, Hubei University of Technology, Wuhan, China, 2021. [Google Scholar]
  97. Xia, Q.C. Study on image processing measurement method of soil breaking rate of micro-cultivator. Agric. Dev. Equip. 2016, 7, 112–113. [Google Scholar]
  98. Yang, X.L. Research on Soil Fragmentation Rate On-line Detection Method and System of Rotary Tiller Unit. Master’s Thesis, Henan University of Science and Technology, Luoyang, China, 2020. [Google Scholar]
  99. Zhang, L.L.; Niu, Q.P.; Han, F.L. Research status and future development direction of sowing monitoring technology at home and abroad. South China Agric. 2024, 18, 196–200. [Google Scholar]
  100. Yang, C.M.; Zhao, B.T.; Cheng, F.P.; Zhang, W.; Wang, Y.P.; Liu, L. Research status and prospect of sowing monitoring technology. J. Chin. Agric. Mech. 2024, 45, 345–352. [Google Scholar]
  101. Ding, Y.C.; Wang, K.Y.; Liu, X.D.; Liu, W.P.; Chen, L.Y.; Liu, W.B.; Du, C.Q. Research progress of seeding detection technology for medium and smallsize seeds. Trans. Chin. Soc. Agric. Eng. 2021, 37, 30–41. [Google Scholar]
  102. Ding, Y.C.; Yang, J.Q.; Zhu, K.; Zhang, L.L.; Zhou, Y.W.; Liao, Q.X. Design and experiment on seed flow sensing device for rapeseed precision metering device. Trans. Chin. Soc. Agric. Eng. 2017, 33, 29–36. [Google Scholar]
  103. Xu, L.C.; Hu, B.; Luo, X.; Ren, L.; Guo, M.Y.; Mao, Z.B.; Cai, Y.Q.; Wang, J. Development of a seeding state monitoring system using interdigital capacitor for cotton seeds. Trans. Chin. Soc. Agric. Eng. 2022, 38, 50–60. [Google Scholar]
  104. Zhao, Z.B.; Liu, Y.C.; Liu, Z.J.; Gao, B. Performance Detection System of Tray Precision Seeder Based on Machine Vision. Trans. Chin. Soc. Agric. Mach. 2014, 45 (Suppl. S1), 24–28. [Google Scholar]
  105. Han, H.F.; Guo, Y.K.; Han, Z.J.; Yang, W.Q.; Xu, Y.; Du, X.H.; Rui, X. Research and Experiment on Operation Quality Monitoring System of Automatic Transplanter. J. Agric. Mech. Res. 2023, 45, 105–109. [Google Scholar]
  106. Jiang, Z.; Zhang, M.; Wu, J.; Jiang, L.; Li, Q. Real-time Monitoring Method for Rape Blanket Seedling Transplanting and Omission Based on Video Image SSplicing. J. Agric. Mech. Res. 2022, 44, 189–195. [Google Scholar]
  107. Liu, L.M.; He, X.K.; Liu, Y.J.; Ceng, A.J.; Song, J.L. Target Pesticide Application Technology Equipment and Future Developments in the Control of Plant Pests, Diseases and Weeds. Plant Health Med. 2023, 2, 1–16. [Google Scholar]
  108. Jiang, H.H.; Wang, P.F.; Zhang, Z.; Mao, W.H.; Zhao, B.; Qi, P. Fast Identification of Field Weeds Based on Deep Convolutional Network and Binary Hash Code. Trans. Chin. Soc. Agric. Mach. 2018, 49, 30–38. [Google Scholar]
  109. Liu, L.M.; Wang, J.Y.; Mao, W.H.; Shi, G.Z.; Zhang, X.H.; Jiang, H.H. Canopy Information Acquisition Method of Fruit Trees Based on Fused Sensor Array. Trans. Chin. Soc. Agric. Mach. 2018, 49 (Suppl. S1), 347–353+359. [Google Scholar]
  110. Gu, C.C.; Zhai, Z.Y.; Chen, L.P.; Li, Q.; Hu, L.N.; Yang, F.Z. Detection Model of Tree Canopy Leaf Area Based on LiDAR Technology. Trans. Chin. Soc. Agric. Mach. 2021, 52, 278–286. [Google Scholar]
  111. Song, L.; Cao, M.; Hu, X.C.; Jia, P.Y.; Chen, Y.; Chen, N.J. Detection of Cassava Leaf Diseases under Complicated Background Based on YOLOX. Trans. Chin. Soc. Agric. Mach. 2023, 54, 301–307. [Google Scholar]
  112. Guo, H.; Han, J.X.; Lu, Z.S.; Chou, S.L.; Dong, Y.D.; Guo, L.H. Design and test of cleaning loss monitoring device for oil sunflower combine harvester. J. Jilin Univ. (Eng. Technol. Ed.) 2024, 1–11. [Google Scholar] [CrossRef]
  113. Du, Y.F.; Zhang, L.R.; Mao, E.R.; Li, X.Y.; Wang, H.J. Design and Experiment of Corn Combine Harvester Grain Loss Monitoring Sensor Based on EMD. Trans. Chin. Soc. Agric. Mach. 2022, 53 (Suppl. S1), 158–165. [Google Scholar]
  114. Chen, M.; Ni, Y.L.; Jin, C.Q.; Xu, J.S.; Zhang, G.Y. Online Monitoring Method of Mechanized Soybean Harvest Quality Based on Machine Vision. Trans. Chin. Soc. Agric. Mach. 2021, 52, 91–98. [Google Scholar]
  115. Wang, H.Y. Application of Agricultural Machinery Navigation Technology in Precision Agriculture. Agric. Mach. Using Maint. 2025, 27, 115–117. [Google Scholar]
  116. Li, X.M.; Feng, Q.C. Research Progress of Autonomous Navigation System for Orchard Mobile Robot Based on Multi-source Information Fusion. J. Anhui Agric. Sci. 2024, 52, 17–21. [Google Scholar]
  117. Dong, Z.S. Research on Monitoring System of Key Components of DrumFilm Recovery Machine. Master’s Thesis, Xinjiang Agricultural University, Urumqi, China, 2023. [Google Scholar]
  118. Hou, Y.T. Investigation and Advancement of Smart Monitoring Technology for Agricultural Machinery Field Operation. Mod. Agric. Equip. 2024, 45, 6–11. [Google Scholar]
  119. He, Y.; Huang, Z.Y.; Yang, N.Y.; Li, X.Y.; Wang, Y.W.; Feng, X.P. Research Progress and Prospects of Key Navigation Technologies for Facility Agricultural Robots. Smart Agric. 2024, 6, 1–19. [Google Scholar]
Figure 1. Agricultural autonomous unmanned operation system architecture diagram.
Figure 1. Agricultural autonomous unmanned operation system architecture diagram.
Agriengineering 07 00071 g001
Figure 2. Obstacle detection method using multi-sensor fusion.
Figure 2. Obstacle detection method using multi-sensor fusion.
Agriengineering 07 00071 g002
Figure 3. Multi-modal information fusion neural network model.
Figure 3. Multi-modal information fusion neural network model.
Agriengineering 07 00071 g003
Figure 4. Fusion of multi-sensor visual navigation system.
Figure 4. Fusion of multi-sensor visual navigation system.
Agriengineering 07 00071 g004
Figure 5. Wheel angle measurement method based on GNSS heading and MEMS.
Figure 5. Wheel angle measurement method based on GNSS heading and MEMS.
Agriengineering 07 00071 g005
Figure 6. Autonomous U-turn operation using the path optimization algorithm.
Figure 6. Autonomous U-turn operation using the path optimization algorithm.
Agriengineering 07 00071 g006
Figure 7. Collaborative operation of unmanned radish harvester and transport vehicle.
Figure 7. Collaborative operation of unmanned radish harvester and transport vehicle.
Agriengineering 07 00071 g007
Figure 8. Overall scheme of the tilling depth monitoring system.
Figure 8. Overall scheme of the tilling depth monitoring system.
Agriengineering 07 00071 g008
Figure 9. Online monitoring system for rapeseed carpet-like seedlings.
Figure 9. Online monitoring system for rapeseed carpet-like seedlings.
Agriengineering 07 00071 g009
Figure 10. Crop protection operation monitoring.
Figure 10. Crop protection operation monitoring.
Agriengineering 07 00071 g010
Figure 11. Schematic diagram of the monitoring device.
Figure 11. Schematic diagram of the monitoring device.
Agriengineering 07 00071 g011
Table 1. Comparative analysis of laser navigation and visual navigation technologies.
Table 1. Comparative analysis of laser navigation and visual navigation technologies.
TechnologyPrincipleClassificationAdvantagesLimitations
Laser NavigationPositioning technology for robots in the environment using LiDARClassified by the number of laser emitters: single-line LiDAR (2D LiDAR) and Multi-line LiDAR (3D LiDAR)High accuracy, low computational load, easy to implement real-time SLAMExpensive, sensor size and power consumption make it difficult to meet the needs of mobile intelligent devices
Visual NavigationUses cameras to capture images of agricultural environments and applies image processing algorithms to identify farm roads, crop rows, etc.Classified by sensor type: pure visual SLAM, RGB-DSLAM, and visual-inertial SLAMLow cost, small size, and rich texture informationVisual data processing is complex, and computational demands are relatively high
Table 2. Comparative analysis of mainstream path planning algorithms for agricultural machinery.
Table 2. Comparative analysis of mainstream path planning algorithms for agricultural machinery.
AlgorithmPrincipleAdvantagesDisadvantages
A*A heuristic algorithm that uses heuristic information to find the optimal pathReacts quickly to the environment; direct path searchPoor real-time performance, high computation for each node, long computation time, and efficiency decreases as the number of nodes increases
DijkstraFinds the shortest path by incrementally selecting the closest node to the start pointSimple computation; can find globally optimal pathGenerates unnecessary detours, leading to resource waste in practical applications
PSOUses cooperation and information exchange between individuals in a swarm to explore and approach the optimal solutionSimple rules; easy to implementProne to becoming stuck in local optima
RRTA sampling-based path planning algorithmQuickly explores reachable space and finds feasible pathsThe resulting path is not optimal, and it does not converge to an asymptotically optimal solution
Deep LearningLearns complex patterns in the environment to provide flexible and efficient path planningCan generate optimal path planning strategies after trainingRequires a large amount of data and computational resources for training
Table 3. Field operation monitoring technologies.
Table 3. Field operation monitoring technologies.
Monitoring ContentPerception MethodsAdvantagesDisadvantages
Tillage DepthRadar sensors, magnetic sensors, and potentiometers; arm-type tillage depth detection sensors based on ultrasonic and displacement sensors; dual tilt angle sensorsOvercomes the influence of crop residues on tillage depth detection; compensates for errors caused by field coverage and equipment vibrationsComplex system design, low sensitivity; wheel sinking issues in loose soil reduce tillage depth accuracy
Surface LevelingTilt angle sensors, ultrasonic sensors, MEMS inertial sensors, etc. (fusion of one or more sensor types)Effectively improves micro-topography of farmland, enhances water and fertilizer utilization, and increases crop yieldRequires high-precision sensors and processing algorithms, leading to higher costs
Soil Fragmentation RateVisual sensors, laser scanning, vibration sensors, soil particle analysis equipmentDirectly reflects soil tillage quality, ensures suitable soil structure; avoids over-fragmentation or under-fragmentation of soil, which is unsuitable for crop growthDetection results may be affected by environmental interference; requires high sensor precision and data processing
SowingPhotoelectric sensing technology; image processing technologyIncreases seed sensing precision by improving the density of infrared probe layout, enhances high-speed seeding monitoring accuracy by expanding the light field area; high accuracy in sensing actual seed spacingProne to interference from dust and non-seed particles, reducing sowing monitoring precision; large size, susceptible to vibrations
TransplantingVisual sensors, infrared sensors, displacement sensorsEnsures accurate control of seedling position and depth during transplanting; can cooperate with smart systems for real-time equipment adjustments to ensure optimal planting conditionsSeedling shape and size cause recognition difficulties for sensors; tight sensor-device integration increases system complexity
Crop ProtectionSpectral sensors, infrared imaging, visual sensors, LiDARImproves plant protection effectiveness, reduces pesticide waste, and protects the farmland ecosystemRequires complex algorithms to identify pest and disease types, greatly influenced by environmental factors; high cost when multiple sensors work together in large-scale field operations
HarvestingOptical sensors, visual sensors, infrared sensors, laser scannersPrecisely judges crop maturity, increases harvesting efficiency; helps avoid early or late harvesting, improving crop yield and qualitySensitive to environmental changes, potentially affecting accuracy; requires high standards for crop variety and growth characteristics
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, W.; Gu, J.; Liu, J.; Cheng, B.; Zhu, H.; Miao, Y.; Guo, W.; Jiang, G.; Wu, H.; Song, W. A Review of Key Technological Developments in Autonomous Unmanned Operation Systems for Agriculture in China. AgriEngineering 2025, 7, 71. https://doi.org/10.3390/agriengineering7030071

AMA Style

Li W, Gu J, Liu J, Cheng B, Zhu H, Miao Y, Guo W, Jiang G, Wu H, Song W. A Review of Key Technological Developments in Autonomous Unmanned Operation Systems for Agriculture in China. AgriEngineering. 2025; 7(3):71. https://doi.org/10.3390/agriengineering7030071

Chicago/Turabian Style

Li, Weizhen, Jingqiu Gu, Jingli Liu, Bo Cheng, Huaji Zhu, Yisheng Miao, Wang Guo, Guolong Jiang, Huarui Wu, and Weitang Song. 2025. "A Review of Key Technological Developments in Autonomous Unmanned Operation Systems for Agriculture in China" AgriEngineering 7, no. 3: 71. https://doi.org/10.3390/agriengineering7030071

APA Style

Li, W., Gu, J., Liu, J., Cheng, B., Zhu, H., Miao, Y., Guo, W., Jiang, G., Wu, H., & Song, W. (2025). A Review of Key Technological Developments in Autonomous Unmanned Operation Systems for Agriculture in China. AgriEngineering, 7(3), 71. https://doi.org/10.3390/agriengineering7030071

Article Metrics

Back to TopTop