Next Article in Journal
Irrigation Suitability and Interaction Between Surface Water and Groundwater Influenced by Agriculture Activities in an Arid Plain of Central Asia
Previous Article in Journal
Four-Dimensional Hyperspectral Imaging for Fruit and Vegetable Grading
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Research Progress and Applications of Artificial Intelligence in Agricultural Equipment

1
National Research Center of Pumps, Jiangsu University, Zhenjiang 212013, China
2
Institute of Advanced Manufacturing and Modern Equipment Technology, School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China
*
Author to whom correspondence should be addressed.
Agriculture 2025, 15(15), 1703; https://doi.org/10.3390/agriculture15151703
Submission received: 10 July 2025 / Revised: 29 July 2025 / Accepted: 4 August 2025 / Published: 7 August 2025
(This article belongs to the Section Agricultural Technology)

Abstract

With the growth of the global population and the increasing scarcity of arable land, traditional agricultural production is confronted with multiple challenges, such as efficiency improvement, precision operation, and sustainable development. The progressive advancement of artificial intelligence (AI) technology has created a transformative opportunity for the intelligent upgrade of agricultural equipment. This article systematically presents recent progress in computer vision, machine learning (ML), and intelligent sensing. The key innovations are highlighted in areas such as object detection and recognition (e.g., a K-nearest neighbor (KNN) achieved 98% accuracy in distinguishing vibration signals across operation stages); autonomous navigation and path planning (e.g., a deep reinforcement learning (DRL)-optimized task planner for multi-arm harvesting robots reduced execution time by 10.7%); state perception (e.g., a multilayer perceptron (MLP) yielded 96.9% accuracy in plug seedling health classification); and precision control (e.g., an intelligent multi-module coordinated control system achieved a transplanting efficiency of 5000 plants/h). The findings reveal a deep integration of AI models with multimodal perception technologies, significantly improving the operational efficiency, resource utilization, and environmental adaptability of agricultural equipment. This integration is catalyzing the transition toward intelligent, automated, and sustainable agricultural systems. Nevertheless, intelligent agricultural equipment still faces technical challenges regarding data sample acquisition, adaptation to complex field environments, and the coordination between algorithms and hardware. Looking ahead, the convergence of digital twin (DT) technology, edge computing, and big data-driven collaborative optimization is expected to become the core of next-generation intelligent agricultural systems. These technologies have the potential to overcome current limitations in perception and decision-making, ultimately enabling intelligent management and autonomous decision-making across the entire agricultural production chain. This article aims to provide a comprehensive foundation for advancing agricultural modernization and supporting green, sustainable development.

1. Introduction

With the continuous growth of the global population and increasing pressure on global food supply, traditional agricultural production methods are struggling to meet rising demands for efficiency, precision, and sustainable development [1]. Meanwhile, as arable land and agricultural resources decrease, as well as the agricultural labor force ages, there is an urgent need to leverage modern technologies to unlock the production potential of agriculture [2]. Among them, the rapid development of intelligent agricultural equipment has emerged as a central driver of agricultural modernization and productivity enhancement.
The technological evolution of agricultural equipment has facilitated the transition from the “Agriculture 1.0” era to the “Agriculture 4.0” paradigm. The transition from manual and mechanized operations to intelligent, sustainable, and labor-efficient agriculture has redefined the agricultural landscape [3]. Traditional agricultural equipment primarily relies on manual operation or rule-based automation systems. There are significant limitations in operation perception, environmental adaptation, and precision control. It is difficult to cope with the complex and dynamic field environments and the high heterogeneity of crop growth [4]. In contrast, AI technology has prominent advantages in big data processing, visual recognition, intelligent control, and autonomous decision-making. It has become a critical pathway for breaking through the bottlenecks in agricultural equipment intelligence and driving technological transformation [5]. In recent years, AI-powered agricultural equipment, including agricultural robots, intelligent spraying systems, autonomous machinery, and crop monitoring platforms, has emerged as a leading force in the field. The precision operations have been successfully implemented in key agricultural stages, including tillage, seeding, irrigation, weeding, fertilization, and harvesting. As a result, agricultural productivity and resource utilization are significantly enhanced [6,7].
The digital transformation of agriculture continues to be reshaped by key technologies such as computer vision, machine learning (ML), and the Internet of Things (IoT) [8]. Advances such as automatic crop recognition and sorting via computer vision [9], operational path optimization using reinforcement learning (RL) [10], and high-precision weed detection enabled by semantic segmentation algorithms [11] are propelling agricultural equipment toward higher levels of autonomy, precision, and operational efficiency. At the same time, collaborative innovation across fields such as sensing technology, materials science, and computer science are progressing. The intelligence and automation decision-making capabilities of agricultural equipment and key agricultural equipment processes have been further enhanced [12]. To improve the harvesting efficiency of economic crops, researchers employed convolutional neural network (CNN) combined with novel soft actuators for multi-arm collaborative operations [13]. In addition, by integrating environmental monitoring sensors with agricultural equipment, precise identification and intelligent control of field weeds were achieved [14]. This technology not only reduces the environmental impact of chemical weeding but also minimizes crop damage caused by mechanical weeding. However, there are still practical bottlenecks in the widespread application of intelligent agricultural equipment, such as a lack of data samples, complex operating environments, and high-performance requirements for algorithms and hardware [15]. For example, semi-supervised learning has been employed to forecast spare part demands, mitigating data scarcity and improving prediction efficiency. However, further work is needed to enhance classification accuracy and real-world applicability in spare parts recognition [16]. Therefore, the deep integration of AI models, especially ML methods, with agricultural equipment is accelerating the realization of intelligent functions. These functions include real-time crop health monitoring, automated harvesting, and precise prediction of optimal operation timing.
The remainder of this article is organized as follows: Section 2 presents advancements in sensing and detection technologies. Section 3 systematically reviews the latest research advances and typical applications of AI-powered agricultural equipment. It covers key processes such as land preparation, sowing, fertilization, harvesting, and field management, with a comprehensive overview of AI technology and innovations. Section 4 analyzes the development trends and major challenges. It proposes future directions for research on smart agricultural equipment. Section 5 summarizes the work of the full article.

2. Advancements in Sensing and Detection Technologies for Agricultural Equipment

With the rapid advancement of sensing technologies and artificial intelligence, the intelligence level of agricultural equipment has been significantly elevated. Sensing technologies encompass a wide range of methods and principles designed to capture critical environmental and crop-related data. At the hardware level, high-precision sensors, including red, green, blue plus depth (RGB-D) cameras, light detection and ranging (LiDAR), multispectral and hyperspectral imaging, and ultrasonic sensors, have been integrated to enable advanced perception [17]. These sensors, as key research focuses in sensing technologies, capture essential visual information such as color, morphology, and structural characteristics of crops, while also measuring critical parameters like soil moisture, nutrient content, and other environmental factors [18]. Table 1 presents a comparative summary of these sensing technologies. Concurrently, advancements in crop detection and recognition technologies have also made it possible to extract valuable information from the massive amounts of sensor data collected. Non-destructive, information-driven detection methods based on chemical and physical parameters have been rapidly developed [19]. Presently, crop detection and recognition technologies primarily include spectral analysis, traditional image processing, and deep learning (DL) methods. Table 2 provides a comparative summary of these detection and recognition technologies.

3. Artificial Intelligence-Powered Agricultural Equipment

3.1. Artificial Intelligence Empowerment of Tillage Equipment

3.1.1. Intelligent Optimization of Vibration Characteristics

Tillage equipment plays a critical role in modern agricultural production, encompassing a variety of machinery such as cultivators, plows, rotavators, subsoilers, and land levelers. These implements perform essential soil management tasks including plowing, loosening, stubble removal, crushing, and compaction [25]. However, under field conditions with complex topography, heavy crop residues, and highly variable tillage resistance, the equipment is frequently subjected to multi-source excitations. Significant mechanical vibrations are induced. These vibrations not only significantly reduce the operational reliability of the equipment, but also directly affect the quality of operation and energy consumption levels. Moreover, these vibrations are transmitted to the operator through the floor, seat, and steering wheel, resulting in operational fatigue, reduced ride comfort, and potential long-term health risks [26]. As shown in Table 3, various advanced signal processing and optimization techniques have been employed to enhance the vibration analysis, signal extraction, and parameter optimization for tillage equipment.
In 2022, based on the CEEMDAN method and the POT model, Dai et al. [27] proposed a method for compiling and extrapolating tractor ground vibration load spectra. By collecting vibration signals under various complex operating conditions and integrating the CEEMDAN-Wavelet threshold method with the generalized pareto distribution function, full-life load spectra of six typical ground types were accurately constructed. In 2022, integrating vibration resonance of a second-order Duffing bistable system with VMD, Wang et al. [28] presented an adaptive weak-damping signal extraction method. The kurtosis parameter was optimized by the QPSO. The extraction of weak feature signals in agricultural equipment operations was enhanced under strong noise environments. In 2024, Gao et al. [29] analyzed the effects of different wheat stubble heights, forward speeds of machinery, and PTO shaft speeds on the vibration characteristics and operational performance of the tillage equipment. An orthogonal experiment and response surface optimization were used to identify key factors. The results revealed that the power take-off speed and stubble height are the main factors affecting machine vibration.
Regarding the influence of tillage equipment vibration on operators, in 2023, Singh et al. [32] employed IoT and big data acquisition technologies. Utilizing multi-sensor real-time monitoring and spectral analysis, the dynamic vibration characteristics of the tractor were assessed under field operating conditions. To predict driving comfort, they applied several supervised learning methods, including DTR, SVR, and ANN. The combination of ANN and Bayesian optimization was proven to be highly effective in improving predictive accuracy. In 2025, based on the ThingSpeak IoT platform and embedded MATLAB R2023a intelligent analysis, Singh et al. [33] developed a vibration monitoring and early-warning system of agricultural tractor operations. The data transmission and alarm generation processes were illustrated in Figure 1. The whole-body and seat vibration data were transmitted in real time to the IoT cloud monitoring platform. Online amplitude-threshold discrimination is performed within the embedded MATLAB environment. The operational safety and comfort of the operator was significantly improved.

3.1.2. Intelligent Regulation of Compaction Operations

Soil compaction is a key agronomic parameter. It affects seed germination, root development, and crop yield improvement. An optimal compaction state helps optimize soil structure and enhances seed-soil contact. It is essential for healthy early-stage crop growth. However, traditional compaction machinery lacks real-time monitoring and adaptive control capabilities. As a result, it cannot effectively respond to heterogeneous field soil conditions. The quality of operation and crop yield were adversely affected [35].
There is significant advantage demonstrated by AI-driven intelligent perception and decision-making methods in the optimization of compaction operations. In 2020, Zhao et al. [36] developed an intelligent monitoring and adaptive control system for the seedbed compaction process. The system was based on discrete element simulation and an elastic ellipsoid contact model. It integrated fuzzy logic algorithms and statistical analysis. The dynamic perception and adaptive control of operational parameters was achieved. In 2020, Ben et al. [37] applied the Van Genuchten equation to normalize soil moisture conditions. They employed discrete Bayesian network. The soil penetration resistance prediction was conducted based on different combinations of soil physicochemical factors. It was confirmed that using features primarily based on sand and clay content significantly improves model prediction accuracy.
Regarding multi-source sensor data fusion, in 2024, using a multi-degree-of-freedom sensor, Wang et al. [38] used a multi-degree-of-freedom sensor to collect real-time vibration acceleration signals during compactor operation. They constructed an intelligent online diagnostic framework for soil gradation and compaction quality assessment. The framework utilized a DeepLabv3+ semantic segmentation network based on MobileNetV2 and the XGBoost algorithm. As shown in Figure 2, the framework was developed to enable high-precision and automated detection of field compaction quality.
In addition to sensor-based and DL, geophysical non-destructive testing methods were applied to the assessment of field soil compaction. In 2024, Carrera et al. [39] employed seismic refraction tomography and multi-channel surface wave analysis techniques. They evaluated the capability of this non-destructive testing method to identify spatial distribution and variability of farmland soil under different compaction levels. In 2025, Meehan et al. [40] systematically analyzed the accuracy and applicability of various methods, including magnetic pulse induction scanning, real-time kinematic-global positioning system, and unmanned aerial vehicle (UAV) image acquisition. These methods were evaluated for monitoring the layer thickness of soil compaction construction. The intelligent data processing methods, such as spatial interpolation, were used to confirm the significant potential of this sensor integration technology. It demonstrated great promise in continuous and accurate monitoring of soil stratification and assisting in compaction quality control.
In conclusion, as presented in Table 4, various advanced modeling and sensing techniques have been utilized to enhance the analysis and optimization of soil compaction and layer thickness in agricultural operations.

3.1.3. Intelligent Control of Subsoiling Operations

Subsoiling is a key agricultural technique to improve arable land quality, enhance soil structure, and prevent soil compaction through subsoiling operations. There is significant importance in terms of protecting the soil ecological environment. Soil moisture permeability, gas exchange capacity, and root growth space are improved by subsoiling. However, in practical operations, it is highly complex for the interaction between the subsoiler and soil due to varying subsoiling depths, operating speeds, and soil resistance conditions during actual operations. Solving the issue of how to scientifically set and dynamically optimize deep loosening parameters is critical to enhancing loosening effectiveness, extending equipment lifespan, and ensuring operational reliability [41].
With the introduction of AI and advanced sensing technologies, it is beneficial for the intelligent perception, monitoring, and optimization of the subsoiling equipment operation process. In 2017, to reveal the impact of different operational parameters on soil throwing behavior, Gao et al. [42] utilized high-speed imaging and data regression method during the inversion tillage subsoiling process. They developed a model of the corresponding soil-throwing behavior. This model provides a theoretical basis for operation simulation, parameter optimization, and intelligent design of reverse-rotary subsoiling equipment. In 2020, combining linear displacement sensors and inclinometers, Kim et al. [43] developed a real-time subsoiling depth measurement system to achieve high-precision and continuous monitoring of subsoiling depth. The effectiveness and reliability of the system in analyzing the impact of subsoiling depth on draft force were validated through field experiments. In 2022, Yin et al. [44] proposed an edge computing monitoring system based on an IoT architecture. The overall architecture of the subsoiling monitoring system was displayed in Figure 3. Multi-source sensors, including a GPS antenna, dual-channel attitude sensors, and cameras, were mounted on the subsoiler to enable real-time sensing of operating depth, implement orientation, and worksite imagery. This system enabled real-time, high-precision acquisition, statistical analysis, and intelligent management of key subsoiling parameters such as depth, area, and quality. It demonstrated excellent concurrent performance in large-scale field operations.
The introduction of simulation and DT technologies has significantly enhanced the modeling and prediction capabilities of subsoiling operation states. In 2022, integrating discrete element method and multi-body dynamics modeling, Kim et al. [45] developed a full-scale DT simulation model of an agricultural tractor–plow–soil system. By collecting soil stratification data and performing multi-source calibration, they achieved high-precision intelligent prediction of key indicators such as draft force. The performance of the model significantly surpasses that of the traditional ASABE empirical model. In 2024, based on field measurement data, Zhang et al. [46] conducted a comprehensive analysis using the integrated simulation platform of ANSYS Workbench 2021 R2 and nCode DesignLife. The effects of varying soil penetration resistance, working depth, and operating speed on the structural fatigue damage of the tractor subsoiler shovel were investigated. The DT simulation model of typical operating load spectra and accurate fatigue life assessment were achieved.
Based on the above, as shown in Table 5, various advanced techniques and modeling approaches have been employed to optimize subsoiling parameters, enhance measurement accuracy, and improve the management and monitoring of subsoiling depth and draft resistance.

3.2. Artificial Intelligence Empowerment of Transplanting and Seeding Equipment

3.2.1. Automatic Navigation of Transplanting Equipment

The transplanting of crops such as fruit trees and vegetable seedlings usually relies on ridge transplanters for operation. The transplanters are often required to precisely follow predefined ridges to prevent seedling deviation from the intended planting path. The accuracy and uniformity of planting row spacing were ensured [47]. However, due to the complex and variable field environment, traditional satellite navigation systems face significant limitations in identifying surface ridge features. As a result, it is difficult to meet the requirements of high-precision autonomous navigation.
The integrated application of machine vision and multi-source intelligent perception technologies has become a key approach to enhancing the autonomous navigation performance of transplanting equipment. In 2021, based on central axis extraction, Opiyo et al. [48] designed a machine vision navigation method. By combining color index of vegetation extraction, Gabor texture features, principal component analysis (PCA), and K-means clustering algorithm, high-precision extraction of navigation paths and fuzzy logic-based trajectory tracking control were ensured Through field experiments, the proposed method outperformed conventional approaches in both navigation accuracy and robustness. To address the practical requirements of transplanting machinery operating along crop-free ridges, in 2024, Liu et al. [49] developed an ultrasonic ridge tracking system. The development sequence of the proposed system was illustrated in Figure 4. By integrating intelligent perception with fuzzy inference algorithms, this system realized high-precision autonomous tracking of field ridges. It significantly enhanced the stability of automatic navigation.
At the level of trajectory tracking algorithms, in 2024, Shet et al. [50] introduced a novel method for autonomous vehicle trajectory tracking. This method integrated fuzzy logic, adaptive fractional-order sliding mode control (SMC), and swarm intelligence optimization. The high-precision and robust trajectory tracking are obtained under parametric uncertainties and external disturbances. The superiority of the proposed method over traditional sliding mode and adaptive backstepping controllers was validated through simulation and hardware experiments. In 2024, based on machine vision, Liu et al. [51] developed a precise recognition method for navigation lines of ridges without crops. Through image intelligent processing techniques such as grayscale reconstruction, threshold segmentation, and contour detection, they achieved the extraction of stable navigation lines for farmland ridges under different lighting and soil conditions. The reliability and environmental adaptability of automatic navigation for agricultural equipment was enhanced by the proposed method.
In conclusion, as shown in Table 6, various advanced modeling and sensing techniques have been utilized to enhance the analysis and optimization of soil compaction and layer thickness in agricultural operations.

3.2.2. Classification and Detection of Seedling Equipment

With the wide application of transplanting equipment in the cultivation of fruits, vegetables and other crops, the operation efficiency has been significantly improved. However, there are emerging quality issues such as seedling damage and missed seedling These issues have become major bottlenecks restricting subsequent crop growth and the fulfillment of agronomic requirements [52]. To address the challenges of classification during transplanting operations, intelligent detection technology is gradually becoming a key approach to enhancing operational quality and stability.
The current mainstream on-site detection methods include ultrasonic sensors, three-dimensional optical detection and machine vision, etc. The ultrasonic and light detection and ranging (LiDAR) sensing technologies provide reliable capabilities for the initial identification of seedling presence. However, they still exhibit significant limitations in accurately distinguishing indicators of planting quality such as burial depth and exposure. In contrast, machine vision detection systems are characterized by high information content, strong robustness, and a high level of intelligence. They have become the forefront of research and application in seedling field quality inspection [53]. In 2022, combining the attention mechanism and transfer learning optimization, Zhang et al. [54] designed a squeeze-and-excitation (SE) network combined with You Only Look Once v5 (YOLOv5x). The transfer learning method for crop localization is presented in Figure 5. This model facilitated efficient recognition and precise localization of lettuce and weeds. It enhanced the practicality and generalizability of seedling detection under real-world agricultural conditions. In 2023, based on the improved YOLOv5s and ByteTrack, Cui et al. [55] proposed a real-time rice seedling detection and tracking method. They successfully addressed the automatic detection and counting of missing seedlings in paddy field blocks. In 2024, Zhang et al. [56] proposed a Seedling-YOLO algorithm. This algorithm is integrated with the ELAN_P partial convolution, CARAFE feature reassembly, and coordinate attention mechanism. It achieved efficient and accurate automatic recognition of broccoli transplant seedling quality in the field. In 2024, Li et al. [57] proposed a method for identifying healthy plug seedlings for greenhouse transplanting robots. This method integrated image preprocessing, image segmentation based on MLP, and centroid matching of connected components. The seedling health status was classified by setting regional thresholds. Under multiple cultivar and illumination conditions, the proposed method was validated, demonstrating superior accuracy in seedling health identification and higher transplanting success rates. In 2024, to address the problem of missed seeding in potato sowing and the poor anti-interference performance of existing missed seeding detection methods, Li et al. [58] proposed a lightweight YOLOv5s missed sowing detection algorithm. A convolutional attention mechanism and improved non-maximum suppression were integrated into the proposed algorithm. In 2025, to achieve real-time identification and sorting of robust and inferior seedlings, Li et al. [59] presented a selective intelligent seedling picking framework based on DL. In addition, a robotic system with selective transplanting capability was designed to support automated and precise seedling management.
In conclusion, as shown in Table 7, a variety of advanced models have been applied to improve the accuracy and efficiency of seedling classification and detection, leading to significant enhancements in precision, speed, and model optimization.

3.2.3. Precision Control of Transplanting Equipment

Traditional transplanting equipment relies on ground wheels and chains for power transmission and parameter adjustment. The basic quantitative dispensing of seeds has been achieved by traditional transplanting equipment. However, it is difficult to adjust the planting density and automation efficiency according to dynamic changes in field conditions. This limitation directly affects the precision of transplanting operations and the uniformity of crop growth. It is unable to meet the high-standard operational requirements of modern agriculture [60].
With the continuous advancement of various intelligent sensing and control technologies, transplanting equipment now possesses real-time environmental perception, adaptive regulation, and precision operation capabilities. By collecting multi-source data such as soil fertility, operation trajectory, and downforce, and applying data fusion algorithms for dynamic parameter optimization, the uniformity and accuracy of sowing operations have been significantly improved [61]. In 2018, Zhao et al. [62] applied the discrete element method to simulate and model the seed movement in the rectangular vibrating tray. Using a gray model and a backpropagation ANN (BP-ANN), the seed distribution status was predicted in real time. This method improved the distribution uniformity and automatic monitoring capability of precision seedling trays with vibrating plates. In 2019, Chen et al. [63] investigated the efficient and low-damage transplanting technology of plug seedling robots in greenhouses. By designing an image recognition system, optimizing the end effector, and utilizing trajectory planning based on a hybrid optimization algorithm combining artificial fish swarm (AFS) and particle swarm optimization (PSO). In 2022, Zhao et al. [64] developed a seedling transplanting system. The system integrated RGB-D visual sensing for dynamic detection, intelligent multi-module coordinated control, and an integrated mechanical structure design. The entire process of seedling tray sorting, transplanting, and replanting was highly automated and intelligently operated. In 2024, based on multi-sensor fusion and programmable logic controller (PLC) control, Yue et al. [65] developed a double-row precise seedling taking and throwing system for seedling holes. Through intelligent parameter setting, adaptive motion control, and dynamic process monitoring, data-driven optimization and precision assurance of operations were effectively achieved. To meet the requirements of the dual-axis positioning seedling tray conveying system in fully automatic transplanters, in 2024, Yao et al. [66] proposed a dual-sensor intelligent positioning algorithm, a displacement control method, and a three-closed-loop control system. This system achieved high-precision automatic conveying and position recognition of multi-specification seedling trays. Operational efficiency and flexible adaptability were effectively improved.
In summary, as presented in Table 8, a variety of advanced methods and intelligent control systems have been implemented to improve seed distribution accuracy and seedling picking precision, leading to significant enhancements in operational performance and precision.

3.3. Artificial Intelligence Empowerment of Harvesting Equipment

3.3.1. Intelligent Recognition of Harvesting Equipment

Fruit and vegetable harvesting is a labor-intensive process in traditional agriculture. There is a heavy reliance on seasonal labor, leading to increased production costs and limited operational efficiency. With the increasing severity of labor shortages, mechanized and intelligent autonomous harvesting has gradually become a key solution to overcoming production bottlenecks, improving operational efficiency, and reducing labor costs. However, AI-driven intelligent harvesting equipment performs well in target recognition, maturity assessment, and autonomous localization [67].
It is widely applied to use image processing and target recognition technologies in fruit and vegetable harvesting equipment. Information analysis is performed using traditional methods based on multi-dimensional features including RGB or HSV color, surface texture, contours, shapes, and spatial relationships. Basic recognition of fruit and vegetable targets is achieved using techniques such as Otsu thresholding, SIFT feature extraction, HOG descriptors, K-means clustering, Canny edge detection, Hough transform, and support vector machine (SVM) [68,69,70]. In 2018, to improve the nighttime operational efficiency of apple-picking robots, Jia et al. [71] proposed an image preprocessing method based on color analysis and differential image processing. This method performed noise detection and classification on night vision images under various auxiliary light sources. The adaptability of apple-picking robots for nighttime operations was enhanced. In 2021, Sun et al. [72] proposed a clustering adaptive density estimation method based on the constructed LiDAR measurement system and double-threshold segmentation. By applying SOR filtering and Otsu method on both elevation and reflectance intensity features, accurate grain segmentation was achieved. It was effective to improve the segmentation accuracy of dense lodging rice point clouds and the precision of intelligent crop density estimation.
However, due to illumination variations, occlusion, and noise interference in complex field environments, the robustness and generalization ability of traditional machine vision algorithms need to be improved. Faced with these challenges, the research is gradually shifting from traditional digital image processing to DL and end-to-end visual recognition frameworks. By automatically extracting features with neural networks, it is effective to enhance the autonomous adaptability of the equipment under diverse conditions such as complex illumination, occlusion, and noise [73]. The real-time performance and accuracy of harvesting operations are greatly improved by algorithms such as lightweight network architectures, attention mechanisms, and multi-scale feature recombination. These algorithms provide a robust algorithmic foundation for subsequent precise detection, recognition, and grasping path planning. In 2023, Zhang et al. [74] proposed a light-weight tea tree top bud detection model based on YOLOv4. By integrating lightweight neural network, deformable convolution layer and coordinate attention module, the efficient and accurate detection under various lighting conditions was achieved. In 2022, based on ShuffleNetV2-YOLOX, Ji et al. [75] developed a real-time target detection method for apples. By integrating a lightweight backbone network, an attention mechanism and an adaptive spatial feature fusion module, the calculation overhead under complex natural environment was effectively reduced and detection accuracy. In 2024, based on a multimode RGB-D sensing and an improved YO7-tiny-CTD neural network, Cai et al. [76] designed a tomato picking detection scheme. By integrating color, depth information, and point cloud normal vectors, the detection model structure and non-maximum suppression method were optimized in a multi-dimensional manner. The real-time accuracy of cherry tomato recognition and localization was significantly improved under complex greenhouse environments.
The vision-based 3D positioning system, as a key technology for automation in agricultural harvesting robots, has made significant progress in precise recognition of crops and application scenarios. In 2023, based on an improved YOLOX network and an RGB-D image fusion, Hu et al. [77] proposed an improved YOLOX network to detect the target region. By employing AI technologies such as structural re-parameterization, multi-branch feature extraction, and adaptive dynamic loss weighting, they achieved high-precision spatial localization of apple targets in complex orchard environments. The flow of apple target detection and location was indicated in Figure 6. In 2024, to solve the problem of target identification and location in the complex environment, Guan et al. [78] proposed a lightweight YOLO-GS based on a multi-feature information fusion. By integrating Ghost convolution, the C3-GS cross-stage attention module, an improved detection head, and the Focal EIoU loss function, the capability of accurately recognizing and localizing aquatic vegetables in 3D on low-power platforms was improved. In 2024, based on the improved YO8 algorithm, Zou et al. [79] developed a lightweight high-precision cauliflower target detection method. A Slim-neck architecture, a three-branch attention mechanism, and a weighted bidirectional feature pyramid network was integrated. The number of parameters and computational complexity were reduced. Meanwhile, both detection accuracy and real-time performance were simultaneously improved.
Based on the above, as illustrated in Table 9, various advanced detection and localization methods have been employed to enhance the accuracy, efficiency, and robustness of crop and object recognition across different agricultural environments, leading to significant improvements in performance metrics.

3.3.2. Autonomous Operation of Harvesting Equipment

The autonomous path planning and operation strategy of harvesting equipment is the core link to improving the intelligent operation and operation efficiency. It has drawn increasing attention, particularly in complex agricultural environments such as fruit and vegetable harvesting scenarios [80]. In addressing technical challenges such as multi-arm robots, multi-target task scheduling, and dynamic environment adaptation, advanced intelligent methods including DRL, dynamic task allocation, and multi-agent collaboration are extensively utilized in current research. The perception, decision-making, and operational adaptability of autonomous harvesting systems are continuously being advanced [81].
In terms of path planning, effective navigation and decision-making technologies can reduce operation time and optimize machinery routes. They also contribute to minimizing equipment wear and lowering energy consumption during operation [82]. However, under the complex field environment, such as light change, shade disturbance, irregular crop lines, weeds and other factors, it is challenging to quickly and accurately identify the crop or harvesting boundary [83]. Therefore, the high-precision perception of unstructured farmland environments, including obstacle detection and automatic recognition of crop rows and field boundaries, is essential. It is regarded as the primary task for achieving autonomous navigation and operation of harvesting equipment. In 2021, based on the improved grayscale factor and prediction point Hough transform, Chen et al. [84] designed a visual navigation path extraction method for greenhouse cucumber picking robot. Through innovative grayscale segmentation and predictive point-guided curve fitting, the Path extraction accuracy and real-time performance were improved. Compared to traditional Hough transform and least squares methods, there is higher robustness and efficiency by the proposed method. In 2023, Wang et al. [85] developed a semantic segmentation model based on the lightweight MV3DLabV3+DL network. This model was constructed by integrating the MobileNetV3_Large backbone, LeakyReLU activation function, depthwise separable atrous convolution, and B-spline curve fitting. The accurate extraction of wheat combine harvesting boundaries was obtained by the proposed model
In collaborative harvesting with robotic arms, there is increased demand for task allocation mechanisms due to the uncertainty and diversity of tasks in orchard environments. The proper task decomposition, allocation, and synchronized operation are crucial for improving overall system performance. It is particularly suitable for field scenarios involving rigid collision risks and high environmental uncertainty. In 2024, Yang et al. [86] developed a lightweight mask region-based convolutional neural network (Mask R-CNN) visual perception system, along with a 6-DOF dual-arm collaborative structure. In complex ridge-planted strawberry environments, this system successfully accomplished intelligent 3D recognition of picking points, optimized path planning, and non-destructive fruit harvesting. In 2025, integrating Markov countermeasures, self-attention mechanism and centralized training, Xie et al. [87] introduced a dynamic task planning approach based on DRL. The Markov game theory, self-attention mechanisms, and centralized training framework were integrated to enhance multi-agent coordination and decision-making efficiency. The efficiency of autonomous planning and decision-making in multi-arm collaboration was improved.
In conclusion, as shown in Table 10 various advanced methods and intelligent control systems have been implemented to optimize agricultural harvesting tasks, leading to significant improvements in accuracy, efficiency, and real-time performance across different environments and conditions.

3.3.3. Maturity Evaluation of Harvesting Equipment

The accurate determination of crop maturity is crucial for optimizing harvest timing, controlling post-harvest quality, and enhancing industry value. Although traditional expert sensory evaluation is convenient, it is difficult to achieve consistency and large-scale standardization due to the influence of physiological state and subjective factors of evaluators [88]. The objectivity, efficiency, and accuracy of crop maturity assessment have been greatly improved [89]. The integration of AI and multimode sensing technologies has promoted the steady improvement of online and non-destructive detection capabilities of harvest equipment crop maturity.
In terms of spectrum analysis and ML, in 2020, Qin et al. [90] used the hyperspectral microscopy imaging technology to extract the spatial and spectral characteristics of the tea samples. The competitive adaptive reweighted sampling and ANN modeling were combined to achieve rapid and objective prediction of sensory attributes, including appearance, liquor color, aroma, taste, and overall quality of matcha. In 2021, using surface-enhanced Raman scattering technology based on unlabeled gold nanorods and combining multiple algorithms such as KNN, SVM, and BP-ANN, Guo et al. [91] achieved rapid, efficient, and non-destructive identification of dominant spoilage bacteria during apple storage. In 2023, by using visible/near-infrared spectrum and ML algorithms such as principal latent structure discriminant analysis (PLS-DA) and ANN, Qiu et al. [92] established a fast detection model for pineapple fruit maturity and soluble solid content. There is high accuracy and fast efficiency in the model. In 2023, Xu et al. [93] utilized the visible/near-infrared hyperspectral imaging (HSI) combined with the stacked autoencoder-LSSVM. The fruit size and other compensation factors were incorporated A high-accuracy, non-destructive, and rapid prediction of total soluble solids and titratable acidity in Kyoho grapes was designed. In the field of traditional image processing, in 2020, Chen et al. [94] proposed a real-time monitoring device for grain crushing rate of rice combine harvester based on machine vision system. The real-time performance and automation levels were improved by multi-source sampling and image analysis. In addition, the significant progress was achieved through the integration of novel sensing technologies and DL methods. In 2025, Wang et al. [95] constructed a colorimetric sensor array (CSA) based on ZIF-8 mediated nanocomposite material, as shown in Figure 7. Combined with ANN, the highly sensitive detection and intelligent classification of volatile organic compounds at the camellia extract level were realized.
In summary, as presented in Table 11, various advanced methods and intelligent modeling techniques have been employed to improve the prediction accuracy, detection efficiency, and performance of agricultural quality assessments, leading to notable enhancements in both calibration and prediction metrics.

3.4. Artificial Intelligence Empowerment of Field Management Equipment

3.4.1. Intelligent Management of Irrigation Equipment

Water resources are a fundamental element of agricultural production, and their efficient and scientific utilization is crucial for the sustainable development of modern agriculture. The conventional moisture monitoring methods rely on manual sampling and laboratory analysis. They are hampered by high costs, low efficiency, and an inability to provide timely feedback on field-level moisture dynamics. Consequently, these methods are increasingly inadequate for meeting the demand of real-time and high-efficiency data acquisition in precision agriculture [96].
By integrating multi-source heterogeneous data including remote sensing imagery, meteorological data, and soil sensors, and leveraging AI models, this approach enables real-time field information processing and intelligent optimization of dynamic irrigation strategies. The water use efficiency has been significantly enhanced, and the development of precision irrigation has been advanced [97]. In 2023, based on meteorological observations from five climatic zones in Pakistan, Raza et al. [98] optimized modeling approaches for reference evapotranspiration (ETo) to more accurately assess crop water requirements and to support irrigation scheduling in data-scarce regions. The M5P tree, sequential minimal optimization (SMO) radial basis function neural regression (RBFNreg), and multiple linear regression (MLR) were employed as ML techniques. These methods were systematically compared and optimized to improve the modeling of reference evapotranspiration. In 2024, based on field sampling from alluvial aquifers and multi-parameter water quality assessment, Boufekane et al. [99] evaluated and predicted the suitability of groundwater for irrigation as well as its future spatiotemporal variations. Geographic Information System (GIS) spatial analysis was integrated with a Long Short-Term Memory (LSTM) model to enhance the prediction accuracy.
To address irrigation speed fluctuations caused by complex field conditions such as rocks, pits, and pipeline structural variations, Preite et al. [100] presented an intelligent irrigation management system in 2024. The soil and environmental sensor data and three-day weather forecasts were integrated by the system. AI algorithms, including MLP, SVM, and KNN, were incorporated. Meanwhile, Tang et al. [101] developed a linear active disturbance rejection control strategy based on improved particle swarm optimization (IPSO), as illustrated in Figure 8. The method exhibited better speed tracking performance and disturbance rejection than traditional proportional–integral–derivative (PID) and linear active disturbance rejection control (LADRC) strategy under variable speed and load conditions. It was effective in ensuring operational stability and irrigation uniformity in complex environments. In 2024, Wang et al. [102] proposed an intelligent farmland irrigation warning system based on the e enhanced genetic algorithm–backpropagation neural network (EGA-BPNN). By using GA to enhance the global search and generalization capability of the BP neural network, high-precision prediction of irrigation flow was achieved under complex field conditions.
In the field of intelligent irrigation decision-making, in 2025, Chen et al. [103] integrated the DSSAT crop model with distributional RL to develop a multi-source information-driven precision intelligent irrigation regulation system. The proposed system incorporated meteorological, soil, and crop physiological parameters. It demonstrated superior policy optimization performance under multiple scenarios and extreme climate conditions in Xinjiang cotton fields. It was more effective in improving water use efficiency and cotton yield than traditional empirical methods and the general Deep Q-Network.
In summary, as illustrated in Table 12, various advanced modeling and optimization methods have been applied to enhance the accuracy and efficiency of irrigation predictions, water quality estimations, and farm flow forecasting, resulting in significant improvements in performance metrics such as error reduction, prediction accuracy, and processing speed.

3.4.2. Intelligent Management of Weeding Equipment

By competing with crops for light, nutrients, and growing space, weeds present a persistent and significant threat to crop yield and the quality. Conventional field weeding methods primarily rely on chemical spraying, mechanical removal, and manual labor. These approaches have effectively suppressed weed growth. However, it is still challenging to achieve precise identification and operational safety. Achieving both crop safety and efficient weed management remains a critical unresolved challenge in current agricultural practices [104].
In intelligent crop-weed discrimination, DL and multimodal perception technologies have accelerated the development of smart weeding systems. In 2021, Liu et al. [105] developed an intelligent weed recognition and variable-rate spraying system for farmland based on CNN. The superior performance of VGG-16 in real-time weed detection and precision spraying was validated through comparison with AlexNet and GoogleNet deep network models. In 2024, Jia et al. [106] designed an intra-row obstacle-avoiding shovel-type weeder for orchards. By integrating theoretical modeling, ADAMS virtual simulation, and response surface optimization analysis, the optimal structural and operational parameters were determined to enable automatic obstacle avoidance and efficient mechanical weeding. In 2024, utilizing color feature analysis and an adaptive Otsu threshold segmentation algorithm, Chen et al. [107] proposed a cotton field weed identification method. The optimal combination of key structural parameters and operational parameters was obtained. In 2024, Zheng et al. [108] proposed a weed identification method in cotton fields based on YOLOv5s and the Otsu adaptive threshold segmentation algorithm. It was adaptable to multiple growth stages and enabled efficient discrimination between cotton seedlings and major weeds. In 2025, by employing morphological feature extraction and classification algorithms, Memon et al. [109] achieved precise discrimination between crops and weeds based on machine vision and image processing. The workflow of the weed recognition algorithm was displayed in Figure 9.
In conclusion, as presented in Table 13, a variety of advanced image classification and detection methods have been implemented to improve weed identification and differentiation in agricultural environments, leading to significant enhancements in accuracy, processing time, and overall system performance.

3.4.3. Intelligent Management of Fertilization Equipment

Under the background of intensive development in modern agriculture, excessive application of chemical fertilizers has posed severe challenges to agricultural ecosystems. There are challenges such as decreased soil microbial activity, deterioration of soil fertility, and declines in crop yield and quality [110]. The scientifically precise fertilizer application not only enhances crop nutrient uptake efficiency but also mitigates nitrogen leaching, promotes microbial transformation, and reduces agricultural environmental footprints [111].
In the domain of crop growth monitoring and fertilization management, non-contact sensing techniques such as remote sensing and UAV platforms have gained extensive application due to their high spatiotemporal resolution and broad coverage. The studies investigate the integration of UAV-based multispectral and HSI technologies with ML approaches It is an effective approach for rapid, non-destructive, and spatially resolved quantitative monitoring of crop nutrients and biochemical traits. In 2023, by combining the UAV-based digital image, multispectral imagery, and Gaussian process regression (GPR) algorithm with minimum redundancy maximum relevance (mRMR) feature selection, Xu et al. [112] achieved high-precision remote estimation of rice leaf nitrogen content. In 2023, Elsayed et al. [113] developed an integrated gradient boosting regression model combined with novel two-dimensional and three-dimensional spectral indices. This model accurately estimated the chlorophyll content, yield, and quality of sugar beets under different nitrogen regimes. In 2024, Zhang et al. [114] applied ML algorithms such as SVM and random forest (RF). They combined these methods with UAV-derived vegetation indices to achieve high-accuracy remote sensing estimation of leaf chlorophyll content (LCC) in winter wheat. The study covered multiple cultivars, growth stages, and nitrogen stress conditions. As shown in Figure 10, the results demonstrated the superior performance of SVM regression under complex conditions.
Significant progress has been made in the intelligent control of fertilization equipment. To address the limited control precision of traditional fertilization devices, Zhu et al. [115] proposed a fertilizer discharge system based on PID control and PSO with parameter optimization in 2023. The high-precision and low-error fertilization operations were achieved by the system through modeling, simulation, and experimental validation. In 2024, based on biogas slurry conductivity, Jiang et al. [116] developed an algorithm for slurry mixing decisions and the feedback control mechanism. The application accuracy was enhanced, fertilizer consumption was significantly reduced, and fertilization management was optimized by this device. In 2024, Shi et al. [117] constructed an intelligent fertigation system integrating a combined Mariotte siphon structure with a fuzzy PID controller. The system enabled precise irrigation and nutrient solution preparation. Under greenhouse hydroponic conditions for tomatoes, it achieved stable irrigation management with minimal fluctuation. In 2024, Zhu et al. [118] developed an improved fireworks algorithm for task allocation among multiple fertilization machines operating cooperatively in different fields. The algorithm integrated discrete decoding, chaotic mapping-based initialization, and Cauchy mutation. It was more efficient and stable than traditional optimization algorithms in practical scenarios.
Based on the above, as shown in Table 14, various advanced methods and optimization techniques have been employed to enhance fertilization processes. These approaches have led to significant improvements in accuracy, efficiency, and overall system performance, particularly in areas such as nitrogen content estimation, nutrient control, and task allocation for cooperative operations.

4. Challenges and Prospects of Intelligent Agricultural Equipment

4.1. Typical Artificial Intelligence Models and Applications in Agricultural Equipment

Agricultural equipment faces challenges posed by labor shortages, limited arable land, and increasing food production demands driven by global population growth. As a result, AI technology is widely recognized as a key driving force for advancing the mechanization and intelligent upgrading of agriculture. In recent years, research has focused on critical processes such as land preparation, transplanting, sowing, harvesting, and field management. AI has deeply penetrated and transformed the operational models of agricultural equipment. The operational efficiency and the level of operational intelligence have been significantly enhanced. Table 15 summarizes the commonly used AI methods and their main tasks in various agricultural equipment operation scenarios. In Table 3, AI methods and their main tasks in different agricultural equipment operation scenarios are summarized.
In modern agriculture, traditional ML algorithms (such as SVM, K-means clustering, etc.) and advanced DL models (such as YOLO, Mask R-CNN, MobileNet, etc.) are widely applied. Multiple tasks have been effectively addressed, including automatic crop recognition, dynamic optimization of operation parameters, target detection and localization, and intelligent assessment of crop quality. The computer vision and data processing systems have been developed based on these AI methods. And the intelligent perception, autonomous navigation, and environmental adaptation can be achieved in complex, unstructured field environments. There is a deep integration between multi-source sensing technologies (such as RGB-D cameras, drone-based multispectral imaging, etc.), ML, and computer vision. The capabilities for intelligent data collection and operation evaluation have been significantly enhanced. It provides a solid information foundation for precision agriculture. Based on this, the integration of deep neural networks with remote sensing, spectral, and other multimodal perception data is applied. Key decision-making processes such as weed control, fertilization management, and intelligent irrigation are effectively achieved. Ultimately, AI models, multi-source intelligent sensing, and control technologies are innovatively integrated. Modern agricultural equipment is enhanced in terms of precision operation, efficient resource utilization, and system robustness. These advancements are accelerating the transformation of agricultural production toward intelligent and high-efficiency systems.

4.2. Key Artificial Intelligence Technology in Agricultural Equipment

4.2.1. Intelligent Autonomous Operation Technology in Unstructured Environments

The agricultural production environment is highly dynamic and unstructured. There are significant challenges for autonomous navigation and path planning due to complex terrain, uneven crop distribution, and frequent occlusions. To achieve accurate environmental perception and target recognition across multiple scenarios, multi-sensor fusion and multi-source integrated navigation (machine vision, LiDAR, inertial navigation, and ultrasonic sensors) have been extensively studied. Meanwhile, AI algorithms are introduced to enhance obstacle detection, avoidance, and autonomous path generation. Thus, autonomous operation and safe performance of the equipment in complex terrain and dynamic environments are improved. It provides a solid foundation for intelligent perception and high-precision tasks.

4.2.2. Multimodal Intelligent Detection and Recognition Technologies in Complex Scenarios

Under complex conditions such as vegetation overlap, intense lighting, shadows, and soil occlusion, it is challenging to accurately detect crops, weeds, and foreign objects. Reliable identification and segmentation are complicated by these conditions. To overcome the limitations of single-feature and single-source perception, DL-driven multimodal information fusion methods have been investigated. Through intelligent feature integration and enhancement, target detection accuracy is improved. System robustness is significantly enhanced. In addition, compared to traditional image processing and spectral analysis methods, DL not only entails lower costs but also offers greater robustness in complex scenarios. It has increasingly become a focal point of detection research. Based on multimodal perception, the AI detection system has been developed to achieve efficient recognition and robust sensing of multiple target types in complex scenarios. As a result, there is a solid foundation for the operational optimization, precise decision-making, and closed-loop control of agricultural machinery and equipment.

4.2.3. Precision Control and Decision Optimization Technologies in Dynamic Environments

Due to complex operating conditions and dynamic environmental changes, there are higher demands for control accuracy and stability of agricultural equipment. In recent years, adaptive control, model predictive control, and fuzzy control methods have been widely applied. The key equipment parameters are sensed in real time through multi-sensor systems such as inertial navigation, vision, and radar. The control system dynamically adjusts to cope with environmental disturbances, including soil moisture fluctuations and crop growth stage variations. These data-driven and model-based control technologies significantly improve operational accuracy in tasks such as precision fertilization and tillage depth regulation. They also enhance system robustness. The evolution of intelligent equipment control strategies toward dynamic adaptation is driven by this advancement.

4.3. Pathways to Achieving Key Technological Breakthroughs

4.3.1. Fusion of Virtual and Physical Realms in Digital Twin Models

A virtual model highly synchronized with the physical entity is constructed through DT. Real-time data interaction and dynamic mapping between the physical world and digital space are achieved. In intelligent agricultural equipment, DT encompass equipment kinematics, operational status, and energy consumption monitoring. They also enable high-precision reconstruction of work scenarios and resource distribution. Based on virtual environment strategy simulation and parameter optimization, task outcome prediction, fault warning, and intelligent feedback are realized. Consequently, physical equipment and virtual systems can collaboratively optimize and continuously evolve. DT provide robust theoretical foundations and platform support for intelligent lifecycle management, including equipment design, remote diagnostics, and field operation management. However, to fully realize “physical–digital” collaborative optimization and end-to-end intelligent lifecycle management, it is essential to achieve ongoing breakthroughs in heterogeneous data acquisition and integration. Additionally, advancements in high-bandwidth/low-latency communications and edge- and cloud-computing resources are essential.

4.3.2. Collaborative Optimization of Edge Computing and Big Data

The scale and complexity of field data in agriculture are continuously increasing. There are limitations in the single-cloud computing model to meet the real-time control requirements of intelligent agricultural equipment. Edge computing performs real-time analysis and preliminary decision-making at the data source to reduce latency and optimize local task responsiveness. Meanwhile, big data platforms are proficient in deep mining of historical information and modeling of global knowledge. A closed-loop mechanism of “on-site sensing–edge processing–cloud optimization–intelligent feedback” is formed through edge–cloud collaboration. Real-time data and anomaly detection are focused on by edge nodes. Subsequently, multi-source data are integrated to support modeling and prediction, with mutual reinforcement achieved between the two. This architecture significantly enhances the adaptability of production scheduling and the efficiency of resource allocation. In addition, it also strengthens the intelligence and real-time performance of complex decision-making processes, including multi-agent collaboration, task allocation, and agricultural logistics optimization. However, high-performance edge devices, upgraded communication infrastructure, and the establishment of big-data platforms are demanded. It is difficult for small- to mid-sized farms and resource-constrained regions to achieve large-scale adoption due to high initial investment and operational costs. Therefore, attention has been given to solutions such as lightweight edge computing and industrial ecosystem collaboration.

4.3.3. Self-Evolution of Intelligent Control Strategies Driven by Deep Reinforcement Learning

Due to continuous interaction and adaptive learning capabilities of RL, the processes of perception, decision-making, and execution are continuously optimized in response to environmental feedback. Moreover, as deep neural networks are capable of handling multivariable and high-dimensional complex control problems, DRL enables autonomous operations and precise regulation under complex agricultural conditions. Multi-agent RL supports multi-robot collaboration and intelligent resource allocation. It enhances operational efficiency. This approach meets the demands of large-scale, multi-region autonomous operations. Based on DRL, the transition of equipment from traditional rule-based to data-driven operation is promoted by intelligent control strategies. Meanwhile, the core support for highly adaptive and intelligent agricultural production systems is established.

5. Conclusions

This article systematically summarizes the research progress and representative applications of AI in agricultural machinery, covering key stages such as tillage, seeding, transplanting, harvesting, and field management. The deep integration and innovative development of technologies such as computer vision, machine learning, and intelligent sensing of agricultural equipment are analyzed.
(1)
In the tillage stage, by collecting and modeling multi-source vibration signals, tillage equipment can perceive soil resistance and terrain variations in real time. The fusion of deep learning with fuzzy logic control autonomously optimizes tillage settings, enhancing both stability and soil-turning quality.
(2)
In the seeding and transplanting stage, based on visual perception and deep neural networks, path planning and seedling recognition systems have been developed. Seeding and transplanting equipment is capable of automatically identifying row spacing, obstacles, and planting positions, enabling precise control of seeding density.
(3)
In the harvesting stage, by integrating RGB-D imaging, infrared thermography, and multimodal sensing technologies, harvesting equipment can accurately identify the location, size, and ripeness of target fruits. The processing efficiency has been improved by lightweight networks and image segmentation algorithms.
(4)
In the field management stage, intelligent irrigation, fertilization, and weeding equipment have been integrated with soil moisture sensing, weather forecasting, and crop modeling. The operation strategies are dynamically optimized using reinforcement learning algorithms, facilitating the implementation of precision agriculture. Weeding equipment enables accurate discrimination between crops and weeds through target recognition and classification algorithms.
To advance the development of this field, the following directions should be prioritized in future research: DT models should be developed to enable full lifecycle modeling and performance prediction. The synergy between edge computing and big data should be enhanced. The adaptive optimization of control strategies driven by DRL should be introduced.

Author Contributions

Conceptualization, investigation, funding acquisition, Y.Z.; methodology, writing—original draft preparation, S.Z.; validation, writing—review and editing, S.T.; formal analysis, validation, Q.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Chen, T.; Lv, L.; Wang, D.; Zhang, J.; Yang, Y.; Zhao, Z.; Chen, W.; Guo, X.; Chen, H.; Wang, Q.; et al. Empowering Agrifood System with Artificial Intelligence: A Survey of the Progress, Challenges and Opportunities. ACM Comput. Surv. 2023, 57, 1–37. [Google Scholar] [CrossRef]
  2. El Jarroudi, M.; Kouadio, L.; Delfosse, P.; Bock, C.H.; Mahlein, A.K.; Fettweis, X.; Mercatoris, B.; Adams, F.; Lenn, J.; Hamdioui, S. Leveraging Edge Artificial Intelligence for Sustainable Agriculture. Nat. Sustain. 2024, 7, 846–854. [Google Scholar] [CrossRef]
  3. Chang, H.; Yang, J.; Wang, Z.; Peng, G.; Lin, R.; Lou, Y.; Shi, W.; Zhou, L. Efficiency Optimization of Energy Storage Centrifugal Pump by Using Energy Balance Equation and Non-Dominated Sorting Genetic Algorithms-II. J. Energy Storage 2025, 114, 115817. [Google Scholar] [CrossRef]
  4. Wang, Z.; Chen, Y.; Rakibuzzaman, M.; Agarwal, R.; Zhou, L. Numerical and Experimental Investigations of a Double-Suction Pump with a Middle Spacer and a Staggered Impeller. Irrig. Drain. 2025, 10, 944–956. [Google Scholar] [CrossRef]
  5. Li, Y.; Xu, L.; Lv, L.; Shi, Y.; Yu, X. Study on Modeling Method of a Multi-Parameter Control System for Threshing and Cleaning Devices in the Grain Combine Harvester. Agriculture 2022, 12, 1483. [Google Scholar] [CrossRef]
  6. He, W.; Liu, Y.; Sun, H.; Taghizadeh-Hesary, F. How Does Climate Change Affect. Rice Yield in China? Agriculture 2020, 10, 441. [Google Scholar] [CrossRef]
  7. Ali, A.B.; Elshaikh, N.A.; Hong, L.; Adam, A.B.; Haofang, Y. Conservation Tillage as an Approach to Enhance Crops Water Use Efficiency. Acta Agric. Scand. Sect. B—Soil Plant Sci. 2017, 67, 252–262. [Google Scholar] [CrossRef]
  8. El-Emam, M.A.; Zhou, L.; Omara, A.I. Predicting the Performance of Aero-Type Cyclone Separators with Different Spiral Inlets Under Macroscopic Bio-Granular Flow Using CFD-DEM Modelling. Biosyst. Eng. 2017, 233, 125–150. [Google Scholar] [CrossRef]
  9. Peng, Y.; Wang, A.; Liu, J.; Faheem, M. A Comparative Study of Semantic Segmentation Models for Identification of Grape with Different Varieties. Agriculture 2021, 11, 997. [Google Scholar] [CrossRef]
  10. Wang, N.; Jin, Z.; Wang, T.; Xiao, J.; Zhang, Z.; Wang, H.; Zhang, M.; Li, H. Hybrid Path Planning Methods for Complete CoverAge in Harvesting Operation Scenarios. Comput. Electron. Agric. 2025, 231, 109946. [Google Scholar] [CrossRef]
  11. Sodjinou, S.G.; Mahama, A.T.S.; Gouton, P. Automatic Segmentation of Plants and Weeds in Wide-Band Multispectral Imaging (WMI). J. Imaging 2025, 11, 85. [Google Scholar] [CrossRef] [PubMed]
  12. Rayhana, R.; Xiao, G.; Liu, Z. RFID Sensing Technologies for Smart Agriculture. IEEE Instrum. Meas. Mag. 2021, 24, 50–60. [Google Scholar] [CrossRef]
  13. Williams, H.A.; Jones, M.H.; Nejati, M.; Seabright, M.J.; Bell, J.; Penhall, N.D.; Baenett, J.J.; Duke, M.D.; Scarfe, A.J.; Ahn, H.S.; et al. Robotic Kiwifruit Harvesting Using Machine Vision, Convolutional Neural networks, and Robotic Arms. Biosyst. Eng. 2019, 181, 140–156. [Google Scholar] [CrossRef]
  14. Nasiri, A.; Omid, M.; Taheri-Garavand, A.; Jafari, A. Deep Learning-Based Precision Agriculture Through Weed Recognition in Sugar Beet Fields. Sustain. Comput. Inform. Syst. 2022, 35, 100759. [Google Scholar] [CrossRef]
  15. El Akrouchi, M.; Mhada, M.; Gracia, D.R.; Hawkesford, M.J.; Gérard, B. Optimizing Mask R-CNN for Enhanced Quinoa Panicle Detection and Segmentation in Precision Agriculture. Front. Plant Sci. 2025, 16, 1472688. [Google Scholar] [CrossRef]
  16. Qiu, C.; Zhao, B.; Liu, S.; Zhang, W.; Zhou, L.; Li, Y.; Guo, R. Data Classification and Demand Prediction Methods Based on Semi-Supervised Agricultural Machinery Spare Parts Data. Agriculture 2022, 13, 49. [Google Scholar] [CrossRef]
  17. Kim, H.; Sim, S.H.; Yoon, J.; Lee, J. Full-Scale Structural Displacement Measurement with Camera Ego-Motion Compensation Using RGB and LiDAR Cameras. Measurement 2024, 237, 115194. [Google Scholar] [CrossRef]
  18. Iqbal, B.; Alabbosh, K.F.; Jalal, A.; Suboktagin, S.; Elboughdiri, N. Sustainable Food Systems Transformation in the Face of Climate Change: Strategies, Challenges, and Policy Implications. Food Sci. Biotechnol. 2025, 34, 871–883. [Google Scholar] [CrossRef]
  19. Yin, L.; Jayan, H.; Cai, J.; El-Seedi, H.R.; Guo, Z.; Zou, X. Spoilage Monitoring and Early Warning for Apples in Storage Using Gas Sensors and Chemometrics. Foods 2023, 12, 2968. [Google Scholar] [CrossRef]
  20. You, H.; Xu, F.; Ye, Y.; Xia, P.; Du, J. Adaptive LiDAR Scanning Based on RGB Information. Automat. Constr. 2024, 160, 105337. [Google Scholar] [CrossRef]
  21. Shoaib, M.; Li, H.; Khan, I.M.; Hassan, M.M.; Zareef, M.; Niazi, S.; Chen, Q. Emerging MXenes-Based Aptasensors: A Paradigm Shift in Food Safety Detection. Trends Food Sci. Tech. 2024, 151, 104635. [Google Scholar] [CrossRef]
  22. Zhang, Z.; Zhang, Y.; Jayan, H.; Gao, S.; Zhou, R.; Yosri, N.; Zou, X.; Guo, Z. Recent and Emerging Trends of Metal-Organic Frameworks (MOFs)-Based Sensors for Detecting Food Contaminants: A Critical and Comprehensive review. Food Chem. 2024, 448, 139051. [Google Scholar] [CrossRef]
  23. Ma, J.; Li, M.; Fan, W.; Liu, J. State-of-the-Art Techniques for Fruit Maturity Detection. Agronomy 2024, 14, 2783. [Google Scholar] [CrossRef]
  24. Wang, C.; Wang, H.; Han, Q.; Wu, Z.; Li, C.; Zhang, Z. Litchi Bunch Detection and Ripeness Assessment Using Deep Learning and Clustering with Image Processing Techniques. Biosyst. Eng. 2025, 255, 104173. [Google Scholar] [CrossRef]
  25. Wang, H.; Gu, J.; Wang, M. A Review on the Application of Computer Vision and Machine Learning in the Tea Industry. Front. Sustain. Food Syst. 2023, 7, 1172543. [Google Scholar] [CrossRef]
  26. Cutini, M.; Brambilla, M.; Bisaglia, C. Whole-Body Vibration in Farming: Background Document for Creating a Simplified Procedure to Determine Agricultural Tractor Vibration Comfort. Agriculture 2017, 7, 84. [Google Scholar] [CrossRef]
  27. Dai, D.; Chen, D.; Wang, S.; Li, S.; Mao, X.; Zhang, B.; Wang, Z.; Ma, Z. Compilation and Extrapolation of Load Spectrum of Tractor Ground Vibration Load Based on CEEMDAN-POT Model. Agriculture 2023, 13, 125. [Google Scholar] [CrossRef]
  28. Wang, S.; Lu, B. Detecting the Weak Damped Oscillation Signal in the Agricultural Machinery Working Environment by Vibrational Resonance in the Duffing System. J. Mech. Sci. Technol. 2022, 36, 5925–5937. [Google Scholar] [CrossRef]
  29. Gao, Y.; Hu, Y.; Yang, Y.; Feng, K.; Han, X.; Li, P.; Zhu, Y.; Song, Q. Optimization of Operating Parameters for Straw Returning Machine Based on Vibration Characteristic Analysis. Agronomy 2024, 14, 2388. [Google Scholar] [CrossRef]
  30. Aiello, G.; Catania, P.; Vallone, M.; Venticinque, M. Worker Safety in Agriculture 4.0: A New Approach for Mapping Operator’s Vibration Risk Through Machine Learning Activity Recognition. Comput. Electron. Agric. 2022, 193, 106637. [Google Scholar] [CrossRef]
  31. Gao, Y.; Yang, Y.; Fu, S.; Feng, K.; Han, X.; Hu, Y.; Zhu, Q.; Wei, X. Analysis of Vibration Characteristics of Tractor-Rotary CultiVator Combination based on Time Domain and Frequency Domain. Agriculture 2024, 14, 1139. [Google Scholar] [CrossRef]
  32. Singh, A.; Nawayseh, N.; Singh, H.; Dhabi, Y.K.; Samuel, S. Internet of Agriculture: Analyzing and Predicting Tractor Ride Comfort Through Supervised Machine Learning. Eng. Appl. Artif. Intel. 2023, 125, 106720. [Google Scholar] [CrossRef]
  33. Singh, A.; Nawayseh, N.; Dhabi, Y.K.; Samuel, S.; Singh, H. Transforming Farming with Intelligence: Smart Vibration Monitoring and Alert System. J. Eng. Res. 2024, 12, 190–199. [Google Scholar] [CrossRef]
  34. Jin, X.; Chen, K.; Ji, J.; Zhao, K.; Du, X.; Ma, H. Intelligent Vibration Detection and Control System of Agricultural Machinery Engine. Measurement 2019, 145, 503–510. [Google Scholar] [CrossRef]
  35. Wang, X.; Zheng, Z.; Jia, W.; Tai, K.; Xu, Y.; He, Y. Response Mechanism and Evolution Trend of Carbon Effect in the Farmland Ecosystem of the Middle and Lower Reaches of the Yangtze River. Agronomy 2024, 14, 2354. [Google Scholar] [CrossRef]
  36. Zhao, Z.; Li, H.; Liu, J.; Yang, S.X. Control Method of Seedbed Compactness Based on Fragment Soil Compaction Dynamic Characteristics. Soil Till. Res. 2020, 198, 104551. [Google Scholar] [CrossRef]
  37. Ben Hassen, H.; Elaoud, A.; Masmoudi, K. Modeling of Agricultural Soil Compaction Using Discrete Bayesian Networks. Int. J. Environ. Sci. Technol. 2020, 17, 2571–2582. [Google Scholar] [CrossRef]
  38. Wang, X.; Wang, T.; Zhang, J.; Ma, G. Autonomous Soil Vision Scanning System for Intelligent Subgrade Compaction. Automat. Constr. 2024, 158, 105242. [Google Scholar] [CrossRef]
  39. Carrera, A.; Barone, I.; Pavoni, M.; Boaga, J.; Dal Ferro, N.; Cassiani, G.; Morari, F. Assessment of Different Agricultural Soil Compaction Levels Using Shallow Seismic Geophysical Methods. Geoderma 2024, 447, 116914. [Google Scholar] [CrossRef]
  40. Meehan, C.L.; Baker, W.J., III. Scanners, Satellites, Smart Compactors, and Drones: Emerging Technologies for Assessing Compacted Soil Lift Thickness. Transp. Geotech. 2025, 52, 101574. [Google Scholar] [CrossRef]
  41. Lakhiar, I.A.; Yan, H.; Zhang, C.; Wang, G.; He, B.; Hao, B.; Han, Y.; Wang, B.; Bao, R.; Syed, T.N.; et al. A Review of Precision Irrigation Water-Saving Technology Under Changing Climate for Enhancing Water Use Efficiency, Crop Yield, and Environmental Footprints. Agriculture 2024, 14, 1141. [Google Scholar] [CrossRef]
  42. Gao, J.; Qi, H. Soil Throwing Experiments for Reverse Rotary Tillage at Various Depths, Travel speeds, and Rotational Speeds. Trans. ASABE 2017, 60, 1113–1121. [Google Scholar] [CrossRef]
  43. Kim, Y.S.; Kim, T.J.; Kim, Y.J.; Lee, S.D.; Park, S.U.; Kim, W.S. Development of a Real-Time Tillage Depth Measurement System for Agricultural Tractors: Application to the Effect Analysis of Tillage Depth on Draft Force During Plow Tillage. Sensors 2020, 20, 912. [Google Scholar] [CrossRef] [PubMed]
  44. Yin, Y.; Zhao, C.; Zhang, Y.; Chen, J.; Luo, C.; Wang, P.; Chen, L.; Meng, Z. Development and Application of Subsoiling Monitoring System Based on Edge Computing Using IoT Architecture. Comput. Electron. Agric. 2022, 198, 106976. [Google Scholar] [CrossRef]
  45. Kim, Y.S.; Lee, S.D.; Baek, S.M.; Baek, S.Y.; Jeon, H.H.; Lee, J.H.; Siddque, M.A.; Kim, Y.J.; Kim, W.S.; Sim, T.; et al. Development of DEM-MBD Coupling Model for Draft Force Prediction of Agricultural Tractor with Plowing Depth. Comput. Electron. Agric. 2022, 202, 107405. [Google Scholar] [CrossRef]
  46. Zhang, B.; Bai, T.; Wu, G.; Wang, H.; Zhu, Q.; Zhang, G.; Meng, Z.; Wen, C. Fatigue Analysis of Shovel Body Based on Tractor Subsoiling Operation Measured Data. Agriculture 2024, 14, 1604. [Google Scholar] [CrossRef]
  47. Zhao, S.; Adade, S.Y.S.S.; Wang, Z.; Jiao, T.; Ouyang, Q.; Li, H.; Chen, Q. Deep Learning and Feature Reconstruction Assisted Vis-NIR Calibration Method for On-Line Monitoring of Key Growth Indicators During Kombucha Production. Food Chem. 2025, 463, 141411. [Google Scholar] [CrossRef]
  48. Opiyo, S.; Okinda, C.; Zhou, J.; Mwangi, E.; Makange, N. Medial Axis-Based Machine-Vision System for Orchard Robot Navigation. Comput. Electron. Agric. 2021, 185, 106153. [Google Scholar] [CrossRef]
  49. Liu, W.; Zhou, J.; Liu, Y.; Zhang, T.; Meng, Y.; Chen, J.; Zhou, C.; Hu, J.; Chen, X. An Ultrasonic Ridge-Tracking Method Based on Limiter Sliding Window Filter and Fuzzy Pure Pursuit Control for Ridge Transplanter. Agriculture 2024, 14, 1713. [Google Scholar] [CrossRef]
  50. Shet, R.M.; Lakhekar, G.V.; Iyer, N.C. Intelligent Fractional-Order Sliding Mode Control Based Maneuvering of an Autonomous Vehicle. J. Ambient Intell. Humaniz. Comput. 2024, 15, 2807–2826. [Google Scholar] [CrossRef]
  51. Liu, W.; Hu, J.; Liu, J.; Yue, R.; Zhang, T.; Yao, M.; Li, J. Method for the Navigation Line Recognition of the Ridge without Crops Via Machine Vision. Int. J. Agric. Biol. Eng. 2024, 17, 230–239. [Google Scholar] [CrossRef]
  52. Aytem, H.; Karayel, D.; Šarauskis, E. Influence of Tillage Methods on Transplanter Performance with Different Transplanting Mechanisms. Sci. Rep. 2025, 15, 13081. [Google Scholar] [CrossRef]
  53. Jin, X.; Li, R.; Tang, Q.; Wu, J.; Jiang, L.; Wu, C. Low-Damage Transplanting Method for Leafy Vegetable Seedlings Based on Machine Vision. Biosyst. Eng. 2022, 220, 159–171. [Google Scholar] [CrossRef]
  54. Zhang, J.L.; Su, W.H.; Zhang, H.Y.; Peng, Y. SE-YOLOv5x: An Optimized Model Based on Transfer Learning and Visual Attention Mechanism for Identifying and Localizing Weeds and Vegetables. Agronomy 2022, 12, 2061. [Google Scholar] [CrossRef]
  55. Cui, J.; Zheng, H.; Zeng, Z.; Yang, Y.; Ma, R.; Tian, Y.; Tan, J.; Feng, X.; Qi, L. Real-Time Missing Seedling Counting in Paddy Fields Based on Lightweight Network and Tracking-by-Detection Algorithm. Comput. Electron. Agric. 2023, 212, 108045. [Google Scholar] [CrossRef]
  56. Zhang, T.; Zhou, J.; Liu, W.; Yue, R.; Yao, M.; Shi, J.; Hu, J. Seedling-YOLO: High-Efficiency Target Detection Algorithm for Field Broccoli Seedling Transplanting Quality Based on YOLOv7-Tiny. Agronomy 2024, 14, 931. [Google Scholar] [CrossRef]
  57. Li, Y.; Wei, H.; Tong, J.; Qiu, Z.; Wu, C. Evaluation of Health Identification Method for Plug Seedling Transplantation Robots in Greenhouse Environment. Biosyst. Eng. 2024, 240, 33–45. [Google Scholar] [CrossRef]
  58. Li, H.; Liu, X.; Zhang, H.; Li, H.; Jia, S.; Sun, W.; Wang, G.; Feng, Q.; Yang, S.; Xing, W. Research and Experiment on Miss-Seeding Detection of Potato Planter Based on Improved YOLOv5s. Agriculture 2024, 14, 1905. [Google Scholar] [CrossRef]
  59. Li, M.; Zhu, X.; Ji, J.; Jin, X.; Li, B.; Chen, K.; Zhang, W. Visual Perception Enabled Agriculture Intelligence: A Selective Seedling Picking Transplanting Robot. Comput. Electron. Agric. 2025, 229, 109821. [Google Scholar] [CrossRef]
  60. You, J.; Li, D.; Wang, Z.; Chen, Q.; Ouyang, Q. Prediction and Visualization of Moisture Content in Tencha Drying Processes by Computer Vision and Deep Learning. J. Sci. Food Agric. 2024, 104, 5486–5494. [Google Scholar] [CrossRef]
  61. Sun, J.; Nirere, A.; Dusabe, K.D.; Yuhao, Z.; Adrien, G. Rapid and Nondestructive Watermelon (Citrullus lanatus) Seed Viability Detection Based on Visible Near-Infrared Hyperspectral Imaging Technology and Machine Learning Algorithms. J. Food Sci. 2024, 89, 4403–4418. [Google Scholar] [CrossRef]
  62. Zhao, Z.; Jin, M.; Tian, C.; Yang, S.X. Prediction of Seed Distribution in Rectangular Vibrating Tray Using Grey Model and Artificial Neural Network. Biosyst. Eng. 2018, 175, 194–205. [Google Scholar] [CrossRef]
  63. Chen, J.; Yu, C.; Xia, X.; Zhao, X.; Cai, S.; Wang, J. Design and Experimental Study of a Power Matching Control System for a Working Device of a Tree Transplanting Machine. Proc. Inst. Mech. Eng. Part I J. Syst. Control Eng. 2019, 233, 689–701. [Google Scholar] [CrossRef]
  64. Zhao, S.; Liu, J.; Jin, Y.; Bai, Z.; Liu, J.; Zhou, X. Design and Testing of an Intelligent Multi-Functional Seedling Transplanting System. Agronomy 2022, 12, 2683. [Google Scholar] [CrossRef]
  65. Yue, R.; Yao, M.; Zhang, T.; Shi, J.; Zhou, J.; Hu, J. Design and Experiment of Dual-Row Seedling Pick-Up Device for High-Speed Automatic Transplanting Machine. Agriculture 2024, 14, 942. [Google Scholar] [CrossRef]
  66. Yao, M.; Hu, J.; Liu, W.; Shi, J.; Jin, Y.; Lv, J.; Sun, Z.; Wang, C. Precise Servo-Control System of a Dual-Axis Positioning Tray Conveying Device for Automatic Transplanting Machine. Agriculture 2024, 14, 1431. [Google Scholar] [CrossRef]
  67. Xiao, X.; Wang, Y.; Jiang, Y. Review of Research Advances in Fruit and Vegetable Harvesting Robots. J. Electr. Eng. Technol. 2024, 19, 773–789. [Google Scholar] [CrossRef]
  68. Li, A.; Wang, C.; Ji, T.; Wang, Q.; Zhang, T. D3-YOLOv10: Improved YOLOv10-Based Lightweight Tomato Detection Algorithm Under Facility Scenario. Agriculture 2024, 14, 2268. [Google Scholar] [CrossRef]
  69. Chen, J.; Ma, W.; Liao, H.; Lu, J.; Yang, Y.; Qian, J.; Xu, L. Balancing Accuracy and Efficiency: The Status and Challenges of Agricultural Multi-Arm Harvesting Robot Research. Agronomy 2024, 14, 2209. [Google Scholar] [CrossRef]
  70. Huang, H.; Wang, R.; Huang, F.; Chen, J. Analysis and Realization of a Self-Adaptive Grasper Grasping for Non-Destructive Picking of Fruits and Vegetables. Comput. Electron. Agric. 2025, 232, 110119. [Google Scholar] [CrossRef]
  71. Jia, W.; Zheng, Y.; Zhao, D.A.; Yin, X.; Liu, X.; Du, R. Preprocessing Method of Night Vision Image Application in Apple Harvesting Robot. Int. J. Agric. Biol. Eng. 2018, 11, 158–163. [Google Scholar] [CrossRef]
  72. Sun, Y.; Luo, Y.; Chai, X.; Zhang, P.; Zhang, Q.; Xu, L.; Wei, L. Double-Threshold Segmentation of Panicle and Clustering Adaptive Density Estimation for Mature Rice Plants Based on 3D Point Cloud. Electronics 2021, 10, 872. [Google Scholar] [CrossRef]
  73. Xue, Z.; Fu, J.; Fu, Q.; Li, X.; Chen, Z. Modeling and Optimizing the Performance of Green Forage Maize Harvester Header Using a Combined Response Surface Methodology-Artificial Neural Network Approach. Agriculture 2023, 13, 1890. [Google Scholar] [CrossRef]
  74. Zhang, Z.; Lu, Y.; Zhao, Y.; Pan, Q.; Jin, K.; Xu, G.; Hu, Y. Ts-yolo: An All-Day and Lightweight Tea Canopy Shoots Detection Model. Agronomy 2023, 13, 1411. [Google Scholar] [CrossRef]
  75. Ji, W.; Pan, Y.; Xu, B.; Wang, J. A Real-Time Apple Targets Detection Method for Picking Robot Based on ShufflenetV2-YOLOX. Agriculture 2022, 12, 856. [Google Scholar] [CrossRef]
  76. Cai, Y.; Cui, B.; Deng, H.; Zeng, Z.; Wang, Q.; Lu, D.; Cui, Y.; Tian, Y. Cherry Tomato Detection for Harvesting Using Multimodal Perception and an Improved YOLOv7-Tiny Neural Network. Agronomy 2024, 14, 2320. [Google Scholar] [CrossRef]
  77. Hu, T.; Wang, W.; Gu, J.; Xia, Z.; Zhang, J.; Wang, B. Research on Apple Object Detection and Localization Method Based on Improved YOLOX and RGB-D Images. Agronomy 2023, 13, 1816. [Google Scholar] [CrossRef]
  78. Guan, X.; Shi, L.; Yang, W.; Ge, H.; Wei, X.; Ding, Y. Multi-Feature Fusion Recognition and Localization Method for Unmanned Harvesting of Aquatic Vegetables. Agriculture 2024, 14, 971. [Google Scholar] [CrossRef]
  79. Zuo, Z.; Gao, S.; Peng, H.; Xue, Y.; Han, L.; Ma, G.; Mao, H. Lightweight Detection of Broccoli Heads in Complex Field Environments Based on LBDC-YOLO. Agronomy 2024, 14, 2359. [Google Scholar] [CrossRef]
  80. Ntakolia, C.; Moustakidis, S.; Siouras, A. Autonomous Path Planning with Obstacle Avoidance for Smart Assistive Systems. Expert. Syst. Appl. 2023, 213, 119049. [Google Scholar] [CrossRef]
  81. Zhang, F.; Chen, Z.; Wang, Y.; Bao, R.; Chen, X.; Fu, S.; Tian, M.; Zhang, Y. Research on Flexible End-Effectors with Humanoid Grasp Function for Small Spherical Fruit Picking. Agriculture 2023, 13, 123. [Google Scholar] [CrossRef]
  82. Liu, H.; Yan, S.; Shen, Y.; Li, C.; Zhang, Y.; Hussain, F. Model Predictive Control System Based on Direct Yaw Moment Control for 4WID Self-Steering Agriculture Vehicle. Int. J. Agric. Biol. Eng. 2021, 14, 175–181. [Google Scholar] [CrossRef]
  83. Kumar, S.; Kumari, S.; Rana, S.S.; Rana, R.S.; Anwar, T.; Qureshi, H.; Saleh, M.; Alamer, K.; Atta, H.; Ercisli, S.; et al. Weed Management Challenges in Modern Agriculture: The Role of Environmental Factors and Fertilization Strategies. Crop Prot. 2024, 185, 106903. [Google Scholar] [CrossRef]
  84. Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation Path Extraction for Greenhouse Cucumber-Picking Robots Using the Prediction-Point Hough Transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
  85. Wang, Q.; Qin, W.; Liu, M.; Zhao, J.; Zhu, Q.; Yin, Y. Semantic Segmentation Model-Based Boundary Line Recognition Method for Wheat Harvesting. Agriculture 2024, 14, 1846. [Google Scholar] [CrossRef]
  86. Yang, Y.; Xie, H.; Zhang, K.; Wang, Y.; Li, Y.; Zhou, J.; Xu, L. Design, Development, Integration, and Field Evaluation of a Ridge-Planting Strawberry Harvesting Robot. Agriculture 2024, 14, 2126. [Google Scholar] [CrossRef]
  87. Xie, F.; Guo, Z.; Li, T.; Feng, Q.; Zhao, C. Dynamic Task Planning for Multi-Arm Harvesting Robots Under Multiple Constraints Using Deep Reinforcement Learning. Horticulturae 2025, 11, 88. [Google Scholar] [CrossRef]
  88. Zhang, X.; Wang, P.; Mao, H.; Gao, H.; Li, Q. Detection of the Nutritional Status of Phosphorus in Lettuce Using the Time-Domain Spectroscopy. Eng. Agríc. 2021, 41, 599–608. [Google Scholar] [CrossRef]
  89. Gu, Q.; Li, T.; Hu, Z.; Zhu, Y.; Shi, J.; Zhang, L.; Zhang, X. Quantitative Analysis of Watermelon fruit Skin Phenotypic Traits Via Image Processing and Their Potential in Maturity and Quality Detection. Comput. Electron. Agric. 2025, 230, 109960. [Google Scholar] [CrossRef]
  90. Qin, O.; Wang, L.; Park, B.; Kang, R.; Wang, Z.; Chen, Q.; Guo, Z. Assessment of Matcha Sensory Quality Using Hyperspectral Microscope Imaging Technology. LWT 2020, 125, 109254. [Google Scholar] [CrossRef]
  91. Guo, Z.; Wang, M.; Barimah, A.O.; Chen, Q.; Li, H.; Shi, J.; EI-Seedi, H.R.; Zou, X. Label-Free Surface Enhanced Raman Scattering Spectroscopy for Discrimination and Detection of Dominant Apple Spoilage Fungus. Int. J. Food Microbiol. 2021, 338, 108990. [Google Scholar] [CrossRef]
  92. Qiu, G.; Lu, H.; Wang, X.; Wang, C.; Xu, S.; Liang, X.; Fan, C. Nondestructive Detecting Maturity of Pineapples Based on Visible and Near-Infrared Transmittance Spectroscopy Coupled with Machine Learning Methodologies. Horticulturae 2023, 9, 889. [Google Scholar] [CrossRef]
  93. Xu, M.; Sun, J.; Cheng, J.; Yao, K.; Wu, X.; Zhou, X. Non-Destructive Prediction of Total Soluble Solids and Titratable Acidity in Kyoho Grape Using Hyperspectral Imaging and Deep Learning Algorithm. Int. J. Food Sci. Tech. 2023, 58, 9–21. [Google Scholar] [CrossRef]
  94. Chen, J.; Lian, Y.; Zou, R.; Zhang, S.; Ning, X.; Han, M. Real-Time Grain Breakage Sensing for Rice Combine Harvesters Using Machine Vision Technology. Int. J. Agric. Biol. Eng. 2020, 13, 194–199. [Google Scholar] [CrossRef]
  95. Wang, Y.; Shoaib, M.; Wang, J.; Lin, H.; Chen, Q.; Ouyang, Q. A novel ZIF-8 Mediated Nanocomposite Colorimetric Sensor Array for Rapid Identification of Matcha Grades, Validated by Density Functional Theory. J. Food Comp. Anal. 2025, 137, 106864. [Google Scholar] [CrossRef]
  96. Pang, Y.; Tang, P.; Li, H.; Marinello, F.; Chen, C. Optimization of Sprinkler Irrigation Scheduling Scenarios for Reducing Irrigation Energy Consumption. Irrig. Drain. 2024, 73, 1329–1343. [Google Scholar] [CrossRef]
  97. Tunio, M.H.; Gao, J.; Qureshi, W.A.; Sheikh, S.A.; Chen, J.; Chandio, F.A.; Lakhiar, I.A.; Solangi, K.A. Effects of Droplet Size and Spray Interval on Root-to-Shoot Ratio, Photosynthesis Efficiency, and Nutritional Quality of Aeroponically Grown Butterhead Lettuce. Int. J. Agric. Biol. Eng. 2022, 15, 79–88. [Google Scholar]
  98. Raza, A.; Saber, K.; Hu, Y.L.; Ray, R.; Ziya Kaya, Y.; Dehghanisanij, H.; Elbeltagi, A. Modelling Reference Evapotranspiration Using Principal Component Analysis and Machine Learning Methods Under Different Climatic Environments. Irrig. Drain. 2023, 72, 945–970. [Google Scholar] [CrossRef]
  99. Boufekane, A.; Meddi, M.; Maizi, D.; Busico, G. Performance of Artificial Intelligence Model (LSTM Model) for Estimating and Predicting Water Quality Index for Irrigation Purposes in Order to Improve Agricultural Production. Environ. Monit. Assess. 2024, 196, 1049. [Google Scholar] [CrossRef]
  100. Preite, L.; Vignali, G. Artificial Intelligence to Optimize Water Consumption in Agriculture: A Predictive Algorithm-Based Irrigation Management System. Comput. Electron. Agric. 2024, 223, 109126. [Google Scholar] [CrossRef]
  101. Tang, L.; Wang, W.; Zhang, C.; Wang, Z.; Ge, Z.; Yuan, S. Linear Active Disturbance Rejection Control System for the Travel Speed of an Electric Reel Sprinkling Irrigation Machine. Agriculture 2024, 14, 1544. [Google Scholar] [CrossRef]
  102. Wang, X. The Artificial Intelligence-Based Agricultural Field Irrigation Warning System Using GA-BP Neural Network Under Smart Agriculture. PLoS ONE 2025, 20, e0317277. [Google Scholar] [CrossRef] [PubMed]
  103. Chen, Y.; Lin, M.; Yu, Z.; Sun, W.; Fu, W.; He, L. Enhancing Cotton Irrigation with Distributional Actor-Critic Reinforcement Learning. Agric. Water Manag. 2025, 307, 109194. [Google Scholar] [CrossRef]
  104. Ju, J.; Chen, G.; Lv, Z.; Zhao, M.; Sun, L.; Wang, Z.; Wang, J. Design and Experiment of an Adaptive Cruise Weeding Robot for Paddy Fields Based on Improved YOLOv5. Comput. Electron. Agric. 2024, 219, 108824. [Google Scholar] [CrossRef]
  105. Liu, J.; Abbas, I.; Noor, R.S. Development of Deep Learning-Based Variable Rate Agrochemical Spraying System for Targeted Weeds Control in Strawberry Crop. Agronomy 2021, 11, 1480. [Google Scholar] [CrossRef]
  106. Jia, W.; Tai, K.; Wang, X.; Dong, X.; Ou, M. Design and Simulation of Intra-Row Obstacle Avoidance Shovel-Type Weeding Machine in Orchard. Agriculture 2024, 14, 1124. [Google Scholar] [CrossRef]
  107. Chen, S.; Memon, M.S.; Shen, B.; Guo, J.; Du, Z.; Tang, Z.; Guo, X.; Memon, H. Identification of Weeds in Cotton Fields at Various Growth Stages Using Color Feature Techniques. Ital. J. Agron. 2024, 19, 100021. [Google Scholar] [CrossRef]
  108. Zheng, S.; Zhao, X.; Fu, H.; Tan, H.; Zhai, C.; Chen, L. Design and Experimental Evaluation of a Smart Intra-Row Weed Control System for Open-Field Cabbage. Agronomy 2025, 15, 112. [Google Scholar] [CrossRef]
  109. Memon, M.S.; Chen, S.; Shen, B.; Liang, R.; Tang, Z.; Wang, S.; Zhuo, W.; Memon, N. Automatic Visual Recognition, Detection and Classification of Weeds in Cotton Fields Based on Machine Vision. Crop Prot. 2025, 187, 106966. [Google Scholar] [CrossRef]
  110. Raheem, A.; Bankole, O.O.; Danso, F.; Musa, M.O.; Adegbite, T.A.; Simpson, V.B. Physical Management Strategies for Enhancing Soil Resilience to Climate Change: Insights From Africa. Eur. J. Soil Sci. 2025, 76, e70030. [Google Scholar] [CrossRef]
  111. Lakhiar, I.A.; Yan, H.; Zhang, J.; Wang, G.; Deng, S.; Bao, R.; Zhang, C.; Syed, T.N.; Wang, B.; Zhou, R.; et al. Plastic Pollution in Agriculture as a Threat to Food Security, the Ecosystem, and the Environment: An Overview. Agronomy 2024, 14, 548. [Google Scholar] [CrossRef]
  112. Xu, S.; Xu, X.; Zhu, Q.; Meng, Y.; Yang, G.; Feng, H.; Yang, M.; Zhu, Q.; Xue, H.; Wang, B. Monitoring Leaf Nitrogen Content in Rice Based on Information Fusion of Multi-Sensor Imagery From UAV. Precis. Agric. 2023, 24, 2327–2349. [Google Scholar] [CrossRef]
  113. Elsayed, S.; El-Hendawy, S.; Elsherbiny, O.; Okasha, A.M.; Elmetwalli, A.H.; Elwakeel, A.E.; Memon, M.S.; Ibrahim, M.E.M.; Ibrahim, H.H. Estimating Chlorophyll Content, Production, and Quality of Sugar Beet Under Various Nitrogen Levels Using Machine Learning Models and Novel Spectral Indices. Agronomy 2023, 13, 2743. [Google Scholar] [CrossRef]
  114. Zhang, L.; Wang, A.; Zhang, H.; Zhu, Q.; Zhang, H.; Sun, W.; Niu, Y. Estimating Leaf Chlorophyll Content of Winter Wheat from UAV Multispectral Images Using Machine Learning Algorithms under Different Species, Growth Stages, and Nitrogen Stress Conditions. Agriculture 2024, 14, 1064. [Google Scholar] [CrossRef]
  115. Zhu, Q.; Zhu, Z.; Zhang, H.; Gao, Y.; Chen, L. Design of an Electronically Controlled Fertilization System for an Air-Assisted Side-deep Fertilization Machine. Agriculture 2023, 13, 2210. [Google Scholar] [CrossRef]
  116. Jiang, Y.; Zhang, Y.; Li, H.; Li, H.; Yan, H.; Xing, S. Research on the Control System for the Use of Biogas Slurry as Fertilizer. Agronomy 2024, 14, 1439. [Google Scholar] [CrossRef]
  117. Shi, W.; Xue, X.; Feng, F.; Zheng, W.; Chen, L. Fertigation Control System Based on the Mariotte Siphon. Sci. Rep. 2024, 14, 23573. [Google Scholar] [CrossRef]
  118. Zhu, S.; Wang, B.; Pan, S.; Ye, Y.; Wang, E.; Mao, H. Task Allocation of Multi-Machine Collaborative Operation for Agricultural Machinery Based on the Improved Fireworks Algorithm. Agronomy 2024, 14, 710. [Google Scholar] [CrossRef]
Figure 1. Process flow for data transmission and alert generation [33].
Figure 1. Process flow for data transmission and alert generation [33].
Agriculture 15 01703 g001
Figure 2. Online intelligent diagnostic analysis framework [38].
Figure 2. Online intelligent diagnostic analysis framework [38].
Agriculture 15 01703 g002
Figure 3. The overall architecture of the subsoiling monitoring system (SMA stands for system management agent; UART stands for universal asynchronous receiver/transmitter; RPC stands for remote procedure Call; REST stands for representational state transfer) [44].
Figure 3. The overall architecture of the subsoiling monitoring system (SMA stands for system management agent; UART stands for universal asynchronous receiver/transmitter; RPC stands for remote procedure Call; REST stands for representational state transfer) [44].
Agriculture 15 01703 g003
Figure 4. The development sequence of the ultrasonic ridge tracking system [49].
Figure 4. The development sequence of the ultrasonic ridge tracking system [49].
Agriculture 15 01703 g004
Figure 5. Transfer learning methods for crop location [54].
Figure 5. Transfer learning methods for crop location [54].
Agriculture 15 01703 g005
Figure 6. The overall framework of apple object detection and localization [77].
Figure 6. The overall framework of apple object detection and localization [77].
Agriculture 15 01703 g006
Figure 7. Identification of different grades of matcha based on colorimetric sensor array [95].
Figure 7. Identification of different grades of matcha based on colorimetric sensor array [95].
Agriculture 15 01703 g007
Figure 8. Design concept of the IPSO-LADRC controller [101].
Figure 8. Design concept of the IPSO-LADRC controller [101].
Agriculture 15 01703 g008
Figure 9. Flow diagram of the weed identification algorithm [109].
Figure 9. Flow diagram of the weed identification algorithm [109].
Agriculture 15 01703 g009
Figure 10. Flowsheet for mapping LCC of winter wheat based on UAV-derived vegetation indices [114].
Figure 10. Flowsheet for mapping LCC of winter wheat based on UAV-derived vegetation indices [114].
Agriculture 15 01703 g010
Table 1. Comparison of sensing technologies [20,21,22].
Table 1. Comparison of sensing technologies [20,21,22].
TechnologyTheoryAdvantageDisadvantage
RGB-D cameraCombines red, green, and blue images with depth information to generate 3D images of crops using computer vision technologyProvides high-precision visual perception, capable of 3D modeling and object recognitionSensitive to lighting changes and relatively high in cost; recognition performance may degrade in complex environments
LiDARMeasures distances by emitting laser pulses and calculating the reflection time, generating 3D point cloud data of the surrounding environmentProvides high-precision 3D environmental modeling, especially useful for operations in complex terrainsExpensive, sensitive to environmental conditions, and requires extensive data processing
Multispectral/hyperspectral sensorsAnalyzes plant and soil health by capturing spectral information at different wavelengthsNon-contact, non-destructive, suitable for large-scale monitoring; provides information on crop growth, soil moisture, and pest controlData processing and analysis are complex and require high-precision calibration; hyperspectral sensors are costly
Ultrasonic sensorsMeasures the distance between objects and the sensor by emitting and receiving ultrasonic waves, widely used for obstacle detection and distance measurementLow-cost, simple and efficient, suitable for precise distance and depth detectionHighly susceptible to environmental noise, sensitive to the material’s density and surface properties; limited detection range
Table 2. Comparison of detection and recognition technologies [23,24].
Table 2. Comparison of detection and recognition technologies [23,24].
TechnologyTheoryAdvantageDisadvantage
Spectral analysisThe reflection, absorption, and transmission characteristics of materials to electromagnetic waves at different wavelengths are utilized. The multispectral information, including visible and near-infrared bands, is collected to achieve quantitative analysisLossless, fast, suitable for batch inspection, easily integrates with ML to improve prediction accuracyLarge data volume, complex preprocessing, high hardware cost
Traditional image processingThe visual methods such as segmentation, feature extraction and threshold can realize target identification and quantitative analysisIntuitive, highly interpretable, demands low hardware resourcesPoor adaptability, generalization of complex scenes
DLThe automatic feature learning and classification/regression are performed using data-driven neural network modelsStrong robustness, good generalization ability, high prediction accuracy, suitable for large-scale applicationRequires large amounts of labeled data, weak interpretability
Table 3. Comparison of vibration signal processing and optimization models for tillage equipment.
Table 3. Comparison of vibration signal processing and optimization models for tillage equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
Complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN)-wavelet threshold, peak over threshold (POT) [27]Significant improvement in vibration signal denoisingSignal–noise ratio: 15.97; root mean square error (RMSE): 0.2372Signal–noise ratio: 20.00; RMSE: 0.1554
Variational mode decomposition (VMD), quantum-behaved particle swarm optimization (QPSO) [28]Extraction of weak feature signalsImpulse signal not visibleRecognition of 20 instances of grain impact
Response surface optimization method [29]Optimization of vibration parametersVibration: 82.6 m/s2Vibration: 27.4 m/s2
K-nearest neighbor (KNN), fast Fourier transform (FFT) [30]Differentiation of vibration signals in different operation stages Accuracy: 98%; sensitivity: 100%
Power spectral density (PSD), FFT, wavelet transform [31]Quantitative analysis of vibration characteristics of components Average root mean square (ARMS) of the tiller: 24.294 m/s2; ARMS of the three-point hitch: 19.042 m/s2; ARMS of the cabin: 1.299 m/s2
Decision tree regression (DTR) [32]Prediction of tractor ride comfortRMSE: 0.03142; R2: 0.83RMSE: 0.03142; R2: 0.83
support vector regression (SVR) [32]Prediction of tractor ride comfortRMSE: 0.022895; R2: 0.87RMSE: 0.019883; R2: 0.89
artificial neural network (ANN) [32]Prediction of tractor ride comfortRMSE: 0.019636; R2: 0.89RMSE: 0.019636; R2: 0.90
IoT cloud monitoring platform [33]Real-time monitoring of seat amplitude during tractor operation Real-time alarm and intervention, alarm triggered when seat effective amplitude transmissibility ≥ 100
Distributed intelligent node monitoring, IoT, PSD [34]Analysis of vibration characteristics and verification of vibration isolation effectHigh peak power spectral densityPeak power spectral density shifted forward and reduced
Table 4. Comparison of soil compaction analysis and optimization models for compaction equipment.
Table 4. Comparison of soil compaction analysis and optimization models for compaction equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
Discrete element method, elastic ellipsoid contact model, fuzzy control [36]Analyze the force–density relationship under different operation parameters and real-time adjustment of compaction depthTraditional depth-controlled operation showed large fluctuation in compaction, with an average compaction of 485 kPa and a deviation of 86.9 kPa at 0.75 m/s speedCompaction mean reduced to 435 kPa, with a deviation of 28.5 kPa
Discrete Bayesian network, sensitivity analysis tool [37]Simulate operation scenarios and output compaction probability distributionThe compaction probability under traditional depth-controlled operation was 26.7%By controlling humidity, wheel pressure, and pass frequency, the compaction probability dropped to 0.3%
DeepLabv3+, XGBoost algorithm, MobileNetV2 [38]Real-time identification of coarse particle soil and evaluation of compaction qualitySignificant deviation with large particlesRelative error percentage between predicted and actual values: 4.93%
Seismic refraction tomography, multi-channel surface wave model [39]Evaluate compaction quality using P-wave and S-wave velocitiesTraditional measurements had insufficient spatial coverage and couldn’t distinguish soil moisture distributionMonitored soil mechanical and hydraulic properties
magnetic pulse induction scanning, real-time kinematic-global positioning system, UAV [40]Non-destructive detection of compacted soil layer thickness and large-scale soil layer mappingTraditional measurements had insufficient spatial coverageAverage absolute deviation between predicted and actual values of 0.37 cm, with a detection depth of 45 cm
Table 5. Comparison of process optimization and monitoring models for subsoiling equipment.
Table 5. Comparison of process optimization and monitoring models for subsoiling equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
High-speed imaging, data regression method, soil throwing model [42]Optimize the parameters of the reverse rotary tiller to improve soil throw ratioRotor shaft depth of 0 cm: soil throw ratio 11.68% (upper layer) and 7.81% (middle layer)Rotor shaft depth of 10 cm: soil throw ratio 72.11% (upper layer) and 63.01% (middle layer)
Sensor fusion type A, subsoiling depth-draft force correlation modeling [43]Develop a real-time subsoiling depth measurement systemTraditional methods used discrete measurement points, with deviations of 0.55 cm to 1.9 cm from the actual subsoiling depthMeasurement deviation from actual subsoiling depth: 0.0011 m
Edge computing, IoT [44]Implement real-time monitoring and management of subsoiling depth and working areaTraditional manual measurements resulted in unstable tillage depth control and significant area calculation errorsSubsoiling depth detection error: <1.2 cm; working area detection error: <1%
DT [45]Predict the draft resistance of the subsoiling tool at different depthsDraft force accuracy: 67.4–70.6%Draft force accuracy: 86.4–99.3%
DT [46]Quantify structural fatigue damage of the tractor subsoiler shovel across working conditions Severity of operational conditions quantified
Table 6. Comparison of path detection and navigation models for transplanting equipment.
Table 6. Comparison of path detection and navigation models for transplanting equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
Machine vision, Gabor texture features, PCA, K-means clustering algorithm, fuzzy logic controller [48]Path detection and navigationMaximum lateral deviation of navigation error: 14.6 mm; standard deviation: 6.8 mmReduced trajectory tracking error, RMSE: 45.3 mm
Ultrasonic ridge-tracking method, fuzzy look-ahead distance decision [49] Track ridges accurately for ridge transplantersMean absolute lateral deviation: 11.67 mmMean absolute lateral deviation: 7.39 mm
Edge computing, IoT [50]Standard path tracking with robustness against disturbancesRequires 8–10 s to settle with noticeable oscillationsReduced settling time
Grayscale reconstruction, threshold segmentation, contour detection [51]Recognize the navigation line for crop-free ridges in agriculture Success rate ≥ 97%, with up to 100% for certain ridge types, and running time < 0.3 s
Table 7. Comparison of seedling detection and classification models for seedling equipment.
Table 7. Comparison of seedling detection and classification models for seedling equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
SE-YOLOv5x [54] Weed–crop classification and localizationPrecision: 96.7%; recall: 95.0%; F1-score: 95.8%Precision: 97.6%; recall: 95.6%; F1-score: 97.3%
Paddy-YOLOv5s; ByteTrack [55]Detect and count missing rice seedlings in paddy fieldsPrecision: 71.8%; frames per second (FPS): 56.1; model size: 14.4 MBPrecision: 77.0%; FPS: 60.4; model size: 5.0 MB
Seedling-YOLO [56]Detect broccoli seedling qualityPrecision: 71.8%; frames per second: 56.1; model size: 14.4 MBPrecision: 94.3%; FPS: 29.7; model size: 4.98 MB
MLP [57]Classify plug seedlings based on healthVaries based on light intensity, no prior segmentation resultsHealth identification accuracy > 96.90%; transplant success rate: 95.86%; seedlings transplanted per hour: 2117.65
Improved YOLOv5s [58]Detect and reduce potato miss-seeding in plantersPrecision: 96.02%; recall: 96.31%Precision: 96.90%; recall: 96.50%
Selective intelligent seedling picking framework [59]Detect robust and inferior seedlings for selective seedlingPrecision: 85.1%; inference time: 36.6 ms; model size: 14.3 MBPrecision: 86.4%; inference time: 28.9 ms; model size: 12.4 MB
Table 8. Comparison of methods for seed distribution and positioning of translating equipment.
Table 8. Comparison of methods for seed distribution and positioning of translating equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
Gray model, BP-ANN [62]Predict seed distribution in a vibrating tray Prediction angle error: ap < 7.5°; gp < 0.25°
PSO-AFS optimization algorithm [63]Optimal time trajectory planning for transplanting Single operation time reduced to <1.36 s
RGB-D visual sensing, intelligent multi-module coordinated control [64]Automated transplanting, sorting, and replanting of seedlings Transplanting efficiency: 5000 plants/h, replanting success rate: 99.33%
Dual-row seedling pick-up system, PLC [65]Coordination of servo motors and pneumatic actuatorsSeedling picking efficiency: 90 plants/row/minEfficiency improved to 180 plants/min, with positioning error < 1 mm
Dual-axis positioning tray conveying device [66]Enhance precision in seedling tray positioning and conveyanceInitial X-axis deviation: up to 1.34 mm; Y-axis: up to 0.99 mmX-axis deviation reduced to 0.85 mm, Y-axis deviation: 0.98 mm
Table 9. Comparison of methods for crop and object detection for harvesting equipment.
Table 9. Comparison of methods for crop and object detection for harvesting equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
LiDAR [72]Estimation of rice crop density using 3D point cloud data RMSE: 9.968; mean absolute percent error: 5.67%
YOLOv4 [74]Efficient detection of tea canopy shoots under varying light conditionsRecall: 78.08%; precision: 87.69%; FPS: 37.18; model size: 64.36 MBRecall: 78.42%; precision: 85.35%; FPS: 48.86; model size: 11.78 MB
ShufflenetV2-YOLOX [75]Improve apple detection speed and accuracyRecall: 74.22%; precision: 94.06%; FPS: 55; model size: 5.03 MBRecall: 93.75%; precision: 95.62%; FPS: 65; model size: 5.40 MB
YOLOv7-tiny-CTD [76]Enhance detection robustnessAverage precision: 92.8%; recall: 91.6%; accuracy: 91.8%Average precision: 94.9%; recall: 96.1%; accuracy: 95.7%
Improved YOLOX [77]Detect and localize applesmAP: 92.91%; FPS: 118.38; model size: 8.97 MmAP: 94.09%; FPS: 167.43; model size: 11.71 M
YOLO-GS [78]Target recognition and localizationmAP: 92.9%; precision: 89.9%; recall: 87.5%; FPS: 24.9; model size: 7.01 MmAP: 95.7%; precision: 89.1%; recall: 89.5%; FPS: 28.7; model size: 3.75 M
LBDC-YOLO [79]Lightweight detection of broccoli heads in complex field environments.mAP: 93.97%; model size: 3.006 MmAP: 97.65%; model size: 1.928 M
Table 10. Comparison of methods for agricultural task optimization in harvesting equipment.
Table 10. Comparison of methods for agricultural task optimization in harvesting equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
Prediction-point Hough transform, machine vision [84]Navigation path fitting using machine vision and improved Hough transformTraditional Hough transform with high computation time and errorNavigation path fitting error reduced to less than 0.5°, with time consumption reduced by 35.20 ms
MV3_DeepLabV3+, LeakyReLU activation function [85]Wheat harvesting boundary line recognition in complex environmentsRecognition accuracy and real-time performance were limited by traditional methodsCrop intersection over union: 95.20%; crop pixel accuracy: 98.04%; FPS: 7.5; pixel error: 7.3 pixels
Mask R-CNN; 6-DOF manipulator [86]Strawberry fruit detection and precise control of harvesting end-effector in ridge-plantingLower accuracy and speed in multi-fruit overlap situationsAccuracy: 95.78%; recall: 95.41%; FPS: 12
DRL [87]Optimize task planning for multi-arm harvesting robotsLonger harvesting time, higher computational cost in complex environmentsReduced execution time by 10.7% and 3.1% for 25 and 50 targets.
Table 11. Comparison of methods for quality prediction and detection for harvesting equipment.
Table 11. Comparison of methods for quality prediction and detection for harvesting equipment.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
Competitive adaptive reweighted sampling, ANN [90]Predict sensory quality of matcha powder using hyperspectral imagingAppearance R2: 0.74; taste R2: 0.68; aroma R2: 0.57; overall quality R2: 0.73Appearance R2: 0.79; taste R2: 0.78; aroma R2: 0.67; overall quality R2: 0.84
KNN [91]Discriminate dominant apple spoilage fungi using SERS dataCalibration accuracy: 92.13%; prediction accuracy: 89.83%Calibration accuracy: 92.26%; prediction accuracy: 98.30%
SVM [91]Discriminate dominant apple spoilage fungi using SERS dataCalibration accuracy: 94.38%; prediction accuracy: 96.61%Calibration accuracy: 94.92%; prediction accuracy: 94.38%
BP-ANN [91]Discriminate dominant apple spoilage fungi using SERS dataCalibration accuracy: 97.49%; prediction accuracy: 94.91%Calibration accuracy: 100%; prediction accuracy: 98.23%
PLSDA; ANN [92]Predict soluble solids content in pineapples.R2: 0.7455; root mean square error of prediction (RMSEP): 0.8120R2: 0.7596; RMSEP: 0.7879
Stacked autoencoder-LSSVM [93]Predict total soluble solids in grapes using hyperspectral imaging.R2: 0.9237; RMSEP: 0.5041; residual predictive deviation: 3.25R2: 0.9216; RMSEP: 0.1091; residual predictive deviation: 3.21
Machine vision system [94]Real-time detection and monitoring of grain breakage in rice combine harvesterRecognition accuracy: 96%; breakage rate monitoring accuracy: N/ARecognition accuracy: 97%; breakage rate monitoring accuracy: 96%
ZIF-8 mediated CSA [95]Enhance CSA performance for matcha grading with ZIF-8 integration.Training recognition rate: 91.7%; test recognition rate: 87.5%Training recognition rate: 100%; test recognition rate: 95%
Table 12. Comparison of methods for irrigation prediction and optimization.
Table 12. Comparison of methods for irrigation prediction and optimization.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
M5P tree [98]Estimate ETo with minimal climatic inputs using machine learningCorrelation coefficient of prediction (CCP): 0.964; mean absolute error (MAE): 0.443; RMSE: 0.603 (training)CCP: 0.982; MAE: 0.406; RMSE: 0.556 (testing)
SMO [98]Estimate ETo with SVRCCP: 0.964; MAE: 0.433; RMSE: 0.610 (training)CCP: 0.986; MAE: 0.345; RMSE: 0.492 (testing)
RBFNreg [98]Estimate ETo by capturing non-linear relationships with minimal dataCCP: 0.970; MAE: 0.391; RMSE: 0.553 (training)CCP: 0.984; MAE: 0.390; RMSE: 0.549 (testing)
MLR [98]Estimate ETo based on limited data using multilinear regressionCCP: 0.964; MAE: 0.443; RMSE: 0.603 (training)CCP: 0.982; MAE: 0.406; RMSE: 0.556 (testing)
LSTM [99]Predict and estimate the modified water quality index for irrigationR2: 0.992; RMSE: 0.061 (training)R2: 0.987; RMSE: 0.084 (testing)
KNN [100]Multi-class prediction of irrigation status Accuracy: 99.46%; precision: 0.9946; recall: 0.9939
SVM [100]Multi-class prediction of irrigation status Accuracy: 99.20%; precision: 0.9911; recall: 0.9644
MLP [100]Multi-class prediction of irrigation status Accuracy: 99.61%; precision: 0.9967; recall: 0.9951
IPSO, LADRC [101]Enhance global search and convergence accuracySettling time: 0.161 s; overshoot: 7.6%; steady-state error: 0.0034%Settling time: 0.061 s; overshoot: 0%; steady-state error: 0.0001%
EGA-BPNN [102]Predict farm water level flowSingle water level: mean squared error (MSE): 6.64 × 10−4; average relative error (ARE): 3.42%; dual water level: MSE: 4.43 × 10−4; ARE: 1.09%Single water level: MSE: 4.53 × 10−4; ARE: 1.87%; dual water level: MSE: 2.38 × 10−4; ARE: 0.41%
Distributional RL [103]Optimize cotton irrigation decisionsConvergence speed: 3702 steps; cumulative reward: 63.03Convergence speed: 1256 steps; cumulative reward: 95.08
Table 13. Comparison of methods for weed detection and classification.
Table 13. Comparison of methods for weed detection and classification.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
VGG-16 [105]Classify weeds in strawberry fields Precision: 0.98; recall: 0.97; F1-score: 0.97; accuracy: 0.97
GoogleNet [105]Classify weeds in strawberry fields Precision: 0.96; recall: 0.95; F1-score: 0.96; accuracy: 0.96
AlexNet [105]Classify weeds in strawberry fields Precision: 0.95; recall: 0.96; F1-score: 0.95; accuracy: 0.95
Whole-plant method [107]Weed and cotton plant differentiation using overall color characteristicsRecognition rate: 71.4% for cotton; 92.9% for weedOverall recognition rate: 82.1%
YOLOv5s [108]Detect cabbage plants in intra-row weeding systems Accuracy: 96.1%; processing time: 51 ms
Machine vision weed detection [109]Detect and classify inter-row and intra-row weeds in cotton fields Inter-row weed recognition rate: 89.4%; intra-row recognition rate: 84.6%; overall recognition rate: 85%; processing time: 437 ms
Table 14. Comparison of methods for optimization of fertilization processes.
Table 14. Comparison of methods for optimization of fertilization processes.
MethodsKey TasksPre-Improvement MetricsPost-Improvement Metrics
GPR, mRMR [112]Optimize feature selection and leaf nitrogen content estimation R2: 0.59; RMSE: 12.93%R2: 0.68; RMSE: 11.45%
Gradient boosting regression [113]Estimate various sugar beet parameters based on spectral reflectance indicesR2: 0.65; RMSE: 0.354 (testing)R2: 0.99; RMSE: 0.073 (training)
SVM [114]Estimate LCC for winter wheat using UAV multispectral imagesR2: 0.932; RMSE: 3.96 (training)R2: 0.60; RMSE: 3.86 (validation)
RF [114]Estimate LCC for winter wheat R2: 0.932; RMSE: 4.37 (training)R2: 0.49; RMSE: 3.65 (validation)
Feedback regulation mechanism [116]Enhance fertilizer application accuracy Fertilizer outlet flow: 3.2 m3/h; rated ratio quantity: 3.0 m3/h
Fuzzy PID algorithm [117]Control the nutrient solution (EC and pH)EC overshoot: 17.04%; pH overshoot: 8%EC overshoot: 6.36%; pH overshoot: 6.67%
Fireworks algorithm [118]Task allocation for multi-machine cooperative operation of fertilizer applicatorsConvergence speed: medium; fitness value: 85.52; variance: 0.173Convergence speed improved; fitness value: 87.79; variance: 0.280
Table 15. AI methods and main tasks in different agricultural equipment operation scenarios.
Table 15. AI methods and main tasks in different agricultural equipment operation scenarios.
Operation ScenariosAI ModelsKey Goals
TillageCEEMDAN-Wavelet threshold, QPSO, SVM, DTR, XGBoost, ANN, VMD, and DTRVibration signal monitoring, mechanical vibration modeling, ride comfort prediction, intelligent soil compaction detection, parameter optimization, compaction/subsoiling simulation, and fatigue assessment
Transplanting/SowingK-means clustering, CNN, YOLO series, fuzzy logic, SMC, and BP-ANNSeedling detection, seedling condition recognition, autonomous navigation, precise positioning, and efficient transplanting/sowing operations
HarvestingYOLO series, Mask R-CNN, attention mechanism, PLS-DA, ANN, KNN, SVM, and DRLDetection, segmentation, and localization of fruits and vegetables (e.g., apples, tomatoes), harvesting path planning, and ripeness identification
IrrigationGA-BPNN, LSTM, PSO-LADRC, MLP, SVM, and RLIntelligent irrigation scheduling, soil moisture status prediction, irrigation warning, and irrigation parameter optimization
WeedingCNN, AlexNet, GoogleNet, and YOLO seriesCrop and weed differentiation, intelligent machine obstacle avoidance, and precision identification and weeding
FertilizationSVM, RF, GBR, PLS-DA, ANN, PID, and mRMRCrop nutrient status analysis, variable rate fertilization decision-making, fertilizer distribution monitoring, and fertilizer control system optimization
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhu, Y.; Zhang, S.; Tang, S.; Gao, Q. Research Progress and Applications of Artificial Intelligence in Agricultural Equipment. Agriculture 2025, 15, 1703. https://doi.org/10.3390/agriculture15151703

AMA Style

Zhu Y, Zhang S, Tang S, Gao Q. Research Progress and Applications of Artificial Intelligence in Agricultural Equipment. Agriculture. 2025; 15(15):1703. https://doi.org/10.3390/agriculture15151703

Chicago/Turabian Style

Zhu, Yong, Shida Zhang, Shengnan Tang, and Qiang Gao. 2025. "Research Progress and Applications of Artificial Intelligence in Agricultural Equipment" Agriculture 15, no. 15: 1703. https://doi.org/10.3390/agriculture15151703

APA Style

Zhu, Y., Zhang, S., Tang, S., & Gao, Q. (2025). Research Progress and Applications of Artificial Intelligence in Agricultural Equipment. Agriculture, 15(15), 1703. https://doi.org/10.3390/agriculture15151703

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop