Previous Article in Journal
Quadrupedal Locomotion with Passive Ventral Wheels: A Data-Driven Approach to Energy Efficiency Analysis
Previous Article in Special Issue
Co-Simulation Model of an Autonomous Driving Rover for Agricultural Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination

by
Chijioke Leonard Nkwocha
1,2,*,
Adeayo Adewumi
3,
Samuel Oluwadare Folorunsho
4,
Chrisantus Eze
5,
Pius Jjagwe
2,
James Kemeshi
6 and
Ning Wang
1,*
1
Department of Biosystems and Agricultural Engineering, Oklahoma State University, Stillwater, OK 74078, USA
2
Department of Biological Systems Engineering, Virginia Tech, Blacksburg, VA 24061, USA
3
Department of Agricultural and Biological Engineering, Purdue University, West Lafayette, IN 47907, USA
4
Department of Industrial & Enterprise Systems Engineering, University of Illinois Urbana-Champaign, Champaign, IL 61801, USA
5
Department of Computer Science, Oklahoma State University, Stillwater, OK 74078, USA
6
Department of Agricultural and Biosystems Engineering, South Dakota State University, Brookings, SD 57007, USA
*
Authors to whom correspondence should be addressed.
Robotics 2025, 14(11), 159; https://doi.org/10.3390/robotics14110159
Submission received: 22 September 2025 / Revised: 21 October 2025 / Accepted: 27 October 2025 / Published: 29 October 2025
(This article belongs to the Special Issue Smart Agriculture with AI and Robotics)

Abstract

This review critically examines advancements in sensing, control, and networking technologies for agricultural robots (AgRobots) and their impact on modern farming. AgRobots—including Unmanned Aerial Vehicles (UAVs), Unmanned Ground Vehicles (UGVs), Unmanned Surface Vehicles (USVs), and robotic arms—are increasingly adopted to address labour shortages, sustainability challenges, and rising food demand. This paper reviews sensing technologies such as cameras, LiDAR, and multispectral sensors for navigation, object detection, and environmental perception. Control approaches, from classical PID (Proportional-Integral-Derivative) to advanced nonlinear and learning-based methods, are analysed to ensure precision, adaptability, and stability in dynamic agricultural settings. Networking solutions, including ZigBee, LoRaWAN, 5G, and emerging 6G, are evaluated for enabling real-time communication, multi-robot coordination, and data management. Swarm robotics and hybrid decentralized architectures are highlighted for efficient collective operations. This review is based on the literature published between 2015 and 2025 to identify key trends, challenges, and future directions in AgRobots. While AgRobots promise enhanced productivity, reduced environmental impact, and sustainable practices, barriers such as high costs, complex field conditions, and regulatory limitations remain. This review is expected to provide a foundation for guiding research and development toward innovative, integrated solutions for global food security and sustainable agriculture.

Graphical Abstract

1. Introduction

The main objective of agriculture is to ensure the production of adequate, high-quality food to support and improve human life. A significant challenge currently confronting the agricultural sector in many countries is the growing shortage of labour for physically demanding fieldwork—a trend that is anticipated to persist and potentially worsen over time. One promising approach to mitigate this issue is the implementation of robotic technologies in field crop production. Robots have become integral to addressing challenges across various sectors, notably in agriculture. The adoption of automation and robotic technologies in farming has increased, helping to overcome complexities in tasks ranging from planting to harvesting, thereby enhancing operational efficiency and effectiveness. Agricultural robotics focuses on developing autonomous systems capable of operating alongside humans, with an emphasis on adaptability and intelligent behaviour in dynamic, unstructured environments. Research and development of robots tailored for agricultural purposes have been ongoing since the 1980s [1]. The overarching objective of incorporating robotics into agriculture is to boost the sustainability and reliability of farming practices, ultimately resulting in improved product quality, cost reduction, and a diminished environmental footprint.
Autonomous capabilities in agricultural robots have the potential to support continuous field operations while improving both productivity and efficiency. To achieve effective performance in agricultural environments, research must prioritize the integration of diverse and complementary sensors to ensure reliable localization and monitoring. Additionally, there is a need for the development of simple yet functional manipulators to carry out specific agricultural tasks, as well as the design of robust path planning, navigation, and guidance algorithms suitable for complex environments beyond open fields. Seamless interaction between robots and human workers is also crucial in these dynamic and multifaceted settings [2]. Overall, the core technologies essential for agricultural robots can be grouped into three main categories: (i) sensing technologies, (ii) control technologies, and (iii) networking (communication) technologies, as illustrated in Figure 1, alongside their deployment platforms seen in Figure 2. Advanced sensing technologies enable agricultural robots to carry out functions such as target detection, yield prediction, decision-making, and fault diagnosis. These intelligent sensing systems serve as the “eyes” of the robot, providing critical information about the surrounding environment, the objects it interacts with, and their status during operation. By capturing and integrating these multi-dimensional data, these technologies facilitate precise and efficient environmental perception, crop recognition, and information gathering [3].
To achieve optimal operational performance, agricultural robots require high-performance control technologies, including reliable actuators and end-effectors. The control process typically involves planning a movement path based on a defined trajectory, guiding the robot’s actuators to avoid obstacles and reach the target location, and then using the end-effectors—assisted by sensors—to perform the necessary tasks with flexibility. Given the complex and ever-changing conditions in agricultural environments, accurate navigation and motion control have become critically important [4]. Common control strategies employed in agricultural robotics include classical control methods, intelligent control approaches such as fuzzy logic, and nonlinear control techniques. Robotic modules typically communicate either through physical links or wireless signals, allowing for effective coordination and collaboration among them. Communication, both between individual modules and across multiple robots, is crucial for achieving coordinated actions and scalable solutions in agricultural operations. Various communication architectures are designed to enhance teamwork among robotic systems, supporting the efficient completion of field tasks [5]. These communication and control technologies play a vital role in advancing multi-robot systems, broadening their use in agricultural automation and enhancing crop management strategies.
Agricultural robots are utilized in diverse applications, including orchards, horticulture, and various field crop operations such as weeding, planting and transplanting, harvesting, cultivation, spraying, and pruning. With the integration of machine vision technologies, these robots can mitigate risks, recognize crops, and assess optimal harvest times. They are also employed for tasks such as soil analysis, irrigation management, fertilization, predictive planning using artificial intelligence, growth monitoring, as well as sorting, grading, and packaging. The combined use of robotics, automation, and big data analytics has significantly accelerated advancements in agriculture. Moreover, autonomous vehicles equipped with advanced sensors, communication systems, Geographic Positioning System (GPS), and Geographic Information System (GIS) have been developed to optimize the management of high-value crops [1]. Several autonomous prototypes have been designed for orchards and horticultural crops, such as tomatoes [6]. For instance, Jiang et al. [7] introduced an autonomous navigation system for greenhouse mobile robots that integrates 3D Light Detection and Ranging (LiDAR) with 2D LiDAR-based Simultaneous Localisation and Mapping (SLAM). Similarly, to improve autonomous navigation for agricultural robots in orchard environments, Rovira-Más et al. [8] developed a system that combines 3D vision, LiDAR, and ultrasonic sensors to enhance perception and navigation capabilities. These innovations simplify agricultural operations, minimize labour requirements, optimize resource use, and promote sustainability, ultimately leading to greater productivity and improved crop quality.
The aim of this review is to provide a comprehensive synthesis of the latest advancements in sensing and control technologies that empower agricultural robots, encompassing unmanned ground vehicles (UGVs), unmanned aerial vehicles (UAVs), unmanned surface vehicles (USVs), and robotic arms with specialized end-effectors. Additionally, it explores networking strategies for multi-robot systems and examines how these technologies interact with complex agricultural environments to enable intelligent, adaptive, and coordinated operations. By analysing research published between 2015 and 2025, this review highlights key technological breakthroughs, emerging trends, and the trajectory of innovations that are shaping the future of agricultural automation and precision farming.

2. Review Methodology and Literature Selection

2.1. Literature Retrieval Criteria

To maintain a structured and high-quality review, specific inclusion and exclusion criteria were defined, focusing on factors such as relevance, recency, and methodological rigor. In examining sensing, control, and networking strategies of agricultural robots within precision agriculture, these criteria were applied to encompass a broad range of studies while giving preference to works that present robust experimental evidence or practical applications. While inclusion criteria focused on selecting articles to be included and analysed in the study, exclusion criteria did the opposite. Hence, exclusion criteria were not explicitly discussed to avoid repetition.

Inclusion Criteria

The following outlines the criteria applied to determine which studies were included in this review.
  • Relevance to subject of review: An essential requirement for inclusion was the study’s relevance to the core focus on agricultural robot technologies and their application domains. Eligible works needed to address at least one relevant aspect of this review, including classifications of agricultural robots, sensing technologies, control approaches, or networking solutions. Relevance was determined by examining the title, abstract, objectives, and methodology of each study to ensure alignment with the review’s scope.
  • Publication timeframe: While the review primarily focused on recent advancements and emerging technologies in the field, earlier works were also included to capture the evolution of developments over time. Consequently, the literature surveyed spans the last decade, covering publications from 2015 to 2025.
  • Article type and subject areas: The literature search for this review mainly focused on review and research articles published within the domains of agricultural and biological sciences, computer science, robotics, and engineering. The selected sources included journal articles, conference proceedings, as well as theses and dissertations.
  • Language: To ensure consistency and wide accessibility, the review was limited to publications written in English. This restriction helped maintain clarity and uniformity in analysing the selected body of literature.

2.2. Literature Selection Process

2.2.1. Database Search

Comprehensive searches were carried out using the ScienceDirect and Google Scholar databases to ensure wide and thorough coverage of relevant literature.

2.2.2. Keywords Search

The set of keywords used in the literature search is a combination of the major keywords and their synonyms. Example keyword combinations used are: (i) Robots OR Robotics AND Applications AND Agriculture, (ii) Sensing technologies AND Agricultural robots, (iii) Control technologies AND Agricultural robots, (iv) Networking technologies AND Agricultural robots, (v) Swarm robotics AND Agriculture. The keywords used in this study were selected to retrieve research studies related to the advancement of sensing, control, and networking technologies of agricultural robots.

2.2.3. Initial Screening

The search across both databases yielded a total of 1056 articles. An initial screening based on the relevance of titles and abstracts reduced this number to 450 studies. At this stage, studies that did not satisfy the inclusion criteria were excluded. This preliminary screening allowed the literature to be narrowed down to a more manageable set for detailed analysis.

2.2.4. Full-Text Evaluation and Final Selection

The studies that passed the initial screening underwent full-text evaluation to determine their relevance and contribution to the field. During this stage, comprehensive inclusion and exclusion criteria were applied, taking into account each study’s objectives, methodology, findings, and overall quality [9,10]. Ultimately, based on their pertinence to agricultural robot technologies, 259 articles were selected for detailed analysis in this review.

2.3. Keyword Analysis

Bibliometric networks for the selected studies were created and visualized using the VOSviewer (version 1.6.20) software (https://www.vosviewer.com/, accessed on 15 January 2025), which provides network, overlay, and density visualizations [10,11,12]. In this analysis (Figure 3), twelve distinct clusters were identified, represented by colours including green, light green, purple, red, pink, soft pink, blue, light blue, orange, yellow, brown, and sky blue. The central node, “agriculture,” emerged as the most prominent and interconnected node in the network, closely linked to terms such as “agricultural robotics,” “machine learning,” “deep learning,” “field robot,” and “uavs.” Precision agriculture, followed by navigation, and robot sensing systems, emerged as the next prominent node, closely linked to terms such as “ugv,” “uav,” “path tracking,” and “motion control.” This interconnectedness highlights the role of agricultural robotics and its associated technologies in agriculture, particularly in precision agriculture-related applications such as smart farming and crop monitoring. A total of 153 keywords were identified in the analysis, each appearing at least twice. Applying the association strength method, the analysis illustrates the interconnected structure of research themes and emphasizes the prominent role of sensing, control, and networking technologies in agricultural robotics research.
The cluster of techniques used in the selected papers (Figure 4) illustrates the interconnections among the methodologies applied in agricultural robotics research. The VOS network reveals eight distinct clusters, highlighting the breadth of approaches and their interrelatedness within the reviewed literature. At the core of the network lies navigation, a central theme that connects strongly to other methodological domains such as path planning, obstacle avoidance, collision avoidance, and robot sensing systems. Closely associated with navigation, the path planning cluster emphasizes algorithms, heuristic methods, and planning strategies essential for enabling robots to operate effectively in dynamic agricultural environments.
Another major cluster is machine learning, which is tightly linked to deep learning, robot vision systems, and sensor technologies, showcasing its role in processing sensor data and supporting decision-making for tasks such as perception, localization, and control. Similarly, artificial intelligence connects to control systems, trajectory generation, and optimization algorithms, reflecting its importance in high-level decision-making and adaptive control strategies for autonomous robots.
The robot sensing systems and autonomous navigation clusters bring together enabling technologies such as laser radar, 3D vision, GPS, and sensor networks, which are vital for real-time perception and environment mapping. The wireless sensor networks cluster, though smaller, highlights the integration of distributed sensing technologies that support communication and data sharing in field robotics applications. Finally, the control systems cluster, incorporating PI and PD control methods, underscores the role of classical and modern control strategies in ensuring stability, precision, and efficiency of robotic operations. This network structure emphasizes how agricultural robotics relies on an interconnected landscape of methodologies, where navigation, AI, and sensing technologies converge to support autonomous decision-making, robust perception, and reliable field operations.

3. Classifications of Agricultural Robots

Agricultural robots are designed to perform a wide range of tasks across diverse farming environments, and as such, they can be categorized using several classification criteria. Among the most widely adopted approaches is classification based on the robot’s spatial domain or operational medium. This perspective groups robots according to where and how they interact with the agricultural environment. Under this framework, agricultural robots generally fall into four major categories:
  • Airborne systems, primarily represented by Unmanned Aerial Vehicles (UAVs), which are extensively used for crop monitoring, spraying, and mapping tasks due to their ability to cover large areas efficiently.
  • Earthbound systems, commonly referred to as Unmanned Ground Vehicles (UGVs), which operate at the field level to carry out activities such as planting, weeding, harvesting, and crop and soil analyses.
  • Aquatic or water-surface systems, known as Unmanned Surface Vehicles (USVs), which are particularly relevant in water-intensive farming systems, aquaculture, and irrigation management.
  • Robotic arms and end-effectors, which serve as precision tools for delicate and highly specific operations such as harvesting fruits, pruning, or performing tasks in controlled environments like greenhouses.
This classification not only reflects the physical domains in which these robotic systems operate—air, land, and water—but also highlights their role in supporting different stages of agricultural production. Each category has unique design requirements, sensing technologies, control systems, and networking capabilities tailored to its operational environment. The following subsections provide a detailed discussion of these categories, their applications, technological advancements, and contributions to modern agriculture.

3.1. Unmanned Aerial Vehicles (UAVs)

As the global population continues to rise, the demand for increased agricultural productivity and crop quality becomes more pressing. Unmanned Aerial Vehicles (UAVs), commonly referred to as drones, represent a promising technological advancement to address these challenges by enhancing efficiency in modern farming practices [13,14]. UAVs belong to the category of airborne robotics and are widely deployed in agriculture for tasks such as crop monitoring, spraying, and field mapping. For effective operation, UAV systems must be designed with adequate payload capacity to accommodate mission-specific components [14,15] while ensuring sufficient flight endurance to cover the target area [16]. Typically, the payload consists of various sensors and instruments essential for data acquisition and analysis, including (i) multispectral and hyperspectral cameras, (ii) infrared and RGB imaging systems, (iii) LiDAR sensors for detailed topographic mapping [17], and a Global Navigation Satellite System (GNSS) for precise georeferencing.
The integration of UAVs into farming practices has significantly improved efficiency, reduced costs, and enhanced profitability for farmers. UAVs, when combined with advanced computing systems and on-board sensors, play a critical role in crop management tasks such as field mapping, crop health monitoring, irrigation planning, plant diagnostics, disaster mitigation, early warning systems, and even wildlife and forestry conservation [18,19]. Furthermore, their cost-effectiveness, operational versatility, and ability to capture high-resolution data make UAVs ideal for applications such as crop growth assessment, water stress detection, precision irrigation, yield forecasting, and targeted management of weeds, pests, and diseases [20,21]. Spraying pesticides and fertilizers is one of the most straightforward applications for UAVs [22,23]. By utilizing UAV-based spraying systems, farmers can minimize direct human exposure to hazardous chemicals such as pesticides and fertilizers, thereby improving safety and reducing health risks [24]. Beyond spraying, UAVs enable more accurate, frequent, and cost-effective crop monitoring, delivering high-quality, up-to-date data that supports informed decision-making, identifies inefficiencies, and promotes improved crop management. Additionally, UAVs equipped with multispectral imaging sensors can capture detailed crop images, which are then analysed to monitor variations in plant structure, detect stress, and assess growth stages [25]. These capabilities make UAVs a valuable tool for both producers and agronomists, enhancing precision agriculture practices. Figure 5 illustrates the diverse applications of UAVs in modern agriculture and the required sensors. Several studies have explored the capabilities and functionalities of UAVs in carrying out agricultural tasks.
Guo et al. [26] studied how rotor downwash airflow and crosswinds affect droplet deposition during quad-rotor UAV spraying. Using a three-dimensional computational fluid dynamics (CFD) model validated by wind-tunnel tests, they found that higher flight speed and altitude increase droplet drift, while moderate crosswinds expand coverage but lower accuracy. The model showed about 30% agreement with experiments, confirming reliability. The study recommends optimal UAV spraying conditions of 2 m/s flight speed, 2.5–3 m altitude, and crosswinds below 1.5 m/s to enhance pesticide deposition efficiency and minimize drift.
Xiao et al. [27] developed a real-time matching and pose reconstruction method for low-overlap agricultural UAV imagery with repetitive textures. The study aimed to overcome challenges in UAV photogrammetry caused by weak textures and limited image overlap. Using an adaptive map initialization algorithm, global texture-aware feature extraction, and a multi-model adaptive tracking system, the method improved feature reliability and tracking robustness. A hybrid local-global optimization strategy enhanced both pose accuracy and global consistency. Tests on agricultural datasets showed a reconstruction completeness rate of 100% and a processing speed of ~3.4 frames per second, outperforming existing SLAM (Simultaneous Localization and Mapping) and Structure-from-Motion (SfM) systems. The approach enables accurate, real-time UAV photogrammetry critical for precision agriculture applications.
Using UAV-acquired multispectral images and machine learning (ML) algorithms, Demir et al. [28] developed yield prediction models for organic Isparta oil rose farming. In this study, thirteen parameters, including soil moisture, pH, electrical conductivity, soil temperature, and nine vegetation indices, were analyzed using ML methods. The developed models demonstrated high accuracy ( R 2 = 0.931 ) for early yield prediction (Day 69). The findings highlight the potential of integrating UAV and machine learning technologies for precise yield prediction, supporting Agriculture 4.0 and sustainable farming practices.
Singh and Sharma [29] proposed an intelligent Wireless Sensor Networks (WSNs) and UAV-based Internet of Things (IoT) framework for precision agriculture to address challenges like climate change, limited land, and freshwater requirements. The methodology integrates WSNs and UAVs for real-time data collection, trajectory planning, and hierarchical data processing across ground, edge, and cloud intelligence levels. The system utilizes clustering, consensus algorithms, and optimized UAV trajectories for efficient data collection and analysis. Results show high precision (98.77%), sensitivity (99.51%), and accuracy (98.79%), with reduced error rates and communication costs. The findings highlight the framework’s potential to enhance crop monitoring, spraying, and decision-making, offering a cost-effective and scalable solution for sustainable agriculture.
UAVs have revolutionized precision agriculture by enabling rapid, high-resolution monitoring, efficient spraying, and data-driven decision-making, as demonstrated by advances in real-time photogrammetry, optimized spraying, and machine learning-based yield prediction, amongst other numerous applications. However, their effectiveness is limited in areas such as proximity and below-canopy sensing due to their design barriers. To overcome these limitations, integrating UAVs with Unmanned Ground Vehicles (UGVs) offers a promising solution—UGVs can perform detailed below-canopy sensing and continuous ground-level monitoring, while UAVs provide large-scale aerial coverage. A coordinated UAV-UGV system can enhance robustness, ensure complete multi-layer crop observation, and enable more reliable spatiotemporal data fusion for intelligent, adaptive farm management.

3.1.1. Types of UAV Platforms

UAV systems vary widely in terms of size, weight, payload capacity, power source, and flight endurance. For agricultural remote sensing, UAVs commonly fall within the “small” ( 15 kg) or “light” ( 7 kg) categories, typically weighing under 116 kg. These UAVs generally operate at altitudes below 1000 m, which places them in either the low-altitude range (100–1000 m) or the ultra-low-altitude range (1–100 m) [16]. Among the various UAV configurations, the most widely adopted for remote sensing are multi-rotor platforms, fixed-wing UAVs, hybrid or vertical take-off and landing (VTOL) systems, and unmanned helicopters, each offering unique advantages in terms of manoeuvrability, endurance, and operational flexibility (Table 1).
UAVs can be classified based on their flight mechanisms into civilian and military categories [30]. In the context of agricultural remote sensing, civilian UAVs are predominantly used, with fixed-wing and multi-rotor systems being the most common configurations [16]. Among these, multi-rotor UAVs have emerged as the most widely adopted option due to their numerous operational advantages. Available in configurations such as quadcopters (four rotors), hexacopters (six rotors), and octocopters (eight rotors), these UAVs provide exceptional stability and safety during take-off and landing without the need for runways or large open spaces. Furthermore, they offer superior flight control, enabling easy adjustments in altitude and speed, which is crucial for precision agricultural tasks [31]. Figure 6 illustrates four representative UAV platforms commonly employed in agricultural applications.
Multi-Rotor UAVs
Multi-rotor UAVs are among the most dependable wing-based UAVs, recognized for their speed, agility, and ability to execute complex manoeuvres with precision. Over the past decade, they have emerged as one of the most significant advancements in UAV technology [22]. These systems rely on multiple rotors driven by brushless motors to generate lift, with each rotor capable of independently adjusting its rotation speed. This configuration provides superior stability, allowing the UAV to rapidly correct its altitude and orientation during disturbances. However, multi-rotor UAVs exhibit relatively low power efficiency, which limits their endurance and flight time. Despite these drawbacks, their versatility and ease of operation have made them the most widely adopted platforms for agricultural remote sensing and data acquisition.
Fixed-Wing UAVs
Fixed-wing UAVs are similar to traditional aircraft, operating with horizontal take-off and landing (HTOL) capabilities and requiring ample space for runway operations. Compared to multi-rotor platforms, they offer superior energy efficiency and can sustain longer flight durations, even gliding in the event of engine or controller failure [32]. Their extended endurance and high battery capacity make them ideal for large-scale coverage and long-distance path planning tasks [33]. The simplicity of their design and control system enhances reliability while keeping costs relatively low compared to other UAV types. Due to these advantages, fixed-wing UAVs have been widely adopted for applications such as large-area field mapping and livestock monitoring [34,35,36].
Hybrid UAVs (VTOL)
Another significant category of wing-based UAVs is the Hybrid Vertical Take-Off and Landing (VTOL) system, also referred to as a hybrid UAV or fixed-wing VTOL aircraft. These platforms are designed to operate effectively in diverse conditions, integrating features of both multi-rotor and fixed-wing UAVs. Typically, a hybrid UAV incorporates a fixed-wing structure combined with multiple rotors—usually four or more—allowing it to take off and land vertically like a multi-rotor while transitioning to efficient fixed-wing flight during cruising. This dual functionality offers the advantage of precise and controlled take-off and landing, coupled with energy-efficient long-range flight, making VTOL UAVs highly versatile for agricultural operations [31].
Unmanned Helicopters
Unmanned helicopters are single-rotor UAVs that share structural similarities with traditional helicopters. They utilize a large main rotor to generate lift and a tail rotor for directional stability and control. These platforms are known for their superior power efficiency and extended flight duration, enabling them to hover precisely in one location—an essential feature for applications such as aerial imaging and precision spraying. Despite these advantages, their complex mechanical blade system introduces significant vibration issues and increases overall cost, which can limit widespread adoption.

3.1.2. UAV-Mounted Sensors

UAV platforms can be equipped with various sensors tailored to the specific needs of agricultural monitoring tasks. Depending on the payload capacity of the UAV, these sensors may include high-resolution RGB cameras, multispectral and hyperspectral cameras, thermal imaging systems, and LiDAR sensors, among others (Figure 7). In agricultural applications, maintaining a lightweight payload is crucial for improving flight efficiency, precision, and overall performance. Broadly, these sensors can be classified into two main categories: imagery sensors, which capture visual and spectral data, and three-dimensional information sensors, which provide detailed spatial and structural measurements.
Imagery sensors, including RGB cameras, operate within the visible light spectrum and are widely utilized for applications such as vegetation mapping, land-use classification, and environmental assessment. In contrast, spectral sensors—such as multispectral and hyperspectral cameras—capture data across both the visible and near-infrared regions of the electromagnetic spectrum, extending beyond what the human eye can perceive. This capability allows for the detection and analysis of specific characteristics, including plant health and species differentiation, water quality assessment, and even mineral composition mapping. Thermal cameras capture electromagnetic radiation within the infrared region of the spectrum, enabling the detection of temperature variations and heat anomalies in crops. This capability is particularly valuable for agricultural applications such as assessing water stress in plants. On the other hand, LiDAR sensors employ laser pulses to measure distances between the UAV and target surfaces, facilitating the generation of highly accurate three-dimensional maps. In agriculture, LiDAR is commonly utilized to estimate parameters such as crop height, biomass, and leaf area index (LAI). Table 2 provides a detailed comparison of the features and functionalities of various UAV-mounted sensors.
Table 3 provides a curated summary of key studies on UAVs over the past decade (between 2015 and 2025), illustrating the diversity of UAV applications in agriculture, while also highlighting notable technologies, sensors, algorithms, and outcomes that reflect broader trends across the field. This summary captures the evolution of UAV development and deployment across different tasks and environments.

3.2. Unmanned Ground Vehicles (UGVs)

Unmanned ground vehicles (UGVs) are essential for the successful adoption of precision agriculture, serving as advanced platforms designed to support crop management and agricultural production. These vehicles integrate cutting-edge technologies such as sensors, artificial intelligence, big data analytics, edge and cloud computing, IoT, and automated control systems [10]. This integration enables capabilities in perception, decision-making, actuation, and execution for multi-degree-of-freedom autonomous operations [3]. The development of UGVs is rapidly advancing, with functionalities becoming increasingly sophisticated and diverse. They are equipped to perform comprehensive field coverage, recognize crop rows, navigate autonomously while avoiding obstacles. They also facilitate seamless communication for production data and decision-making between humans and machines. Additionally, UGVs leverage advanced sensors and computer vision technologies to execute functions such as object detection, task planning, and system diagnostics. With rapid advancements in technology, these vehicles are becoming indispensable in modern agriculture, playing a key role in promoting sustainable farming practices [3,47]. Their operations typically take place in semi-structured or unstructured environments, where the primary targets include crops, weeds, vegetables, and livestock [48]. To function effectively in such settings, UGVs must simultaneously perceive their surroundings, make decisions, and carry out actions in real time.
UGVs are widely utilized across diverse agricultural operations such as soil preparation, crop phenotyping, yield estimation, crop monitoring, plant treatment, automated seeding, precision fertilization, weed management, and harvesting. They revolutionize soil preparation through automated ploughing, accurate seedbed formation, and comprehensive soil analysis. Integrated with sensors and AI technologies, these systems manage weed control and monitor soil health, ensuring optimal conditions for planting. Such advancements foster sustainability by reducing resource use, maintaining soil integrity, and enhancing productivity through data-driven and targeted interventions based on real-time soil information [49]. Seeding UGVs enable precise placement of seeds, significantly reducing time and operational costs for farmers. Leveraging remote and proximal sensing technologies for phenotyping, these robots can identify pests and diseases, perform 3D reconstruction for yield estimation, map flower and fruit distribution, and calculate canopy volume. Harvesting, being a repetitive and labour-intensive task requiring high precision, has seen extensive integration of UGVs equipped with robotic arms, vision systems, and actuators to automate the process, thereby easing farmers’ workload. Collectively, the deployment of UGVs in these operations enhances agricultural efficiency, boosts productivity, and saves farmers’ time for other non-field activities. Figure 8 illustrates the diverse application areas of agricultural UGVs. Notable studies in the field have developed and applied UGVs for various agricultural operations. These studies are discussed in the subsequent paragraphs.
Teng et al. [50] proposed AG-LOAM (Adaptive LiDAR Odometry and Mapping) to improve localization and mapping accuracy of autonomous agricultural robots in unstructured farm environments. The system adapts map updates based on motion stability and point cloud consistency to handle terrain irregularities and vegetation dynamics. Tested on a Clearpath Jackal robot with a 3D LiDAR sensor, AG-LOAM achieved centimeter-level accuracy and real-time performance, outperforming existing methods such as LeGO-LOAM (Lightweight and Ground-Optimized LiDAR Odometry and Mapping) and CT-ICP (Continuous-Time Iterative Closest Point). The approach offers a robust, infrastructure-free localization solution for reliable navigation in autonomous farming.
Linford and Haghshenas-Jaryani [51] developed a ground robotic system for crop and soil monitoring aimed at supporting precision agriculture through autonomous field data collection. The system integrates soil and environmental sensors, a Global Positioning System (GPS), and an Inertial Measurement Unit (IMU) for localization and mapping, alongside a Raspberry Pi–based control platform for real-time data processing. A custom four-wheel drive mobile robot was designed to navigate uneven terrain while collecting soil moisture, temperature, and pH data. Field experiments demonstrated accurate path tracking and reliable sensor data acquisition across varying soil conditions. Results showed strong correlation between robotic and manual measurements, confirming sensing reliability. The study highlights the potential of low-cost UGVs to enhance data-driven decision-making and sustainability in modern farming systems.
Wang et al. [52] designed and evaluated an automated latex harvesting robot intended to address labor shortages and inefficiencies in manual rubber tapping. The system incorporates a flexible actuator, a rotating collection cup with damping mechanisms, and a sensor suite comprising LiDAR, Radio Frequency Identification (RFID), and a precision ranging sensor for accurate spatial positioning. Structural optimization through dynamic modeling and simulation was employed to mitigate latex splashing, followed by comprehensive field trials. Experimental results demonstrated robust performance: at a forward velocity of 0.5 m/s, the robot achieved an average harvesting efficiency of 98.18%, with negligible latex waste (approximately 3.6 mL) and positioning errors constrained to 5.14 mm laterally and 0.56 mm vertically. Latex oscillation averaged 3.58 mm without external spillage, confirming operational stability. This innovation significantly enhances productivity and reduces labor dependency in rubber plantations, with future research directed toward autonomous navigation to eliminate reliance on rail-based movement.
Hemanth et al. [53] developed a low-cost UGV for pesticide spraying in chilli fields to reduce labor and health risks associated with manual methods. The UGV integrates an ESP32-CAM microcontroller for Wi-Fi-based remote control via the Blynk application, a 12 V DC geared motor for propulsion, and a mini submersible pump (120 L/hr) with mist nozzles for uniform spraying. Design calculations addressed torque (0.7687 Nm), rolling resistance, and gradient resistance, enabling an operating speed of 2.26 km/hr. Field-based specifications ensure suitability for small farms (1–3 acres), offering a cost-effective, safe, and efficient alternative to conventional spraying practices.
Despite the role UGV plays in precision agriculture by enabling autonomous navigation, crop monitoring, and targeted interventions, significantly reducing labor and improving productivity, they face limitations including restricted mobility in muddy or uneven terrains, slower coverage rates for large fields, and dependence on ground-based navigation, which can be hindered by obstacles or crop density. These constraints can be effectively addressed by complementing UGVs with Unmanned Aerial Vehicles (UAVs). UAVs provide rapid aerial coverage, real-time imaging, and variable-rate spraying over expansive areas, enabling efficient monitoring and treatment where UGVs struggle. A combined UGV-UAV approach leverages the precision and payload capacity of ground systems with the speed and flexibility of aerial platforms, resulting in improved operational efficiency, reduced resource use, and enhanced adaptability for diverse agricultural environments.

3.2.1. UGV Configurations

Agricultural UGVs can be designed in multiple configurations depending on their intended application. Based on the drive mechanism, they are generally categorized into three types: wheeled UGVs, tracked UGVs, and legged robots. Wheeled UGVs rely on wheels for locomotion, while tracked UGVs utilize continuous tracks. These two configurations are the most widely adopted in agriculture. Legged robots, which employ articulated legs for movement, are less common in this domain [54]. Figure 9 illustrates the various UGV configurations used in agriculture. Each configuration offers specific advantages and limitations regarding stability, manoeuvrability, and control (Table 4). Typically, wheeled designs are favoured for their energy efficiency on moderate terrains, whereas tracked systems or high-traction wheels are preferred for challenging conditions such as deep mud or steep slopes [55].
Wheeled UGVs
Wheeled UGVs represent the most prevalent type of agricultural robots. They can be distinguished by both the number and type of wheels employed, with the four primary wheel types being standard, caster, Swedish, and ball/spherical wheels [56]. Based on mobility, wheeled UGVs can be broadly divided into two categories: (i) systems with locally restricted mobility, such as skid-steer and differential-drive UGVs, and (ii) systems with full mobility, like omnidirectional UGVs [54]. Skid-steer and differential-drive configurations dominate agricultural applications due to their simple mechanical design and ease of motion control. These robots manoeuvre by adjusting wheel speeds for forward, backward, and turning movements. In contrast, omnidirectional UGVs offer unrestricted movement in any direction, providing superior manoeuvrability and terrain adaptability but at the expense of increased mechanical complexity and cost. Additionally, some wheeled UGVs employ a configuration with two driving wheels and two steering wheels (2WD2WS), similar to Ackermann steering, allowing precise control over both speed and turning angles.
Tracked UGVs
Tracked UGVs feature continuous treads driven by two or more wheels, significantly increasing ground contact area and evenly distributing the vehicle’s weight. This design enhances traction, making them well-suited for challenging terrain conditions. Similar to how large agricultural tractors use tracks, these systems excel in environments where wheeled UGVs struggle, such as muddy or uneven fields, due to their ability to exert lower ground pressure. Their superior terrain adaptability allows them to operate effectively on rough and soft soils. From a control perspective, the kinematics of tracked UGVs closely resemble those of differential-drive wheeled robots, using similar motion control strategies.
Legged Robots
Legged UGVs employ articulated legs for locomotion, allowing them to traverse uneven terrain, navigate obstacles, and move across soft soils where wheeled or tracked robots face limitations. This configuration provides superior terrain adaptability and minimizes soil compaction, which is beneficial for preserving soil health. However, these advantages come at the cost of increased mechanical complexity, sophisticated control requirements, and significantly higher energy consumption compared to wheeled or tracked systems.

3.2.2. UGV-Mounted Sensors

Sensors are critical components of UGVs, providing essential data to the control unit for decision-making and navigation. These sensors can be categorized into four main types: (i) position sensors for determining the robot’s location, (ii) attitude sensors for measuring orientation, (iii) vision-based sensors for detecting obstacles and target objects, and (iv) safety sensors for handling emergency situations. Among positioning systems, GNSS, GPS, DGPS, and RTK-GPS are widely utilized, all operating on similar fundamental principles. For instance, GPS determines latitude and longitude using signals from three satellites and altitude from a fourth, achieving an accuracy of approximately 3 m [57]. Unlike machine vision guidance systems, GPS is not significantly impacted by weed density, shadows, missing plants, or other environmental conditions.
Attitude sensors are essential for autonomous navigation as they provide information on the vehicle’s orientation. When GPS signals are obstructed by tree canopies or crop cover, sensor fusion techniques can be employed to enhance positioning accuracy. A common example of an attitude sensor is the Inertial Measurement Unit (IMU), which accurately measures the UGV’s orientation. Machine vision sensors enable the vehicle to perceive its surroundings and determine its relative position and heading. Frequently used vision sensors in agricultural UGVs include depth cameras, tracking cameras, and RGB cameras. For safety applications, sensors such as ultrasonic sensors, laser sensors, radar, and LiDAR are widely adopted. LiDAR serves both as a safety and a primary navigation sensor, utilizing emitted light to calculate the distance to target objects. Modern LiDAR systems, equipped with multiple lasers or channels, can generate up to 2.2 million data points per second [58]. Figure 10 illustrates the most commonly utilized sensors in agricultural UGV development.
Table 5 provides a detailed summary of key studies on UGVs over the past decade (2015–2025), showcasing their diverse applications in agriculture and highlighting notable technologies, sensors, algorithms, and performance outcomes. This summary reflects the evolution of UGV design and deployment across tasks such as spraying, harvesting, soil monitoring, and navigation, offering insights into emerging trends and innovations in ground-based automation for precision farming.

3.3. Robotic Arms and End-Effectors

Robotic arms, also known as manipulators, are generally arm-type electro-mechanical devices that can move in a confined space with ends such as a tool or end effector. The manipulators are classified according to their degree of freedom (DOF), type of joint, link length, and offset length. A robotic arm’s DOF is a crucial component that determines its flexibility and operational capacity. For example, a 6-DOF arm allows the manipulator to move in three dimensions and control pitch, yaw, and roll, which makes it appropriate for reaching occluded fruits and negotiating intricate canopies [75]. Even more mobility is offered by 7-DOF arms, which is especially useful in high-density plantings where avoiding obstacles is essential. The efficiency of higher DOF for accurate and quick harvesting in congested situations was demonstrated by Davidson and Mo [76], who used a 7-DOF robotic arm for apple harvesting and achieved an 84.6% picking success rate with an average picking time of 7.6 s. However, simpler or more structured settings tend to use lower DOF systems, like 4-DOF or 5-DOF arms [77,78]. The manipulator has a major task of moving its end effector to a position where it can interact with a task object, and/or orienting the end effector to perform pre-defined mission(s). Each manipulator and end effector is usually designed for a specific task and environment. However, a special-purpose manipulator can be used to perform various other tasks by using different end effectors.
The end-effector is the movable component of the picking robot that actually picks the fruit, and it influences both the robot’s picking efficiency and performance [78]. The design and functionality of the end effector during harvesting influence the overall success rate of robotic systems for fruits and vegetables, which is often lower compared to the visual detection success rate. End effectors for robotic arms are developed for specific crops to accommodate various crop types and physical characteristics. The end-effectors of picking robots can be categorized into scissor-type, adsorption-type, finger-clamp-type, and other types based on the gripping method and the target crop’s properties [78]. Scissor (shear) end-effectors use a shear-type device to cut the crop’s stalks directly, and a hose is used to retrieve the fruit right beneath it. The goal is to prevent harm from the end-effector coming into direct touch with the fruit. However, this faces a challenge of limited methods of separation and the potential for clamping damage to fruit recovery after pedicel cutting [78,79]. Secondly, absorption end effectors utilize suction cups to hold the fruit in place and the stalks are separated by either pulling them off directly or slicing them with a blade [75,78,80]. Lastly, the finger-clamp end-effector is equipped with several pliable fingers that are used to grasp the fruit and twist or pull it away from the stalk [78]. In summary, the scissor-type end-effector has a straightforward mechanism to prevent damage from clamping, but it has a limited manner of separation and is vulnerable to damage from fruit recovery after the stalk is cut. The suction end-effector has a complicated structure, poor suction reliability, and unstable location, despite the addition of suction cups. Fruit can be easily grasped by the finger grasping end-effector, but when the stems are split, the fruit is vulnerable to injury [78]. The agricultural environment is unstructured; hence, it is paramount to develop end-effectors and robotic arms that can adapt to the complex operating environment. Figure 11 shows the major types of end-effectors for harvesting robots.
For an automated and intelligent fruits and vegetables harvesting, research on harvesting robots equipped with robotic arms and end-effectors has gotten a lot of attention from researchers in recent years, some of which are summarised as follows: Yin et al. [75] developed and field-tested a fully autonomous citrus harvesting robot that can function in intricate orchard settings. For citrus harvesting, the system combined a new end-effector with a collaborative robotic arm with six degrees of freedom. To meet supermarket standards for stem length (≤1 cm), the end-effector softly pulled the citrus into the sleeve and precisely cut the stem using a sleeve structure, fork-shaped blades, and enclosed rubber rollers. The mechanical design minimized fruit damage, and a linear radar sensor kept track of the fruit’s feeding depth to guarantee the best cutting position. The results showed an average harvesting time of 10.9 s per fruit and an overall picking success percentage of 87.2%.
Gao et al. [81] analyzed cherry tomato picking patterns to guide robotic end-effector design, identifying twisting and pulling as the most suitable methods for minimal plant disturbance and simpler mechanics. Two pneumatically controlled end-effectors were developed: a vacuum type (based on pulling) and a rotating type (based on twisting). Greenhouse tests showed the vacuum end-effector achieved 66.3% success with no fruit damage, while the rotating type reached 70.1% success but caused 5.2% fruit damage. Failures were mainly due to detachment (vacuum) and localization errors or collisions (rotating). Both designs demonstrate promise for efficient, compact, and low-disturbance robotic harvesting.
Yeshmukhametov et al. [82] introduced TakoBot, a cherry tomato harvesting robot featuring a cable-driven continuum arm with a passive pretension mechanism for smooth, sensor-free motion control. Its semi-spherical gripper with integrated cutting blades enables gentle grasping and detachment, reducing fruit damage and simplifying operation in humid greenhouse conditions. TakoBot achieved 96% tomato recognition accuracy and an average harvesting cycle of 56 s per fruit, demonstrating precise manipulation and reduced system complexity. Chen et al. [83] designed a pneumatic three-finger flexible spherical end-effector for fruit and vegetable picking, integrating vision recognition and multi-sensor fusion to optimize clamping force and minimize damage. Using the YOLOv3 algorithm for object detection and adaptive pressure control via pressure and torque sensors, the system achieved rapid, non-destructive gripping across multiple fruit types. Experiments confirmed stable clamping within preset safety intervals and efficient operation without external air sources. This approach enhances versatility and precision in robotic harvesting, supporting the development of intelligent, damage-free picking solutions for facility agriculture.
Existing robotic arms and end-effectors have demonstrated commendable performance in tasks like citrus and cherry tomato harvesting, achieving reasonable success rates and reducing fruit damage compared to manual methods. However, they often suffer from task-specific designs, limited adaptability to diverse crop types, and inefficiencies in unstructured environments. Common issues include fruit damage from excessive clamping force, unreliable suction, complex mechanisms prone to failure, and slow cycle times, especially in dense canopies. To improve performance and efficiency, future designs should integrate modular end-effectors with adaptive gripping strategies, real-time force feedback, and AI-driven vision systems for dynamic adjustment. Combining lightweight continuum arms for flexibility with interchangeable multi-functional end-effectors can enhance versatility, reduce damage, and accelerate harvesting in complex agricultural settings.
Table 6 provides a detailed summary of key studies on robotic arms and end-effectors over the past decade (2015–2025), highlighting their applications in fruit and vegetable harvesting and the technologies, gripping mechanisms, sensors, and control strategies employed. This summary reflects the evolution of manipulator and end-effector designs across diverse crops and environments, offering insights into current challenges and innovations aimed at improving adaptability, precision, and efficiency in automated harvesting systems.

3.4. Unmanned Surface Vehicles (USVs)

Unmanned Surface Vehicles (USVs), also known as airboats, are usually adopted for above-water operations such as in paddy fields. An airboat is a flat-bottomed vessel propelled by an aerodynamic propeller powered by an electromotor or an automotive engine. Airboat has a low environmental impact and a shallow draft depth. Consequently, the airboat is more suited to environmental monitoring and agricultural production operations [95]. In agriculture, airboats are usually applied in paddy fields in operations such as cultivation of paddy seedlings, field tillage, paddy transplanting, spraying, weeding, fertilizer and harvesting since such operations cannot be done using a wheel-type tractor or a crawler-type tractor, because the soil and paddy crops will be easily damaged by the tractor. Since a surface vehicle floats on the water, paddy seedlings will not be crushed. This is a lower-cost, safer and faster way than using a ground vehicle such as a tractor or doing manual work. Application scenarios of agricultural USVs are shown in Figure 12.
Unlike UAVs and UGVs, USVs have received comparatively less research attention in agricultural applications, primarily due to their limited operational scope and applicability. However, recent studies have begun exploring their potential for waterborne agricultural tasks, including paddy field weeding, paddy seedlings cultivation, and paddy growth monitoring, amongst others. These emerging efforts highlight the growing recognition of USVs as valuable tools for aquatic environment sensing and precision water resource management, offering complementary capabilities to aerial and ground-based robotic platforms. Some of these notable studies are summarised below.
Moro et al. [96] developed an autonomous boat-type weeding robot for paddy fields, equipped with trailing chains for mud agitation and weed suppression. Using a mathematical model and Proportional-Integral-Derivative (PID) control algorithms, the system achieved precise path and speed control. Results showed a 71% reduction in path deviation and 10% longer operation time compared to non-PID systems. The robot maintained stable performance at target speeds of 0.5–1.0 m/s (straight) and 0.35–0.4 m/s (turns), even under 3 m/s wind disturbances, demonstrating its effectiveness for autonomous paddy field weeding.
Murugaraj et al. [97] proposed an in-row mini mobile robot suitable to work autonomously in the slurry paddy field using a novel wheel and wheel angle adjustment assembly. Experiments were conducted to achieve the optimisation of wheel position to reach the optimum turning angle. The optimum angle of wheel position was found to be around 35 degrees with an error value of ± 2 degrees. It was also reported that the variation in the wheel angle influenced the distance taken for the turning operation.
Kaizu et al. [95] developed a small autonomous electric robot boat to control excessive lotus growth in lakes, offering a low-cost, labor-saving alternative to manual management. The system used paddle propulsion to prevent plant entanglement and Real-Time Kinematic Global Navigation Satellite System (RTK-GNSS) for precise guidance. Operating at an average speed of 0.41 m/s with 9.4 cm lateral deviation, the robot maintained stability even in dense vegetation. Power consumption averaged 518 W, with a theoretical work rate of 0.133 ha/h. UAV imagery confirmed effective lotus suppression, particularly during late growth stages, demonstrating the robot’s efficiency and adaptability in aquatic vegetation control.
Liu et al. [98] developed a UAV-based vision positioning system for autonomous navigation of an agricultural airboat in paddy fields. Using field markers detected via the Fitzgibbon ellipse fitting method, the UAV tracked the airboat and sent real-time position updates via Bluetooth. Field tests achieved 0.10–0.17 m RMS lateral error, surpassing Differential GPS (DGPS) accuracy, demonstrating a low-cost and precise navigation alternative for paddy field operations.
Liu and Noguchi [99] developed an agricultural USV platform using a low-cost GPS compass for autonomous navigation in a paddy field. Based on the system architecture, the USV platform can work in two modes: manual mode by an original radio controller and automatic mode by an autonomous navigation system. Building upon their earlier work, Liu et al. [100] adapted the developed platform for paddy field weeding and crop monitoring, modeled using the Nomoto maneuvering model to achieve precise navigation control. Equipped with an onboard computer, GPS compass, and Inertial Measurement Unit (IMU), the system’s maneuverability indices (K and T) were derived from zigzag maneuverability tests. MATLAB (Version R2013a) simulations and field trials confirmed stable course control at 1.2 m/s, achieving 0.25 m tracking error at a 15° rudder angle, demonstrating reliable and accurate path-following for autonomous paddy operations.
Although USVs have shown great potential for paddy field and aquatic agricultural operations, their current designs face several limitations. Most existing systems are restricted by limited perception and navigation accuracy under dynamic water conditions, low adaptability to varying water depths and crop densities, and restricted autonomy due to reliance on external localization sources such as GPS or UAV-assisted vision. Additionally, energy inefficiency, payload constraints, and poor scalability hinder long-duration or large-scale deployment. To improve performance, future USVs should integrate multi-sensor fusion systems (combining LiDAR, sonar, and vision sensors) for robust perception, AI-based adaptive control for real-time path adjustment, and energy-efficient power systems such as solar-assisted propulsion. Furthermore, continued UAV–USV collaboration can enhance real-time navigation, task coordination, and monitoring by providing aerial situational awareness and dynamic path updates. UAVs can also assist USVs through aerial mapping and target tracking, improving precision and efficiency in aquatic farming operations.

3.5. Research Focus Trends in Agricultural Robotics (2015–2025)

Between 2015 and 2025, research on agricultural robots shows clear disparities in focus across different platforms (Figure 13). UGVs dominate with 35% of the reviewed studies, largely because researchers often design UGVs from scratch, incorporating custom hardware, sensors, and algorithms tailored for tasks like spraying, harvesting, and crop monitoring. In contrast, UAVs account for 25%, as most studies leverage commercial drones (e.g., DJI drones) for general tasks such as above-canopy crop monitoring rather than developing new platforms, limiting innovation-driven research. Robotic arms represent 30%, reflecting growing interest in precision harvesting and manipulation in complex environments, driven by advancements in vision systems and adaptive gripping technologies. USVs have the least attention at 10%, primarily due to their narrow applicability in paddy-field operations and challenges in scaling beyond niche use cases. These trends highlight that research intensity correlates with development complexity, task diversity, and potential for customization, with UGVs leading as the most versatile and innovation-rich domain.

4. Sensing Technologies for Agricultural Robots

4.1. Navigation Techniques

Autonomous navigation is an important part of agricultural robot autonomy, yet it remains a challenging problem in unstructured outdoor farm environments [101,102]. Agricultural fields present highly variable conditions including uneven terrain, changing weather (rain, dust, fog), dynamic obstacles, and dense vegetation, all of which demand robust sensing and control systems for reliable navigation. Unlike controlled indoor settings or well-paved urban roads, farm environments can vary from open plains of row crops to cluttered orchards with overhanging canopies. Navigation systems must therefore be tailored to cope with these mostly unstructured conditions, leveraging multiple sensors and algorithms to ensure the robot knows where it is (localization) and can move safely to its target (guidance and path planning). Figure 14 shows the most common agricultural navigation modalities.

4.1.1. Localization Methods

Modern agricultural robots use a combination of global and relative localization techniques. Global Navigation Satellite Systems (GNSS), especially RTK-GPS, are widely adopted on large farm machinery for absolute positioning with centimetre-level accuracy [103]. For example, RTK-GNSS-based guidance on tractors and sprayers can achieve lateral path-tracking errors on the order of 5–6 cm in straight rows. This is sufficient for many row-crop operations and has enabled automatic steering and headland turning in commercial auto-guidance systems. However, GNSS alone has well-documented limitations. Its signals suffer from multipath errors and can be completely blocked under tree canopies or inside greenhouses. In orchards and under-canopy scenarios, an agricultural robot cannot rely on RTK-GPS for consistent positioning. Inertial navigation systems (INS) (accelerometers/gyros) are often combined with GNSS to bridge short signal outages, but pure INS drifts over time if not corrected. Wheel odometry is similarly prone to cumulative error on soft, uneven ground. As a result, sensor fusion is critical: GNSS is frequently integrated with INS, wheel encoders, and other sensors via Kalman filtering to improve accuracy and robustness. Fusing RTK-GPS with an IMU, for instance, yields smooth state estimates and compensates for momentary satellite dropouts.
When external references are unreliable, robots turn to relative localization based on onboard perception. Vision-based navigation is a compelling approach in agriculture due to the low cost and richness of camera data. Machine vision systems can identify crop row lines, tree trunks, or artificial landmarks to guide robots within the field. Cameras provide abundant information (colour, texture, shape) that can be used to recognize field features and even estimate distances via stereo vision or structure-from-motion. Vision is especially useful for obtaining the relative pose of the robot with respect to crop rows or targets, which GNSS alone cannot provide. For instance, Sun et al. [104] note that while GNSS gives global coordinates, it does not indicate the robot’s position relative to crop rows, so relying solely on predefined GPS waypoints could result in machinery drifting off the actual plant lines leading to crop damage and other unintended consequences. By contrast, onboard cameras or Time-of-Flight (ToF) sensors can detect the row structure directly. Gai et al. [69] demonstrated an under-canopy ToF camera system on a PhenoBot field robot that detected corn rows and achieved a mapping accuracy within ~3.5 cm mean error. Vision-based localization can thus enable precise in-row navigation even without GPS, but it introduces its own challenges: lighting changes (day/night, shadows) and seasonality can alter the appearance of crops dramatically. Under a corn canopy, for example, lighting is uneven, and foliage occlusion is high, making pure RGB vision less reliable. Overhanging leaves and narrow camera field-of-view may also limit what the robot “sees,” potentially missing upcoming turns or obstacles. These issues have been addressed through robust computer vision algorithms (e.g., adaptive thresholding, high dynamic range (HDR) imaging, or learning-based segmentation) and by complementing cameras with other sensors.
LiDAR-based navigation provides another avenue, leveraging active laser scanning to measure distances independently of lighting conditions. LiDAR can produce high-resolution 3D maps of the surroundings, which is valuable for both localization and obstacle detection. In open farm fields, a scanning LiDAR on a rover can perform SLAM—concurrently mapping the field terrain or crop rows and localizing the robot within that map. In orchards, LiDAR is effective for detecting structures like tree trunks or trellises to guide navigation down lanes. A drawback, however, is that LiDAR returns contain no colour/texture information: a tree trunk and a person both appear as clusters of points, so additional processing or sensor fusion is needed to classify objects. Moreover, the performance of LiDAR can degrade in heavy dust (such as during harvesting or tillage) which scatters the laser beams. The cost and power consumption of high-end 3D LiDAR units remain non-trivial, though solid-state and lower-cost LiDAR sensors are emerging. Recent studies have pursued cost-effective LiDAR setups; for example, Kim et al. [105] used a 2D LiDAR with a clever row-detection algorithm to keep a robot centred between crop rows, demonstrating that even a single-plane laser can guide in-row travel if the algorithm balances distances to the flanking crop lines. Building on this, Kim’s group developed P-AgNav, a navigation framework using a 3D LiDAR’s “range image” (a panoramic depth view) to enable a robot to autonomously traverse multiple corn rows and even switch rows without GNSS or predefined waypoints. Their approach was motivated by the observation that conventional path planners would treat harmless crop foliage as obstacles; overhanging leaves not present in an early-season map could later trigger false re-routes. Using LiDAR imaging and adaptive algorithms, the robot could distinguish true obstacles from mere crop presence and navigate under dense canopies where classical GPS-based or vision-based methods struggled.

4.1.2. Pre-Mapping vs. On-Line Planning

A key design decision in navigation is whether to rely on prior maps. Deterministic route planning (global path pre-computation) can find an optimal path if an accurate field map is available. For instance, a drone (UAV) can survey a field and produce a map for coverage path planning before the ground robot starts. Mansur et al. [106] showed that UAV-derived maps of row crops can facilitate route optimization and obstacle identification in advance, which is especially beneficial for small robots with limited onboard computer. The limitation is that any changes in the environment (new obstacles, moving animals, crop growth) after mapping will not be reflected unless the robot also senses in real-time. On the other hand, sensor-based planning generates paths on the fly using live sensor data. This reactive approach ensures the robot can deal with unforeseen obstacles, but processing rich sensor data (camera feeds, LiDAR clouds) in real time can increase the robot’s computational resources. A hybrid strategy is often ideal: use a coarse global plan (e.g., waypoints at row ends or sectors of the farm) and combine it with local reactive adjustments as sensors detect immediate hazards.
Research in agricultural robot navigation has been very active, with trends mirroring advances in computer vision and robotics at large. Deep learning has been introduced to navigation perception, for example, Chen et al. [107] developed a semantic line detection framework to identify navigation lines (e.g., crop row centrelines) in camera images. Through a novel geometry-aware loss function, their network significantly improved the accuracy of detecting these guidance lines for crop rows. Similarly, Yu et al. [108] evaluated several deep semantic segmentation models (UNet, ENet, DeepLabV3+, BiseNet) for field furrow detection on edge computing devices. They found that DeepLabV3+ offered the best accuracy, demonstrating that modern CNNs can run in real-time at the edge and provide reliable row segmentation. The push toward lightweight models is important for agriculture, since robots often must process data onboard with limited hardware. Another innovation is robust navigation in GNSS-denied environments such as orchards. Raikwar et al. [66] implemented a model-based navigation controller for an orchard robot that assumes no GPS availability. Their system used a prior map of the orchard and a three-tier architecture (path planning, motion planning, vehicle control) to guide the robot through orchard rows using minimal sensors. Notably, the robot was able to follow straight and turning paths reliably using only onboard localization relative to the provided map. This suggests that in structured farming environments (like orchards with consistent row geometry), a high-fidelity map plus odometry/IMU can substitute for GPS, as long as the control system is robust to slight localization errors. Other researchers have improved vision-based path extraction in structured settings like greenhouses. For instance, Chen et al. [109] designed a modified Hough Transform algorithm to fit a cucumber-picking robot’s path in a greenhouse, achieving a maximum deviation under 0.5° from the true path and real-time performance (~17 ms processing). Such accuracy is crucial for guiding robots in narrow confines without hitting crops. Table 7 provides a comparative summary of the main navigation techniques and their trade-offs in agricultural environments.

4.1.3. Suitability in Different Farm Environments

In practice, navigation systems are often customized to the crop environment. The following explains the different crop environments where navigation systems are applied.
Row crops (e.g., maize, soybeans): Straight parallel rows allow simpler guidance (following a straight line). Here, RTK-GNSS is very effective for maintaining rows, as evidenced by widespread auto-steer systems on tractors. However, reliance on GNSS means the vehicle follows predetermined tracks and might crush crops if actual planting rows are shifted from mapped lines. To address this, modern systems increasingly incorporate computer vision to adjust to the true crop rows in real time. Deep-learning-based row detection [65] has recently been applied to handle more complex planting patterns (e.g., a soybean–corn interplanting scenario), Sun et al. [104] developed a lightweight YOLO-based segmentation model to detect dual crop rows at 61 FPS, achieving ~95% line-fitting accuracy and thus balancing precision and speed for real-world field use. Nkwocha and Wang [110] developed a deep learning-based navigation system for agricultural robots, achieving high segmentation accuracy (mIoU = 96.50%) and reducing navigation line errors to 1.1–1.6°, ensuring precise navigation under varying field conditions. Figure 15 shows a pipeline of deep learning-based path segmentation and navigation line extraction for agricultural robots.
Orchards and vineyards: These involve traversing lanes between trees or vines, often under partial canopy. As noted, GNSS coverage is patchy; accordingly, robots in orchards lean on LiDAR or vision to follow the corridor of tree trunks. For instance, automated orchard sprayers have used LiDAR to sense the gap between tree rows for navigation. At the same time, the presence of regular structure (rows of trees) can be exploited; some systems use path playback (teaching the robot a route by driving it once) or rely on prior orchard maps. The orchard environment also exemplifies the need for tight path tracking. Raikwar et al. [66] emphasized inertial stabilization and continuous path re-planning to keep an orchard robot precisely on course in turns.
Unstructured terrain: Pastures, forests, or irregular fields pose the toughest challenge, as there are no man-made row patterns to guide the robot. Here, SLAM becomes crucial; the robot must build and use its own map. Researchers have demonstrated mapping of rough fields using multi-sensor SLAM (e.g., combining visual and LiDAR SLAM), but ensuring long-term accuracy over large areas and repeat visits (season to season) is still an open research problem. In general, despite significant progress, achieving the level of navigation reliability seen in self-driving cars is harder in agriculture due to the lack of road-like consistent features and the harsher environmental variability. Nevertheless, ongoing innovations, from advanced sensor fusion to deep learning for perception, continue to narrow this gap, moving agricultural robots closer to true autonomy.

4.2. Object Detection

Object detection in agricultural robotics typically refers to perceiving target objects of interest (such as fruits, vegetables, or weeds) in the environment so that the robot can act on them (e.g., harvesting or precision spraying). This perception capability is critical because nearly all subsequent robotic actions including grasping a fruit, cutting a vegetable stem, or spraying a weed, depend on first correctly detecting and localizing the target. Agricultural settings make detection challenging: the targets (crops) are often naturally camouflaged by foliage, subject to variable lighting (bright sun vs. shadows under leaves), and can appear in cluttered, dynamic backgrounds (swaying branches, dirt, other plants). Furthermore, a farm robot may encounter different crop varieties, growth stages, and environmental conditions, so its object detection algorithms must be robust to high variability.
The object detection pipeline for agricultural robots generally involves acquiring sensor data (images or point clouds), extracting features that differentiate the target from its surroundings, and then recognizing/classifying those features as a particular object. Early approaches relied heavily on identifying simple morphological features of crops. According to Zhao et al. [111], the most widely used features for fruit/produce detection were basic visual attributes like colour, shape, and texture, based on the premise that ripe produce exhibits distinct colour or shape characteristics that set it apart from leaves or stems. For example, a ripe apple might be red and roughly spherical against green leaves. Such features can be extracted from standard 2D RGB images. Indeed, many classical vision methods showed that with careful thresholding and filtering, one can detect fruits under certain conditions. Lili et al. [112] achieved a 99.3% success rate in ripe tomato recognition and an 86% picking success rate with a load-bearing capacity of 1.5 kg. Similarly, algorithms focusing on geometric shape have been used to find fruits that approximate circles or ellipses in the image. Gongal et al. [113] employed geometric feature extraction to reduce the influence of illumination changes when detecting produce that had a similar colour to the background. Their approach assumed the fruit had a known simple shape (e.g., roughly circular in 2D, spherical in 3D) and searched for those shapes in the visual data. Such model-based detectors work well for relatively regular fruits (apples, oranges) under stable lighting but can struggle with complex shapes or when fruits are heavily occluded by leaves.
A major limitation of simple colour/shape features is their sensitivity to environmental conditions. Lighting changes (time of day, shadows, highlights on shiny fruit skin) can skew colour values, and complex backgrounds can confuse shape detectors. To tackle this, researchers explored features less susceptible to lighting, for instance, spectral reflectance features outside the visible spectrum and thermal response [114]. Multispectral and hyperspectral cameras capture reflectance in infrared or other bands where the contrast between fruit and foliage might be higher. Thermal cameras can detect temperature differences; for example, some fruits heat up or cool down differently than leaves under sunlight. These modalities have shown promise (e.g., distinguishing fruits by their spectral signature or detecting ripe fruits via thermal emission at night), but they require specialized (often expensive) sensors. A simple monocular RGB camera cannot capture this information, so additional spectral cameras or thermal imagers must be deployed. This increases system complexity and cost, which is why RGB cameras remain the default sensor in many practical systems despite their limitations.
The past decade has seen the rise of 3D sensing for crop detection, thanks to affordable depth cameras (RGB-D sensors) and LiDARs. With 3D point clouds, one can compute features like surface normals and curvature, which describe the 3D shape of objects. These 3D features are invariant to colour and lighting, making them attractive for fruit detection under occlusion. For instance, Wu et al. [115] used a 3D Euclidean clustering on point clouds to group points belonging to individual peaches, improving a support vector machine (SVM) classifier’s ability to distinguish peach clusters from leaves. Sa et al. [116] calculated surface normals to find the symmetry axis of sweet peppers, which helped locate the peppers even when their stems (peduncles) were small or hidden. Depth-based segmentation can also separate objects by distance, useful when colour alone would merge a fruit and a leaf that overlap in the image but sit at different depths. As another example, shape matching algorithms in 3D have been developed: Lin et al. [117] detected green peppers and eggplants by matching point cloud data to 3D shape templates of the produce. Because 3D data provides the true structure, these methods can recognize produce of a known shape even when colour contrast is low (e.g., green pepper in green foliage). The downside is that depth sensors can be affected by their own noise (outdoor sunlight can interfere with some infrared-based depth cameras, and LiDAR has the dust issue noted earlier) and the data processing is heavier.
Despite the success of well-crafted feature extraction, modern trends clearly favour machine learning (ML) approaches, particularly deep learning, for object detection in agriculture. ML algorithms can learn to combine multiple features (colour, texture, shape, etc.) in a weighted manner, often outperforming single-feature heuristics especially under varied conditions. Classical ML methods like SVM and k-Nearest Neighbours (KNN) were initially popular. For example, Luo et al. [118] used K-means clustering to segment vine canopy images by colour and isolate grape clusters, then applied edge detection to refine the result. In another study, Méndez et al. [119] combined a 3D LiDAR and K-means clustering to detect oranges by their roughly spherical shape. KNN classifiers have been used to classify image pixels as “fruit” or “background” based on colour features, yielding better detection of plums compared to simple neural networks in one case. SVMs have also shown good accuracy for fruit pixel classification while being computationally efficient (only storing support vectors rather than large models). These conventional ML methods require manual feature selection (e.g., choosing colour indices or shape descriptors to feed into the classifier), which can limit their adaptability. They work well in the conditions they were tuned for but may need re-tuning for different crops or lighting.
Deep learning, especially Convolutional Neural Networks (CNNs), has revolutionized object detection globally and has made strong inroads into agricultural vision in recent years. CNN-based detectors automatically learn features from training data, which is a huge advantage given the complexity of plant environments. Researchers have applied state-of-the-art detection models to various crops: Fu et al. [120] utilized a Faster R-CNN as a kiwifruit detector towards robotic harvesting of the crop. This approach achieved a high accuracy (up to 92.3% recognition rate) of kiwi fruit detection on the test images. Faster R-CNN and its variants can detect objects with high accuracy, though computational load can be significant. To address speed, some have turned to one-stage detectors like YOLO (You Only Look Once) [121] (Figure 16) and SSD (Single Shot Detector), which are known for real-time performance. Onishi et al. [122] used SSD to accurately detect apples in orchard images, enabling the robot to locate fruit in near real-time using stereo camera inputs. Likewise, YOLOv3 was employed by Birrell et al. [123] to detect iceberg lettuce in the field, chosen for YOLO’s fast inference speed which is crucial when the robot is moving and needs instant feedback. Multiple studies [124,125] echoed the success of YOLO in field conditions. For more complex scenarios, Mask R-CNN (which provides pixel-level segmentation of objects) has been used; for example, Yu et al. [126] used Mask R-CNN with a ResNet-50 and Feature Pyramid Network (FPN) backbone to accurately detect and segment strawberries. Through this method, they reported a 96% precision and recall, outperforming traditional methods even with overlapping fruits and variable lighting. Similarly, Yang et al. [127] applied Mask R-CNN in a citrus orchard to simultaneously identify fruits and branching structure; the latter information helped reduce collisions by the robotic arm during picking. These deep learning models have achieved higher detection accuracy and robustness than earlier methods, especially under challenging conditions like varying lighting or complex foliage backgrounds. Moreover, they can detect multiple objects in one frame and handle different viewpoints of the fruit (thanks to large training datasets encompassing many variations).
Despite these advances, challenges remain and constitute active research gaps. One issue is occlusion: fruits often grow under leaves or in clusters, and even the best vision algorithms can miss targets that are partially hidden. Approaches like 3D completion have been explored, Wang et al. [128] used symmetry assumptions to reconstruct partially visible fruits in point clouds, but more work is needed for reliably detecting heavily occluded produce. Another challenge is the variability in appearance across orchards and seasons. A model trained on one orchard or crop variety might not generalize to another due to differences in colour, fruit size, or canopy structure. This is pushing research towards domain adaptation techniques and large, diverse training datasets for agricultural objects. The need for large, labelled datasets itself is a hurdle, unlike common objects (cats, chairs, etc.), agricultural datasets are not as abundant, although efforts like public fruit image datasets are growing. To alleviate data requirements, researchers use data augmentation (as in [129] who augmented simulated crop images with rotations, brightness changes, etc., to train a YOLO-based detector) and transfer learning from generic vision models. Another frontier is multi-modal detection: combining colour cameras with thermal or multispectral data in a deep network to leverage complementary information. For example, a multispectral CNN could potentially detect ripe fruits by combining visible and near-infrared bands (where fruit might reflect differently than leaves), improving robustness under challenging light. Some preliminary works in orchards have used colour-depth fusion (merging RGB and depth images) to improve segmentation of fruits in dense foliage, hinting at the potential of sensor fusion in detection.
It is worth noting that object detection in agriculture is increasingly tied to the action it enables. In harvesting robots, detection is followed by localization (exact 3D position of the fruit) and then manipulation (grasping). Systems like that of Zhao et al. [129] integrate detection networks (YOLO-based) with a secondary network to determine grasp points for a robotic gripper. Similarly, in weeding or spraying, detection of a plant might be paired with a tracking system so that the robot knows which plants have been treated. For instance, Hu et al. [130] developed “LettuceTrack”, which uses YOLOv5 to detect lettuce heads and then a tracking algorithm to assign IDs to each lettuce, ensuring the robot sprays each plant exactly once. Such integration of detection with tracking and decision-making is vital for efficient operation. In summary, object detection for agricultural robots has evolved from simple thresholding to sophisticated multi-sensor, AI-powered vision. Deep learning has significantly improved detection accuracy and speed, but ongoing research is addressing how to make these systems generalize across environments and run on resource-constrained robotic platforms. The ultimate goal is a detection system as reliable as a human’s eye, one that can spot the target produce under any realistic farm condition and do so fast enough for real-time robotic action.

4.3. Obstacle Avoidance Strategies

Safe navigation in agricultural fields not only requires detecting crops of interest but also avoiding various obstacles that could impede the robot or cause damage. Obstacle avoidance in this context means perceiving whether the space ahead of the robot is clear or blocked and then maneuvering the robot to prevent collisions. The types of obstacles an agricultural robot might face are diverse. Reina et al. [131] categorized four typical obstacle types in farms: positive obstacles (physical objects above ground like rocks, poles, equipment), negative obstacles (gaps such as ditches or holes in the terrain), terrain hazards (challenging ground like water or steep slopes), and moving obstacles (animals, people, or other machines). Current Ag robots have only a limited understanding of such obstacles, and the variation in obstacle types across different farming domains (row crops vs. orchards, etc.) makes universal solutions difficult.
Sensors for Obstacle Detection: Just as in navigation, a variety of sensors are employed to detect obstacles, each with its strengths and weaknesses. Vision sensors are a cost-effective choice that provide rich information. Early work used single cameras (monocular vision) and stereo camera rigs to perceive obstacles. A stereo camera can produce a depth map via disparity matching, effectively giving 3D structure of the scene. This allows detection of obstacles by their protrusion above ground (for positive obstacles) or depressions (for negatives) while also providing visual context (colour/texture) to identify what the object is. For example, Yan and Liu [132] implemented a stereo-vision obstacle detector using a combination of fuzzy logic and a neural network classifier. Their system could successfully detect common obstacles in farms (like rocks or tree stumps), leveraging depth data, but it struggled with moving obstacles since those require temporal tracking logic beyond a single frame. Another study by Qiu et al. [133] took advantage of modern vision algorithms by using the YOLOv3 CNN to detect objects in the scene and the DeepSORT tracking algorithm to follow their motion. This approach enabled detection and tracking of moving obstacles (e.g., people) in a paddy field in real time, but it required a large, labelled dataset to train the detector, a non-trivial task for agricultural environments. Xu et al. [134] tackled moving obstacle detection through a different route: they used optical flow (via the Lucas-Kanade method) to first identify regions of motion in the panoramic view of a moving farm vehicle, then applied K-means clustering to segment these moving objects. This unsupervised approach could flag dynamic obstacles without prior training but might not classify what the obstacle is.
Apart from cameras, LiDAR is extensively used for obstacle detection on field robots. A forward-facing 2D LiDAR can scan a plane and detect any object protruding into that plane (e.g., a wall, an animal) by measuring the absence of expected free space. For instance, Bayar et al. [135] outfitted an autonomous orchard vehicle with a 2D LiDAR (laser scanner) to localize itself between tree rows in a fruit orchard and control its steering to follow the rows. They built a model-based controller along with row-detection from LiDAR scans, enabling the vehicle to turn between rows and maintain path accuracy. Field tests showed the system could reliably follow the tree rows using LiDAR-based geometry, though the authors noted challenges in computation, sensing in complex orchard layouts, and reliance on geometry alone. However, one challenge with this method is the cost of mapping with LiDAR in complex outdoor environments. Denser scans or 3D mapping can be computationally heavy and energy-consuming. Moreover, LiDAR alone provides shape but not appearance; a vision system can better recognize an object (say, distinguish a person from a post), whereas a LiDAR would need to infer it from size/shape. On the other hand, a stereo camera provides both depth and colour, Ball et al. [136] exploited this by using stereo vision to detect obstacles and then expand their detected size in software (adding a safety buffer) to define an “obstacle area” on the map. They could then plan a new path around this expanded obstacle region, demonstrating a complete vision-based detection and avoidance loop in an unstructured farm setting.
For path planning around obstacles, various strategies exist. Some systems attempt to go around the obstacle by computing a local avoidance manoeuvre, while others (especially in structured tasks like row following) might prefer to stop and wait or adjust speed if deviating from the path is undesirable. An example of the latter is transplanting or harvesting operations where the machine should stay on a fixed row; in such cases, deviating around an obstacle might damage crops in adjacent rows. Xue et al. [137] developed a speed controller that adjusts the robot’s velocity based on the motion state and risk factor of detected obstacles, effectively slowing or pausing to avoid collisions without leaving the planned path. Their controller ran at 5 Hz and could react to moving obstacles, though its prediction accuracy for obstacle motion could be improved. In contrast, when path diversion is allowed, geometric path planning algorithms can be applied: Liu et al. [138] proposed an obstacle avoidance path planner that respects the vehicle’s minimum turning radius and uses a three-segment arc to smoothly steer around an obstacle and then back to the original path. This was essentially a curvilinear detour computed in real-time. Similarly, Liu et al. [139] implemented an avoidance scheme on a tractor by formulating it as a nonlinear optimization problem (minimize deviation and turning effort subject to vehicle dynamics and obstacle constraints) and solved it within ROS; however, that system was tested mainly in simulation. Chen et al. [140] developed a Genetic Algorithm (GA)-Bézier obstacle-avoidance path planner for autonomous tractors that explicitly optimizes land utilization alongside navigation error. A third-order Bézier curve models each avoidance manoeuvre; feasible control-point ranges are derived from the global path and obstacle pose, and a genetic algorithm selects the control points under multiple constraints (collision avoidance, minimum turning radius, maximum turning angle). The resulting paths both minimize tracking error and maximize plantable area, showing generally superior performance to conventional planners in comparative tests.
More advanced methods combine classical planning with modern techniques to handle complex scenarios. Cui et al. [141] presented a planner for farmland that layered an artificial potential field (APF) (for basic obstacle repulsion) with a particle swarm optimization algorithm (to fine-tune waypoints) and finally a model predictive control (MPC) to ensure the tractor follows the avoidance trajectory smoothly. The resultant paths were feasible and accurate, but the approach was computationally intensive, beyond the real-time capability of most onboard computers at the time. This shows a general trade-off: combining multiple sophisticated algorithms may yield better avoidance paths, but efficiency is crucial for real-time response. On the simpler end, Santos et al. [142] developed an open-source avoidance system called AgRobPP-CA which specifically addressed rough terrain in vineyards. Their method successfully kept robots clear of obstacles and dangerous slopes while being lightweight in computation, illustrating that sometimes a tailored solution for a specific domain (vineyards in this case) can outperform heavier generic algorithms.
A pattern in the aforementioned research is that many rely on pre-defined models or rules, whether it’s a geometric pattern for path planning or a predefined obstacle class for detection. These classical or model-based methods are effective within their design assumptions, but performance can degrade if reality deviates from those assumptions. For example, a planner assuming all obstacles are static and convex might fail if confronted with a moving person or a long irregular shape. To enhance adaptability, recent works are exploring learning-based and AI-driven obstacle avoidance. Wang et al. [143] introduced a machine-learning strategy where different avoidance behaviours (sub-tasks like “steer left”, “slow down”, etc.) are assigned weights depending on sensor inputs. Essentially, the robot learns a rule base: given the current environment perception, compute a weighted combination of possible actions. This is a form of behaviour arbitration learned from data. Reinforcement learning (RL) is another promising avenue: an autonomous tractor can be trained in simulation to negotiate obstacles by trial-and-error, learning an optimal policy for navigation. In fact, deep reinforcement learning is identified as a beneficial approach for navigation and obstacle avoidance tasks in agriculture, as it allows the system to learn from interactions rather than require explicit programming of all scenarios. Some studies (e.g., a 2022 work by Yang et al. [144]) apply RL combined with classical path planners to handle dynamic environments with UAVs or UGVs in farms. However, a challenge for learning-based methods is the need for extensive and representative training data. As noted, many approaches so far need prior information, whether a large, labelled dataset for a vision model or many training episodes for an RL agent, which can be costly to acquire in agricultural domains. Moreover, even with data, models trained on known obstacle classes might not recognize novel obstacles that were absent in the training set (e.g., the first time a robot encounters a scarecrow or an irrigation hose on the ground). To address this, researchers have proposed methods like zero-shot object detection that leverage auxiliary information (object descriptions or attributes) to detect objects from unseen classes. Bansal et al. [145] and Zhu et al. [146] applied zero-shot detection techniques which could, in principle, allow a robot to flag unknown obstacles (never seen in training) by generalizing from other knowledge. In an agricultural context, this could mean the robot recognizing “obstacleness” of an object even if it’s not one of the predefined categories, a key step toward truly autonomous systems that encounter the unexpected.
Considering the sensors and strategies in obstacle avoidance more broadly, we can compare their characteristics as in Table 8. Different sensors can be complementary: for example, a camera might fail in heavy fog or dust, but a radar or ultrasonic sensor can still detect obstacles in those conditions. Conversely, radar might not distinguish a person from a tree, but a camera can. Many advanced robots use multi-sensor setups (sensor fusion) to combine these benefits. On the algorithm side, reactive methods (like following a potential field away from obstacles) are quick and work in unknown environments, but they can get stuck in local minima (such as oscillating in a U-shaped obstacle). Deliberative methods (like computing a new path around an obstacle cluster) ensure a solution if one exists but require a map and time to compute. In practice, agricultural robots often implement a hierarchy: a global planner plans an initial route (covering all crop rows, for instance), then a local avoidance module makes minor adjustments or stops when obstacles appear. Safety is paramount; if an unknown obstacle is detected and the system is unsure of avoidance, the robot is usually designed to stop and alert a human (fail-safe behaviour).
In field robotics, sensor fusion of the above is common to achieve a more reliable obstacle perception. For instance, some autonomous tractors use a suite of cameras and LiDARs: the cameras classify an object as a person or debris, while the LiDAR pinpoints its distance and size. The fused data yields a richer understanding than either alone. This, combined with increasingly intelligent algorithms, is pushing obstacle avoidance toward higher levels of autonomy. As mentioned, a future direction is enabling robots to handle unknown obstacles gracefully, for example, using anomaly detection to recognize when something doesn’t match any known object and still treating it as a hazard to avoid. Field trials like those by Santos et al. [142] in vineyards show that relatively simple, robust systems can handle real-world conditions (uneven ground, slopes, random obstacles) by focusing on reliability over complexity. Meanwhile, cutting-edge research on AI and deep learning is laying the groundwork for more adaptive avoidance. The convergence of these efforts will be crucial for the next generation of agricultural robots.

4.4. Sensing for Agricultural Robots in the Age of Artificial Intelligence (AI)

Current field robots remain brittle under distribution shifts, sensor noise, and sparse labels, limitations that limit real-world reliability. Two AI trends point to a path forward. First, generative models (e.g., diffusion) can produce diverse, label-rich crop, weed, and disease imagery to “fill in” rare conditions and seasonal variability, improving robustness of downstream perception without costly data campaigns [148]. Figure 17 shows a 3D annotated image, generated by AI, depicting a common obstacle found on the farm. Second, large language models (LLMs) can operate above perception as reasoning engines that fuse cameras, weather, and soil telemetry into task-level guidance [149]. Recent robotics work shows how LLM-centered agents query tools, reconcile conflicting sensor cues, and replan when observations change, behaviors directly applicable to farms with shifting illumination and occlusions [150,151,152]. Vision-language frameworks emerging in agriculture similarly coordinate specialized models for decision support [153]. These advances suggest a transition from passive pipelines to AI-driven sensing.

5. Control Technologies for Agricultural Robots

5.1. Control Methods for UAVs

UAV control technology enhances flight capabilities, including navigation, obstacle avoidance, and the completion of agricultural operations in complex and dynamic weather conditions and agricultural environments. Flight control helps to precisely achieve flight stability and desired trajectory in the presence of environmental disturbances [154]. The choice of UAV control technology depends on the agricultural application, the UAV platform, environmental conditions, and the desired level of autonomy and precision [155]. Agricultural environments are typically unstructured, and UAVs are operated at low altitudes and low speeds. Still, in situations such as UAVs for grazing, there is a need for increased speed and manoeuvrability which increases the complexity of control [156]. UAV control technology can be broadly categorized as linear, non-linear control, and intelligent or learning-based control [157,158]. Furthermore, swarm control is increasingly relevant in advanced UAV applications [157].
Linear control utilizes simplified, easy-to-implement linear models, hence their wide usage [154]. Some examples of linear controls are classical Proportion-Integral-Derivative (PID) control, Linear Quadratic Regulator (LQR), and Gain scheduling. In PID, control action is determined according to the deviation between the desired value and the actual value. It is simple to use but requires tuning for different flight missions and environments, and it is difficult to handle multiple variables efficiently [159]. LQR is a model-based optimal control approach that minimizes a cost function that penalizes both state and control effort deviation, providing robust and precise steady-state tracking. LQR performance can be affected by environmental disturbances such as windy conditions, but this can be minimized by adding Kalman and particle filters [157]. Gain scheduling divides the operation range into smaller regions based on changes in the operational condition of the system and approximates each region using a linear model. Using this approach, different flight regimes and environmental conditions can be tuned separately.
The lateral and longitudinal dynamics control for a fixed-wing agricultural UAV (Ultra-stick 25e) used for crop monitoring and spraying was analysed using two PID control methods: MATLAB tuning and Ziegler–Nichols (ZN) controller to achieve autonomous and stable flight [160]. The study concluded that although ZN methods required more time to fine-tune the P, I, and D parameters, suitable parameters could be obtained. Whereas MATLAB Tuning does not always successfully produce parameters for some transfer functions, but automatically and quickly selects these parameters is successful. A PID control was used for position loop control in [161]. The study reports that when the payload is kept relatively small, the UAV’s flight trajectory can maintain a certain level of stability, thereby ensuring effective performance during agricultural operations. Motivated by the increasing use of quadcopters in applications such as crop monitoring, Surur et al. [162] proposed a gain-scheduling fault-tolerant PID control for quadcopter UAVs. In this study, Artificial Neural Network (ANN) served as an online gain scheduler, receiving real-time information about the detected UAV rotor’s fault, tracking errors and inferring optimal PID gains to be applied to the controller. It was observed that gain scheduling allowed the PID controllers to adapt their gains effectively, leading to robust and high-performance fault-tolerant control. While linear controls are easy to design and configure, they lack changes with time that are present in non-linear controls [163].
A variety of nonlinear controllers have been developed and applied to UAVs to overcome some of the shortcomings of linear controllers. Additionally, non-linear systems help to handle significant external disturbances such as wind storms where linear controls might fail [157]. Among these, feedback linearisation, backstepping, sliding mode control (SMC), and adaptive control have received more attention [158]. Feedback linearisation transforms a non-linear dynamic system into a linear system by redefining the system’s state variables and redesigning control inputs [164,165]. Backstepping is another non-linear control that breaks down a non-linear system into a series of interconnected lower-order subsystems and designs a virtual control input for each subsystem to stabilize it [166,167]. This virtual control input becomes the desired state for the preceding subsystem, and a new control input is designed to drive the actual state towards this desired state. Compared to the backstepping controllers, when robustness against uncertainty and disturbance is sought, SMC controllers could represent a valid alternative since they are characterised by low sensitivity to external disturbances, good tracking ability, and rapid response. Adaptive controllers can automatically compensate for parameter changes in system dynamics by means of the controller’s characteristics, making the overall system performance remain the same or be maintained at an optimal level.
Recently, nonlinear control strategies have become integral to agricultural UAV applications, either as standalone solutions or in combination with other advanced techniques. For instance, Bhowmick et al. [168] proposed a robust feedback linearization method within a two-loop control scheme for tri-rotor UAV swarm formation tracking. This approach demonstrated resilience to model uncertainties and aerodynamic disturbances, proving effective for multitarget surveillance in precision agriculture. Similarly, Sierra-García and Santos [169] integrated feedback linearization with a neural network to create a hybrid intelligent control system for stabilizing suspended load trajectories. Their design, comprising a position controller, attitude controller, and a neuro-estimator, enhanced robustness, minimized tracking error, and mitigated load influence on UAV dynamics. To address the vulnerability of quadrotor UAVs to external disturbances like wind and payload variation—common in aerial agricultural photography—Sun et al. [170] developed a backstepping controller with a high-order sliding mode-assisted disturbance observer. This method accurately estimated disturbances with low energy consumption and no observable chattering. Further advancements include augmented backstepping with metaheuristic optimization for trajectory planning, improving load convergence to desired paths [171]. In another study, Ijaz et al. [172] introduced a hybrid controller combining higher-order integral and fast terminal sliding modes with an adaptive law to ensure UAV stability during spraying tasks. The octocopter system used in the study and validated through HIL simulations using a Pixhawk Orange Cube, demonstrated robust performance under varying payloads and disturbances, highlighting its real-world applicability. A summary of other control technologies in agro-UAV is presented in Table 9.
From the literature, it is observed that most research in agricultural UAV control targets multirotor drones and spraying operations because multirotor platforms account for an overwhelming majority of drones used in agricultural research. One recent survey found that 93.1% of agricultural UAVs are multirotor [173]. The preference is due to their ability to take off and land vertically, hover for precision work, and manoeuvre across irregular fields; capabilities essential for spraying, which demands highly controlled delivery due to the variable payload and strict requirements for even, targeted application. Additionally, drone-based spraying is seen as particularly challenging because changes in liquid payload during operation alter flight dynamics, requiring robust control algorithms and stability systems; this has made spraying both a practical necessity and a technical benchmark in the literature.
Table 9. Control methods used in different agricultural UAVs.
Table 9. Control methods used in different agricultural UAVs.
ControlUAV TypeApplicationCitation
Hybrid (PID + PWM)QuadrotorSpraying[174]
LQRQuadrotorSpraying[175]
Feedback linearizationQuadrotorPredefined trajectory following[176]
Feedback linearizationQuadrotorSwarm UAV formation[177]
Feedback linearizationQuadrotorSwarm UAV formation[178]
BacksteppingQuadrotorVisual servoing[179]
Hybrid (Adaptive backstepping + Sliding mode)MultirotorSpraying[180]

5.2. Control Methods for UGVs

Control technologies in agricultural UGVs are fundamental to the autonomous navigation and operation of robots in diverse and often unstructured agricultural environments. Agricultural UGV control ranges from classic control to more advanced methods such as adaptive control, robust control, and intelligent control [181]. PID controllers fall under the category of classic control and are widely used due to their simplicity and robustness [182,183]. They adjust control outputs based on proportional, integral, and derivative terms of the error. While PID controllers are simple and have broad applicability, they present a challenge during parameter tuning, especially when dealing with external disturbances and varying working conditions [184]. Studies have developed multi-loop adaptive PID control systems to enhance stability on different road surfaces [182]. Fuzzy Logic Control (FLC) diverges from traditional approaches such as PID by not relying on precise mathematical models, instead using human expertise to formulate control rules. This enables better handling of imprecise information and uncertain environments where mathematical descriptions are difficult. FLC has been combined with PID controllers, and optimization algorithms like Genetic Algorithms (GA) or Particle Swarm Optimization (PSO) are used to tune FLC parameters, enhancing performance and adaptability to varying speeds and road conditions [185].
Model Predictive Control (MPC) is a class of optimal and predictive control methods that predicts future system states and solves an optimization problem over a finite time horizon to generate an optimal control sequence. A key advantage of MPC is its ability to handle constraints on both control inputs and system states. However, its practical application is often challenged by the risk of solutions falling into local minima and its high computational complexity, which can hinder real-time performance. To address these challenges, several studies have explored the use of nonlinear MPC in combination with advanced optimization techniques. For example, Utstumo et al. [186] implemented a nonlinear MPC using a direct multiple shooting method and a Gauss-Newton quadratic objective function for a drop-on-demand herbicide spraying robot. The objective of control was to keep the vision system aligned with seed rows while constraining the wheel motion to predefined tracks. Their results showed improved performance compared to a conventional PD controller, and they suggested potential for real-time implementation, though this was not rigorously evaluated. In a more recent study, Soitinaho and Oksanen [187] applied nonlinear MPC for path tracking and obstacle avoidance in autonomous tractors. Through simulation and field experiments, their approach demonstrated effective real-time path following and obstacle avoidance capabilities.
Geometric path tracking algorithms, such as the Pure Pursuit Algorithm (PPA) and the Stanley controller, constitute a class of control methods for autonomous navigation in agricultural robotics. PPA guides a robot along a planned trajectory by tracking a preview point ahead on the path, simplifying the geometric relationship between the robot’s current position and the target point on the path [188]. While it offers low computational complexity, PPA suffers from limited robustness in preview point selection and difficulty in adaptively determining the look-ahead distance, particularly under varying speed conditions. To overcome these limitations, several improvements have been proposed in the literature. An enhanced pure pursuit model for agricultural machinery was developed to dynamically adjust speed and look-ahead distance based on real-time tracking deviation [189]. The model was optimized using an Improved Sparrow Search Algorithm (ISSA), which improves convergence speed and mitigates the risk of local minima through chaotic initialization, adaptive discoverer ratios, and dynamic inertia weights. Experimental results showed that the ISSA-PP algorithm outperformed conventional methods in both tracking accuracy and stability. However, its performance under complex field conditions and with larger agricultural machinery remains untested, indicating the need for further validation and the integration of multi-sensor systems. In a related study, a path tracking method combining the pure pursuit model with an improved ant colony optimization strategy was proposed to balance the influence of the path-following behaviour and the heading error rate [189]. Experimental results demonstrated that the method significantly reduced tracking error and enhanced accuracy and control stability under various conditions.
Stanley model (SM), another geometric path control approach, calculates steering angle commands directly from the lateral tracking deviation of robots from the desired path [190]. This control method does not require calculating the optimal look-ahead distance, as compared to PPA, making it easier to implement [191]. To adapt to changes in varying robot speed and road conditions, an improved fuzzy Stanley model incorporating particle swarm optimization (PSO) was proposed to adaptively adjust the control gain based on tracking error, vehicle velocity, and steering actuator saturation [192]. When deployed on a wheeled combine harvester, the proposed method outperformed both the conventional Stanley model and the fuzzy Stanley model, achieving an acceptable maximum lateral tracking error of 0.63 m. However, its applicability to other autonomous platforms was not evaluated. To address this, a dynamic path search algorithm was integrated with the fuzzy Stanley model on a self-driving tractor performing autonomous tillage at field scale [193]. The approach reduced tracking error by 20.6% when compared to Stanley and fuzzy Stanley approaches for whole field tracking.
Learning-based control methods, such as adaptive and reinforcement learning, are increasingly used in agricultural robotics due to their ability to manage environmental variability, nonlinear system behaviour, and external disturbances without relying on precise mathematical models or manual tuning. These approaches enable real-time learning of optimal control policies and adaptive response to dynamic field conditions, with demonstrated success in path tracking and obstacle avoidance tasks.
An advanced learning-based control framework was proposed using Adaptive Dynamic Programming (ADP) combined with a Critic Neural Network to address challenges such as non-holonomic constraints and external disturbances like wheel slippage, which often cause path deviation [184]. The critic neural network approximates the Hamilton–Jacobi-Isaacs (HJI) equation, and its weights are updated online using an adaptive law, allowing the controller to continuously learn and adapt to changing dynamics. Simulation results confirmed the framework’s ability to maintain stable path tracking even under wheel slip, although its computational complexity limits real-time applicability. A Double Deep Q-Network (Double DQN) algorithm was applied to the navigation and path-tracking control of an orchard traction spraying robot, using a virtual radar model to estimate the robot’s position relative to the planned path. Simulation and field tests along a U-shaped path demonstrated an average lateral deviation of less than 0.08 m, representing at least a 5% improvement compared to the baseline radar model [155]. To address the limitations of the conventional Dynamic Window Approach (DWA) in dynamic environments, a Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm was introduced to optimize the weight determination process for obstacle avoidance in Ackermann-steering agricultural robots [194]. The proposed method achieved a 90% obstacle avoidance success rate, demonstrating enhanced performance in complex and changing field conditions. Other controls used in Agro-UGV are presented in Table 10.
Unmanned ground vehicles (UGVs) in agriculture face significant challenges due to the complex, dynamic, and unstructured nature of agricultural environments. This includes variations in terrain, weather, vegetation, and soil conditions that can cause issues like inaccurate sensing, wheel slippage, and unstable control [188]. Achieving high precision, robustness, and real-time performance is difficult, particularly for advanced control algorithms due to their computational complexity [184,187]. Future trends are focused on enhancing adaptability, intelligence, and efficiency through advanced technologies. This involves integrating artificial intelligence (AI), deep learning (DL), and reinforcement learning (RL) for tasks like dynamic control parameter tuning, precise object detection, and autonomous navigation in complex conditions.

5.3. Control Methods for USVs

Control technology for USVs helps to achieve automation via automatic navigation, path planning, and path following functions [210]. The automatic navigation system typically integrates navigation sensors to gather environmental and vehicle status information, a path planning system to devise optimal routes, and a path following system to execute these routes by controlling the USV’s direction and speed. USV control presents unique challenges compared to Unmanned Ground Vehicles (UGVs) or Unmanned Aerial Vehicles (UAVs) due to the complex water environment, including disturbances from wind, waves, and currents, as well as the vehicle’s inherent inertia and response times [210].
Path following control mechanisms are important for maintaining the USV’s planned trajectory. While the vehicle’s motion can be analysed as a 6-degree-of-freedom (6-DOF) rigid body motion, simplified models often focus on horizontal movements. Traditional PID control is used for heading and line tracking but faces challenges with tuning and external disturbances, leading to enhancements like cascade control and Elitism Estimation of Distribution Algorithm (EEDA) for parameter self-tuning [211]. Feedforward control helps compensate for environmental disturbances, such as wind, which significantly impact lighter USVs. The Line Of Sight (LOS) control algorithm is a mature and robust method for path following, effective even with uncertain model parameters [212]. Intelligent control methods, including neural networks (e.g., Radial Basis Function networks) and reinforcement learning techniques (e.g., Q-learning, Deep Deterministic Policy Gradient), are increasingly applied for their self-learning capabilities in path tracking [213,214]. Furthermore, Fuzzy Logic Controllers (FLC) are utilized for their ability to manage complex systems and uncertainties without requiring precise mathematical models [215]. Other control methods are summarized in Table 11. Key challenges and future trends in USV control technology involve improving environmental awareness in dynamic water conditions (fog, intense light, rain) through advanced fusion algorithms, enhancing stability and anti-interference capabilities, developing robust sensors and end effectors for specialized applications like agriculture, ensuring stable and efficient wireless communication, and fostering multi-USV cooperative operations for increased efficiency and broader systematic applications.

5.4. Control Methods for Robotic Arms and End-Effectors

The control methods for agricultural robotic arms and end-effectors prioritize safety, precision, and adaptability due to the unstructured nature of farm environments and the delicate nature of crops [221,222]. High-level control is often managed by sophisticated frameworks like Hierarchical Quadratic Programming (HQP), which handles kinematically redundant systems by structuring tasks into priority levels [223]. The highest priority is consistently assigned to safety tasks, which are formulated using Control Barrier Functions (CBFs) to enforce strict constraints, such as joint position limits, velocity limits, and self-collision avoidance [223]. This architecture also facilitates crucial interaction paradigms: by embedding admittance control within the HQP framework and coordinating movement via a Finite-State Machine (FSM), the system can operate in a semi-autonomous “hand-guiding” mode. This allows human operators to physically guide the end-effector when perception uncertainty is high, ensuring that essential safety constraints remain active during physical human–robot interaction.
Achieving precise manipulation relies on integrating advanced planning and execution algorithms [224,225]. In Choi et al. [222], the perception system first estimates the 6D pose of the target fruit, which is fed into Inverse Kinematics (IK) calculations to determine the required joint angles. For collision-free and efficient movement in cluttered environments, advanced path planning algorithms are utilized, such as the Dynamic Temperature Simplified Transition Adaptive Cost Bi-Directional Transition-Based Rapidly Exploring Random Tree (DSA-BiTRRT), which enhances performance by optimizing paths and avoiding obstacles [226]. During execution, systems require robust controllers: Sliding Mode Control (SMC) is proven effective for end-effector position control due to its robustness against nonlinearities, disturbances, and unmodeled dynamics, providing faster convergence and improved tracking performance [227]. For tasks like sweet potato transplanting, Linear Model Predictive Control (LMPC) is used to solve trajectory control problems by compensating for system delay through repeated optimization and feedback, often implementing segmented control of the reference trajectory to improve real-time operation [228]. Furthermore, advanced learning techniques like Model-Based Reinforcement Learning (MBRL), particularly utilizing Gaussian Process Regression (GPR) for environment modeling, have achieved high success rates (at least 95% in numerical studies) in optimizing movement control for tasks with targets in random positions [229].
End-effector control systems are designed primarily to ensure damage-free grasping and efficient detaching mechanisms [230,231]. Damage prevention relies heavily on precise force control, often achieved through real-time sensory feedback. This includes integrating specialized sensors, such as a super-hydrophobic tactile sensor capable of detecting both pressure and slip, coupled with microcontroller-based systems to precisely regulate grip strength [231]. Conversely, methods exist to simplify complexity while maintaining non-destructive handling: a simulated “soft” grasper utilizes a novel motion control scheme where the grasping stops based on a simple binary code feedback signal triggered upon object contact, eliminating the need for force or angle sensors entirely [221]. End-effectors exhibit high adaptability through specific mechanism designs, such as rope-driven bionic fingers that conform to different fruit sizes and shapes while utilizing self-developed posture systems for real-time monitoring. Specialized detaching mechanisms, such as those used in citrus harvesting, employ an integrated control system to command an adaptive cutting part that includes a 2 D.O.F passive joint and a Y-shaped guide to align the cut closely to the fruit surface, minimizing the remaining peduncle length [222]. Other specialized solutions include robotic arms utilizing compliant materials like Straight-Fiber-Type Pneumatic Artificial Muscle (SF-PAM) combined with noncircular pulleys to mechanically compensate for inherent output limitations and increase the joint range of motion. A summary of control methods for various levels of control domain or robotic arm and end effector can be found in Table 12.
The unstructured environment, resulting in high perception uncertainty, fragile crop nature, and network latency, hinders real-time precision control of robotic arms and end effectors. Future research can implement a reinforcement learning (RL) approach to enhance system autonomy, particularly by actively determining operational thresholds like grasping force. In addition, dual-arm cooperative control algorithms should be considered for efficient handling of clustered fruits and coordinating complex tasks. Lastly, there is a strong need for incorporating more complex motion planners and multimodal data fusion techniques to estimate and actively avoid external obstacles in real-time environments.

6. Networking Technologies for Agricultural Robots

The smart agriculture market, driven by IoT, sensors, location tracking, robotics, and AI, is projected to grow from $14.40 billion in 2024 to $23.38 billion by 2029, highlighting the rapid pace of technological integration [237]. Central to this evolution is the need for a robust networking infrastructure to support the effective operation of agricultural robots. Data transmission requirements vary with the robot’s functions and technological complexity [238]. For example, Real-Time Kinematic (RTK) correction, which is essential for precise positioning, benefits significantly from mobile networks, as they overcome range limitations and frequency interference challenges [238]. Similarly, tasks such as video streaming and remote control require low-latency, high-reliability connectivity.
A key trend is the shift toward externalizing data processing to centralized data centres. This offers advantages like enhanced service sophistication, GPU scaling efficiency, and reduced robot costs by offloading computationally demanding components [238]. However, this shift further underscores the need for high-throughput, low-latency networking to support real-time, data-intensive operations. Often overlooked is the foundational infrastructure required for large-scale robot deployment, including its energy demands and environmental impact [238]. While the benefits of agricultural robotics are substantial, they must be weighed against these underlying requirements. Without adequate infrastructure planning, systems may face performance bottlenecks and long-term sustainability issues. Networking is thus not just a support system, but a critical enabler of modern robotic functionality, powering real-time AI-driven tasks, decision-making, and durable, cost-effective designs through computational offloading. In the next section, we analyse communication protocols from short-range wireless to wide-area cellular and emerging 6G technologies and explore innovations shaping swarm robotics.

6.1. Communication Protocols for Agricultural Robotics

The operational efficacy of agricultural robots is inherently tied to the underlying communication protocols that facilitate data exchange between robots, sensors, base stations, and centralized control systems. This section categorizes and details the diverse range of communication technologies vital for these applications, outlining their technical specifications, advantages, limitations, and representative deployments within the agricultural sector.

6.1.1. Short-to-Medium-Range Wireless Technologies

These protocols are typically employed for localized communication, often within a single field, greenhouse, or between closely operating robotic units.
ZigBee (IEEE 802.15.4)
ZigBee operates on the IEEE 802.15.4 standard [239], distinguishing itself through its exceptionally low power consumption, robust mesh networking capabilities, and support for a substantial number of nodes, potentially up to 65,000 devices. It offers data throughputs ranging from 20 to 250 kbps and maintains low latency, typically in the tens of milliseconds [240,241]. The effective communication range for ZigBee networks is generally between 10 and 100 m [240,241].
The primary advantages of ZigBee include its high energy efficiency, which translates to extended battery life for sensor nodes, and its inherent ability to form self-healing mesh networks. This mesh capability allows data to hop between nodes, extending the network’s reach and enhancing its reliability in environments with obstacles. Its support for numerous devices makes it particularly well-suited for dense sensor deployments [240,241]. However, its main limitations are its relatively low throughput and restricted range, which can be particularly challenging in dynamic topologies where the network structure changes frequently [240,241].
In agricultural robotics, ZigBee has found widespread application in greenhouse and field sensor networks. It is commonly used for monitoring and controlling critical environmental parameters such as soil moisture, temperature, and humidity, as well as for automating irrigation systems [240,241]. Its mesh capabilities and low power consumption also make it a frequent choice for communication within small teams of robots [240,241].
Bluetooth/BLE (Bluetooth Low Energy)
Bluetooth, particularly its Low Energy (BLE) [242] variant, is characterized by its shorter communication range, typically between 10 and 100 m, and moderate data throughput of approximately 1 Mb/s. The main benefit of BLE is its energy-efficient design, making it suitable for direct connections to sensors or wearable devices near robots. Its compact size and low power requirements also lend themselves to integration into small, mobile components. However, its significant limitation is its short range and moderate throughput, which restricts its utility in large-scale field robotics applications beyond simple interfacing with smart devices [240]. While its prior use in agricultural robotics has been somewhat limited to interfacing with smart devices, it holds potential for applications such as connecting on-robot sensors or tracking livestock over short distances.
Wi-Fi (IEEE 802.11 a/b/g/n/ac/ax), Especially Wi-Fi 6
Wi-Fi, especially modern versions like Wi-Fi 6 (IEEE 802.11ax) [243], offers high throughput (often exceeding 100 Mb/s) and low latency (in the millisecond range) over distances of tens to hundreds of meters [241]. Designed for high-density environments, Wi-Fi 6 supports numerous devices efficiently, making it ideal for agricultural settings such as greenhouses or confined field operations [241]. Applications include Wi-Fi-enabled moisture sensors for precision irrigation [244] and serving as a backhaul option for LoRaWAN [245] gateways using existing farm internet infrastructure [246].
Short-to-medium-range wireless technologies involve inherent trade-offs. ZigBee favours low power and mesh networking but has low throughput; Bluetooth is energy-efficient but limited to very short ranges; Wi-Fi offers high throughput and low latency but at higher power costs and shorter range. These differences mean no single protocol suits all tasks. System designers must match communication technologies to specific needs—ranging from low-data sensor networks to high-bandwidth video streaming for robot navigation—often requiring hybrid solutions.
Wi-Fi is also evolving beyond its traditional local network role. With performance increasingly comparable to 5G in certain deployments, Wi-Fi 6 supports real-time control and mesh networking for field robotics. Its integration with LoRaWAN gateways and sensor systems underscores its expanding role as a flexible, high-capacity local connectivity layer in smart agriculture.

6.1.2. Low-Power Wide-Area Networks (LPWAN)

Low-Power Wide-Area Networks (LPWAN) are a class of wireless communication technologies specifically engineered for long-range, very low-power data transmission. This design makes them particularly well-suited for connecting scattered sensor nodes across the expansive and often remote landscapes characteristic of agricultural fields.
LoRa/LoRaWAN
LoRa (Long Range) [240] is a proprietary spread spectrum modulation technique, and LoRaWAN is the open standard network protocol built on top of it. Practical deployments have demonstrated their effectiveness in remote soil moisture monitoring, automated irrigation control, and livestock tracking [245]. Case studies, such as those conducted in commercial apple orchards and the Purdue ACRE Farm, illustrate its deployment using both Wi-Fi and cellular-connected gateways to ensure comprehensive coverage [246]. LoRaWAN-based IoT networks are also instrumental in smart greenhouses, facilitating efficient transmission of environmental data, including temperature, humidity, light intensity, soil moisture, and carbon dioxide levels [245]. Case studies have demonstrated their effectiveness in remote soil moisture monitoring, automated irrigation control, and livestock tracking.
NB-IoT, Sigfox, LTE-M, RPMA, WavIoT
These LPWAN technologies offer diverse features suited for different agricultural applications, each balancing trade-offs in range, data rate, power, and cost.
  • NB-IoT (Narrowband IoT) [247]: Operating on licensed cellular spectrum, NB-IoT offers long-range (up to 10 km rural), ultra-low-power communication with downlink speeds up to 200 kbps and uplink around 10 kbps [240]. It supports high scalability—over 100,000 devices per cell and provides reliable QoS (Quality of Service). However, it has lower interference immunity than LoRaWAN or Sigfox, relies on existing LTE infrastructure (limiting rural deployment), and entails higher deployment and device costs. NB-IoT is used for livestock tracking and remote sensing in areas with cellular coverage.
  • Sigfox [248]: Using unlicensed sub-GHz ISM bands and Ultra Narrow Band (UNB) modulation, Sigfox achieves very long range (up to 20 km) and ultra-low power consumption [240]. Its limitations include very low throughput (10–50 kbps), strict message limits (140 uplink, 4 downlink/day), and small payload sizes. While unsuitable for high-data tasks, it excels in low-power, infrequent sensing applications such as soil monitoring.
  • LTE-M [249]: Built on existing 4G/LTE infrastructure, LTE-M provides better latency (10 ms) and higher throughput than other LPWANs. However, it consumes more energy per message and is less effective over long distances or through obstacles [240]. It is better suited for urban or peri-urban agricultural use, but insufficient for high-throughput robotics or remote deployments.
  • RPMA and WavIoT: RPMA (2.4 GHz) [250] and WavIoT (868 MHz) [251] suffer from higher path loss, at least 9 dB more than Sigfox and LoRaWAN, making them less suitable for rural or obstructed environments [240]. RPMA is less energy-efficient, while WavIoT offers battery life comparable to LoRaWAN and Sigfox. These are generally less favoured for wide-area agricultural deployments.
LPWANs present a trade-off among range, data rate, and power. For instance, LoRaWAN and Sigfox prioritize range and energy efficiency at the cost of throughput, while NB-IoT delivers higher data rates and QoS with greater power and cost demands. As such, the choice of LPWAN depends heavily on application needs. Soil monitoring may require low data and long battery life, while mobile robots may demand moderate data rates and lower latency, necessitating a hybrid, multi-protocol approach.

6.1.3. Cellular Networks (4G, 5G, and Emerging 6G)

Cellular networks offer wide-area coverage and higher data rates compared to many LPWAN solutions, making them suitable for more demanding agricultural robotics applications that require consistent connectivity over large areas.
4G/LTE-Advanced
4G/LTE-Advanced [252] provides wide coverage, moderate throughput (around 100 Mb/s), and latency typically in the range of 10 ms. It has served as a foundational technology for initial teleoperation and data offload in agricultural robotics. While 4G/LTE-Advanced offers broad coverage, its moderate throughput and latency are often insufficient for the most demanding, high-throughput, and ultra-low-latency tasks required by advanced agricultural robotics, such as real-time control of autonomous vehicles or high-definition video streaming for precision tasks [240,241]. It remains a baseline for general connectivity but is increasingly being superseded by newer technologies for critical robotic functions.
5G (Including URLLC, eMBB, mMTC)
5G technology, encompassing its key capabilities such as Ultra-Reliable Low-Latency Communication (URLLC) [253], enhanced Mobile Broadband (eMBB) [254], and massive Machine-Type Communications (mMTC) [255], represents a significant leap forward for agricultural robotics. It offers high throughput (1+ Gbps) and very low latency (<10 ms) [240,241].
5G has been rigorously tested in field robotics, demonstrating substantial performance improvements over 4G and achieving latency comparable to Wi-Fi 6, with only an approximately 18 ms difference. This makes 5G highly reliable for real-time control applications and efficient video offloading from agricultural robots. These networks offer enhanced security and greater control over network equipment and data. Projects such as the collaboration between Freshwave and the National Robotarium in Edinburgh are deploying portable 5G private networks to enhance agritech innovation, providing reliable high-speed internet in rural and remote areas where traditional broadband options are limited [256]. Such networks allow real-time data collection and analysis, enabling faster decision-making in precision agriculture, such as crop monitoring and equipment adjustments [256].
The critical role of 5G as the backbone for advanced agricultural robotics is becoming increasingly evident. Its capabilities in providing real-time control, efficient video offload, and seamless multi-robot coordination, particularly through private network deployments, are transforming the operational landscape of smart farming. This level of connectivity enables complex tasks, such as precise fertilizer application based on drone imagery and autonomous mechanical weed control, where robots are in constant communication with local servers [257]. The ability to centralize data processing in data centres, offloading power-hungry GPUs from robots, further underscores 5G’s enabling role in creating more cost-effective and robust robotic systems [238]. Figure 18 shows a diagram of a deployed 5G private network.
6G/Beyond-5G
The need for rural connectivity is a significant challenge for cellular networks in agriculture. These advances are envisioned to support highly remote and flexible operations in Agriculture 5.0/4.0, facilitating even more sophisticated and widespread deployment of Agri robots. The imperative for rural connectivity is a significant challenge for cellular networks in agriculture. The development and adoption of private 5G networks becomes crucial in these contexts, providing dedicated, high-performance connectivity where public cellular infrastructure may be inadequate [257]. Furthermore, the integration of satellite-supported networks and emerging 6G technologies [258] is vital for achieving comprehensive global coverage, ensuring that even the most remote farms can benefit from advanced robotic systems [258]. This highlights the importance of a multifaceted approach to connectivity, combining terrestrial and non-terrestrial solutions, in bridging the digital divide in agriculture.

6.1.4. Comparative Analysis of Communication Protocols

The selection of communication protocols for agricultural robots involves a complex interplay of technical specifications, operational requirements, and environmental considerations. As illustrated in Table 13, each technology offers a unique balance of range, throughput, latency, and power consumption, making it suitable for specific agricultural tasks.
Short-range technologies like ZigBee and BLE are well-suited for dense sensor networks and localized interactions, thanks to their low power consumption and mesh networking capabilities. However, their limited range and throughput make them less effective for high-data or dynamic applications. In contrast, Wi-Fi 6 offers much higher throughput and lower latency, making it ideal for local robot control and edge video processing, especially in settings like greenhouses. Its advancing capabilities position it as a viable alternative to cellular networks in localized environments.
For broader coverage, LPWAN technologies such as LoRaWAN, NB-IoT, and Sigfox enable long-range, low-power communication, perfect for connecting dispersed sensors across large fields. LoRaWAN excels in scalability and energy efficiency but has limited data rates and higher latency. NB-IoT offers better quality of service and scalability, while Sigfox provides ultra-low power use at the cost of strict message limits. The choice among them depends on specific requirements for data rate, latency, cost, and spectrum access.
Cellular networks, particularly 5G, are becoming essential for real-time control, video streaming, and coordinated multi-robot systems across wide areas. Private 5G networks further enhance connectivity in rural agricultural settings by offering dedicated, secure, and reliable communication. While 4G provides baseline coverage, it often falls short for advanced robotics, and emerging 6G technologies promise even greater capabilities, including global coverage and AI-enabled infrastructure.
Given the diverse requirements of agricultural robotics, no single communication technology is sufficient; instead, a layered or hybrid networking approach is often most effective, leveraging LPWANs for low-bandwidth sensor data, Wi-Fi for local robot interactions, and 5G for high-level coordination and cloud integration. This combination ensures robust, flexible, and scalable communication across various agricultural scenarios.

6.2. Swarm Robotics Networking

Swarm robotics in agriculture involves the coordinated operation of multiple, often simple, robotic units to achieve complex tasks collectively. This paradigm necessitates ad hoc and decentralized networking architectures to facilitate distributed coordination, efficient sensing data sharing, and collective control among the robots.

6.2.1. Ad-Hoc/MANET/FANET Approaches

Ad-hoc networking models are fundamental to swarm robotics, as they enable communication without reliance on a fixed infrastructure, which is highly advantageous in dynamic and unpredictable agricultural environments.
Mobile Ad Hoc Networks (MANET) and Flying Ad Hoc Networks (FANET)
Mobile Ad Hoc Networks (MANETs) and Flying Ad Hoc Networks (FANETs), as shown in Figure 19, provide infrastructure-less, peer-to-peer communication, making them exceptionally well-suited for multi-robot deployments [240,241]. MANETs are applicable for ground-based agricultural robots, allowing them to communicate directly with each other as they move across fields. FANETs, on the other hand, are designed for aerial Unmanned Aerial Vehicles (UAVs) operating in swarms, enabling them to form dynamic airborne communication networks. These approaches are crucial for maintaining connectivity in environments where fixed infrastructure is impractical or absent.
IEEE 802.15.4/ZigBee in Swarms
IEEE 802.15.4, often implemented through ZigBee, is frequently employed in small robot teams due to its inherent mesh capability and low power consumption. Its ability to form a mesh network allows robots to relay messages to each other, extending the effective communication range within the swarm. However, ZigBee’s utility in larger or more dynamic swarm contexts is limited by its relatively low throughput and restricted range, which can become significant bottlenecks as the number of robots increases or their topology changes rapidly [240].
LoRaWAN in Swarm Contexts
LoRaWAN is sometimes considered an infrastructure-based LPWAN alternative for swarm communication due to its high range capabilities. While its long-range characteristic is appealing for covering vast agricultural areas, LoRaWAN’s inherent limitations in terms of scalability and latency pose significant challenges for real-time swarm control applications [240]. Its high latency, in particular, can hinder the rapid, synchronized decision-making required for effective collective behaviour in dynamic swarm operations.
The autonomy-communication nexus in swarms is a critical design consideration. The decentralized control and self-organization inherent in swarm robotics necessitate robust, infrastructure-less communication. This is because individual robots must be able to share information, coordinate actions, and adapt to environmental changes without a central entity. Maintaining stable communication links among numerous mobile robots in dynamic topologies, especially in challenging agricultural terrains with obstacles and varying line-of-sight conditions, presents a significant engineering challenge [259]. The effectiveness of the swarm is directly proportional to the reliability and efficiency of its internal communication, as communication failures can lead to uncoordinated actions, redundant efforts, or even mission failure.

6.2.2. Hybrid and Multi-Layer Networking Designs for Swarms

Given the diverse and often conflicting communication requirements within agricultural robot swarms, hybrid and multi-layer networking designs are becoming the standard practice. These architectures combine different communication technologies to leverage their respective strengths across various operational layers.
Many contemporary agricultural robot systems employ a multi-layer networking approach. This typically involves using low-power LPWAN technologies, such as LoRa or NB-IoT, for connecting sparse, static sensor nodes across vast fields, where long range and minimal power consumption are paramount [240]. For local robot-to-robot links and immediate coordination within a smaller operational area, Wi-Fi mesh networks are often utilized due to their higher throughput and lower latency, enabling real-time data exchange and control. Finally, for backhaul communication to edge computing facilities or cloud servers, 5G cellular networks are increasingly adopted. This high-bandwidth, low-latency connection supports high-level planning, remote telemetry, and the offloading of computationally intensive tasks from individual robots.
In the context of swarm robotics, combining ad-hoc Wi-Fi mesh for local coordination with 5G/edge connectivity for high-level planning and remote telemetry is emerging as a standard practice. This hybrid approach allows swarms to maintain robust local communication for emergent behaviours while simultaneously benefiting from centralized intelligence and vast data processing capabilities in the cloud or at the edge [240,241,258].
The necessity of layered communication for scalable swarms is a clear realization in modern agricultural robotics. A single communication protocol is inherently insufficient to meet the complex and varied demands of sophisticated swarm operations. Different layers of communication are required to balance conflicting needs for range, bandwidth, latency, and power consumption. For instance, sensor data often requires long-range, low-power transmission, local robot-to-robot coordination demands low-latency and moderate bandwidth, while backhauling video or processed data to a cloud server necessitates high bandwidth and reliable wide-area connectivity. A hybrid, multi-layer approach allows designers to optimize each communication link for its specific purpose, ensuring that the swarm can operate efficiently and reliably across diverse scales and tasks. This architectural flexibility is crucial for achieving the full potential of large-scale, autonomous agricultural robot swarms.
In summary, future networking for agricultural robots will harness emerging technologies like 6G/THz communications, non-terrestrial networks, and Reconfigurable Intelligent Surfaces to deliver ultra-low latency, high throughput, and global coverage, paving the way for advanced Agriculture 5.0 operations. Research should focus on optimizing hybrid network architectures, adaptive protocols for dynamic swarms, and AI-driven autonomous network management. Additionally, addressing the energy and environmental impact of network infrastructure will be crucial for sustainable, large-scale deployments. A balanced integration of innovation, sustainability, and security will be essential to unlock the full potential of agricultural robotics.

7. Conclusions: Limitations and Future Prospects of Agricultural Robots

The agricultural sector has witnessed significant advancements in productivity and efficiency through the adoption of automation and robotics. These technologies are pivotal in addressing the pressing challenges of food production, particularly in the context of a rapidly growing global population. As the demand for food escalates, the integration of agricultural robots (AgRobots) offers a promising solution to enhance crop yields, improve the quality of both fresh and processed food, and mitigate environmental impacts associated with traditional farming practices. Farmers are adopting technology to address issues such as the global shortage of food and labour. Artificial intelligence, field sensors, and data analytics are some of the advanced systems used in this quest, with robotics being the area in which these technologies converge. Agricultural robots have proven effective in performing tasks that are slow, repetitive, and dull for farmers, allowing them to focus more on improving overall yield. For instance, robotic farming has helped prevent substantial losses due to herbicide-resistant weeds. The application of robotics, combined with new sensor and geo-mapping technologies, provides farmers with advanced data about their crops, collected autonomously by drones and ground robots. This data helps scientists understand the optimal environment to nurture the best crops, thereby progressively improving farming practices.
Despite their transformative potential, AgRobots face several technical, economic, and operational challenges that limit large-scale adoption. High initial investment costs and uncertain return on investment remain key barriers, particularly for small-scale farmers. The complexity of agricultural environments—characterized by variable soil types, crop structures, and weather conditions—further complicates robot performance and reliability. Many AgRobots struggle with terrain adaptability, limited endurance, and energy inefficiency, while sensing systems such as RGB, LiDAR, and tactile sensors remain vulnerable to dust, rain, low light, and canopy occlusion. Likewise, conventional control algorithms, including PID and geometric path tracking, require extensive tuning and often fail under dynamic field conditions. Communication constraints, including limited network coverage in rural areas and the high cost of 5G infrastructure, also restrict scalability and real-time coordination. Additionally, task-specific mechanical designs—especially in robotic arms and end-effectors—reduce adaptability across diverse crops, and the lack of technical expertise among farmers poses an operational barrier. Addressing these challenges will require the development of cost-effective, modular robotic platforms; robust multi-sensor fusion for reliable perception; intelligent control systems using adaptive and learning-based algorithms; and hybrid communication networks combining Wi-Fi, LPWAN, and private 5G for improved connectivity. Equally important are policies and training programs that build local capacity and establish clear regulatory and safety frameworks, ensuring that AgRobot technologies are accessible, reliable, and sustainable for all scales of farming.
The future of agricultural robotics is highly promising, with rapid advancements in artificial intelligence, machine learning, sensing, control, and communication technologies poised to revolutionize modern farming. Beyond enhancing productivity, AgRobots hold immense potential to mitigate environmental challenges by replacing heavy machinery with lighter, autonomous systems that reduce soil compaction, optimize fertilizer and pesticide use, and minimize water wastage. Emerging trends such as swarm robotics—where multiple robots collaborate autonomously—could further improve field efficiency, resource allocation, and task precision. The integration of 6G and satellite-assisted communication systems will enable seamless, low-latency coordination even in remote areas, while AI-driven adaptive control frameworks will strengthen reliability under dynamic field conditions. Moreover, cost-effective designs, local manufacturing, and open-source innovation can lower barriers to adoption, particularly for small-scale farmers. Complementary capacity-building programs and well-defined regulatory frameworks will be essential to ensure safe, equitable, and sustainable deployment of AgRobots, ultimately promoting a resilient and environmentally responsible agricultural future.
In conclusion, while the path toward widespread adoption of agricultural robots remains challenged by technical, economic, and regulatory constraints, ongoing research and innovation continue to bridge these gaps. Advances in intelligent sensing, hybrid robotic architectures, and adaptive control systems are paving the way for more reliable, efficient, and sustainable robotic solutions in agriculture. By addressing existing limitations and fostering collaboration among researchers, policymakers, and industry stakeholders, the agricultural sector can fully harness the transformative potential of AgRobots to enhance productivity, reduce environmental impacts, and ensure food security for future generations.

Author Contributions

Conceptualization, C.L.N. and N.W.; methodology, C.L.N., A.A., S.O.F., C.E., and P.J.; validation, C.L.N., A.A., S.O.F., C.E., and P.J.; formal analysis, C.L.N., A.A., S.O.F., C.E., and P.J.; investigation, C.L.N., A.A., S.O.F., C.E., and P.J.; writing—original draft preparation, C.L.N., A.A., S.O.F., C.E., and P.J.; writing—review and editing, J.K. and N.W.; visualization, C.L.N., A.A., S.O.F., C.E., and P.J.; supervision, N.W.; project administration, N.W.; funding acquisition, N.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the U.S. Department of Energy (DOE), project number 23-0179.

Data Availability Statement

No new data were created or analyzed in this study.

Acknowledgments

During the preparation of this manuscript, the authors used [ChatGPT, GPT-5] and [Gemini, 2.5 Flash] for the purpose of generating some graphics used in this manuscript. The views, findings, conclusions, or recommendations presented in this publication are solely those of the authors and do not necessarily reflect the official policies or positions of the U.S. DOE or the U.S. Government.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Amin, A.; Wang, X.; Zhang, Y.; Tianhua, L.; Chen, Y.; Zheng, J.; Shi, Y.; Abdelhamid, M.A. A Comprehensive Review of Applications of Robotics and Artificial Intelligence in Agricultural Operations. Stud. Inform. Control 2023, 32, 59–70. [Google Scholar] [CrossRef]
  2. Botta, A.; Cavallone, P.; Baglieri, L.; Colucci, G.; Tagliavini, L.; Quaglia, G. A Review of Robots, Perception, and Tasks in Precision Agriculture. Appl. Mech. 2022, 3, 830–854. [Google Scholar] [CrossRef]
  3. Liu, L.; Yang, F.; Liu, X.; Du, Y.; Li, X.; Li, G.; Chen, D.; Zhu, Z.; Song, Z. A Review of the Current Status and Common Key Technologies for Agricultural Field Robots. Comput. Electron. Agric. 2024, 227, 109630. [Google Scholar] [CrossRef]
  4. Xie, D.; Chen, L.; Liu, L.; Chen, L.; Wang, H. Actuators and Sensors for Application in Agricultural Robots: A Review. Machines 2022, 10, 913. [Google Scholar] [CrossRef]
  5. Hernández, H.A.; Mondragón, I.F.; González, S.R.; Pedraza, L.F. Reconfigurable Agricultural Robotics: Control Strategies, Communication, and Applications. Comput. Electron. Agric. 2025, 234, 110161. [Google Scholar] [CrossRef]
  6. Peng, Y.; Liu, J.; Xie, B.; Shan, H.; He, M.; Hou, G.; Jin, Y. Research Progress of Urban Dual-Arm Humanoid Grape Harvesting Robot. In Proceedings of the 2021 IEEE 11th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Jiaxing, China, 27–31 July 2021; pp. 879–885. [Google Scholar]
  7. Jiang, S.; Wang, S.; Yi, Z.; Zhang, M.; Lv, X. Autonomous Navigation System of Greenhouse Mobile Robot Based on 3D Lidar and 2D Lidar SLAM. Front. Plant Sci. 2022, 13, 815218. [Google Scholar] [CrossRef]
  8. Rovira-Más, F.; Saiz-Rubio, V.; Cuenca-Cuenca, A. Augmented Perception for Agricultural Robots Navigation. IEEE Sens. J. 2021, 21, 11712–11727. [Google Scholar] [CrossRef]
  9. Upadhyay, A.; Chandel, N.S.; Singh, K.P.; Chakraborty, S.K.; Nandede, B.M.; Kumar, M.; Subeesh, A.; Upendar, K.; Salem, A.; Elbeltagi, A. Deep Learning and Computer Vision in Plant Disease Detection: A Comprehensive Review of Techniques, Models, and Trends in Precision Agriculture. Artif. Intell. Rev. 2025, 58, 92. [Google Scholar] [CrossRef]
  10. Nkwocha, C.L.; Chandel, A.K. Towards an End-to-End Digital Framework for Precision Crop Disease Diagnosis and Management Based on Emerging Sensing and Computing Technologies: State over Past Decade and Prospects. Computers 2025, 14, 443. [Google Scholar] [CrossRef]
  11. Ashwini, C.; Sellam, V. EOS-3D-DCNN: Ebola Optimization Search-Based 3D-Dense Convolutional Neural Network for Corn Leaf Disease Prediction. Neural Comput. Appl. 2023, 35, 11125–11139. [Google Scholar] [CrossRef] [PubMed]
  12. Singla, A.; Nehra, A.; Joshi, K.; Kumar, A.; Tuteja, N.; Varshney, R.K.; Gill, S.S.; Gill, R. Exploration of Machine Learning Approaches for Automated Crop Disease Detection. Curr. Plant Biol. 2024, 40, 100382. [Google Scholar] [CrossRef]
  13. Das, S.; Chapman, S.; Christopher, J.; Choudhury, M.R.; Menzies, N.W.; Apan, A.; Dang, Y.P. UAV-Thermal Imaging: A Technological Breakthrough for Monitoring and Quantifying Crop Abiotic Stress to Help Sustain Productivity on Sodic Soils—A Case Review on Wheat. Remote Sens. Appl. Soc. Environ. 2021, 23, 100583. [Google Scholar] [CrossRef]
  14. Singh, A.P.; Yerudkar, A.; Mariani, V.; Iannelli, L.; Glielmo, L. A Bibliometric Review of the Use of Unmanned Aerial Vehicles in Precision Agriculture and Precision Viticulture for Sensing Applications. Remote Sens. 2022, 14, 1604. [Google Scholar] [CrossRef]
  15. Madroñal, D.; Palumbo, F.; Capotondi, A.; Marongiu, A. Unmanned Vehicles in Smart Farming: A Survey and a Glance at Future Horizons. In Proceedings of the 2021 Drone Systems Engineering and Rapid Simulation and Performance Evaluation: Methods and Tools Proceedings, Budapest, Hungary, 18–20 February 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 1–8. [Google Scholar]
  16. Zhang, H.; Wang, L.; Tian, T.; Yin, J. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  17. Daponte, P.; De Vito, L.; Glielmo, L.; Iannelli, L.; Liuzza, D.; Picariello, F.; Silano, G. A Review on the Use of Drones for Precision Agriculture. IOP Conf. Ser. Earth Environ. Sci. 2019, 275, 012022. [Google Scholar] [CrossRef]
  18. Huang, H.; Yang, A.; Tang, Y.; Zhuang, J.; Hou, C.; Tan, Z.; Dananjayan, S.; He, Y.; Guo, Q.; Luo, S. Deep Color Calibration for UAV Imagery in Crop Monitoring Using Semantic Style Transfer with Local to Global Attention. Int. J. Appl. Earth Obs. Geoinf. 2021, 104, 102590. [Google Scholar] [CrossRef]
  19. Negash, L.; Kim, H.-Y.; Choi, H.-L. Emerging UAV Applications in Agriculture. In Proceedings of the 2019 7th International Conference on Robot Intelligence Technology and Applications (RiTA), Daejeon, Republic of Korea, 1–3 November 2019; IEEE: Piscataway, NJ, USA; pp. 254–257. [Google Scholar]
  20. Inoue, Y. Satellite- and Drone-Based Remote Sensing of Crops and Soils for Smart Farming—A Review. Soil Sci. Plant Nutr. 2020, 66, 798–810. [Google Scholar] [CrossRef]
  21. Panday, U.S.; Pratihast, A.K.; Aryal, J.; Kayastha, R.B. A Review on Drone-Based Data Solutions for Cereal Crops. Drones 2020, 4, 41. [Google Scholar] [CrossRef]
  22. Rahman, M.F.F.; Fan, S.; Zhang, Y.; Chen, L. A Comparative Study on Application of Unmanned Aerial Vehicle Systems in Agriculture. Agriculture 2021, 11, 22. [Google Scholar] [CrossRef]
  23. Spoorthi, S.; Shadaksharappa, B.; Suraj, S.; Manasa, V.K. Freyr Drone: Pesticide/Fertilizers Spraying Drone—An Agricultural Approach. In Proceedings of the 2017 2nd International Conference on Computing and Communications Technologies (ICCCT), Chennai, India, 23–24 February 2017; pp. 252–255. [Google Scholar]
  24. Shilin, W.; Jianli, S.; Xiongkui, H.; Le, S.; Xiaonan, W.; Changling, W.; Zhichong, W.; Yun, L. Performances Evaluation of Four Typical Unmanned Aerial Vehicles Used for Pesticide Application in China. Int. J. Agric. Biol. Eng. 2017, 10, 22–31. [Google Scholar] [CrossRef]
  25. Fernández-Guisuraga, J.M.; Sanz-Ablanedo, E.; Suárez-Seoane, S.; Calvo, L. Using Unmanned Aerial Vehicles in Postfire Vegetation Survey Campaigns through Large and Heterogeneous Areas: Opportunities and Challenges. Sensors 2018, 18, 586. [Google Scholar] [CrossRef] [PubMed]
  26. Guo, Q.; Zhu, Y.; Tang, Y.; Hou, C.; Fang, M.; Chen, X. Numerical Simulation of the Effects of Downwash Airflow and Crosswinds on the Spray Performance of Quad-Rotor Agricultural UAVs. Smart Agric. Technol. 2025, 11, 100940. [Google Scholar] [CrossRef]
  27. Xiao, X.; Qu, W.; Xia, G.-S.; Xu, M.; Shao, Z.; Gong, J.; Li, D. A Novel Real-Time Matching and Pose Reconstruction Method for Low-Overlap Agricultural UAV Images with Repetitive Textures. ISPRS J. Photogramm. Remote Sens. 2025, 226, 54–75. [Google Scholar] [CrossRef]
  28. Demir, S.; Dedeoğlu, M.; Başayiğit, L. Yield Prediction Models of Organic Oil Rose Farming with Agricultural Unmanned Aerial Vehicles (UAVs) Images and Machine Learnaing Algorithms. Remote Sens. Appl. Soc. Environ. 2024, 33, 101131. [Google Scholar] [CrossRef]
  29. Singh, P.K.; Sharma, A. An Intelligent WSN-UAV-Based IoT Framework for Precision Agriculture Application. Comput. Electr. Eng. 2022, 100, 107912. [Google Scholar] [CrossRef]
  30. Park, M.; Lee, S.; Lee, S. Dynamic Topology Reconstruction Protocol for UAV Swarm Networking. Symmetry 2020, 12, 1111. [Google Scholar] [CrossRef]
  31. Zhang, Z.; Zhu, L. A Review on Unmanned Aerial Vehicle Remote Sensing: Platforms, Sensors, Data Processing Methods, and Applications. Drones 2023, 7, 398. [Google Scholar] [CrossRef]
  32. Ahmed, F.; Mohanta, J.C.; Keshari, A.; Yadav, P.S. Recent Advances in Unmanned Aerial Vehicles: A Review. Arab. J. Sci. Eng. 2022, 47, 7963–7984. [Google Scholar] [CrossRef]
  33. Shekh, M.; Rani, S.; Datta, R. Review on Design, Development, and Implementation of an Unmanned Aerial Vehicle for Various Applications. Int. J. Intell. Robot. Appl. 2025, 9, 299–318. [Google Scholar] [CrossRef]
  34. Guo, X.; Shao, Q.; Li, Y.; Wang, Y.; Wang, D.; Liu, J.; Fan, J.; Yang, F. Application of UAV Remote Sensing for a Population Census of Large Wild Herbivores—Taking the Headwater Region of the Yellow River as an Example. Remote Sens. 2018, 10, 1041. [Google Scholar] [CrossRef]
  35. Mammarella, M.; Capello, E.; Dabbene, F.; Guglieri, G. Sample-Based SMPC for Tracking Control of Fixed-Wing UAV. IEEE Control Syst. Lett. 2018, 2, 611–616. [Google Scholar] [CrossRef]
  36. Pfeifer, C.; Barbosa, A.; Mustafa, O.; Peter, H.-U.; Rümmler, M.-C.; Brenning, A. Using Fixed-Wing UAV for Detecting and Mapping the Distribution and Abundance of Penguins on the South Shetlands Islands, Antarctica. Drones 2019, 3, 39. [Google Scholar] [CrossRef]
  37. Divazi, A.; Askari, R.; Roohi, E. Experimental and Numerical Investigation on the Spraying Performance of an Agricultural Unmanned Aerial Vehicle. Aerosp. Sci. Technol. 2025, 160, 110083. [Google Scholar] [CrossRef]
  38. Kovalev, I.V.; Kovalev, D.I.; Astanakulov, K.D.; Podoplelova, V.A.; Borovinsky, D.V.; Shaporova, Z.E. Productivity Analysis of Agricultural UAVs by Field Crop Spraying. IOP Conf. Ser. Earth Environ. Sci. 2023, 1284, 012026. [Google Scholar] [CrossRef]
  39. Ukaegbu, U.F.; Tartibu, L.K.; Okwu, M.O.; Olayode, I.O. Development of a Light-Weight Unmanned Aerial Vehicle for Precision Agriculture. Sensors 2021, 21, 4417. [Google Scholar] [CrossRef]
  40. Abramov, N.V.; Semizorov, S.A.; Sherstobitov, S.V.; Gunger, M.V.; Petukhov, D.A. Digitization of Agricultural Land Using an Unmanned Aerial Vehicle. IOP Conf. Ser. Earth Environ. Sci. 2020, 548, 032002. [Google Scholar] [CrossRef]
  41. Chen, P.-C.; Chiang, Y.-C.; Weng, P.-Y. Imaging Using Unmanned Aerial Vehicles for Agriculture Land Use Classification. Agriculture 2020, 10, 416. [Google Scholar] [CrossRef]
  42. Yang, M.-D.; Boubin, J.G.; Tsai, H.P.; Tseng, H.-H.; Hsu, Y.-C.; Stewart, C.C. Adaptive Autonomous UAV Scouting for Rice Lodging Assessment Using Edge Computing with Deep Learning EDANet. Comput. Electron. Agric. 2020, 179, 105817. [Google Scholar] [CrossRef]
  43. Ju, C.; Son, H.I. Multiple UAV Systems for Agricultural Applications: Control, Implementation, and Evaluation. Electronics 2018, 7, 162. [Google Scholar] [CrossRef]
  44. Psirofonia, P.; Samaritakis, V.; Eliopoulos, P.; Potamitis, I. Use of Unmanned Aerial Vehicles for Agricultural Applications with Emphasis on Crop Protection: Three Novel Case-Studies. Int. J. Agric. Sci. Technol. 2017, 5, 30–39. [Google Scholar] [CrossRef]
  45. Torres-Sánchez, J.; López-Granados, F.; Serrano, N.; Arquero, O.; Peña, J.M. High-Throughput 3-D Monitoring of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology. PLoS ONE 2015, 10, e0130479. [Google Scholar] [CrossRef] [PubMed]
  46. Vega, F.A.; Ramírez, F.C.; Saiz, M.P.; Rosúa, F.O. Multi-Temporal Imaging Using an Unmanned Aerial Vehicle for Monitoring a Sunflower Crop. Biosyst. Eng. 2015, 132, 19–27. [Google Scholar] [CrossRef]
  47. Khan, N.; Ray, R.L.; Sargani, G.R.; Ihtisham, M.; Khayyam, M.; Ismail, S. Current Progress and Future Prospects of Agriculture Technology: Gateway to Sustainable Agriculture. Sustainability 2021, 13, 4883. [Google Scholar] [CrossRef]
  48. Ferreira, F.; Faria, J.; Azevedo, A.; Marques, A.L. Product Lifecycle Management in Knowledge Intensive Collaborative Environments: An Application to Automotive Industry. Int. J. Inf. Manag. 2017, 37, 1474–1487. [Google Scholar] [CrossRef]
  49. Singh, S.; Vaishnav, R.; Gautam, S.; Banerjee, S. Agricultural Robotics: A Comprehensive Review of Applications, Challenges and Future Prospects. In Proceedings of the 2024 2nd International Conference on Artificial Intelligence and Machine Learning Applications Theme: Healthcare and Internet of Things (AIMLA), Namakkal, India, 15–16 March 2024; pp. 1–8. [Google Scholar]
  50. Teng, H.; Wang, Y.; Chatziparaschis, D.; Karydis, K. Adaptive LiDAR Odometry and Mapping for Autonomous Agricultural Mobile Robots in Unmanned Farms. Comput. Electron. Agric. 2025, 232, 110023. [Google Scholar] [CrossRef]
  51. Linford, J.; Haghshenas-Jaryani, M. A Ground Robotic System for Crops and Soil Monitoring and Data Collection in New Mexico Chile Pepper Farms. Discov. Agric. 2024, 2, 101. [Google Scholar] [CrossRef]
  52. Wang, S.; Zhou, H.; Zhang, C.; Ge, L.; Li, W.; Yuan, T.; Zhang, W.; Zhang, J. Design, Development and Evaluation of Latex Harvesting Robot Based on Flexible Toggle. Robot. Auton. Syst. 2022, 147, 103906. [Google Scholar] [CrossRef]
  53. Hemanth Kumar, N.; Suresh, R.; Balappa, B.U. Development of an Unmanned Ground Vehicle for Pesticide Spraying in Chilli Crop. In Proceedings of the 2023 IEEE Technology & Engineering Management Conference-Asia Pacific (TEMSCON-ASPAC), Bengaluru, India, 14–16 December 2023; pp. 1–5. [Google Scholar]
  54. Xu, R.; Li, C. A Review of High-Throughput Field Phenotyping Systems: Focusing on Ground Robots. Plant Phenomics 2022, 2022, 9760269. [Google Scholar] [CrossRef]
  55. Fernandes, H.R.; Polania, E.C.M.; Garcia, A.P.; Mendonza, O.B.; Albiero, D. Agricultural Unmanned Ground Vehicles: A Review from the Stability Point of View. Rev. Ciênc. Agronômica 2021, 51, e20207761. [Google Scholar] [CrossRef]
  56. Bonadies, S.; Lefcourt, A.; Gadsden, S.A. A Survey of Unmanned Ground Vehicles with Applications to Agricultural and Environmental Sensing. In Proceedings of the SPIE Proceedings, Baltimore, MD, USA, 17–21 April 2016; Valasek, J., Thomasson, J.A., Eds.; SPIE: Bellingham, WA, USA, 2016; Volume 9866, p. 98660Q. [Google Scholar]
  57. Roshanianfard, A.; Noguchi, N.; Okamoto, H.; Ishii, K. A Review of Autonomous Agricultural Vehicles (The Experience of Hokkaido University). J. Terramechanics 2020, 91, 155–183. [Google Scholar] [CrossRef]
  58. Etezadi, H.; Eshkabilov, S. A Comprehensive Overview of Control Algorithms, Sensors, Actuators, and Communication Tools of Autonomous All-Terrain Vehicles in Agriculture. Agriculture 2024, 14, 163. [Google Scholar] [CrossRef]
  59. Pham, V.; Malladi, B.; Moreno, F.; Gonzalez, C.; Bhandari, S.; Raheja, A. Collaboration between Aerial and Ground Robots for Weed Detection and Removal. In Precision Agriculture’ 25; Wageningen Academic: Wageningen, The Netherlands, 2025. [Google Scholar]
  60. Pour Arab, D.; Spisser, M.; Essert, C. 3D Hybrid Path Planning for Optimized Coverage of Agricultural Fields: A Novel Approach for Wheeled Robots. J. Field Robot. 2025, 42, 455–473. [Google Scholar] [CrossRef]
  61. Zhang, Y.; Shen, Y.; Liu, H.; He, S.; Khan, Z. A Composite Sliding Mode Controller with Extended Disturbance Observer for 4WSS Agricultural Robots in Unstructured Farmlands. Comput. Electron. Agric. 2025, 232, 110069. [Google Scholar] [CrossRef]
  62. Zhang, Z.; Li, Z.; Yang, M.; Cui, J.; Shao, Y.; Ding, Y.; Yang, W.; Qiao, W.; Song, P. An Autonomous Navigation Method for Field Phenotyping Robot Based on Ground-Air Collaboration. Artif. Intell. Agric. 2025, 15, 610–621. [Google Scholar] [CrossRef]
  63. Banić, M.; Stojanović, L.; Perić, M.; Rangelov, D.; Pavlović, V.; Miltenović, A.; Simonović, M. AgAR: A Multipurpose Robotic Platform for the Digital Transformation of Agriculture. In Proceedings of the 11th International Scientific Conference IRMES 2025, Vrnjačka Banja, Serbia, 19–21 June 2025; Faculty of Mechanical Engineering, University of Niš: Niš, Serbia, 2025; pp. XXIII–XXXI. [Google Scholar]
  64. Dokic, K.; Kukina, H.; Mikolcevic, H. A Low-Cost Agriculture Robot for Dataset Creation-Software and Hardware Solutions. In Proceedings of the 2024 1st International Conference on Innovative Engineering Sciences and Technological Research (ICIESTR), Muscat, Oman, 14–15 May 2024; pp. 1–6. [Google Scholar]
  65. de Silva, R.; Cielniak, G.; Wang, G.; Gao, J. Deep Learning-Based Crop Row Detection for Infield Navigation of Agri-Robots. J. Field Robot. 2024, 41, 2299–2321. [Google Scholar] [CrossRef]
  66. Raikwar, S.; Fehrmann, J.; Herlitzius, T. Navigation and Control Development for a Four-Wheel-Steered Mobile Orchard Robot Using Model-Based Design. Comput. Electron. Agric. 2022, 202, 107410. [Google Scholar] [CrossRef]
  67. Shojaei, K. Intelligent Coordinated Control of an Autonomous Tractor-Trailer and a Combine Harvester. Eur. J. Control 2021, 59, 82–98. [Google Scholar] [CrossRef]
  68. Skoczeń, M.; Ochman, M.; Spyra, K.; Nikodem, M.; Krata, D.; Panek, M.; Pawłowski, A. Obstacle Detection System for Agricultural Mobile Robot Application Using RGB-D Cameras. Sensors 2021, 21, 5292. [Google Scholar] [CrossRef] [PubMed]
  69. Gai, J.; Xiang, L.; Tang, L. Using a Depth Camera for Crop Row Detection and Mapping for Under-Canopy Navigation of Agricultural Robotic Vehicle. Comput. Electron. Agric. 2021, 188, 106301. [Google Scholar] [CrossRef]
  70. Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperative Agricultural Operations of Aerial and Ground Unmanned Vehicles. In Proceedings of the 2020 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 4–6 November 2020; pp. 224–229. [Google Scholar]
  71. Utstumo, T.; Urdal, F.; Brevik, A.; Dørum, J.; Netland, J.; Overskeid, Ø.; Berge, T.W.; Gravdahl, J.T. Robotic In-Row Weed Control in Vegetables. Comput. Electron. Agric. 2018, 154, 36–45. [Google Scholar] [CrossRef]
  72. Mueller-Sim, T.; Jenkins, M.; Abel, J.; Kantor, G. The Robotanist: A Ground-Based Agricultural Robot for High-Throughput Crop Phenotyping. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3634–3639. [Google Scholar]
  73. Gonzalez-de-Santos, P.; Ribeiro, A.; Fernandez-Quintanilla, C.; Lopez-Granados, F.; Brandstoetter, M.; Tomic, S.; Pedrazzi, S.; Peruzzi, A.; Pajares, G.; Kaplanis, G.; et al. Fleets of Robots for Environmentally-Safe Pest Control in Agriculture. Precis. Agric. 2017, 18, 574–614. [Google Scholar] [CrossRef]
  74. Kayacan, E.; Ramon, H.; Saeys, W. Robust Trajectory Tracking Error Model-Based Predictive Control for Unmanned Ground Vehicles. IEEEASME Trans. Mechatron. 2016, 21, 806–814. [Google Scholar] [CrossRef]
  75. Yin, H.; Sun, Q.; Ren, X.; Guo, J.; Yang, Y.; Wei, Y.; Huang, B.; Chai, X.; Zhong, M. Development, Integration, and Field Evaluation of an Autonomous Citrus-harvesting Robot. J. Field Robot. 2023, 40, 1363–1387. [Google Scholar] [CrossRef]
  76. Davidson, J.R.; Mo, C. Mechanical Design and Initial Performance Testing of an Apple-Picking End-Effector. In Proceedings of the ASME 2015 International Mechanical Engineering Congress and Exposition, Houston, TX, USA, 13–19 November 2015; Volume 4A: Dynamics, Vibration, and Control. American Society of Mechanical Engineers: New York, NY, USA, 2015. [Google Scholar]
  77. Kaleem, A.; Hussain, S.; Aqib, M.; Cheema, M.J.M.; Saleem, S.R.; Farooq, U. Development Challenges of Fruit-Harvesting Robotic Arms: A Critical Review. AgriEngineering 2023, 5, 2216–2237. [Google Scholar] [CrossRef]
  78. Xiao, X.; Wang, Y.; Jiang, Y. Review of Research Advances in Fruit and Vegetable Harvesting Robots. J. Electr. Eng. Technol. 2024, 19, 773–789. [Google Scholar] [CrossRef]
  79. Wang, Z.; Xun, Y.; Wang, Y.; Yang, Q. Review of Smart Robots for Fruit and Vegetable Picking in Agriculture. Int. J. Agric. Biol. Eng. 2022, 15, 33–54. [Google Scholar] [CrossRef]
  80. Gao, J.; Zhang, F.; Zhang, J.; Yuan, T.; Yin, J.; Guo, H.; Yang, C. Development and Evaluation of a Pneumatic Finger-like End-Effector for Cherry Tomato Harvesting Robot in Greenhouse. Comput. Electron. Agric. 2022, 197, 106879. [Google Scholar] [CrossRef]
  81. Gao, J.; Zhang, F.; Zhang, J.; Guo, H.; Gao, J. Picking Patterns Evaluation for Cherry Tomato Robotic Harvesting End-Effector Design. Biosyst. Eng. 2024, 239, 1–12. [Google Scholar] [CrossRef]
  82. Yeshmukhametov, A.; Koganezawa, K.; Yamamoto, Y.; Buribayev, Z.; Mukhtar, Z.; Amirgaliyev, Y. Development of Continuum Robot Arm and Gripper for Harvesting Cherry Tomatoes. Appl. Sci. 2022, 12, 6922. [Google Scholar] [CrossRef]
  83. Chen, M.; Chen, F.; Zhou, W.; Zuo, R. Design of Flexible Spherical Fruit and Vegetable Picking End-Effector Based on Vision Recognition. J. Phys. Conf. Ser. 2022, 2246, 012060. [Google Scholar] [CrossRef]
  84. Arikapudi, R.; Vougioukas, S.G. Robotic Tree-Fruit Harvesting with Arrays of Cartesian Arms: A Study of Fruit Pick Cycle Times. Comput. Electron. Agric. 2023, 211, 108023. [Google Scholar] [CrossRef]
  85. Sun, Q.; Zhong, M.; Chai, X.; Zeng, Z.; Yin, H.; Zhou, G.; Sun, T. Citrus Pose Estimation from an RGB Image for Automated Harvesting. Comput. Electron. Agric. 2023, 211, 108022. [Google Scholar] [CrossRef]
  86. Ji, W.; Tang, C.; Xu, B.; He, G. Contact Force Modeling and Variable Damping Impedance Control of Apple Harvesting Robot. Comput. Electron. Agric. 2022, 198, 107026. [Google Scholar] [CrossRef]
  87. Xiao, X.; Wang, Y.; Jiang, Y. End-Effectors Developed for Citrus and Other Spherical Crops. Appl. Sci. 2022, 12, 7945. [Google Scholar] [CrossRef]
  88. Fan, P.; Yan, B.; Wang, M.; Lei, X.; Liu, Z.; Yang, F. Three-Finger Grasp Planning and Experimental Analysis of Picking Patterns for Robotic Apple Harvesting. Comput. Electron. Agric. 2021, 188, 106353. [Google Scholar] [CrossRef]
  89. Roshanianfard, A. Development of a Harvesting Robot for Heavy-Weight Crop. Doctoral Dissertation, Hokkaido University, Sapporo, Japan, 2018. [Google Scholar] [CrossRef]
  90. Rahul, K.; Raheman, H.; Paradkar, V. Design of a 4 DOF Parallel Robot Arm and the Firmware Implementation on Embedded System to Transplant Pot Seedlings. Artif. Intell. Agric. 2020, 4, 172–183. [Google Scholar] [CrossRef]
  91. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
  92. Huang, M.; He, L.; Choi, D.; Pecchia, J.; Li, Y. Picking Dynamic Analysis for Robotic Harvesting of Agaricus Bisporus Mushrooms. Comput. Electron. Agric. 2021, 185, 106145. [Google Scholar] [CrossRef]
  93. De Preter, A.; Anthonis, J.; De Baerdemaeker, J. Development of a Robot for Harvesting Strawberries. IFAC-Pap. 2018, 51, 14–19. [Google Scholar] [CrossRef]
  94. Roshanianfard, A.; Noguchi, N. Development of a 5DOF Robotic Arm (RAVebots-1) Applied to Heavy Products Harvesting. IFAC-Pap. 2016, 49, 155–160. [Google Scholar] [CrossRef]
  95. Kaizu, Y.; Shimada, T.; Takahashi, Y.; Igarashi, S.; Yamada, H.; Furuhashi, K.; Imou, K. Development of a Small Electric Robot Boat for Mowing Aquatic Weeds. Trans. ASABE 2021, 64, 1073–1082. [Google Scholar] [CrossRef]
  96. Moro, S.; Uchida, H.; Kato, K.; Nomura, K.; Seikine, S.; Yamano, T. Development of an Automatic Operation Control System for a Weeding Robot in Paddy Fields to Track a Target Path and Speed. Eng. Agric. Environ. Food 2023, 16, 101–112. [Google Scholar] [CrossRef] [PubMed]
  97. Murugaraj, G.; Selva Kumar, S.; Pillai, A.S.; Bharatiraja, C. Implementation of In-Row Weeding Robot with Novel Wheel, Assembly and Wheel Angle Adjustment for Slurry Paddy Field. Mater. Today Proc. 2022, 65, 215–220. [Google Scholar] [CrossRef]
  98. Liu, Y.; Noguchi, N.; Liang, L. Development of a Positioning System Using UAV-Based Computer Vision for an Airboat Navigation in Paddy Field. Comput. Electron. Agric. 2019, 162, 126–133. [Google Scholar] [CrossRef]
  99. Liu, Y.; Noguchi, N. Development of an Unmanned Surface Vehicle for Autonomous Navigation in a Paddy Field. Eng. Agric. Environ. Food 2016, 9, 21–26. [Google Scholar] [CrossRef]
  100. Liu, Y.; Noguchi, N.; Ali, R.F. Simulation and Test of an Agricultural Unmanned Airboat Maneuverability Model. Biol Eng 2017, 10, 88–96. [Google Scholar]
  101. Bechar, A.; Vigneault, C. Agricultural Robots for Field Operations: Concepts and Components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  102. Qu, J.; Zhang, Z.; Qin, Z.; Guo, K.; Li, D. Applications of Autonomous Navigation Technologies for Unmanned Agricultural Tractors: A Review. Machines 2024, 12, 218. [Google Scholar] [CrossRef]
  103. Huang, Y.; Fu, J.; Xu, S.; Han, T.; Liu, Y. Research on Integrated Navigation System of Agricultural Machinery Based on RTK-BDS/INS. Agriculture 2022, 12, 1169. [Google Scholar] [CrossRef]
  104. Sun, T.; Le, F.; Cai, C.; Jin, Y.; Xue, X.; Cui, L. Soybean–Corn Seedling Crop Row Detection for Agricultural Autonomous Navigation Based on GD-YOLOv10n-Seg. Agriculture 2025, 15, 796. [Google Scholar] [CrossRef]
  105. Kim, K.; Deb, A.; Cappelleri, D.J. P-AgNav: Range View-Based Autonomous Navigation System for Cornfields. IEEE Robot. Autom. Lett. 2025, 10, 3366–3373. [Google Scholar] [CrossRef]
  106. Mansur, H.; Gadhwal, M.; Abon, J.E.; Flippo, D. Mapping for Autonomous Navigation of Agricultural Robots Through Crop Rows Using UAV. Agriculture 2025, 15, 882. [Google Scholar] [CrossRef]
  107. Chen, J.; Li, X.; Zhang, X. SLDF: A Semantic Line Detection Framework for Robot Guidance. Signal Process. Image Commun. 2023, 115, 116970. [Google Scholar] [CrossRef]
  108. Yu, J.; Zhang, J.; Shu, A.; Chen, Y.; Chen, J.; Yang, Y.; Tang, W.; Zhang, Y. Study of Convolutional Neural Network-Based Semantic Segmentation Methods on Edge Intelligence Devices for Field Agricultural Robot Navigation Line Extraction. Comput. Electron. Agric. 2023, 209, 107811. [Google Scholar] [CrossRef]
  109. Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation Path Extraction for Greenhouse Cucumber-Picking Robots Using the Prediction-Point Hough Transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
  110. Nkwocha, C.L.; Wang, N. Deep Learning-Based Semantic Segmentation with Novel Navigation Line Extraction for Autonomous Agricultural Robots. Discov. Artif. Intell. 2025, 5, 73. [Google Scholar] [CrossRef]
  111. Zhao, Y.; Gong, L.; Huang, Y.; Liu, C. A Review of Key Techniques of Vision-Based Control for Harvesting Robot. Comput. Electron. Agric. 2016, 127, 311–323. [Google Scholar] [CrossRef]
  112. Lili, W.; Bo, Z.; Jinwei, F.; Xiaoan, H.; Shu, W.; Yashuo, L.; Qiangbing, Z.; Chongfeng, W. Development of a Tomato Harvesting Robot Used in Greenhouse. Int. J. Agric. Biol. Eng. 2017, 10, 140–149. [Google Scholar] [CrossRef]
  113. Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and Systems for Fruit Detection and Localization: A Review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
  114. Patel, K.K.; Pathare, P.B. Principle and Applications of Near-Infrared Imaging for Fruit Quality Assessment—An Overview. Int. J. Food Sci. Technol. 2024, 59, 3436–3450. [Google Scholar] [CrossRef]
  115. Wu, G.; Li, B.; Zhu, Q.; Huang, M.; Guo, Y. Using Color and 3D Geometry Features to Segment Fruit Point Cloud and Improve Fruit Recognition Accuracy. Comput. Electron. Agric. 2020, 174, 105475. [Google Scholar] [CrossRef]
  116. Sa, I.; Lehnert, C.; English, A.; McCool, C.; Dayoub, F.; Upcroft, B.; Perez, T. Peduncle Detection of Sweet Pepper for Autonomous Crop Harvesting—Combined Color and 3-D Information. IEEE Robot. Autom. Lett. 2017, 2, 765–772. [Google Scholar] [CrossRef]
  117. Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Fang, Y. Color-, Depth-, and Shape-Based 3D Fruit Detection. Precis. Agric. 2020, 21, 1–17. [Google Scholar] [CrossRef]
  118. Luo, L.; Tang, Y.; Lu, Q.; Chen, X.; Zhang, P.; Zou, X. A Vision Methodology for Harvesting Robot to Detect Cutting Points on Peduncles of Double Overlapping Grape Clusters in a Vineyard. Comput. Ind. 2018, 99, 130–139. [Google Scholar] [CrossRef]
  119. Méndez, V.; Velasco, J.; Rodríguez, F.; Berenguel, M.; Martínez, A.; Guzmán, J.L. In-Field Estimation of Orange Number and Size by 3D Laser Scanning. Agronomy 2019, 9, 885. [Google Scholar] [CrossRef]
  120. Fu, L.; Feng, Y.; Majeed, Y.; Zhang, X.; Zhang, J.; Karkee, M.; Zhang, Q. Kiwifruit Detection in Field Images Using Faster R-CNN with ZFNet. IFAC-Pap. 2018, 51, 45–50. [Google Scholar] [CrossRef]
  121. Badgujar, C.M.; Poulose, A.; Gan, H. Agricultural Object Detection with You Only Look Once (YOLO) Algorithm: A Bibliometric and Systematic Literature Review. Comput. Electron. Agric. 2024, 223, 109090. [Google Scholar] [CrossRef]
  122. Onishi, Y.; Yoshida, T.; Kurita, H.; Fukao, T.; Arihara, H.; Iwai, A. An Automated Fruit Harvesting Robot by Using Deep Learning. ROBOMECH J. 2019, 6, 13. [Google Scholar] [CrossRef]
  123. Birrell, S.; Hughes, J.; Cai, J.Y.; Iida, F. A Field-Tested Robotic Harvesting System for Iceberg Lettuce. J. Field Robot. 2020, 37, 225–245. [Google Scholar] [CrossRef] [PubMed]
  124. Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep Learning–Method Overview and Review of Use for Fruit Detection and Yield Estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
  125. Lin, G.; Tang, Y.; Zou, X.; Cheng, J.; Xiong, J. Fruit Detection in Natural Environment Using Partial Shape Matching and Probabilistic Hough Transform. Precis. Agric. 2020, 21, 160–177. [Google Scholar] [CrossRef]
  126. Yu, Y.; Zhang, K.; Yang, L.; Zhang, D. Fruit Detection for Strawberry Harvesting Robot in Non-Structural Environment Based on Mask-RCNN. Comput. Electron. Agric. 2019, 163, 104846. [Google Scholar] [CrossRef]
  127. Yang, C.H.; Xiong, L.Y.; Wang, Z.; Wang, Y.; Shi, G.; Kuremot, T.; Zhao, W.H.; Yang, Y. Integrated Detection of Citrus Fruits and Branches Using a Convolutional Neural Network. Comput. Electron. Agric. 2020, 174, 105469. [Google Scholar] [CrossRef]
  128. Wang, W.; Lin, C.; Shui, H.; Zhang, K.; Zhai, R. Adaptive Symmetry Self-Matching for 3D Point Cloud Completion of Occluded Tomato Fruits in Complex Canopy Environments. Plants 2025, 14, 2080. [Google Scholar] [CrossRef] [PubMed]
  129. Zhao, H.; Tang, Z.; Li, Z.; Dong, Y.; Si, Y.; Lu, M.; Panoutsos, G. Real-Time Object Detection and Robotic Manipulation for Agriculture Using a YOLO-Based Learning Approach. In Proceedings of the 2024 IEEE International Conference on Industrial Technology (ICIT), Bristol, UK, 25–27 March 2024; pp. 1–6. [Google Scholar]
  130. Hu, N.; Su, D.; Wang, S.; Nyamsuren, P.; Qiao, Y.; Jiang, Y.; Cai, Y. LettuceTrack: Detection and Tracking of Lettuce for Robotic Precision Spray in Agriculture. Front. Plant Sci. 2022, 13, 1003243. [Google Scholar] [CrossRef] [PubMed]
  131. Reina, G.; Milella, A.; Rouveure, R.; Nielsen, M.; Worst, R.; Blas, M.R. Ambient Awareness for Agricultural Robotic Vehicles. Biosyst. Eng. 2016, 146, 114–132. [Google Scholar] [CrossRef]
  132. Yan, J.; Liu, Y. A Stereo Visual Obstacle Detection Approach Using Fuzzy Logic and Neural Network in Agriculture. In Proceedings of the 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), Kuala Lumpur, Malaysia, 12–15 December 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1539–1544. [Google Scholar]
  133. Qiu, Z.; Zhao, N.; Zhou, L.; Wang, M.; Yang, L.; Fang, H.; He, Y.; Liu, Y. Vision-Based Moving Obstacle Detection and Tracking in Paddy Field Using Improved Yolov3 and Deep SORT. Sensors 2020, 20, 4082. [Google Scholar] [CrossRef]
  134. Xu, H.; Li, S.; Ji, Y.; Cao, R.; Zhang, M. Dynamic Obstacle Detection Based on Panoramic Vision in the Moving State of Agricultural Machineries. Comput. Electron. Agric. 2021, 184, 106104. [Google Scholar] [CrossRef]
  135. Bayar, G.; Bergerman, M.; Koku, A.B.; Konukseven, E. ilhan Localization and Control of an Autonomous Orchard Vehicle. Comput. Electron. Agric. 2015, 115, 118–128. [Google Scholar] [CrossRef]
  136. Ball, D.; Upcroft, B.; Wyeth, G.; Corke, P.; English, A.; Ross, P.; Patten, T.; Fitch, R.; Sukkarieh, S.; Bate, A. Vision-based Obstacle Detection and Navigation for an Agricultural Robot. J. Field Robot. 2016, 33, 1107–1130. [Google Scholar] [CrossRef]
  137. Xue, J.; Xia, C.; Zou, J. A Velocity Control Strategy for Collision Avoidance of Autonomous Agricultural Vehicles. Auton. Robots 2020, 44, 1047–1063. [Google Scholar] [CrossRef]
  138. Liu, C.; Zhao, X.; Du, Y.; Cao, C.; Zhu, Z.; Mao, E. Research on Static Path Planning Method of Small Obstacles for Automatic Navigation of Agricultural Machinery. IFAC-Pap. 2018, 51, 673–677. [Google Scholar] [CrossRef]
  139. Liu, Z.; Lü, Z.; Zheng, W.; Zhang, W.; Cheng, X. Design of Obstacle Avoidance Controller for Agricultural Tractor Based on ROS. Int. J. Agric. Biol. Eng. 2019, 12, 8. [Google Scholar] [CrossRef]
  140. Chen, H.; Xie, H.; Sun, L.; Shang, T. Research on Tractor Optimal Obstacle Avoidance Path Planning for Improving Navigation Accuracy and Avoiding Land Waste. Agriculture 2023, 13, 934. [Google Scholar] [CrossRef]
  141. Cui, J.; Zhang, X.; Fan, X.; Feng, W.; Li, P.; Wu, Y. Path Planning of Autonomous Agricultural Machineries in Complex Rural Road. J. Eng. 2020, 2020, 239–245. [Google Scholar] [CrossRef]
  142. Santos, L.C.; Santos, F.N.; Valente, A.; Sobreira, H.; Sarmento, J.; Petry, M. Collision Avoidance Considering Iterative Bézier Based Approach for Steep Slope Terrains. IEEE Access 2022, 10, 25005–25015. [Google Scholar] [CrossRef]
  143. Wang, Y.J.; Pan, G.T.; Xue, C.L.; Yang, F.Z. Research on Model of Laser Navigation System and Obstacle Avoidance for Orchard Unmanned Vehicle. In Proceedings of the 2019 2nd International Conference on Informatics, Control and Automation, Hangzhou, China, 26–27 May 2019. [Google Scholar]
  144. Yang, J.; Ni, J.; Li, Y.; Wen, J.; Chen, D. The Intelligent Path Planning System of Agricultural Robot via Reinforcement Learning. Sensors 2022, 22, 4316. [Google Scholar] [CrossRef] [PubMed]
  145. Bansal, A.; Sikka, K.; Sharma, G.; Chellappa, R.; Divakaran, A. Zero-Shot Object Detection. In Proceedings of the European Conference on Computer Vision (ECCV), 2018, Munich, Germany, 8–14 September 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 384–400. [Google Scholar]
  146. Zhu, P.; Wang, H.; Saligrama, V. Zero Shot Detection. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 998–1010. [Google Scholar] [CrossRef]
  147. Hossain, M.S.; Rahman, M.; Rahman, A.; Mohsin Kabir, M.; Mridha, M.F.; Huang, J.; Shin, J. Automatic Navigation and Self-Driving Technology in Agricultural Machinery: A State-of-the-Art Systematic Review. IEEE Access 2025, 13, 94370–94401. [Google Scholar] [CrossRef]
  148. Chen, D.; Qi, X.; Zheng, Y.; Lu, Y.; Huang, Y.; Li, Z. Synthetic Data Augmentation by Diffusion Probabilistic Models to Enhance Weed Recognition. Comput. Electron. Agric. 2024, 216, 108517. [Google Scholar] [CrossRef]
  149. De Clercq, D.; Nehring, E.; Mayne, H.; Mahdi, A. Large Language Models Can Help Boost Food Production, but Be Mindful of Their Risks. Front. Artif. Intell. 2024, 7, 1326153. [Google Scholar] [CrossRef]
  150. Sun, L.; Jha, D.K.; Hori, C.; Jain, S.; Corcodel, R.; Zhu, X.; Tomizuka, M.; Romeres, D. Interactive Planning Using Large Language Models for Partially Observable Robotic Tasks. In Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 13–17 May 2024. [Google Scholar]
  151. Hori, C.; Kambara, M.; Sugiura, K.; Ota, K.; Khurana, S.; Jain, S.; Corcodel, R.; Jha, D.K.; Romeres, D.; Le Roux, J. Interactive Robot Action Replanning Using Multimodal LLM Trained from Human Demonstration Videos. In Proceedings of the ICASSP 2025—2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Hyderabad, India, 6–11 April 2025. [Google Scholar]
  152. Li, P.; An, Z.; Abrar, S.; Zhou, L. Large Language Models for Multi-Robot Systems: A Survey. arXiv 2025, arXiv:2502.03814. [Google Scholar] [CrossRef]
  153. Zhu, H.; Qin, S.; Su, M.; Lin, C.; Li, A.; Gao, J. Harnessing Large Vision and Language Models in Agriculture: A Review. Front. Plant Sci. 2025, 16, 1579355. [Google Scholar] [CrossRef] [PubMed]
  154. How, J.P.; Frazzoli, E.; Chowdhary, G.V. Linear Flight Control Techniques for Unmanned Aerial Vehicles. In Handbook of Unmanned Aerial Vehicles; Valavanis, K.P., Vachtsevanos, G.J., Eds.; Springer: Dordrecht, The Netherlands, 2015; pp. 529–576. ISBN 978-90-481-9707-1. [Google Scholar]
  155. Ren, Z.; Zheng, H.; Chen, J.; Chen, T.; Xie, P.; Xu, Y.; Deng, J.; Wang, H.; Sun, M.; Jiao, W. Integrating UAV, UGV and UAV-UGV Collaboration in Future Industrialized Agriculture: Analysis, Opportunities and Challenges. Comput. Electron. Agric. 2024, 227, 109631. [Google Scholar] [CrossRef]
  156. Bretas, I.L.; Dubeux, J.C.B., Jr.; Cruz, P.J.R.; Oduor, K.T.; Queiroz, L.D.; Valente, D.S.M.; Chizzotti, F.H.M. Precision Livestock Farming Applied to Grazingland Monitoring and Management—A Review. Agron. J. 2024, 116, 1164–1186. [Google Scholar] [CrossRef]
  157. Kim, J.; Kim, S.; Ju, C.; Son, H.I. Unmanned Aerial Vehicles in Agriculture: A Review of Perspective of Platform, Control, and Applications. IEEE Access 2019, 7, 105100–105115. [Google Scholar] [CrossRef]
  158. Yu, S.; Zhu, J.; Zhou, J.; Cheng, J.; Bian, X.; Shen, J.; Wang, P. Key Technology Progress of Plant-Protection UAVs Applied to Mountain Orchards: A Review. Agronomy 2022, 12, 2828. [Google Scholar] [CrossRef]
  159. Li, P.; Liu, D.; Baldi, S. Plug-and-Play Adaptation in Autopilot Architectures for Unmanned Aerial Vehicles. In Proceedings of the IECON 2021—47th Annual Conference of the IEEE Industrial Electronics Society, Toronto, ON, Canada, 13–16 October 2021; pp. 1–6. [Google Scholar]
  160. Ulus, S.; Ikbal, E. Lateral and Longitudinal Dynamics Control of a Fixed Wing UAV by Using PID Controller. In Proceedings of the 4th International Conference on Engineering and Natural Sciences, Kiev, Ukraine, 2–6 May 2018. [Google Scholar]
  161. Wei, X.; XianYu, W.; Jiazhen, L.; Yasheng, Y. Design of Anti-Load Perturbation Flight Trajectory Stability Controller for Agricultural UAV. Front. Plant Sci. 2023, 14, 1030203. [Google Scholar] [CrossRef] [PubMed]
  162. Surur, K.; Kabir, I.; Ahmad, G.; Abido, M.A. Optimal Gain Scheduling for Fault-Tolerant Control of Quadrotor UAV Using Genetic Algorithm-Based Neural Network. Arab. J. Sci. Eng. 2025. [Google Scholar] [CrossRef]
  163. Wu, H.; Liu, D.; Zhao, Y.; Liu, Z.; Liang, Y.; Liu, Z.; Huang, T.; Liang, K.; Xie, S.; Li, J. Establishment and Verification of the UAV Coupled Rotor Airflow Backward Tilt Angle Controller. Drones 2024, 8, 146. [Google Scholar] [CrossRef]
  164. Lotufo, M.A.; Colangelo, L.; Perez-Montenegro, C.; Novara, C.; Canuto, E. Embedded Model Control for UAV Quadrotor via Feedback Linearization. IFAC-Pap. 2016, 49, 266–271. [Google Scholar] [CrossRef]
  165. Shen, Z.; Tsuchiya, T. Singular Zone in Quadrotor Yaw–Position Feedback Linearization. Drones 2022, 6, 84. [Google Scholar] [CrossRef]
  166. Lee, D.; Ha, C.; Zuo, Z. Backstepping Control of Quadrotor-Type UAVs and Its Application to Teleoperation over the Internet. In Intelligent Autonomous Systems 12: Volume 2 Proceedings of the 12th International Conference IAS-12, Jeju Island, Republic of Korea, 26–29 June 2012; Lee, S., Cho, H., Yoon, K.-J., Lee, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 217–225. [Google Scholar]
  167. Saibi, A.; Boushaki, R.; Belaidi, H. Backstepping Control of Drone. Eng. Proc. 2022, 14, 4. [Google Scholar] [CrossRef]
  168. Bhowmick, P.; Bhadra, S.; Panda, A. A Two-Loop Group Formation Tracking Control Scheme for Networked Tri-Rotor UAVs Using an ARE-Based Approach. Asian J. Control 2022, 24, 2834–2849. [Google Scholar] [CrossRef]
  169. Sierra-García, J.E.; Santos, M. Intelligent Control of an UAV with a Cable-Suspended Load Using a Neural Network Estimator. Expert Syst. Appl. 2021, 183, 115380. [Google Scholar] [CrossRef]
  170. Sun, Z.; Xiao, M.; Li, D.; Chu, J. Tracking Controller Design for Quadrotor UAVs under External Disturbances Using a High-Order Sliding Mode-Assisted Disturbance Observer. Meas. Control 2025, 58, 155–167. [Google Scholar] [CrossRef]
  171. Maaruf, M.; Ahmad, S.S.; Hamanah, W.M.; Baraean, A.M.; Shafiul Alam, M.; Abido, M.A.; Shafiullah, M. Advanced Optimization Methods for Nonlinear Backstepping Controllers for Quadrotor-Slung Load Systems. IEEE Access 2025, 13, 66607–66621. [Google Scholar] [CrossRef]
  172. Ijaz, S.; Shi, Y.; Khan, Y.A.; Khodaverdian, M.; Javaid, U. Robust Adaptive Control Law Design for Enhanced Stability of Agriculture UAV Used for Pesticide Spraying. Aerosp. Sci. Technol. 2024, 155, 109676. [Google Scholar] [CrossRef]
  173. Lachowiec, J.; Feldman, M.J.; Matias, F.I.; LeBauer, D.; Gregory, A. Adoption of Unoccupied Aerial Systems in Agricultural Research. Plant Phenome J. 2024, 7, e20098. [Google Scholar] [CrossRef]
  174. Wen, S.; Zhang, Q.; Deng, J.; Lan, Y.; Yin, X.; Shan, J. Design and Experiment of a Variable Spray System for Unmanned Aerial Vehicles Based on PID and PWM Control. Appl. Sci. 2018, 8, 2482. [Google Scholar] [CrossRef]
  175. Yadava, R.; Aslam, A. Farming System: Quadcopter Fabrication and Development. In Advances in Engineering Design; Sharma, R., Kannojiya, R., Garg, N., Gautam, S.S., Eds.; Springer Nature: Singapore, 2023; pp. 285–293. [Google Scholar]
  176. Martins, L.; Cardeira, C.; Oliveira, P. Feedback Linearization with Zero Dynamics Stabilization for Quadrotor Control. J. Intell. Robot. Syst. 2020, 101, 7. [Google Scholar] [CrossRef]
  177. Villa, D.K.D.; Brandão, A.S.; Sarcinelli-Filho, M. Path-Following and Attitude Control of a Payload Using Multiple Quadrotors. In Proceedings of the 2019 19th International Conference on Advanced Robotics (ICAR), Belo Horizonte, Brazil, 2–6 December 2019; pp. 535–540. [Google Scholar]
  178. Xu, X.; Watanabe, K.; Nagai, I. Feedback Linearization Control for a Tandem Rotor UAV Robot Equipped with Two 2-DOF Tiltable Coaxial-Rotors. Artif. Life Robot. 2021, 26, 259–268. [Google Scholar] [CrossRef]
  179. Li, J.; Xie, H.; Low, K.H.; Yong, J.; Li, B. Image-Based Visual Servoing of Rotorcrafts to Planar Visual Targets of Arbitrary Orientation. IEEE Robot. Autom. Lett. 2021, 6, 7861–7868. [Google Scholar] [CrossRef]
  180. Shi, Y.; Ijaz, S.; He, Z.; Xu, Z.; Javaid, U.; Xia, Y. Adaptive Backstepping Integral Sliding Mode Control of Multirotor UAV System Used for Smart Agriculture. In Proceedings of the 2024 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Kuching, Malaysia, 6–10 October 2024; pp. 303–308. [Google Scholar]
  181. Mohamed, A.; El-Gindy, M.; Ren, J. Advanced Control Techniques for Unmanned Ground Vehicle: Literature Survey. Int. J. Veh. Perform. 2018, 4, 46–73. [Google Scholar] [CrossRef]
  182. Ao, X.; Wang, L.-M.; Hou, J.-X.; Xue, Y.-Q.; Rao, S.-J.; Zhou, Z.-Y.; Jia, F.-X.; Zhang, Z.-Y.; Li, L.-M. Road Recognition and Stability Control for Unmanned Ground Vehicles on Complex Terrain. IEEE Access 2023, 11, 77689–77702. [Google Scholar] [CrossRef]
  183. Wang, Q.; He, J.; Lu, C.; Wang, C.; Lin, H.; Yang, H.; Li, H.; Wu, Z. Modelling and Control Methods in Path Tracking Control for Autonomous Agricultural Vehicles: A Review of State of the Art and Challenges. Appl. Sci. 2023, 13, 7155. [Google Scholar] [CrossRef]
  184. Azimi, A.; Shamshiri, R.R.; Ghasemzadeh, A. Adaptive Dynamic Programming for Robust Path Tracking in an Agricultural Robot Using Critic Neural Networks. Agric. Eng. 2025, 80, 1–15. [Google Scholar] [CrossRef]
  185. Liu, L.; Wang, X.; Yang, X.; Liu, H.; Li, J.; Wang, P. Path Planning Techniques for Mobile Robots: Review and Prospect. Expert Syst. Appl. 2023, 227, 120254. [Google Scholar] [CrossRef]
  186. Utstumo, T.; Berge, T.W.; Gravdahl, J.T. Non-Linear Model Predictive Control for Constrained Robot Navigation in Row Crops. In Proceedings of the 2015 IEEE International Conference on Industrial Technology (ICIT), Seville, Spain, 17–19 March 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 357–362. [Google Scholar]
  187. Soitinaho, R.; Oksanen, T. Local Navigation and Obstacle Avoidance for an Agricultural Tractor With Nonlinear Model Predictive Control. IEEE Trans. Control Syst. Technol. 2023, 31, 2043–2054. [Google Scholar] [CrossRef]
  188. Song, Y.; Xue, J.; Zhang, T.; Sun, X.; Sun, H.; Gao, W.; Chen, Q. Path Tracking Control of Crawler Tractor Based on Adaptive Adjustment of Lookahead Distance Using Sparrow Search Algorithm. Comput. Electron. Agric. 2025, 234, 110219. [Google Scholar] [CrossRef]
  189. Wen, J.; Yao, L.; Zhou, J.; Yang, Z.; Xu, L.; Yao, L. Path Tracking Control of Agricultural Automatic Navigation Vehicles Based on an Improved Sparrow Search-Pure Pursuit Algorithm. Agriculture 2025, 15, 1215. [Google Scholar] [CrossRef]
  190. Hoffmann, G.M.; Tomlin, C.J.; Montemerlo, M.; Thrun, S. Autonomous Automobile Trajectory Tracking for Off-Road Driving: Controller Design, Experimental Validation and Racing. In Proceedings of the 2007 American Control Conference, New York, NY, USA, 9–13 July 2007; pp. 2296–2301. [Google Scholar]
  191. Wang, L.; Zhai, Z.; Zhu, Z.; Mao, E. Path Tracking Control of an Autonomous Tractor Using Improved Stanley Controller Optimized with Multiple-Population Genetic Algorithm. Actuators 2022, 11, 22. [Google Scholar] [CrossRef]
  192. Sun, Y.; Cui, B.; Ji, F.; Wei, X.; Zhu, Y. The Full-Field Path Tracking of Agricultural Machinery Based on PSO-Enhanced Fuzzy Stanley Model. Appl. Sci. 2022, 12, 7683. [Google Scholar] [CrossRef]
  193. Cui, B.; Cui, X.; Wei, X.; Zhu, Y.; Ma, Z.; Zhao, Y.; Liu, Y. Design and Testing of a Tractor Automatic Navigation System Based on Dynamic Path Search and a Fuzzy Stanley Model. Agriculture 2024, 14, 2136. [Google Scholar] [CrossRef]
  194. Wang, J.; Yang, L.; Cen, H.; He, Y.; Liu, Y. Dynamic Obstacle Avoidance Control Based on a Novel Dynamic Window Approach for Agricultural Robots. Comput. Ind. 2025, 167, 104272. [Google Scholar] [CrossRef]
  195. Qun, R. Intelligent Control Technology Of Agricultural Greenhouse Operation Robot Based On Fuzzy Pid Path Tracking Algorithm. INMATEH Agric. Eng. 2020, 62, 181–190. [Google Scholar] [CrossRef]
  196. Jiao, J.; Chen, J.; Qiao, Y.; Wang, W.; Wang, C.; Gu, L. Single Neuron PID Control of Agricultural Robot Steering System Based on Online Identification. In Proceedings of the 2018 IEEE Fourth International Conference on Big Data Computing Service and Applications (BigDataService), Bamberg, Germany, 26–29 March 2018; pp. 193–199. [Google Scholar]
  197. Gökçe, B.; Koca, Y.B.; Aslan, Y.; Gökçe, C.O. Particle Swarm Optimization-Based Optimal PID Control of an Agricultural Mobile Robot; “Prof. Marin Drinov” Publishing House of Bulgarian Academy of Sciences: Sofia, Bulgaria, 2021. [Google Scholar]
  198. Huang, P.; Zhang, Z.; Luo, X. Feedforward-plus-Proportional–Integral–Derivative Controller for Agricultural Robot Turning in Headland. Int. J. Adv. Robot. Syst. 2020, 17, 1729881419897678. [Google Scholar] [CrossRef]
  199. Liu, J.; Wu, X.; Quan, L.; Xu, H.; Hua, Y. Fuzzy Adaptive PID Control for Path Tracking of Field Intelligent Weeding Machine. AIP Adv. 2024, 14, 035045. [Google Scholar] [CrossRef]
  200. Mekonen, E.A.; Kassahun, E.; Tigabu, K.; Bekele, M.; Yehule, A. Model Predictive Controller Design for Precision Agricultural Robot. In Proceedings of the 2024 International Conference on Information and Communication Technology for Development for Africa (ICT4DA), Bahir Dar, Ethiopia, 18–20 November 2024; IEEE: Piscataway, NJ, USA; pp. 49–54. [Google Scholar]
  201. Mehndiratta, M.; Kayacan, E.; Patel, S.; Kayacan, E.; Chowdhary, G. Learning-Based Fast Nonlinear Model Predictive Control for Custom-Made 3D Printed Ground and Aerial Robots. In Handbook of Model Predictive Control; Raković, S.V., Levine, W.S., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 581–605. ISBN 978-3-319-77489-3. [Google Scholar]
  202. Zhang, Z.; Kayacan, E.; Thompson, B.; Chowdhary, G. High Precision Control and Deep Learning-Based Corn Stand Counting Algorithms for Agricultural Robot. Auton. Robots 2020, 44, 1289–1302. [Google Scholar] [CrossRef]
  203. Mitsuhashi, T.; Chida, Y.; Tanemura, M. Autonomous Travel of Lettuce Harvester Using Model Predictive Control. IFAC-Pap. 2019, 52, 155–160. [Google Scholar] [CrossRef]
  204. Wang, L.; Liu, M. Path Tracking Control for Autonomous Harvesting Robots Based on Improved Double Arc Path Planning Algorithm. J. Intell. Robot. Syst. 2020, 100, 899–909. [Google Scholar] [CrossRef]
  205. Kulathunga, G.; Yilmaz, A.; Huang, Z.; Hroob, I.; Singh, J.; Guevara, L.; Cielniak, G.; Hanheide, M. Navigating Narrow Spaces: A Comprehensive Framework for Agricultural Robots. IEEE Robot. Autom. Lett. 2025, 10, 9296–9303. [Google Scholar] [CrossRef]
  206. Li, Z.; Wang, W.; Zhang, C.; Zheng, Q.; Liu, L. Fault-Tolerant Control Based on Fractional Sliding Mode: Crawler Plant Protection Robot. Comput. Electr. Eng. 2023, 105, 108527. [Google Scholar] [CrossRef]
  207. Jiao, J.; Wang, W.; He, Y.; Wu, Y.; Zhang, F.; Gu, L. Adaptive Fuzzy Sliding Mode-Based Steering Control of Agricultural Tracked Robot. In Fuzzy Systems and Data Mining V; IOS Press: Amsterdam, The Netherlands, 2019; pp. 243–254. [Google Scholar]
  208. Din, A.; Ismail, M.Y.; Shah, B.; Babar, M.; Ali, F.; Baig, S.U. A Deep Reinforcement Learning-Based Multi-Agent Area Coverage Control for Smart Agriculture. Comput. Electr. Eng. 2022, 101, 108089. [Google Scholar] [CrossRef]
  209. Gökçe, C.O. Single-Layer Neural-Network Based Control of Agricultural Mobile Robot. Meas. Control 2023, 56, 1446–1454. [Google Scholar] [CrossRef]
  210. Liu, Y.; Wang, J.; Shi, Y.; He, Z.; Liu, F.; Kong, W.; He, Y. Unmanned Airboat Technology and Applications in Environment and Agriculture. Comput. Electron. Agric. 2022, 197, 106920. [Google Scholar] [CrossRef]
  211. Xu, Q. USV Course Controller Optimization Based on Elitism Estimation of Distribution Algorithm. In Proceedings of the 2014 IEEE Chinese Guidance, Navigation and Control Conference, Yantai, China, 8–10 August 2014; pp. 958–961. [Google Scholar]
  212. Liu, T.; Dong, Z.; Du, H.; Song, L.; Mao, Y. Path Following Control of the Underactuated USV Based On the Improved Line-of-Sight Guidance Algorithm. Pol. Marit. Res. 2017, 24, 3–11. [Google Scholar] [CrossRef]
  213. Li, L.; Wu, D.; Huang, Y.; Yuan, Z.-M. A Path Planning Strategy Unified with a COLREGS Collision Avoidance Function Based on Deep Reinforcement Learning and Artificial Potential Field. Appl. Ocean Res. 2021, 113, 102759. [Google Scholar] [CrossRef]
  214. Zhu, Z.; Hu, C.; Zhu, C.; Zhu, Y.; Sheng, Y. An Improved Dueling Deep Double-Q Network Based on Prioritized Experience Replay for Path Planning of Unmanned Surface Vehicles. J. Mar. Sci. Eng. 2021, 9, 1267. [Google Scholar] [CrossRef]
  215. Nugroho, H.; Xan, C.J.; Yee, T.J.; Yong, L.K.; Quan, L.Z.; Rusydi, M.I. Control System Development of Unmanned Surface Vehicles (USVs) with Fuzzy Logic Controller. In Proceedings of the 13th National Technical Seminar on Unmanned System Technology 2023—Volume 2; Md. Zain, Z., Ismail, Z.H., Li, H., Xiang, X., Karri, R.R., Eds.; Springer Nature: Singapore, 2024; pp. 83–94. [Google Scholar]
  216. Liu, Y.; Noguchi, N.; Yusa, T. Development of an Unmanned Surface Vehicle Platform for Autonomous Navigation in Paddy Field. IFAC Proc. Vol. 2014, 47, 11553–11558. [Google Scholar] [CrossRef]
  217. Temilolorun, A.; Singh, Y. Towards Design and Development of a Low-Cost Unmanned Surface Vehicle for Aquaculture Water Quality Monitoring in Shallow Water Environments. arXiv 2024, arXiv:2410.09513. [Google Scholar] [CrossRef]
  218. Griffiths, N.A.; Levi, P.S.; Riggs, J.S.; DeRolph, C.R.; Fortner, A.M.; Richards, J.K. Sensor-Equipped Unmanned Surface Vehicle for High-Resolution Mapping of Water Quality in Low- to Mid-Order Streams. ACS EST Water 2022, 2, 425–435. [Google Scholar] [CrossRef]
  219. Yanes Luis, S.; Peralta, F.; Tapia Córdoba, A.; Rodríguez Del Nozal, Á.; Toral Marín, S.; Gutiérrez Reina, D. An Evolutionary Multi-Objective Path Planning of a Fleet of ASVs for Patrolling Water Resources. Eng. Appl. Artif. Intell. 2022, 112, 104852. [Google Scholar] [CrossRef]
  220. Nguyen, A.; Ore, J.-P.; Castro-Bolinaga, C.; Hall, S.G.; Young, S. Towards Autonomous, Optimal Water Sampling with Aerial and Surface Vehicles for Rapid Water Quality Assessment. J. ASABE 2024, 67, 91–98. [Google Scholar] [CrossRef]
  221. Huang, H.; Wang, R.; Huang, F.; Chen, J. Analysis and Realization of a Self-Adaptive Grasper Grasping for Non-Destructive Picking of Fruits and Vegetables. Comput. Electron. Agric. 2025, 232, 110119. [Google Scholar] [CrossRef]
  222. Woon Choi, D.; Hyeon Park, J.; Yoo, J.-H.; Ko, K. AI-Driven Adaptive Grasping and Precise Detaching Robot for Efficient Citrus Harvesting. Comput. Electron. Agric. 2025, 232, 110131. [Google Scholar] [CrossRef]
  223. Palmieri, J.; Di Lillo, P.; Chiaverini, S.; Marino, A. A Comprehensive Control Architecture for Semi-Autonomous Dual-Arm Robots in Agriculture Settings. Control Eng. Pract. 2025, 163, 106394. [Google Scholar] [CrossRef]
  224. Jin, T.; Han, X. Robotic Arms in Precision Agriculture: A Comprehensive Review of the Technologies, Applications, Challenges, and Future Prospects. Comput. Electron. Agric. 2024, 221, 108938. [Google Scholar] [CrossRef]
  225. Kolhalkar, N.R.; Pandit, A.A.; Kedar, S.A.; Yedukondalu, G. Artificial Intelligence Algorithms for Robotic Harvesting of Agricultural Produce. In Proceedings of the 2025 1st International Conference on AIML-Applications for Engineering & Technology (ICAET), Pune, India, 16–17 January 2025; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar]
  226. Jin, T.; Han, X.; Wang, P.; Lyu, Y.; Chang, E.; Jeong, H.; Xiang, L. Performance Evaluation of Robotic Harvester with Integrated Real-Time Perception and Path Planning for Dwarf Hedge-Planted Apple Orchard. Agriculture 2025, 15, 1593. [Google Scholar] [CrossRef]
  227. Ali Hassan, M.; Cao, Z.; Man, Z. End Effector Position Control of Pantograph Type Robot Using Sliding Mode Controller. In Proceedings of the 2022 Australian & New Zealand Control Conference (ANZCC), Gold Coast, Australia, 24–25 November 2022; pp. 156–160. [Google Scholar]
  228. Liu, Z.; Lv, Z.; Zheng, W.; Wang, X. Trajectory Control of Two-Degree-of-Freedom Sweet Potato Transplanting Robot Arm. IEEE Access 2022, 10, 26294–26306. [Google Scholar] [CrossRef]
  229. Mueangprasert, M.; Chermprayong, P.; Boonlong, K. Robot Arm Movement Control by Model-Based Reinforcement Learning Using Machine Learning Regression Techniques and Particle Swarm Optimization. In Proceedings of the 2023 Third International Symposium on Instrumentation, Control, Artificial Intelligence, and Robotics (ICA-SYMP), Bangkok, Thailand, 18–20 January 2023; pp. 83–86. [Google Scholar]
  230. Li, M.; Liu, P. A Bionic Adaptive End-Effector with Rope-Driven Fingers for Pear Fruit Harvesting. Comput. Electron. Agric. 2023, 211, 107952. [Google Scholar] [CrossRef]
  231. Wei, Z.; Zhao, C.; Huang, Y.; Fu, X.; Li, J.; Li, G. A Super-Hydrophobic Tactile Sensor for Damage-Free Fruit Grasping. Comput. Electron. Agric. 2025, 239, 111043. [Google Scholar] [CrossRef]
  232. Kumar, S.; Mohan, S.; Skitova, V. Designing and Implementing a Versatile Agricultural Robot: A Vehicle Manipulator System for Efficient Multitasking in Farming Operations. Machines 2023, 11, 776. [Google Scholar] [CrossRef]
  233. Sriram, A.; R, A.R.; Krishnan, R.; Jagadeesh, S.; Gnanasekaran, K. IoT-Enabled 6DOF Robotic Arm with Inverse Kinematic Control: Design and Implementation. In Proceedings of the 2023 IEEE World Conference on Applied Intelligence and Computing (AIC), Virtual, 29–30 July 2023; pp. 795–800. [Google Scholar]
  234. Yoshida, T.; Onishi, Y.; Kawahara, T.; Fukao, T. Automated Harvesting by a Dual-Arm Fruit Harvesting Robot. ROBOMECH J. 2022, 9, 19. [Google Scholar] [CrossRef]
  235. Mapes, J.; Dai, A.; Xu, Y.; Agehara, S. Harvesting End-Effector Design and Picking Control. In Proceedings of the 2021 IEEE Symposium Series on Computational Intelligence (SSCI), Orlando, FL, USA, 5–7 December 2021; pp. 1–6. [Google Scholar]
  236. Seno, K.; Abe, T.; Tomori, H. Editorial Office Development of 2-DOF Manipulator Using Straight-Fiber-Type Pneumatic Artificial Muscle for Agriculture. J. Robot. Mechatron. 2025, 37, 64–75. [Google Scholar] [CrossRef]
  237. MarketsandMarkets Smart Agriculture Market Size, Share and Trends, 2025. Available online: https://www.marketsandmarkets.com/Market-Reports/smart-agriculture-market-239736790.html (accessed on 11 July 2025).
  238. La Rocca, P.; Guennebaud, G.; Bugeau, A. To What Extent Can Current French Mobile Network Support Agricultural Robots? arXiv 2025, arXiv:2505.10044. [Google Scholar] [CrossRef]
  239. IEEE 802.15.4-2020; IEEE Standards Association. Available online: https://standards.ieee.org/ieee/802.15.4/7029/ (accessed on 26 October 2025).
  240. Aldhaheri, L.; Alshehhi, N.; Manzil, I.I.J.; Khalil, R.A.; Javaid, S.; Saeed, N.; Alouini, M.-S. LoRa Communication for Agriculture 4.0: Opportunities, Challenges, and Future Directions. IEEE Internet Things J. 2024. [Google Scholar] [CrossRef]
  241. Zhivkov, T.; Sklar, E.I. 5g on the Farm: Evaluating Wireless Network Capabilities for Agricultural Robotics. arXiv 2022, arXiv:2301.01600. [Google Scholar] [CrossRef]
  242. Bluetooth/BLE Core Specification. Bluetooth® Technol. Website 2024. Available online: https://www.bluetooth.com/specifications/specs/core-specification-6-0/ (accessed on 26 October 2025).
  243. IEEE 802.11ax-2021; IEEE Standards Association. Available online: https://standards.ieee.org/ieee/802.11ax/7180/ (accessed on 26 October 2025).
  244. Ahmad, S.J.; Yasmin, S.; Khandoker, R.; Chowdhury, F.; Rahman, S.; Khatun, A.; Rajvor, P. A Case Study: A Review On Agriculture Robot. J. Emerg. Technol. Innov. Res. 2024, 11, b254–b263. [Google Scholar]
  245. Bicamumakuba, E.; Habineza, E.; Samsuzzaman, S.; Reza, M.N.; Chung, S.-O. IoT-Enabled LoRaWAN Gateway for Monitoring and Predicting Spatial Environmental Parameters in Smart Greenhouses: A Review. Precis. Agric. Sci. Technol. 2025, 7, 28–46. [Google Scholar] [CrossRef]
  246. Bailey, J.K. IoT and Generative AI for Enhanced Data-Driven Agriculture. Ph.D. Thesis, Purdue University Graduate School, West Lafayette, IN, USA, 2025. [Google Scholar]
  247. Nair, K.K.; Abu-Mahfouz, A.M.; Lefophane, S. Analysis of the Narrow Band Internet of Things (NB-IoT) Technology. In Proceedings of the 2019 Conference on Information Communications Technology and Society (ICTAS), Durban, South Africa, 6–8 March 2019; IEEE: Durban, South Africa, 2019; pp. 1–6. [Google Scholar] [CrossRef]
  248. Lauridsen, M.; Vejlgaard, B.; Kovacs, I.Z.; Nguyen, H.; Mogensen, P. Interference Measurements in the European 868 MHz ISM Band with Focus on LoRa and SigFox. In Proceedings of the 2017 IEEE Wireless Communications and Networking Conference (WCNC), San Francisco, CA, USA, 19–22 March 2017; IEEE: San Francisco, CA, USA, 2017; pp. 1–6. [Google Scholar] [CrossRef]
  249. LTE-M Global LTE-M Connectivity | Emnify. Available online: https://www.emnify.com/iot-supernetwork/global-iot-coverage/lte-m (accessed on 26 October 2025).
  250. RPMA—The World’s Premier IoT Solutions Provider. Available online: https://rpmanetworks.com/ (accessed on 26 October 2025).
  251. WavIoT WAVIoT—LPWAN Solutions for IoT and M2M. Available online: https://waviot.com/ (accessed on 26 October 2025).
  252. 4G/LTE-Advanced LTE vs LTE Advanced: Is 4G LTE Different from LTE Advanced?—Commsbrief. Available online: https://commsbrief.com/lte-vs-lte-advanced-is-4g-lte-different-from-lte-advanced/ (accessed on 26 October 2025).
  253. Akhila, S. Hemavathi 5G Ultra-Reliable Low-Latency Communication: Use Cases, Concepts and Challenges. In Proceedings of the 2023 10th International Conference on Computing for Sustainable Global Development (INDIACom), New Delhi, India, 15–17 March 2023; pp. 53–58. [Google Scholar]
  254. Kim, H. Enhanced Mobile Broadband Communication Systems*. In Design and Optimization for 5G Wireless Communications; IEEE: New York, NY, USA, 2020; pp. 239–302. ISBN 978-1-119-49444-7. [Google Scholar] [CrossRef]
  255. Dutkiewicz, E.; Costa-Perez, X.; Kovacs, I.Z.; Mueck, M. Massive Machine-Type Communications. IEEE Netw. 2017, 31, 6–7. [Google Scholar] [CrossRef]
  256. Informed, T. Agriculture Gets Boost from Bots and Portable 5G Network. Available online: https://techinformed.com/agriculture-gets-boost-from-bots-and-portable-5g-network/ (accessed on 11 July 2025).
  257. Dresden, T.U. Digitalization for a Sustainable Future in Agriculture: Successful Completion of the LANDNETZ Collaborative Project. Available online: https://tu-dresden.de/ing/maschinenwesen/die-fakultaet/news/digitalisierung-fuer-eine-nachhaltigere-landwirtschaft-der-zukunft-erfolgreicher-abschluss-des-verbundprojektes-landnetz?set_language=en (accessed on 11 July 2025).
  258. Lindenschmitt, D.; Fischer, C.; Haussmann, S.; Kalter, M.; Kallfass, I.; Schotten, H. Agricultural On-Demand Networks for 6G Enabled by THz Communication. arXiv 2024, arXiv:2408.15665. [Google Scholar] [CrossRef]
  259. Cheraghi, A.R.; Shahzad, S.; Graffi, K. Past, Present, and Future of Swarm Robotics. In Proceedings of the SAI Intelligent Systems Conference, Virtual, 2–3 September 2021; Springer: Cham, Switzerland, 2021; pp. 190–233. [Google Scholar]
Figure 1. Summary of core technologies and sensors for agricultural robots.
Figure 1. Summary of core technologies and sensors for agricultural robots.
Robotics 14 00159 g001
Figure 2. Various deployment platforms for agricultural robots.
Figure 2. Various deployment platforms for agricultural robots.
Robotics 14 00159 g002
Figure 3. VOS diagram illustrating the relationships and clusters of keywords derived from the selected papers, providing insights into thematic connections.
Figure 3. VOS diagram illustrating the relationships and clusters of keywords derived from the selected papers, providing insights into thematic connections.
Robotics 14 00159 g003
Figure 4. VOS diagram illustrating the relationships and groupings of techniques used in the selected papers, providing a visual overview of the methodological landscape within the reviewed literature.
Figure 4. VOS diagram illustrating the relationships and groupings of techniques used in the selected papers, providing a visual overview of the methodological landscape within the reviewed literature.
Robotics 14 00159 g004
Figure 5. Applications of UAVs in agriculture.
Figure 5. Applications of UAVs in agriculture.
Robotics 14 00159 g005
Figure 6. Agricultural UAV platforms: (a) Multi-rotor UAV, (b) Fixed-wing UAV, (c) Hybrid UAV (VTOL), (d) Unmanned Helicopter. The figures included here are AI-generated figures.
Figure 6. Agricultural UAV platforms: (a) Multi-rotor UAV, (b) Fixed-wing UAV, (c) Hybrid UAV (VTOL), (d) Unmanned Helicopter. The figures included here are AI-generated figures.
Robotics 14 00159 g006
Figure 7. UAV-mounted sensors: (a) RGB camera (DJI Zenmuse P1, DJI, Shenzhen, China), (b) Thermal camera (Teledyne FLIR VUE TV128 Payload, Teledyne FLIR, Wilsonville, OR, USA), (c) Multispectral camera (MicaSense RedEdge-P, MicaSense, Seattle, WA, USA), (d) Hyperspectral camera (Specim AFX10, Specim, Spectral Imaging Ltd., Oulu, Finland), (e) LiDAR (RESEPI Velodyne VLP-16, Inertial Labs, Paeonian Springs, VA, USA).
Figure 7. UAV-mounted sensors: (a) RGB camera (DJI Zenmuse P1, DJI, Shenzhen, China), (b) Thermal camera (Teledyne FLIR VUE TV128 Payload, Teledyne FLIR, Wilsonville, OR, USA), (c) Multispectral camera (MicaSense RedEdge-P, MicaSense, Seattle, WA, USA), (d) Hyperspectral camera (Specim AFX10, Specim, Spectral Imaging Ltd., Oulu, Finland), (e) LiDAR (RESEPI Velodyne VLP-16, Inertial Labs, Paeonian Springs, VA, USA).
Robotics 14 00159 g007
Figure 8. Applications of agricultural UGVs: (a) soil preparation, (b) crop monitoring and phenotyping, (c) robotic weeding, (d) harvesting, (e) plant treatment, (f) precision fertilizer application, (g) automatic seeding. The figures included here are AI-generated figures.
Figure 8. Applications of agricultural UGVs: (a) soil preparation, (b) crop monitoring and phenotyping, (c) robotic weeding, (d) harvesting, (e) plant treatment, (f) precision fertilizer application, (g) automatic seeding. The figures included here are AI-generated figures.
Robotics 14 00159 g008
Figure 9. Different configurations of mobile UGV platforms: (a) wheel-type UGV, (b) track-type UGV, (c) legged robot. The figures included here are AI-generated figures.
Figure 9. Different configurations of mobile UGV platforms: (a) wheel-type UGV, (b) track-type UGV, (c) legged robot. The figures included here are AI-generated figures.
Robotics 14 00159 g009
Figure 10. UGV-mounted sensors: (a) LiDAR (LEISHEN LS 32 Channel, LeiShen Intelligent System Co., Ltd, Shenzhen, China), (b) Depth camera (Intel RealSense D435i, Intel Corporation, Santa Clara, CA, USA), (c) Tracking camera (Intel RealSense T265, Intel Corporation, Santa Clara, United States), (d) RGB camera (ASUS C3 1080p HD USB Webcam, ASUSTeK Computer Inc, Taipei, Taiwan), (e) Ultrasonic sensor (HC-SR04), (f) RTK-GPS receiver (Emlid Reach RS3, Emlid Tech Kft, Budapest, Hungary), (g) Wheel encoder (Pololu, Las Vegas, NV, USA).
Figure 10. UGV-mounted sensors: (a) LiDAR (LEISHEN LS 32 Channel, LeiShen Intelligent System Co., Ltd, Shenzhen, China), (b) Depth camera (Intel RealSense D435i, Intel Corporation, Santa Clara, CA, USA), (c) Tracking camera (Intel RealSense T265, Intel Corporation, Santa Clara, United States), (d) RGB camera (ASUS C3 1080p HD USB Webcam, ASUSTeK Computer Inc, Taipei, Taiwan), (e) Ultrasonic sensor (HC-SR04), (f) RTK-GPS receiver (Emlid Reach RS3, Emlid Tech Kft, Budapest, Hungary), (g) Wheel encoder (Pololu, Las Vegas, NV, USA).
Robotics 14 00159 g010
Figure 11. End-effectors for harvesting robots: (a) Scissor-type end-effector, (b) Adsorption-type end-effector, (c) Finger-clamp-type end-effector. The figures included here are AI-generated figures.
Figure 11. End-effectors for harvesting robots: (a) Scissor-type end-effector, (b) Adsorption-type end-effector, (c) Finger-clamp-type end-effector. The figures included here are AI-generated figures.
Robotics 14 00159 g011
Figure 12. Application scenarios of unmanned surface vehicles in agriculture: (a) Weeding robot operation on paddy field. (b) Robot USV cultivating paddy seedlings. The figures included here are AI-generated figures.
Figure 12. Application scenarios of unmanned surface vehicles in agriculture: (a) Weeding robot operation on paddy field. (b) Robot USV cultivating paddy seedlings. The figures included here are AI-generated figures.
Robotics 14 00159 g012
Figure 13. Research focus distribution of agricultural robots (2015–2025) based on reviewed studies. The chart illustrates the relative attention given to different robot types—UGVs, UAVs, USVs, and robotic arms—in agricultural research during the past decade.
Figure 13. Research focus distribution of agricultural robots (2015–2025) based on reviewed studies. The chart illustrates the relative attention given to different robot types—UGVs, UAVs, USVs, and robotic arms—in agricultural research during the past decade.
Robotics 14 00159 g013
Figure 14. Agricultural navigation strategies overview [69]. (Copyright permission has been duly requested and granted to reuse the sensor fusion figure presented here. All credits for this figure go to the original authors as cited here.)
Figure 14. Agricultural navigation strategies overview [69]. (Copyright permission has been duly requested and granted to reuse the sensor fusion figure presented here. All credits for this figure go to the original authors as cited here.)
Robotics 14 00159 g014
Figure 15. Deep learning-based path segmentation and navigation line extraction of a corn field [110].
Figure 15. Deep learning-based path segmentation and navigation line extraction of a corn field [110].
Robotics 14 00159 g015
Figure 16. Object Detection pipeline using Yolo [121] (Copyright permission has been duly requested and granted to reuse the figure presented here. All credits for this figure go to the original authors as cited here).
Figure 16. Object Detection pipeline using Yolo [121] (Copyright permission has been duly requested and granted to reuse the figure presented here. All credits for this figure go to the original authors as cited here).
Robotics 14 00159 g016
Figure 17. Annotated obstacle on the farm. Blue cube is the 3D bounding box annotation. The figure included here is an AI-generated figure.
Figure 17. Annotated obstacle on the farm. Blue cube is the 3D bounding box annotation. The figure included here is an AI-generated figure.
Robotics 14 00159 g017
Figure 18. A block diagram illustrating a typical 5G private network deployment in an agricultural setting, showing connections between robots, drones, local edge servers, and potentially cloud backhaul. The figure included here is an AI-generated figure.
Figure 18. A block diagram illustrating a typical 5G private network deployment in an agricultural setting, showing connections between robots, drones, local edge servers, and potentially cloud backhaul. The figure included here is an AI-generated figure.
Robotics 14 00159 g018
Figure 19. A block diagram illustrating a MANET/FANET setup for an agricultural robot swarm, showing peer-to-peer communication links and dynamic topology. The figure included here is an AI-generated figure.
Figure 19. A block diagram illustrating a MANET/FANET setup for an agricultural robot swarm, showing peer-to-peer communication links and dynamic topology. The figure included here is an AI-generated figure.
Robotics 14 00159 g019
Table 1. Comparison of UAV types used in agriculture.
Table 1. Comparison of UAV types used in agriculture.
ParametersMulti-Rotor UAVsFixed-Wing UAVsHybrid (VTOL) UAVsUnmanned Helicopters
Weight Light   to   medium   ( 15 kg) Light   to   medium   ( 25 kg) Medium   to   heavy   ( 30 kg) Medium   to   heavy   ( 50 kg)
Payload capacityLow to moderate (1–5 kg)Moderate to high (5–20 kg)Moderate (2–10 kg)High (up to 30 kg)
EnduranceShort (15–45 min)Long (1–3 h)Moderate to long (45 min–2 h)Long (1–3 h)
ManoeuvrabilityExcellent (precise hover, agile)Moderate (requires forward motion)High (hover + forward flight)High (can hover and perform complex movements)
Operational flexibilityHigh (can operate in small fields, vertical take-off & landing)Medium (needs runway or catapult for launch)Very high (vertical take-off and landing + long range)High (suitable for spraying and imaging)
Required expertiseLow to moderateModerate to highHighHigh
Adoption rateVery high (most common)ModerateLow to moderateLow (specialised use)
Table 2. Comparison of the functionalities and features of various UAV-mounted sensors.
Table 2. Comparison of the functionalities and features of various UAV-mounted sensors.
SensorPrimary FunctionKey FeaturesApplications in AgricultureSpectral Range
Digital (RGB) cameraCaptures images in the visible light spectrum (Red, Green, Blue)Cost-effective, lightweight, provides true-colour imageryCrop and soil mapping, plant counting, canopy cover estimation, land use classification400–700 nm
Thermal cameraDetects infrared radiation to measure temperature differencesSensitive to temperature variation, useful for stress mappingWater stress detection, irrigation planning, plant health assessment, pest/disease hotspot identification 8 14   μ m
Multispectral cameraCaptures data in a few discrete spectral bands, including visible and near-infraredLimited bands (e.g., 4–10), good for vegetation indicesVegetation health monitoring (NDVI), crop stress detection, precision farming, yield estimation400–1000 nm
Hyperspectral cameraCaptures data across hundreds of narrow spectral bands across visible to shortwave IRHigh spectral resolution, detailed material characterizationCrop disease detection, nutrient analysis, soil property mapping, species identification400–2500 nm
LiDARUses laser pulses to measure distance and create 3D modelsProvides accurate 3D data, works in low-light conditionsCrop height measurement, biomass estimation, terrain modelling, canopy structure analysis905 nm & 1550 nm
Table 3. Summary of studies on agricultural Unmanned Aerial Vehicles (UAVs).
Table 3. Summary of studies on agricultural Unmanned Aerial Vehicles (UAVs).
ReferenceConfigurationApplicationSensorsAlgorithmsKey Outcomes
[37]Multi-rotor UAVPrecision spraying to protect plants from pests and diseasesN/AEulerian-Lagrangian modeling approach, Multi-Reference Frame (MRF) method, Discrete Phase Modeling (DPM), and turbulence models like SST k-ω for numerical simulationsPropeller-atomizer distance affects spraying efficiency
[38]Unmanned helicopter, rotary-wing UAV, and multi-rotor UAVSpraying field crops in precision agricultureRGB and multispectral camerasEfficient matching algorithms for UAV operations, including route planning and decision-making for swarm deploymentDeveloped a mathematical model to optimize UAV performance for precision crop spraying
[39]Multi-rotor UAV (quadcopter)Weed detection and selective herbicide sprayingGPS, flight controller sensors, and Raspberry Pi cameraDeep learning algorithms to detect and classify weedsA quadcopter built to detect and selectively spray herbicides
Developed system demonstrated improved weed management
[40]Fixed-wing UAVDigitization of agricultural landRGB cameraQuantum GIS (QGIS), Agisoft Metashape Professional, and Sputnik AgroUAVs provide accurate, efficient field digitization and mapping, outperforming traditional and satellite methods.
[41]Multi-rotor and Fixed-wing UAVsHigh-resolution aerial and multispectral imaging for agricultural land use classificationRGB and Multispectral camerasMaximum Likelihood Method and Single Feature Probability for image classificationAchieved nearly 90% accuracy in agricultural land use classification
[42]Fixed-wing and Rotary-wing UAVsRice lodging assessmentRGB camerasEDANet deep learning model for semantic segmentation and machine learning algorithms for autonomous UAV scoutingThe study achieved 99.25% accuracy in rice lodging prediction; scouting time reduced by 35% compared to conventional methods.
[43]Multi-rotor UAVRemote sensing in agricultureRGB camerasDistributed swarm control algorithm for multi-UAV systemsMulti-UAV system significantly improves efficiency, reduces working time, and solves battery shortage issues compared to single-UAV systems
[44]Multi-rotor UAVDetecting pest infestation symptoms on olive and palm trees, mapping plantations, and cooperating with e-traps for targeted pesticide sprayingFully stabilized 3-axis 1080p full HD video cameraImage stitching (MapsMadeEasy), mission planning (Pix4Dcapture), and UAV simulation/control (DroneKit-Python, DroneKit-SITL, MAVProxy)Demonstrated UAVs’ effectiveness in detecting crop infestations, mapping affected areas, and enabling targeted pesticide spraying
[23]Multi-rotor UAV (quadcopter)Spraying pesticides and fertilizers in agricultural fieldsAccelerometer, gyroscope, magnetometer, and GPSN/ASuccessfully developed the FREYR Drone, a GPS-enabled, Android-controlled quadcopter for pesticide application
[45]Multi-rotor UAV (quadcopter)3D monitoring of agricultural tree plantationsVisible-light and multispectral camerasObject-Based Image Analysis (OBIA) algorithms for image segmentation, classification, and geometric feature extractionUAV and OBIA technology achieved accurate 3D monitoring of agricultural trees, enabling efficient crop management.
[46]Multi-rotor UAVMulti-temporal imaging to monitor a sunflower crop and estimate NDVITetracam ADC Lite digital camera with multispectral sensorsMaximum Likelihood Classification (MLC) for image classification and linear regression modelsNDVI from UAV-acquired multispectral images can reliably predict sunflower crop yield, aerial biomass, and nitrogen content
Table 4. Comparison of different UGV configurations in agriculture.
Table 4. Comparison of different UGV configurations in agriculture.
UGV ConfigurationAdvantagesDisadvantagesUse-Case Scenario
Wheel-type UGV
-
Simple design and control
-
Energy efficient on flat or moderate terrain
-
High speed compared to other configurations
-
Cost-effective
-
Limited terrain adaptability (struggles in mud, uneven ground)
-
Reduced traction on slippery surfaces
-
Row-crop farming in structured fields
-
Precision tasks like spraying, seeding, phenotyping in orchards
Track-type UGV
-
Superior traction on rough, muddy, or soft soils
-
Low ground pressure reduces soil compaction
-
Enhanced stability on slopes
-
Higher energy consumption than wheeled systems
-
Slower speed
-
Increased maintenance (due to track wear)
-
Field preparation in wet/muddy areas
-
Operations in orchards with uneven terrain
Legged robot
-
Excellent terrain adaptability
-
Capable of navigating obstacles and soft soils
-
Minimal soil compaction
-
Complex design and control systems
-
High energy consumption
-
Expensive to build and maintain
-
Specialty farming in mountainous or uneven landscapes
-
Research and inspection in challenging terrain
Table 5. Summary of studies on agricultural Unmanned Ground Vehicles (UGVs).
Table 5. Summary of studies on agricultural Unmanned Ground Vehicles (UGVs).
ReferenceConfigurationApplicationSensorsAlgorithmsKey Outcomes
[59]Wheeled UGV with differential steering mechanismCollaboration with UAV for automatic weed detection and removalRGB cameras, LIDAR, and GNSSSSD MobileNetV1 and YOLOv8 machine-learning algorithmsSuccessfully demonstrated real-time collaboration between UAVs and UGVs for automated weed detection and removal
[60]Wheeled UGVComplete coverage path planning (CCPP) in agricultural fieldsN/AH-CCPP algorithm, combining features from O-CCPP and Fields2Cover, along with Dubins and Reeds-Shepp methods for turn generationThe study introduced H-CCPP, offering faster computation and better slope optimization than O-CCPP
[61]Four-wheel self-steering (with differential steering) UGVPath tracking of agricultural robots in unstructured farmlandsRTK-GNSS, angle sensorExtended disturbance observer-based sliding mode controller (EDO-SMC)The designed
EDO-SMC method suggests sufficient robustness in control of
the UGV, with more minor offsets indicating good
performance
[62]Wheeled UGV with four-wheel independent steering drive capabilities, including Ackermann model, Crab model, and Rotate modelAutonomous navigation for high-throughput phenotypic data collectionVisible light camera and RTK-GPS moduleSegFormer-B0 semantic segmentation model, Douglas-Peucker algorithm for path simplification, and the Pure Pursuit algorithm for path trackingHigh-precision autonomous navigation for phenotyping robots with lateral errors mostly within 2 cm in field environments.
[63]Wheeled UGV with skid-steering mechanismAgricultural applications, including tasks like ploughing, seeding, mowing, spraying, crop monitoring, and robotic harvestingStereo cameras, LiDAR, and IMU sensorsControl algorithms integrated with ROS2Developed a modular robot compatible with standard implements for versatile and stable operation on complex terrains
[64]Wheeled UGV with differential steering mechanismAgricultural dataset creation by capturing images of plantsRGB cameraArUco marker detection algorithms for navigation and positioningDeveloped a low-cost autonomous robot to efficiently create agricultural image datasets
[65]Wheeled UGV with differential steering mechanismAutonomous navigation for detecting and following crop rows in sugar beet fieldsDepth camera, tracking camera, and RTK-GPSU-Net deep learning-based segmentation algorithm for crop row detection, Triangle Scan Method (TSM), and proportional controller for visual servoingRobust crop row detection with an average angular error of 1.65° and displacement error of 11.99 pixels, outperforming the baseline
[66]Wheeled UGV with four-wheel steering (4WS) mechanismAutonomous navigation in GNSS-denied environments: orchards and vineyardsWheel and steering encodersPure Pursuit for path tracking and Vector Field Histogram (VFH) for obstacle avoidanceDeveloped a navigation algorithm enabling autonomous orchard robot operation in GNSS-denied environments
[8]Wheeled UGV with Ackermann steering mechanismAutonomous navigation in vineyards for field monitoring tasks3D stereoscopic cameras, multi-beam lidar, and ultrasonic sensorsPerception-based navigation algorithms: Augmented Perception Obstacle Map (APOM), 3D density mapping, and occupancy matrix calculationsAugmented perception combining 3D vision, lidar, and ultrasonics enhances autonomous navigation stability and safety in vineyard rows
[67]Wheeled UGVAutonomous crop harvestingN/ANeural adaptive PID control, multi-layer neural networks, and the prescribed performance control (PPC) techniqueNeural adaptive PID controller ensures tractor-trailer tracking with collision avoidance, connectivity, and robustness.
[68]Wheeled UGV with skid-steering mechanismAutonomous lawn mowing in agricultural applicationsDepth cameras, RP-Lidar-S1, Piksi Multi RTK module, SICK incremental encoders, and Xsens Mti-7 IMUDeepLabv3+ for semantic segmentation, point cloud reconstruction, and occupancy grid mappingImproved obstacle detection accuracy with a 38 cm average error
[69]Wheeled UGV with central articulated steering mechanismCrop row detection and mapping for under-canopy navigation of agricultural robotsDepth camera, RTK-GPS module, IMU, wheel and steering encodersRANSAC, Slicing-based clustering, Linear programming, Bayesian mapping, and Kalman filterAchieved reliable crop row mapping (MAE: 3.4 cm in corn, 3.6 cm in sorghum) and inter-row positioning (MAE: 5.0 cm in corn, 4.2 cm in sorghum)
[70]Wheeled UGV with four-wheel Ackermann Steering Mechanism (ASM)Spraying and shredding operations within vineyard rowsProximity sensors, mapping and navigation sensorsDynamic-Window Approach (DWA), Rapid Random Exploring Tree (RRT), and tracking controllersCooperative use of UAVs and UGVs in complex agricultural scenarios; developed innovative path planning and control systems
[71]Three-wheeled robot with an off-center rear castor wheel with differential drive mechanismPrecision in-row weed control in vegetable cropsRGB camera, Forward-facing camera for row detection, GPS module, and Wheel encodersSupport Vector Machine (SVM), Extended Kalman Filter (EKF), Hough Transform, Line-following algorithm, and motion estimationThe UGV achieved over 90% reduction in herbicide use while effectively controlling weeds
[72]Wheeled UGV with skid-steering mechanismHigh-throughput crop phenotypingLiDARs, RTK GPS, RGB cameras, inertial measurement unit (IMU), time-of-flight sensor, custom stereo camera, and fish-eye camerasPure Pursuit algorithm, Unscented Kalman Filter (UKF), RANSAC cylinder detection, and Extended Kalman Filter (EKF)The Robotanist: demonstrated a contact-based automated phenotyping using a ground robot capable of autonomous navigation in Sorghum plots
[73]Wheeled UGV with electro-hydraulic steering systemPest and weed controlGNSS receivers, cameras, ultrasonic sensors, laser range finders, and inertial measurement units (IMU)Simulated Annealing, NSGA-II, Genetic Algorithms for planning, OBIA for weed detection, Hough Transformation, and Theil-Sen Estimator for crop row detectionMulti-robot system reduced pesticide and herbicide use, improved precision in crop management
[74]Wheeled UGV with electro-hydraulic steering mechanismAccurate trajectory tracking between crop rows in challenging field conditionsRTK GPS antennas, potentiometer, inductive sensor, and wheel encoderRobust trajectory tracking error-based Model Predictive Control (MPC)Achieved accurate trajectory tracking for an autonomous tractor–trailer system
Table 6. Summary of studies on agricultural robotic arms and end-effectors.
Table 6. Summary of studies on agricultural robotic arms and end-effectors.
ReferenceConfigurationApplicationSensorsAlgorithmsKey Outcomes
[84]Parallel Cartesian robot armsTree fruit harvestingMultiple sensorsPath planning optimization algorithmsPicking success rate of 90% with cycle time reduced to 4.5–6 s using multiple arms operating in parallel
[85]RGB image-based pose estimation systemCitrus pose estimation and harvestingRGB camerasPose estimation algorithms combined with end-effector adjustmentSystem achieved 85% success rate with average picking time of 12 s
[80]Pneumatic finger-like end-effectorCherry tomato harvesting in greenhousePressure sensors, RGB-D cameraHand-picking dynamic measurement system, Arduino controlAverage cycle time of picking single cherry tomato: 6.4 s
[86]Contact force modelling systemApple harvestingForce sensorsVariable damping impedance control, Burgers modelImproved force control and dynamic performance compared to traditional impedance control
[87]Various end-effector typesCitrus harvestingRGB camerasImprovedYOLOv3 for citrus detection and localizationPicking success rate of 87% with average picking time of 10 s per fruit
[88]Robotic arm systemApple harvestingRTK-GPS, IMUMachine vision algorithmOptimal performance according to feedback from reverse kinematic equation algorithm
[18]Thin-film pressure sensor systemCherry tomato harvesting in greenhouseHigh-precision thin-film pressure sensor, six-axis attitude sensorPressure-sensing algorithmsEnhanced precision in fruit handling and damage reduction
[89]Multi-finger gripperPumpkin harvestingN/ADenavit–Hartenberg (D-H) methodDesigned end-effector can harvest different varieties of pumpkin with sufficient capability
[90]4 DOF Gripper systemPot seedlings transportationPosition sensors3D Bresenham algorithm, Region-based inverse kinematic equationsCycle time for pickup and dropping of each seedling: 3.5 s with 93.3% success rate
[91]Cable-driven gripperStrawberry harvestingRGB-D cameraMachine vision algorithmAverage cycle time of picking: 7.5 s; Success rate of 96.8% for isolated strawberries
[92]Bending mechanismAgaricus bisporus mushroomsForce sensorsForce optimization algorithmsBending method required least picking force and least picking time for detaching mushrooms
[93]Custom robotic arm gripperStrawberry harvestingRGB cameras, Wheel encoders, gyroscope, and ultra-wideband (UWB) indoor positioning systemTrilateration algorithm, machine vision algorithmsRobot capable of picking partially surrounded strawberries with success rates ranging from 50.0% to 97.1% on first attempt
[94]PLC-controlled systemStrawberry harvestingMachine vision sensorsDenavit–Hartenberg method, reverse kinematicsCurrent prototype picked strawberry in 4 s
Table 7. Comparison of autonomous navigation techniques in agriculture, with typical accuracy levels, key strengths, limitations, and suited environments.
Table 7. Comparison of autonomous navigation techniques in agriculture, with typical accuracy levels, key strengths, limitations, and suited environments.
Navigation MethodTypical AccuracyStrengths and ApplicationsLimitations and Suitable Environments
RTK GNSS (GPS)~2–5 cm (with RTK)Absolute positioning; well-suited for large open fields and straight rows (e.g., tractor guidance). Provides global coordinates for precise coverage.Signal dropout under canopy or indoors; requires clear sky view and base station. No info on crop-relative position. Best for open fields; unreliable in orchards, greenhouses.
Vision-Based (Camera)~5–10 cm relative (depends on feature)Low-cost sensor, rich information (color/texture) for row following and visual odometry. Effective under canopy or in orchards where GPS fails; can detect crop alignments and landmarks for in-row navigation.Sensitive to lighting changes and occlusion. Requires robust image processing or learning algorithms. Limited range and field of view. Works best in structured rows with consistent visual cues; challenged by night, fog, or uniform fields (e.g., mature wheat).
LiDAR-Based SLAM~1–10 cm locally (high-resolution mapping)Provides precise distance measurements and 3D mapping. Excellent for obstacle detection and mapping in varied terrain. Not affected by light; useful for orchard navigation (detecting tree rows) and unstructured fields (creating maps).High sensor cost and data rate. Performance drops in heavy dust or rain. Cannot identify object type (sees shape only). Suited for environments with geometric structure or when visual data is insufficient; may be overkill for simple open fields due to cost.
Multi-Sensor FusionN/A (improves consistency)Combines complementary sensors (e.g., GNSS + INS + camera) to mitigate individual weaknesses. Yields robust localization, e.g., GPS provides global fix, IMU smooths short-term motion, camera/LiDAR corrects drift. Enhances reliability in diverse conditions.Increased system complexity (calibration and synchronization required). Still constrained by environmental limitations (e.g., if all sensors degrade in certain conditions). Used across all environments: essential for high reliability in real farms but requires careful integration and tuning.
Table 8. Key sensor options for obstacle detection in agricultural robots, with their advantages and limitations (adapted from [147]).
Table 8. Key sensor options for obstacle detection in agricultural robots, with their advantages and limitations (adapted from [147]).
SensorAdvantagesLimitations
Vision (Camera, Stereo)Provides rich visual and depth information for obstacle recognition and localization. Cameras can detect texture and colour (useful to identify obstacle type, e.g., animal vs. rock) and stereoscopic vision gives 3D structure.Affected by lighting (night requires illumination, glare can blind cameras) and weather (fog, heavy rain). Depth range from stereo is moderate and accuracy decreases with distance. Best suited for moderate speeds and known obstacle appearances; often combined with learning algorithms for classification.
LiDARHighly accurate distance measurements and 3D mapping of obstacles. Effective day or night, not dependent on ambient light. Particularly good for structural obstacles (walls, trees) and creating a local map for path planning.Expensive and power-demanding at high performance. Can be degraded by dust, smoke, or rain (loss of returns). Provides shape but no inherent ability to distinguish material or colour (e.g., cannot tell a black tarp from a water puddle except by shape). Typically used on larger platforms or when precise obstacle contours are needed (e.g., navigating close to tree rows or equipment).
RadarUses radio waves to detect obstacles at relatively long range and in all weather conditions. Robust to dust, fog, and rain where optical sensors struggle. Automotive-style radars can detect large obstacles (vehicles, humans) and measure their relative speed, useful for detecting moving hazards.Lower resolution, small or thin objects (e.g., wires, slender plant stems) may be missed. Less effective for precise shape or terrain profiling. Often used as a complementary sensor to cover the cases when vision/LiDAR are blinded by weather.
UltrasonicEmits high-frequency sound pulses; good for short-range obstacle detection (a few meters). Inexpensive and simple; commonly used on small robots or tractors as proximity sensors (e.g., to stop if an object is very close). They work in darkness and are not affected by colour or transparency of objects.Very limited range and cone of detection, and poor angular resolution (hard to know exact direction of obstacle). Can be triggered by wind noise or certain ambient sounds. Suitable as a safety bumper or for slow-moving platforms in clutter where fine resolution is not needed beyond “object is near.”
Infrared/ThermalDetects heat signatures, enabling obstacle sensing in the dark and potential identification of warm-blooded animals or humans as distinct from the cooler crop environment. Thermal cameras have been used to detect living obstacles (people, livestock) even through mild obstructions like dust. Also useful for finding stressed plants or fires.Lower spatial resolution and influenced by ambient temperature changes (a hot day can reduce contrast between objects and background). Not typically used as a primary obstacle sensor, but rather for specific detection tasks (e.g., wildlife detection to avoid collisions).
Table 10. Agro-UGV control types and working scenario.
Table 10. Agro-UGV control types and working scenario.
Control MethodAgricultural VehicleWorking ScenarioNoteSource
PIDCrawler-type robotGreenhouseFuzzy PID[195]
Two drive wheels and two caster wheelsSeeding and fertilizing operationOnline Particle Swarm Optimization Continuously Tuned PID[196]
Four-wheel skid-steer agrobotSimulationParticle swarm optimized PID[197]
Crawler-type robotHeadland turningFeedforward PID[198]
Four-wheel weederSimulation of pat tracking controlFuzzy adaptive PID[199]
Model Predictive ControlDifferential drive wheeled robotMathematical model simulation for robot trajectory trackingMPC[200]
Wheeled mobile robot for weed controlRow crop navigationNonlinear MPC[186]
TerraSentia robotHigh precision path tracking in the presence of unknown wheel-terrain interactionLearning-based Nonlinear MPC[201,202]
Crawler-type robotLettuce harvestingMPC[203]
PPATwo-wheeled robot modelSimulation of path tracking control for autonomous harvesting robotsLarge-angle steering control and PPA[204]
Non-holonomic robotAutonomous navigation in narrow space (greenhouse)Adaptive trajectory tracking and regulated PPA control[205]
Crawler-type robotAutonomous full-coverage path planningPPA based on linear interpolation
Sliding Model controlCrawler-type robotPlant protectionFractional-order sliding mode control[206]
Crawler-type robotSteering control of robotFuzzy-Sliding mode control[207]
Autonomous agricultural vehicleSimulation of path-trackingFirst- and second-order sliding mode control
Learning/AI-based controlMulti-agent systemPatrolling for crop health monitoringCentralized Convolutional Neural Network (CNN)-based Dual Deep Q-learning (DDQN)[208]
Differential-driven mobile robotAutomatic robot control in farms and greenhouse (simulation)Single-layer neural-network controller[209]
Table 11. Control methods for USV and their applications.
Table 11. Control methods for USV and their applications.
Control MethodsApplicationNoteSource
PID controlAutonomous navigation to a predefined navigation map in a paddy fieldAchieved in-system RMS lateral error ≤ 0.45 m and RMS heading error ≤ 4.4 degrees in map-based navigation[216]
Extended Kalman Filter (EKF) + PWMAquaculture water quality monitoringLow-cost USV design with a total cost of approximately $1118. Communication uses an RC system for control and a local wireless network (2.4 GHZ ISM band) for telemetry/ROS data[217]
Differential controlHigh-resolution water quality mapping in low- to mid-order streamssmall pontoon-style USV (AquaBOT/HyDrone) with a high payload capacity (16 kg). Collects data with higher spatial resolution than manual grab sampling[218]
Multi-objective evolutionary approach (NSGA-II) and genetic algorithms (GAs)Water Monitoring/PatrollingMulti-agent or fleet optimization. Uses a graph-based formulation and messy individual representation for variable path lengths[219]
Multiple Traveling Salesperson Problem (MTSP) Formulation and Guided Local Search MetaheuristicMariculture Water Quality SamplingUSV is equipped with in situ water quality sensors. The method can reach near-optimal solutions in approximately 30 s. USV tours cover a larger spatial extent to maximize spatial information gain[220]
Table 12. Control methods for various levels of control domain of robotic arm and end-effector.
Table 12. Control methods for various levels of control domain of robotic arm and end-effector.
Control DomainControl MethodSource
High-Level Architecture & SafetyHierarchical Quadratic Programming (HQP), Control Barrier Functions (CBFs), Admittance Control, Hand-Guiding Control, Finite-State Machine (FSM)[223]
Motion & Trajectory ExecutionInverse Kinematics (IK),
Rapidly-exploring Random Tree (RRT) Algorithms, DSA-BiTRRT Algorithm, SMC, Linear Model Predictive Control (LMPC),
Model-Based Reinforcement Learning (MBRL)
[224,225,226,228,232,233,234]
End-Effector Grasping & SensingSensor-Based Grasping Force Control, Posture Dynamic Monitoring, Binary Code Feedback Motion Stop, Pneumatic Actuation Control (SF-PAM),
Vision-Based Depth Control
[221,230,235,236]
Table 13. Comparison of Networking Technologies for Agricultural Robotics.
Table 13. Comparison of Networking Technologies for Agricultural Robotics.
TechnologyRangeThroughputLatencyPowerNotes
ZigBee/802.15.4∼10–100 m20–250 kbpsLow (∼tens
ms)
Very lowMesh, many nodes,
field sensors
BLE∼10–100 m∼1 Mb/sLowLowMobile device interfaces, livestock tracking
Wi-Fi 6∼100–300
m
100+ Mb/sLow (∼ms)ModerateEdge video, robot control in greenhouses/fields
LoRa/LoRaWANSeveral km∼10–50 kbpsHigh (∼s)Very lowEnvironmental sensors, wide area monitoring
NB-IoT/SigfoxUp to km
range
∼10 kbpsHigh (∼s)Very lowSimple remote device tracking or sensing
4G/LTEWide area∼100 Mb/s∼10 msModerateBaseline cellular connectivity for robots
5G (private/public)Wide area1+ Gbps<10 msModerateReal-time control, video, multi-robot
coordination
THz/6GField/Very
remote
multi-GbpsUltra-lowTBDExperimental; future infrastructure for rural 6G
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nkwocha, C.L.; Adewumi, A.; Folorunsho, S.O.; Eze, C.; Jjagwe, P.; Kemeshi, J.; Wang, N. A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination. Robotics 2025, 14, 159. https://doi.org/10.3390/robotics14110159

AMA Style

Nkwocha CL, Adewumi A, Folorunsho SO, Eze C, Jjagwe P, Kemeshi J, Wang N. A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination. Robotics. 2025; 14(11):159. https://doi.org/10.3390/robotics14110159

Chicago/Turabian Style

Nkwocha, Chijioke Leonard, Adeayo Adewumi, Samuel Oluwadare Folorunsho, Chrisantus Eze, Pius Jjagwe, James Kemeshi, and Ning Wang. 2025. "A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination" Robotics 14, no. 11: 159. https://doi.org/10.3390/robotics14110159

APA Style

Nkwocha, C. L., Adewumi, A., Folorunsho, S. O., Eze, C., Jjagwe, P., Kemeshi, J., & Wang, N. (2025). A Comprehensive Review of Sensing, Control, and Networking in Agricultural Robots: From Perception to Coordination. Robotics, 14(11), 159. https://doi.org/10.3390/robotics14110159

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop