Next Article in Journal
Simulation of Winter Wheat Gross Primary Productivity Incorporating Solar-Induced Chlorophyll Fluorescence
Previous Article in Journal
Optimizing Growth, Physiology, and Saponin Production in Primula veris L. Through Tailored LED Light Spectra for Energy-Efficient Cultivation
Previous Article in Special Issue
Applications, Trends, and Challenges of Precision Weed Control Technologies Based on Deep Learning and Machine Vision
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Systematic Review of 59 Field Robots for Agricultural Tasks: Applications, Trends, and Future Directions

1
Department of Agricultural, Food, Environmental and Forest Sciences and Technologies (DAGRI), University of Florence, 50144 Florence, Italy
2
Department of Agriculture, Food and Environment (DAFE), University of Pisa, 56214 Pisa, Italy
*
Author to whom correspondence should be addressed.
Agronomy 2025, 15(9), 2185; https://doi.org/10.3390/agronomy15092185
Submission received: 1 August 2025 / Revised: 9 September 2025 / Accepted: 11 September 2025 / Published: 13 September 2025
(This article belongs to the Special Issue Research Progress in Agricultural Robots in Arable Farming)

Abstract

Climate change and labour shortage are re-shaping farming methods. Agricultural tasks are often hard, tedious and repetitive for operators, and farms struggle to find specialized operators for such works. For this and other reasons (i.e., the increasing costs of agricultural labour) more and more farmers have decided to switch to autonomous (or semi-autonomous) field robots. In the past decade, an increasing number of robots has filled the market of agricultural machines all over the world. These machines can easily cover long and repetitive tasks, while operators can be employed in other jobs inside the farms. This paper reviews the current state-of-the-art of autonomous robots for agricultural operations, dividing them into categories based on main tasks, to analyze their main characteristics and their fields of applications. Seven main tasks were identified: multi-purpose, harvesting, mechanical weeding, pest control and chemical weeding, scouting and monitoring, transplanting and tilling-sowing. Field robots were divided into these categories, and different characteristics were analyzed, such as engine type, traction system, application field, safety sensors, navigation system, country of provenience and presence on the market. The aim of this review is to provide a global view on agricultural platforms developed in the past decade, analyzing their characteristics and providing future perspectives for next robotic platforms. The analysis conducted on 59 field robots, those already available on the market and not, revealed that one fifth of the platforms comes from Asia, and 63% of all of them are powered by electricity (rechargeable batteries, not solar powered) and that numerous platforms base their navigation system on RTK-GPS signal, 28 out of 59, and safety on LiDAR sensor (12 out of 59). This review considered machines of different size, highlighting different possible choices for field operations and tasks. It is difficult to predict market trends as several possibilities exist, like fleets of small robots or bigger size platforms. Future research and policies should focus on improving navigation and safety systems, reducing emissions and improving level of autonomy of robotic platforms.

1. Introduction

Contemporary agriculture is undergoing profound transformations due to multiple challenges affecting work dynamics and production systems. Climate change is modifying cultivable areas, altering precipitation patterns, and significantly impacting temperatures [1]. The increasing frequency of extreme weather events has resulted in substantial production losses across the agri-food sector. Global discussions have increasingly focused on reducing the carbon footprint, with the food system contributing approximately 34% of annual greenhouse gas emissions (18 Gt CO2 equivalent) [2]. One strategy for mitigating agricultural emissions is the deployment of fleets of small- and medium-sized autonomous robots. These systems, powered by electric engines and batteries, are capable of covering extensive areas despite their compact dimensions. The integration of electric propulsion is more feasible in smaller platforms than in large-scale machinery, which remains comparable in size to conventional agricultural equipment. Traditional farming relies predominantly on diesel-powered machinery, which is poorly aligned with the sustainability objectives set forth by the European Green Deal. The projected increase in global population—expected to reach nearly two billion additional individuals over the next thirty years—places considerable pressure on agricultural systems to enhance productivity, efficiency, and sustainability. Meeting this demand will require a substantial increase in current agricultural output. In this context, automation has emerged as a key driver for transforming agricultural practices [3]. The sector has simultaneously experienced a continuous decline in the availability of specialized labour. This workforce shortage represents a significant constraint for farms requiring skilled personnel to operate machinery such as tractors and other equipment, which demand both technical expertise and meticulous maintenance [4,5,6]. Furthermore, workplace accidents remain prevalent, frequently arising from operator inattention or equipment ageing. Small-scale farms are particularly affected by financial constraints, limiting access to modern machinery and resulting in the continued use of equipment aged 50–60 years, which frequently fails to comply with current safety and environmental standards. Occupational safety considerations extend beyond physical hazards to include exposure to high noise levels and toxic emissions.
Recent advancements in robotics have addressed these challenges through the development of autonomous agricultural robots. Field deployment of these systems reduces the need for manual labour, enhances operator safety, increases crop yields, and mitigates environmental impact. Collaborative robots (cobots) represent an alternative approach, providing enhanced productivity, efficiency, and sustainability while facilitating human–robot interaction [7]. Human–robot collaboration (HRC) has been identified as critical for optimizing agricultural operations, integrating human expertise with robotic precision in tasks such as planting, harvesting, and pest management. Studies highlight the necessity of ergonomic and safe working environments, particularly when humans and robots operate in close proximity. Wearable sensors and other safety technologies have been investigated to mitigate associated risks [8]. HRC is also considered a transitional strategy toward full automation, enabling systems to overcome current technological limitations. The integration of cobots requires careful consideration of stakeholder engagement and socio-cultural factors to ensure effective and acceptable adoption [9].
The present work aims to provide a comprehensive state-of-the-art review of autonomous agricultural robots, encompassing both research prototypes and commercial systems. The study systematically defines and evaluates their principal functional characteristics, while critically analyzing the advantages and limitations of current technologies. This review further addresses the potential role of these systems in advancing sustainable, efficient, and safe agricultural practices.

2. Methodology

To gather the available material about robotics in agriculture, two different methods were used: for commercial robots, research was based on models available on the global market, while for prototypes, research material was gathered from the Scopus and Web Of Science platforms. The literature was filtered by document type (scientific articles and reviews), publication date in the time range from 2015 to 2025 (inclusive), filtering by language (English), and containing at least one of the following keywords: agricultural field robots, field-scouting robots, harvesting robots, fruit-picking robots, multi-tasking field robots, tool-carrier robots, and spraying robots. Even though 2025 had not yet ended during the writing of the paper, results from that year were still collected and examined to include the latest articles available. Figure 1 shows a diagram of the analysis conducted on the robotic platforms retrieved.
The evaluation of autonomous agricultural robots requires a comprehensive understanding of various performance metrics that together determine the efficiency, reliability, and overall effectiveness of these systems in agricultural operations. Among the most critical parameters is operational efficiency, which measures the robot’s capacity to complete assigned tasks over a given area within a specific time frame. High operational efficiency directly correlates with increased productivity and optimized resource allocation, reducing labour costs and operational downtime, and it serves as a primary indicator of the system’s economic viability [10]. Equally important is the failure rate, encompassing both hardware and software malfunctions. A lower failure rate is indicative of higher reliability, fewer interruptions, and reduced maintenance costs, and it allows for more consistent and predictable deployment in the field [11]. Task adaptability is another essential metric, reflecting the robot’s ability to adjust to diverse environmental conditions, varying crop types, and different operational requirements. Robots with high adaptability can handle complex tasks, such as navigating uneven terrain, detecting and avoiding obstacles, and switching between operations like planting, weeding, or harvesting, without human intervention [12,13]. Precision is particularly critical for tasks that demand high spatial accuracy, including selective harvesting, targeted pesticide application, or precision weeding. Enhanced precision minimizes crop damage, reduces waste, and contributes to overall yield quality [12]. Safety considerations are paramount, especially when autonomous robots operate in proximity to human workers, livestock, or delicate crops. Integrating advanced sensors and safety protocols allows robots to detect obstacles, prevent collisions, and operate safely in dynamic environments, which is essential for regulatory compliance and risk mitigation [13]. Additional performance indicators that are often evaluated include energy efficiency, which measures the amount of energy consumed per task or area covered, and reliability over extended periods of operation, which can reveal trends in wear and tear or long-term system stability [14,15]. Environmental robustness, including resistance to weather conditions such as rain, dust, or extreme temperatures, is another factor that can significantly affect operational outcomes [16]. Collectively, these interrelated parameters provide a multidimensional framework for assessing the overall performance of autonomous agricultural robots, guiding improvements in design, control algorithms, and operational strategies to maximize both productivity and sustainability in modern agriculture [10,14,15,16].

2.1. Scopus

The literature search conducted on Scopus resulted in 336 scientific documents. Figure S1 shows the trend in the number of annual publications within the selected time range: from 2015 to 2018, the number of published articles remained below 20 per year. Starting from 2019, the curve shows a gradual increase, continuing until 2024, which marks the peak year with 78 publications. Understandably, 2025 does not follow the same trend since the analysis was conducted in March and therefore includes only the first three months of the year. This trend highlights a growing interest from research institutions and universities, which increasingly recognize that the future of agriculture will be fully based on these technologies, thus creating a greater need each year to improve previous platforms. Figure S2 shows the distribution of published articles by field of study: 30% of the articles belong to the field of Computer Science, 26% to Agricultural and Biological Sciences, and 24.6% to Engineering, which together account for 80.3% of the total topics. This indicates that the fields of agriculture, engineering, and computer science converge in this context, and their collaboration is essential, as each provides a crucial contribution to the advancement of this technology. The technology combines the principles of computer science and engineering (robotic platforms, environmental recognition software, AI, sensors, etc.) with those of agriculture (fruit maturation, crop growth stages, correct pesticide dosages, pathogen recognition, etc.). Figure S3 shows the total number of articles published during the selected time frame by country: China dominates all other countries, contributing 225 out of the total 336 documents. Japan and the United States follow, with 28 and 17 scientific contributions, respectively.

2.2. Web of Science

The literature search conducted on Web of Science produced 394 scientific documents. Figure S4 shows the trend in the number of publications per year: as in the previous search, the publication trend is roughly the same, with a more pronounced increase starting from 2019/2020, and a slight decline in 2021, likely due to the COVID-19 pandemic. The peak year is 2024, with 113 scientific documents published. Figure S5 shows how the documents resulting from the search are distributed across fields of study: the three dominant areas are Multidisciplinary Agriculture, Computer Science, and Engineering Electrical and Electronic. Other significant areas include Agronomy, Robotics, Agricultural Engineering, and Computer Science Artificial Intelligence. Figure S6 shows the geographical distribution of the documents: as in the previous case, China holds the lead, contributing around 68% of the total publications. The USA follows with 5.3%, and Japan with 4.5%.

3. Agricultural Field Robots: General Overview

3.1. Definition and History

The definition of robot is debated [17]. The etymology of robot is the Czech word for self, worker or servant. The official RIA definition is “a reprogrammable, multifunctional manipulator designed to move material, parts, tools or specialized devices through various programmed functions for the performance of variety of tasks” [18]. Agricultural robots combine cutting-edge robotics, sensor technologies, artificial intelligence, and data analytics to enable precise environmental monitoring, autonomous decision-making, intelligent task control, and efficient automated operations. These systems enhance productivity, reduce labour dependency, and support more sustainable and data-driven farming practices [19,20,21,22]. Agricultural robots and automated machinery are mobile systems designed to efficiently carry out repetitive and labour-intensive tasks—such as weeding, tilling, planting, and harvesting—enhancing productivity and reducing the need for manual labour in modern farming operation. These kinds of machines are designed for crop management and agricultural production operations. They operate by integrating sensors such as cameras, LiDAR, sonars, IMUs, and other sensors with big data, artificial intelligence, the Internet of Things, and motion automation technologies. This enables the development of autonomous machines capable of collecting and processing data, making decisions based on that data, and operating without manual commands from humans [23]. Agricultural autonomous vehicle research began in the early 1960s and focused on developing automatic steering systems [24,25,26]. Most mechanical field crop farming operations in the 1990s employed large, robust, high-capacity gear that required a lot of energy and had significant handling and running costs. But over the last ten years, research at many universities and research centres around the world has undergone a paradigm shift: agricultural robot automation is now considered crucial for increasing overall productivity and should be able to eliminate labour-intensive manual tasks, lower production costs, and improve the quality of fresh produce [27,28].

3.2. Global Market

The global agricultural robots’ market is experiencing significant growth, driven by the need to increase agricultural productivity, address labour shortages, and adopt advanced technologies such as artificial intelligence and automation. Estimates for 2024 range between USD 10 and USD 16.6 billion. Ref. [29] values the market at USD 16.6 billion in 2024 and forecasts growth to USD 51 billion by 2029, with a compound annual growth rate (CAGR) of 25.2%. According to [30], the market is estimated at USD 14.74 billion in 2024, with a forecast to reach USD 48.06 billion by 2030, at a CAGR of 23%. Long-term projections suggest continued expansion: Ref. [31] predicts the market will reach USD 80.96 billion by 2033, with a CAGR of 20.1%.

3.3. Current Limitations

Food is responsible for approximately 26% of global greenhouse gas emissions [32]. Of this 26% of total ghg emissions, 27% is due to crop production, which includes the use of agricultural machineries such as tractors, combines, etc. A key area of agricultural robotics research focuses on reducing the environmental impact of farming equipment. Conventional machinery is powered by diesel engines, which emit large quantities of greenhouse gases. These engines are difficult to replace because they provide high torque, making them well-suited for demanding tasks that require significant power. Additionally, even under heavy use and high fuel consumption, conventional machines provide good operational autonomy, as they can be quickly refuelled using mobile fuel tanks. Research has been directed towards developing autonomous robots with hybrid and electric engines. Hybrid models, while still incorporating combustion engines, allow them to run at constant RPMs, effectively functioning as generators for high-efficiency electric motors. Fully electric models are gradually entering the agricultural landscape, though with challenges. Electric motors eliminate greenhouse gas emissions and noise pollution, but their main drawbacks are limited battery life and long recharge times, which can take several hours—significantly increasing operational downtime and discouraging widespread adoption. One of the biggest challenges hindering the adoption of these new technologies in agriculture is the ageing farmer population [33]. Many farmers are reluctant to overhaul their business management methods, which are based on years of experience, in favour of a new system where humans oversee operations while machines perform the actual cultivation. Since these technologies are still developing, they can be difficult to understand and accept without firsthand experience, leading many to underestimate the potential of agricultural robots. Moreover, ageing farming workforce continues to be vulnerable in terms of physical health, social isolation and poor access to health and support services [34]. Lots are the benefits from the utilization of agricultural robotics, but first it needs acceptance and accessibility from the farmers [35,36]. From the integration of robotics in manufacturing it is possible to learn that even though autonomous robots eliminate manual and tedious tasks, the workforce needs to be educated on how to collaborate with robots, how to operate and maintain this kind of system [37]. The same should be performed with agricultural robotics, by educating and training the workforce to cooperate with robots [38]. The costs associated with these new approaches to farming remain high, although they eventually pay for themselves by delivering a range of benefits. However, as these technologies are still in development, ongoing research and innovation costs make it difficult to introduce competitively priced alternatives to traditional machinery. This challenge is particularly significant for small farms, which often face constraints in experimenting with and investing in new equipment. As a result, many continue to rely on outdated but affordable machinery. Additionally, compared to traditional systems, smart farming technologies involve a vast array of inputs that must be managed and interpreted, along with numerous sensors that require monitoring. For many farmers, learning how to use these systems from scratch presents a significant barrier to adoption. Another critical issue is data management and privacy, a growing global concern. Many farmers are hesitant to adopt cloud-based systems, artificial intelligence, and data analysis tools due to concerns that personal and operational data may be misused, potentially putting their business at risk. Unlike structured industrial environments, agricultural settings are highly dependent on weather events and crop growth cycles. This means that robots must operate in constantly changing environments, where leaves change colour or fall, fruits ripen and alter in size and hue, and each plant has a unique shape, even within structured cultivation systems. This complexity demands highly advanced artificial intelligence training and highly accurate recognition systems capable of identifying plants and their components throughout their life cycle.

4. Field Robots: Main Tasks and Description

There are several ways to identify an agricultural field robot inside a precise category: it is possible to divide and identify robots by their traction system (wheeled, tracked, legged), by their motorization (endothermic, hybrid, electric or solar powered), by the field of application (orchards, horticulture, open field, greenhouse), by the navigation system (basically which sensor they rely on: LiDAR, RGB cameras, RTK-GPS, sonar, IMU) and by the identification as commercial robot (whether it is already available on the market, intended solely for scientific purposes, or developed as part of a start-up or spin-off project). For this review, field robots are categorized by their main function, and divided into seven categories: multi-purpose robots (including multi-tasking and tool-carrier robots), harvesting robots, mechanical weeding robots, pest control and chemical weeding robots, transplanting robots tilling-sowing robots and scouting-monitoring robots.
In this review, UAVs (Unmanned Aerial Vehicles) were not considered, as the focus was placed on Unmanned Ground Vehicles (UGVs). UAVs are aerial robots that can serve as field robots if equipped with cameras or optical sensors for the creation of prescription maps and canopy scouting, paired with various sensors to allow site-specific operations, or with tools for precision spraying using micro-sprayers.
Table S1 shows the comparison between all the 59 retrieved solutions.

4.1. Multi-Purpose Robots

Ref. [39] developed a multipurpose robot for sowing, pruning and harvesting under solar panels that uses a two orthogonal axes mechanism for tool positioning, 4 wheels and a camera. Ref. [40] is an autonomous electric vineyard hovering robot, from Vitibot [41], that navigates using RTK-GNSS signals and cameras to follow the rows of the vineyard. It can carry on multiple attachments. Its safety system consists of a LiDAR sensor and bumpers placed around the tyres to prevent collisions with objects or people. Ref. [42] is a hybrid robot, designed from AgXeed, for open-field operations. Thanks to the two three-point linkage it is able to lift weights up to 8 tons with the rear linkage and up to 3 tons with the front linkage. Its navigation system relies on RTK-GNSS signal, and the obstacle detection system relies on multiple sensors: LiDAR, ultrasonic, radar and bumpers. Ref. [43] is an electric autonomous platform, from Mula, that uses RTK-GPS technology to navigate in the fields and complete selected task while using commercial tools thanks to the CAT I and CAT II. Ref. [44] is a hybrid robotic platform, from BlackShire, mostly designed for orchards and vineyards, with front and rear hitches that make it able to carry multiple tools like sprayers, mowers, tillers and so on, while ensuring high safety with bumpers and RTK-GPS system for the navigation. Ref. [45] is a robotic crawler, developed by Pellenc, for vineyards that navigates between the vine rows using RTK-GNSS technology and ensuring safety for nearby operators with LiDAR sensor and contact bumpers; it is able to use different commercial tools inside the vineyard. Ref. [46] was designed and developed by Pek as an electric base platform for orchard management. It is capable of performing all agrocycle operations by carrying various specially designed tools and navigating in GNSS-denied environments, thanks to AI-powered sensors, an IMU, and a radio locator. Ref. [47] is an unmanned electric field robot, designed by AutoAgri, for open-field crop operations with a CAT 2 3-hitch point compatible with various tools. RTK-GPS ensures safe navigation while LiDAR sensor, GPS fence and 2x HD cameras ensure safety for operators. Ref. [48] is an electric tool carrier for vegetable and large-scale crops [41], from Naio Technologies, designed to carry on different tools thanks to the large ventral clearance and navigate through the field by using RTK-GPS technology and keeping nearby operators safe with a LiDAR sensor, bumpers, and a geo-fencing module. Ref. [49] describes a small, fully electric robot designed for vegetable farms and small-scale growers from Naio Technologies. It is equipped with GNSS RTK for centimetre-level autonomous navigation and likely uses basic vision or row-following sensors to maintain its path along crop rows. Oz can tow or operate weeding tools such as hoes, harrows, or seeders. Its lightweight design minimizes soil compaction, and its safety system includes automatic stop features in case of error or obstacle detection. Ref. [50] is an electric, straddle-type robot developed for vineyards from Naio Technologies. It features GNSS RTK, LiDAR, and vision sensors to navigate autonomously between vine rows and precisely perform mechanical weeding or canopy maintenance. Ted is compatible with a wide range of tools (e.g., discs, brush weeders, defoliators), and it includes obstacle detection systems, geofencing, and remote supervision capabilities. Its design helps reduce herbicide use and improves working efficiency in perennial crops like grapes. Ref. [51] describes a compact, tracked robot tailored for tight vineyard layouts and sloped terrain. It is equipped with dual 3 kW electric motors, GNSS RTK, and safety bumpers with obstacle detection. Jo operates autonomously between rows, using automated U-turn capabilities and real-time positioning to manage cultivation and weeding tasks. Its geofencing system ensures safe operation, and it can work with precision tools while minimizing crop disturbance and manual labour needs. Ref. [52] describes a multitasking robot specifically, designed by Free Green Nature, for vineyards operations with dedicated tools, able to navigate through the vine rows with IMU sensor, RTK-GNSS, AI-powered camera and radar, also ensuring safety with bumpers and geo-fencing. It can be configured with two different traction settings: rubber-tyres and semi-tracked. Ref. [53] is a diesel-powered autonomous robot, designed by Agrointelli [54], for open-field crops, or open-field crops, featuring a 3-point hitch with a lifting capacity of up to 1.2 tons, which allows the use of a wide range of agricultural tools. Dual RTK-GPS ensures safe navigation, while the LiDAR sensor, bumpers, and geofencing ensure safety for people nearby. Ref. [55] is a fully electric autonomous tool-carrier, from Amos, powered by rechargeable batteries, offering 4–8 h of runtime per charge. It is equipped with an advanced sensor suite including GPS, LiDAR, radar, and stereo vision for precise navigation and obstacle detection. Ref. [56] describes an autonomous multipurpose field robot designed by Earthsense to navigate all-terrain conditions by relying on computer vision and machine learning, thereby eliminating the dependence on RTK-GPS and enabling the robot to predict its navigation path. Suitable for orchards and open-field crops, it can be paired with a variety of tools for comprehensive crop management. Ref. [57] is an electric tool-carrier autonomous robot, from XAG, that can be configured and equipped with a cargo rack, a mower or a spraying system for different field applications, mainly orchards operations.

4.2. Harvesting Robots

Ref. [58] presented a prototype of an autonomous tracked robot for harvesting and fruit detection and monitoring. The complex is composed of a robotic mobile platform and a mobile meteorological station that allows the robot to autonomously harvest, monitor and acquire data about the cultivation. Ref. [59] developed a tracked robotic platform equipped with a 6-DOF arm and a gripping end effector to harvest tomatoes during nighttime in green-house. The robot works with YOLOv5+HSV and a vision system composed of a ZED 2i stereo camera and an illuminations system both fixed on the lifting system of the robotic arm. The success rate and harvesting time for nighttime were very similar to the ones during daytime, thus demonstrating the robustness of the system designed. Ref. [60] developed Vegebot, a mobile harvesting platform that uses a bespoke vision and learning system to localize and classify iceberg lettuce. By using a cutting end effector with a soft gripper and a camera, this platform demonstrated a localization success rate of 92% and classification accuracy of 82%, with an average cycle time of 31.7. The harvest success was 88.2% and the damage rate was 38%. Ref. [61] proposed an autonomous robotic platform for strawberry picking by integrating a YOLOv4 model to locate mature fruits from RGB images, then operating by using a Delta arm configuration manipulator with a five-finger structure. Tested over five different scenarios, it achieved an overall efficiency of 71.7%, with a minimum of 37.5% (scenario with occlusions and strawberries’ clusters) and a maximum of 94.0% (easiest scenario). Ref. [62] is strawberries harvesting platform, designed by Agrobot, that can deploy up to 24 robotic arms and autonomously navigate through the field or the green house, recognize the fruits thanks to AI, short-range integrated colour and infrared depth sensors. It also uses a LiDAR sensor for worker safety and has a virtual perimeter that works as a stop fence. Ref. [63] is a solar-powered harvest crates transporter, from Ant Robotics, able to follow the rows and detect humans thanks to stereo cameras, without a GPS system, reducing non-productive time and muscles strain on workers. Ref. [64] is a hybrid (diesel-electric) autonomous asparagus harvester, from AVL Motion, that uses an RGB camera, AI and LiDAR to follow the rows and harvest with 12 harvesting modules. Ref. [65] proposes a fully integrated, autonomous and innovative harvesting robot to overcome the challenges due to the dense growth characteristics of Agaricus bisporus. Ref. [66] developed and validated the necessary components for an autonomous truss tomato harvesting robot, integrating the image processing method using YOLOv5 network. Ref. [67] designed and constructed a prototype apple harvesting manipulator that includes a vacuum three-rotational degrees-of-freedom end-effector, a 3-DoF Cartesian system, an RGB-D camera and a mobile vehicle.

4.3. Mechanical Weeding Robots

Ref. [68] tested an autonomous AWD mowing robot designed for sustainable under-vine weed control in vineyards. The robot is equipped with an RTK- GNSS system for accurate navigation assessment and ultrasonic sensors for obstacle avoidance. It performs mechanical mowing of inter-row and under-row areas using a ventral cutting disc with razor blades. The robot operates autonomously within predefined boundaries using metal wire, and safety is ensured through emergency stop systems and real-time monitoring. This solution reduces herbicide use and promotes environmentally friendly weed management in perennial cropping systems. Ref. [69] developed and evaluated an autonomous weeding robot for strawberry fields. Based on the DIN-LW-YOLO model, the system uses computer vision to detect strawberry seedlings, weeds, irrigation pipes, and weed growth points, enabling precise laser targeting. By leveraging an irrigation pipe-based navigation system and enhancements in multi-scale attention and deformable convolutions, the robot achieves a 92.6% weed removal accuracy with a 1.2% crop damage rate. Ref. [70] introduced an intelligent intra-row weeding system designed to address challenges in mechanical weeding. The system features an electric swing-type opening and closing mechanism, integrating deep learning-based cabbage detection to enable precise weed removal while minimizing crop damage. Laboratory and field tests demonstrate a weeding accuracy of up to 96.67% at low speeds, with a minimum crop injury rate of 0.83%. However, as speed increases, accuracy declines while crop injury rises. The study highlights the system’s adaptability in real-field environments and suggests future optimizations in control algorithms to enhance response time and precision, supporting the advancement of precision agriculture. Ref. [71] presented the development of a four-wheeled autonomous robot designed for weed and grass management in vineyards. The study derives a kinematic model of the Vitirover robot and proposes a Model Predictive Control (MPC) strategy to optimize trajectory tracking while prioritizing weed removal. The MPC formulation integrates dynamic weighting to adjust control objectives based on weed detection, ensuring precise navigation and obstacle avoidance. Simulation tests in Gazebo validate the approach, demonstrating effective weed management and path-following capabilities in vineyard environments. The study highlights the robot’s potential for sustainable and autonomous weed control, reducing reliance on chemical herbicides while enhancing efficiency in precision agriculture. Ref. [72] describes a solar-powered robot designed by Moondino for mechanical weeding in rice cultivation that uses RTK-GNSS technology to navigate inside the rows. Ref. [73] describes a lightweight mechanical weeding robot, from Aigro, equipped with dual RTK-GNSS system and proximity sensors, which are involve in the in-row navigation. The framework is a tool-carrier for weeding tools, like mowers or passive tools. Ref. [74] describes a solar-powered autonomous agricultural robot, designed by Aigen, which uses low-power AI to analyze field data and identify areas with weeds for mechanical removal. Ref. [75] describes a solar-powered robot, from Earth Rover, that involves concentrated light and eight built-in cameras to detect, identify and target weeds without harming the cultivation, while scouting the crop and creating a replica of the farm in the control system.

4.4. Pest Control and Chemical Weeding Robots

The RHEA project [76] developed a fleet of autonomous Unmanned Ground Vehicles (UGVs) and Unmanned Aerial Vehicles (UAVs) designed to perform precise and eco-friendly weed and pest control. The UGVs were based on modified commercial tractors and equipped with RTK-GNSS, cameras, LiDAR, ultrasonic sensors, and safety programmable logic controllers. They carried implements for patch spraying, thermal weed control (using liquefied petroleum gas), and canopy spraying in crops like maize, wheat, and olive trees. The UAVs, hexacopter drones, featured multispectral cameras (RGB + NIR) and GNSS, enabling high-resolution weed mapping via image mosaicking and object-based image analysis (OBIA). The robots operated collaboratively through a centralized mission manager, enabling task planning, navigation, and real-time perception. Together, these machines reduced pesticide use by up to 75% while increasing safety and efficiency in precision agriculture. Ref. [77] developed an autonomous tracked spraying vehicle equipped with a stereo camera, GPS and ultrasonic sensors to move autonomously inside greenhouses and mountainous areas while using a smart spraying system capable of optimizing the spraying operations. Ref. [78] developed a weed detection and target spraying robot for cotton fields that employs CBAM module, BiFPN structure and Bilinear interpolation algorithm to learn and distinguish weeds from cotton seedlings with a mAP of 98.43%. This weed detection model is deployed on a spraying wheeled robot that achieves an effective spraying rate of 98.93%. Ref. [79] developed a semi-autonomous spraying robot equipped with an integrated anemometer, pressure gauge and flow metre linked to a microprocessor that allow it to make real-time decisions over variables like pressure, speed and discharge. Ref. [80] provided an enhancement of a flexible robotic spraying platform designed for remote plant inspection using high-quality thermal imagery. The platform, originally developed for agricultural spraying, has been upgraded with a thermal/optical camera and advanced electronic systems to enable semi-autonomous remote monitoring. The proposed design ensures lightweight operation, minimal soil compaction, and precise thermal imaging while maintaining cost-effectiveness. The integration of machine vision, GPS, and autonomous navigation capabilities highlights its potential in precision agriculture, supporting sustainable and efficient farming practices. Ref. [81] presented a smart robotic system designed for efficient herbicide application in rice fields using a YOLOv5-based machine learning framework. The system integrates image recognition, AI-driven weed detection, and precise herbicide spraying to optimize weed control while minimizing chemical use. Field experiments demonstrated a high accuracy of 98% in weed identification and a 95% weed control rate, significantly outperforming traditional methods. The results highlight the potential of AI-powered robotics to enhance agricultural efficiency, reduce environmental impact, and improve crop yields, paving the way for sustainable and precision-based farming practices. Ref. [82] developed and evaluated an intelligent multivariable spraying robot for orchards and nurseries. The robot integrates 3D-LiDAR for precise plant detection and employs a multivariable spraying model that adjusts flow rate, air volume, droplet size, and spray direction dynamically. Experimental results demonstrate that the robot significantly reduces pesticide usage—by 83% compared to conventional spraying-while ensuring effective coverage and improved uniformity. The findings highlight the potential of this technology to enhance sustainability and efficiency in precision agriculture. Ref. [83] describes an all-electric, four-wheel-drive autonomous robot from Kilter [84] powered by batteries and guided by AI and vision systems for precision weeding. Ref. [85] describes a gasoline-powered, all-wheel-drive sprayer robot with LiDAR and RTK GPS, designed by Yanmar [41] for precision vineyard treatment on slopes and narrow rows. Ref. [86] is an electric, battery-powered robot from Agrobot that uses a vision-guided suction system to detect and remove pests in horticultural crops. Ref. [87] is a battery-electric autonomous sprayer, designed by Ant Robotics, with radar, GPS, and cameras, ideal for precise and quiet operations in tunnels and greenhouses. Ref. [88] is a hybrid diesel-electric 4WD platform, from AgBot, equipped with multispectral sensors and LiDAR for seeding, fertilization, and data collection in large fields. Ref. [89] developed a mobile platform with RGB and IR cameras able to calculate the plant bulk and distribute the most correct amount of chemical solution per plant based on its bulk volume. Ref. [90] is a solar-powered, electric-drive robot, developed by Ecorobotix [91], using AI and vision to identify and microdose weeds for ultra-targeted herbicide application.

4.5. Transplanting Robots

Ref. [92] designed a crawler-type sweet potato transplanting machine, which can accomplish a variety of transplanting trajectories and conduct automatic replanting, without a real-time tracking of the machine, only using the proximity switch to mark the initial position of the manipulator arm. Ref. [93] proposes a selective transplanting robot solution and design the selective intelligent seedling picking framework based on deep learning.

4.6. Tillage-Sowing Robots

Ref. [94] describes a fully solar-powered electric robot, from FarmDroid [95], that uses RTK GPS to perform ultra-precise seeding and mechanical weeding without cameras or sensors. Ref. [96] describes a tracked electric robot, from EarthSense, with advanced soil sensors and cameras, designed for microbiological analysis and soil mapping in rugged terrain. Ref. [97] developed a robot capable of performing operations like automatic tilling-sowing and seed dispensing by navigating in the field thanks to ultrasonic sensor. Ref. [98] developed a seed-sowing robot that relies on infra-red sensors to get tract of the path it goes through, while using a tillage-sowing attachment.

4.7. Scouting Monitoring Robots

Ref. [99] developed a novel bionic hexapod robot with three levels of sensors for crop scouting: LiDAR, three RGB cameras and three range sensors. Its structure allows it to adapt the clearance depending on the crop height, and the legged structure enables it to move through uneven terrains with ease while acquiring information about the crop and the environment. Ref. [100] describes an electric 4WD robot from Antobot with rechargeable batteries and multispectral vision, used for autonomous crop health and growth monitoring. Ref. [101] is a compact electric-drive robot, designed by EarthSense, with swappable batteries, 3D sensors, and LiDAR for detailed crop phenotyping and data collection. Ref. [102] developed a RGV (reconfigurable ground vehicle) able to navigate through corn rows thanks to its reconfigurable two-tracks structure and ultrasonic sensors.
Table 1 shows the general overview of robotic platforms analyzed, with their main features and characteristics.

5. Trends and Discussion

In this section, the results of the analysis conducted on the main characteristics of the selected field robots are presented and discussed. The robotic platforms were analyzed based on eight characteristics: country of provenience, navigation system, engine type, presence on the market, safety sensors, application field (i.e., green-house, vineyard, etc.), main task (i.e., harvesting, multi-purpose, etc.) and traction system.

5.1. Country

The analysis of the country of production/design of the machines, shown in Figure 2, highlights China’s dominance in the manufacturing of this type of equipment: 11 out of 59 robots (approximately 19%) come from China. This is largely due to the investment plans that China has implemented in recent years. The Ministry of Agriculture and Rural Affairs (MARA) has launched a plan aimed at integrating digital technologies—such as artificial intelligence, big data, and GPS—into agriculture. By 2028, the plan envisions the creation of a national agricultural big data platform, the adoption of smart technologies in crop farming, livestock, and aquaculture, and the development of over 1000 digital agriculture factories and 100 “farms of the future”. The National Development and Reform Commission has also announced the creation of a state-backed venture capital fund worth around 1 trillion yuan (approximately 138 billion USD), aimed at supporting robotics, AI, and advanced innovation. This fund is intended to foster the development of cutting-edge technologies—including autonomous agricultural robots—and to strengthen China’s global competitiveness in the sector. In 2024, China released an action plan that promotes the use of technologies such as artificial intelligence, biotechnology, and robotics to improve agricultural productivity and food security. The plan includes investments in modern farming equipment and the adoption of digital technologies to better integrate agriculture with modern innovations [103]. The United States accounts for 14% of the robots analyzed (8 out of 59). There is also a strong push in the U.S. toward the development and implementation of these types of machines, driven by the shortage of skilled labour, rising food demand, and climate change, all of which are increasingly pushing farmers to adopt these technologies. This progress is supported by federal investment programmes that fund research in the sector: at the federal level, the USDA (U.S. Department of Agriculture) has implemented programmes such as AFRI (Agriculture and Food Research Initiative), SBIR (Small Business Innovation Research), and CSAG (Climate-Smart Agriculture Grants). In addition, there is strong venture capital investment (from DCVC, GV, AgFunder), university and R&D centre funding, public–private partnerships, and tax incentives for private entities (e.g., R&D tax credits) [104,105,106]. France accounts for 12% of the total, with 7 robots out of 59. France is one of the most active European countries in the development and adoption of agricultural robotics, mainly thanks to a national investment plan, France2030, launched in 2021 worth EUR 54 billion, of which around EUR 2 billion were allocated to agriculture and food [107].

5.2. Sensors: Navigation and Safety

The analyzed robot population includes machines equipped with more than one type of sensor (e.g., GPS-RTK + LiDAR + IMU). Therefore, the numbers shown in Figure 3 should be interpreted as percentages relative to the total number of robots, not as the sum of all the detected sensors. The analysis of the navigation systems found within the robot population highlights that GPS-RTK is the most widely adopted navigation system, used by 28 out of 59 robots (approximately 47%). However, GPS-based guidance systems for autonomous agricultural robots face significant limitations when operating under natural or artificial canopy cover, such as vineyard rows, tree canopies, or protective structures like anti-hail nets. In these environments, the satellite signal may be severely weakened or completely obstructed, leading to reduced positioning accuracy or total signal loss. This can cause the robot to deviate from its intended path, interrupt autonomous navigation, or even come to a complete stop, thereby compromising operational efficiency and reliability in real-world field conditions. The second most used sensor is the RGB camera, found in 14 out of 59 robots (24%). This is a low-cost solution for robots that use CNN-based models for plant species recognition, providing high-resolution images. Its main drawback is the lack of depth perception, meaning it cannot estimate object distance on its own. As a result, it typically needs to be combined with other sensors (such as LiDAR or RGB-D) to compensate for this shortcoming. Additionally, it is highly sensitive to ambient light: direct sunlight, strong shadows, and changes between day and night can affect image quality and thus the accuracy of target detection. The third most used navigation sensor, also employed as a standalone system in some robots, is LiDAR, present in 10 out of 59 robots (approximately 17%). This sensor is particularly suitable for environments where GPS-based navigation is not feasible due to natural or artificial cover, such as greenhouses. LiDAR accurately detects the distance between the sensor and surrounding objects, generating a real-time 3D map of the environment around the robot—even while it is in motion—thanks to its high refresh rate. Although it is highly reliable and precise, it is not widely implemented due to its higher cost compared to other sensors, as well as its greater computational demands. Moreover, it is sensitive to environmental conditions such as dust or rain, which can compromise the quality of the point cloud the sensor generates.
Safety sensors are various types of devices used on autonomous robots to ensure safe navigation by allowing them to avoid collisions with people or objects along their working path. These sensors are essential when such machines operate in a designated area without operator supervision, as the absence or presence of people or obstacles cannot be verified in real time. Figure 4, generated from the data analysis, highlights which sensors are most used for safety purposes, and which are less preferred. Excluding the “not specified” category—which accounts for 37 out of 59 robots, where safety sensor information was not explicitly stated (although safety sensors are often the same as those used for navigation)—the most frequently used sensor is the LiDAR, found on 12 robots, like [62] and [90]. As mentioned earlier, LiDAR offers several advantages, such as providing accurate distance measurements from surrounding objects and generating a virtual 3D map of the environment. This enables the robot to assess whether it is operating in a clear area or if obstacles are present that might hinder its work. However, LiDAR is less effective in environments with a lot of dust—a common condition in agriculture, especially in hot and dry regions. Bumpers, or contact sensors, are also widely used (10 out of 59 robots). These detect physical impacts: the collision must occur at a reduced speed to avoid damage to people or objects [44]. Therefore, bumpers are usually paired with ultrasonic sensors or other types of proximity sensors that detect obstacles in advance, prompting the machine to slow down. If the obstacle remains in place, contact occurs, and the robot either stops or changes trajectory. The third most common safety sensor is the geo-fence, present on nine robots. This typically consists of a GPS module that defines a virtual perimeter within which the robot must stay during operation [47,52,53]. If the robot crosses outside of this perimeter, it stops because it no longer recognizes the designated safe work area. However, this sensor cannot detect obstacles along the path, so it is usually combined with other types of sensors that provide obstacle detection. This ensures the machine stays within its operational zone and avoids potential hazards. An increasing number of robots are also equipped with stereo cameras (8 out of 59) [45,63]. Unlike standard RGB cameras, stereo cameras can perceive depth and distance, making them well-suited for identifying both the work environment and surrounding obstacles. The only downside is the complexity of image processing, which requires significant computational power on the software side.

5.3. Engine

This section analyses the different types of powertrains present within the selected pool of robots. Figure 5 shows that in 63% of the cases, the robots are powered by electric motors, supplied by rechargeable batteries. The advantage of using an electric motor is that it is completely independent of fossil fuels (diesel or gasoline), thereby eliminating CO2 emissions. Additionally, it simplifies the system architecture, making maintenance easier. Electric motors are particularly suitable for small to medium-sized robots like [39] or [61], which do not require high power, but can be deployed also on bigger machines like [40]. This allows for a reduction in the overall size of the system by eliminating the need for auxiliary components required by internal combustion engines. Moreover, electric motors are far more energy efficient: while diesel and gasoline engines typically reach an efficiency of only 30–33%, electric motors can exceed 90%. For large-sized robots, which require higher power due to the heavy-duty nature of their operations (e.g., large-scale soil preparation with heavy implements), hybrid systems are a good compromise. They help reduce fuel consumption (and thus greenhouse gas emissions) while ensuring the machine has sufficient operating autonomy—something that would be difficult to achieve with battery-powered electric motors alone. Battery charging times are still relatively long (2–3 h), whereas a fuel tank for a combustion engine can be refilled in just a few minutes. Hybrid systems (diesel + electric) are found in 8% of the robots analyzed in this study. A clear example is [42], an autonomous robot used for open-field operations, powered by a 115 kW (156 hp) diesel engine with a maximum torque of 610 Nm. Notably, there is also a category of electric robots powered by solar energy, that accounts for 12% of total robots analyzed. These robots are equipped with solar panels that convert sunlight into electricity, which then charges onboard storage batteries used to power the entire system [74,75,80]. These robots can operate for significantly longer periods than conventional electric robots, which rely on wired charging stations, as they recharge while in operation. However, they are weather-dependent: in the absence of sunlight or during bad weather, battery charging may be compromised. The integration of photovoltaic panels directly on autonomous platforms reduces dependence on fossil fuels and fixed charging infrastructures, thereby improving both environmental and operational sustainability [108]. Recent studies have highlighted how energy harvesting systems, and, in particular, flexible solar cells, can enhance the autonomy and adaptability of robots in complex agricultural scenarios [109]. Some practical applications already demonstrate the feasibility of this technology: [110] developed a solar-powered robot for weeding and crop monitoring, while the Agri.q project introduced a lightweight platform equipped with a manipulator and a photovoltaic drone-support base [111]. Other prototypes include multifunctional solar-powered units for irrigation and targeted spraying [112].

5.4. Presence on the Market

The agricultural robots examined in this study originate from both commercial sources—i.e., machines already available on the market—and scientific sources, meaning prototypes developed by research centres and universities. These two categories account for 51% and 39% of all the machines analyzed, respectively. An additional 3% of the robots are developed by start-ups [96], while research prototypes [88] and commercial–research [71] hybrids each represent 2% and 5% of the total. Figure 6 shows their distribution.
The observed distribution, in which 51% of the machines are classified as commercial and the remaining 49% as prototypes, can be interpreted as the outcome of a gradual technological maturation process, strongly influenced by targeted incentive policies and investments in the agritech sector. In recent years, numerous funding programmes have supported the transition from experimental research to industrial deployment, enabling several technological solutions to move beyond the prototypal stage and establish themselves in the market. Publicly funded research projects have played a key role in this process, fostering innovation initiatives through competitive grants and financial instruments aimed at agricultural digitalization and sustainability. At the same time, increasing private-sector involvement has accelerated commercialization, with industrial partnerships and venture capital investments directed toward agricultural robotics and automation. This synergy has facilitated the development of autonomous machines that are not only technically reliable but also economically viable for large-scale adoption. Another relevant factor explaining the predominance of commercial machines over prototypes is the growing market demand for solutions capable of reducing operational costs and addressing environmental sustainability challenges. These drivers have created a favourable context for the diffusion of consolidated agricultural robots, while prototypal developments continue to function as an innovation engine, introducing new functionalities and enabling adaptation to diverse agricultural scenarios.
The main differences between a commercial and a prototype product lie on the design of the robot, one to be sold and the other yet to be tested and modified, and so the costs are different: to produce a commercial robot it costs less than producing the same machine as a prototype. For commercial robots, production is based on industrial processes that speed-up the whole process, and spare parts are chosen wisely after several tests, but making a prototype means choosing spare parts to test and sometimes changing them after tests, and there is a lot of work behind before having the machine at work. Since commercial products come from previous prototypes, their features are improved, because already tested, like batteries’ life or sensors’ precision: since these products are sold, they are meant to last and work properly. Prototype robots are often tested with lead–acid batteries, which are less expensive but also less performant than lithium-ion batteries, the latter being used in commercial products. This is because, during the initial testing phase, lead-acid batteries help reduce costs and allow verification of whether the robot’s architecture can actually function.

5.5. Application Field

The field of application indicates the type of production for which the machine was designed—that is, the environment in which the machine performs best, considering the equipment it is fitted with, its size, and the type of traction system it uses. The analysis carried out in this study identified five main fields of application: vineyards, orchards, open-field crops, greenhouses, and crops under solar panels. Some robots are specialized not only by application field but also by specific crop, featuring AI and computer vision systems trained to recognize a single species. In such cases, only the general application field (e.g., greenhouses) was considered, to avoid overly fragmented analysis. As shown in Figure 7a, most autonomous robots are primarily (but not exclusively) designed to work in open-field crops [61,70,78,80] (e.g., arable and vegetable crops on large areas), whereas orchard applications show slightly less—though still significant—interest (16 out of 54 robots). In orchards, robots are mainly used during the harvesting phase, as this task is time-consuming and labour-intensive, resulting in high personnel costs for the farmer due to the need for specialized labour. In greenhouses, research focuses primarily on pesticide application and the harvesting of fruits or vegetables. Pesticide application in greenhouses poses a greater risk to the operator due to the enclosed environment and limited air circulation, while the rationale for automating harvesting is similar to that in orchards. As previously mentioned, in some cases a single robot can be used in more than one application field. Figure 7b shows the distribution of robots into three categories—mono-use, double-use, and triple-use—indicating whether a robot is used for one, two, or three purposes. This reflects the versatility of the robot. As shown in Figure 7b, most robots fall under the mono-use category (38 out of 54), 14 out of 54 are in the double-use category [44,47,56,82], and only 2 out of 54 are classified as triple-use [43,99]. In many double-use cases, robots can be deployed both in open-field crops and in orchards: these are often compact or versatile machines that can easily navigate between tree rows as well as crop rows in open fields (e.g., corn, sunflower, etc.). Even highly specific robots designed for vineyard use can often be employed in other tree crops. A clear example is [40], which can be used both in vineyards and in other wall-trained crops, such as intensive olive groves.

5.6. Main Task

The main task refers to the classification of robots based on the primary action they perform within their working environment. Seven different categories were identified, within which each robot was placed. Figure 8 illustrates the distribution of tasks performed by the robots selected for this study. The most common task is multi-purpose, with over a quarter of the robots falling into this category (26%) [39,40,42,43,44,45,46,47,48,52,53,55,56,57]. This group includes both multitasking robots, which are designed with multiple tools integrated into the robot’s structure to perform several operations simultaneously, and tool-carrier robots, which function similarly to traditional tractors and can be paired with different custom-made implements. Usually, they consist of a central body that serves as the driving unit and power distributor, which is transmitted to the various working components through hydraulic, electric, or mechanical systems: the same machine, by changing the tool, can operate in different situations, adjusting engine speed, hydraulic oil flow, or electric current, thereby adapting to the various implements. Nearly a quarter of the robots (24%) are used for pest control and chemical weeding [77,78,79,80,81,82,83,85,86,87,88,89,90]—robots that spray plant protection products to combat fungal diseases, insect infestations, and weed growth. The innovation in this category lies in the sensor systems these robots are equipped with, allowing for precise and targeted applications of chemicals. This not only reduces the quantity of chemicals used but also eliminates the operator’s exposure to potentially harmful substances. Almost one fifth of the robots (19%) fall under the harvesting category [58,59,60,61,62,63,64,65,66,67]. These are primarily used to harvest fruits, vegetables, and mushrooms. Harvesting is one of the most critical stages in agricultural production due to the significant amount of time, specialized labour, and care it requires to avoid damaging the product during collection and storage. Robots in this category are equipped with image recognition sensors (e.g., RGB-D cameras, LiDAR) that enable them to identify and collect the target crops. Mechanical weeding accounts for 13% of the robot [69,70,71,72,73,74,75]. These robots control weed growth through mechanical methods such as soil-working tines, lasers, or cutting heads. This category is particularly important because it enables weed control without the use of herbicides, offering significant benefits for the crop, the soil, the final consumer, and farm workers. This type of machine can be equipped with implements for mowing, shallow soil tillage, or for the mechanical removal or devitalization of weeds. These implements can also be harmful to the cash crop; therefore, artificial intelligence systems are trained to recognize weeds in order to avoid damaging the crop, together with navigation systems such as RTK-GPS, which allow the robot to maintain straight trajectories within row crops planted in parallel lines. The scouting–monitoring category represents 7% of the robots [99,100,101,102] and is experiencing strong growth. These robots are used for crop monitoring, which is essential for forecasting models and supporting farmers’ decision-making processes. They do not perform mechanical actions on crops or soil, but rather observe the environment, providing processed data about crop health and the presence of pathogens through optical sensors, such as cameras. An equal share (7%) of robots falls under the tillage-sowing category [94,96,97,98], which includes robots for open-field sowing. Finally, 4% of the robots are used for transplanting [92,93], i.e., transplanting seedlings either in open fields or in greenhouses. It is also essential to highlight the enabling intelligent technologies that underpin the autonomy, robustness, and scalability of agricultural field robots. Recent research has shown that advanced actuator control systems, particularly Model Predictive Control (MPC), are crucial for achieving precise and energy-efficient manipulation in highly variable and unstructured environments, allowing robots to stabilize soft manipulators, follow constrained trajectories, and optimize repetitive tasks such as spraying or harvesting [22,113]. Equally central are modern data processing methods, which integrate heterogeneous sensor inputs—ranging from RGB and depth cameras to LiDAR, GNSS/IMU, and multispectral sensors—through sensor fusion techniques supported by artificial intelligence. The adoption of edge computing solutions enables real-time inference directly on embedded devices, reducing latency and dependence on cloud services while sustaining computationally demanding tasks such as weed classification, fruit detection, and crop monitoring [114,115,116]. At the same time, the edge-cloud continuum provides the backbone for digital twin implementations, in which virtual replicas of fields, crops, or entire robotic fleets are employed to simulate scenarios, anticipate risks, and optimize decision-making, often in combination with reinforcement learning strategies that enhance long-term adaptability [25,117,118]. Communication architectures represent another key enabling technology: hybrid solutions that combine high-bandwidth 5G or Wi-Fi connectivity with low-power long-range protocols such as LoRa/LoRaWAN have been successfully deployed to guarantee reliable coverage in heterogeneous agricultural landscapes, while multi-agent systems and swarm robotics paradigms ensure fleet coordination, cooperative perception, and resilient task allocation [119,120]. Advances in artificial intelligence further contribute to crop and weed recognition, disease detection, and predictive yield estimation, increasingly relying on transformer architectures, multimodal learning, and extreme learning machines to cope with complex agricultural variability [121]. Human–robot interaction also plays a pivotal role, with intuitive interfaces such as augmented and virtual reality tools, gesture control, or wearable sensors supporting collaborative and safe operations, while enhanced safety frameworks based on multimodal perception ensure that robots can operate in proximity to humans without compromising security [7]. Finally, energy management technologies are becoming indispensable for large-scale deployment: hybrid propulsion strategies, solar-electric configurations, advanced battery management systems, and autonomous docking and recharging stations powered by renewable sources are enabling longer operational autonomy and more sustainable farming practices [26,28]. Collectively, these intelligent technologies form a cohesive ecosystem that not only increases the robustness and adaptability of autonomous agricultural robots but also ensures their scalable integration into the broader framework of digital agriculture.

5.7. Traction Systems

Traction systems are the components that allow agricultural machines to move, and they can be of various types. In this study, six different types were identified, some differing in shape and others in the number of driven components. Figure 9 shows the percentage distribution of traction systems within the robot population analyzed. In total, 37% of the robots are 4WD [39,61,66,70]: this is a significant advantage in agricultural environments, especially in less structured areas where the machine operates on rough terrain and therefore requires all movement components to be driven. A total of 29% of the robots are 2WD [60,63,65], specifically the rear wheels: these robots are suitable for more structured terrain where it an integral traction is not needed, allowing the front wheels to be only steering, which enables sharper and quicker turns. 26% uses rubber tracks [58,67,77,82], which are well-suited for operations requiring high grip and pulling power, also providing greater machine stability and reducing ground pressure—particularly useful for heavy and very heavy robots. The remaining 8% is divided as follows: three wheels (2WD) at 4% [62,106], and 2% for flexible legs [99] and four wheels (4WD) or two wheels and two tracks, respectively [52].

5.8. Future Perspectives

Agricultural robotics addresses multiple needs, ranging from decision-making support to weed removal, and has become an essential component of the farms of the future. Alongside advances in robotics, artificial intelligence continues to evolve, providing robotic platforms with decision-making capabilities and environmental recognition systems that enable them to operate effectively in complex and unstructured situations. A key driver of this sector is the need to increase agricultural production in response to the continuous rise in global population, combined with the limited availability of arable land, which pushes research towards new methods of enhancing productivity. Robotics enables highly precise actions with superior control over agricultural practices, reducing input usage while increasing output, and facilitates the removal of humans from repetitive, low-value tasks, thereby minimizing occupational hazards. Assisted steering systems in conventional machinery have already improved productivity by ensuring accurate alignment during tasks such as sowing, and the implementation of autonomous guidance systems in robotic platforms further eliminates human errors while removing operators from potentially hazardous environments, enhancing safety and freeing labour for higher value, less physically demanding activities. The classification of agricultural robots according to their primary tasks reveals a growing diversity of autonomous systems, including multi-purpose robots, pest control, harvesting, mechanical weeding, scouting and monitoring, tillage and sowing, and transplanting. Future developments are expected to increase the versatility, intelligence, and autonomy of these machines, with multi-purpose robots adopting modular designs and advanced tool-switching mechanisms for flexible operations, pest control robots integrating high-precision sensors and AI for targeted applications that reduce chemical use, harvesting robots improving image recognition and dexterity for delicate crop handling, mechanical weeding robots leveraging AI for precise navigation and weed detection, and scouting and monitoring robots employing multi-spectral and hyperspectral sensors, predictive algorithms, and advanced data processing for real-time crop assessment. Robots for tillage, sowing, and transplanting are anticipated to achieve greater autonomy and precision through adaptive path planning, soil condition assessment, and optimized seedling placement, collectively enhancing efficiency, sustainability, and resilience in agricultural operations. Autonomous agricultural robots will also increasingly rely on multi-sensor integration and AI-driven systems to improve navigation, perception, and operational safety, with GPS-RTK combined with LiDAR, stereo cameras, and IMUs for reliable positioning under canopy cover, vision systems evolving toward multi- and hyperspectral imaging to enhance crop recognition, and LiDAR becoming more cost-effective and resilient for high-resolution mapping and obstacle detection. Safety systems, including bumpers and geo-fences, are expected to incorporate predictive and adaptive algorithms for proactive collision avoidance, while advances in edge computing and onboard processing will enable real-time analysis of large datasets, supporting autonomous decision-making, adaptive behaviours, and coordinated multi-robot operations. Engine technologies in agricultural robots are also evolving, with a gradual shift toward electric motors for less demanding tasks, while hybrid or internal combustion systems provide power for high-load operations, highlighting the need for improved battery performance and faster recharging to extend operational time. The analysis of application fields demonstrates the adaptability of autonomous robots across vineyards, orchards, open-field crops, greenhouses, and crops under solar panels, with many machines capable of operating across multiple fields due to modular designs, AI, and crop-specific computer vision systems, enhancing precision agriculture, labour optimization, and cost efficiency. Traction systems further influence performance and versatility, with four-wheel drive, two-wheel drive, rubber tracks, flexible legs, and hybrid wheel-track configurations enabling robots to navigate diverse terrains while maintaining stability and minimizing soil compaction, and future innovations may include modular traction platforms and AI-assisted drive control. Finally, the introduction of autonomous agricultural robots raises ethical considerations, as automation reshapes rural production systems and labour structures, potentially reducing employment opportunities and impacting cultural and social cohesion. By conceptualizing robots as tools that alleviate strenuous tasks and support new skill development, rather than as replacements for human labour, automation can foster safer, more efficient, and socially sustainable agricultural practices, aligning technological progress with environmental, economic, and community resilience objectives.

6. Conclusions

Agricultural operations pose significant challenges for robotics, due to unpredictable weather, uneven terrains, and narrow time windows. Over the past decade, considerable progress has been achieved, with specialized and multi-tasking robots now addressing repetitive, hazardous, and labour-intensive tasks, improving worker safety and quality of life while reducing labour costs and greenhouse gas emissions. However, current European regulations (Machinery Regulation (EU) 2023/1230) still require direct operator supervision, preventing these platforms from being considered fully autonomous. What is needed instead is a supervisory figure capable of managing multiple robots simultaneously, rather than continuously monitoring each machine. From a technological standpoint, key challenges remain. Navigation systems must be made more robust to ensure reliable operation in unstructured environments where weather conditions, soil moisture, and canopy coverage affect perception, traction, and obstacle detection. Autonomy is strongly influenced by the interaction between motor type, power requirements, and operational load. Fleets of small, lightweight robots powered by electric motors are promising due to low energy demand and reduced soil impact, but remain limited by battery capacity and recharging needs. Larger machines, often based on internal combustion engines, provide the power required for demanding operations but are associated with high greenhouse gas emissions and greater soil compaction. The development of high-power electric motors for heavy tasks, together with improvements in battery density, charging efficiency, and autonomous docking at renewable-powered stations, represents a fundamental step toward sustainable automation. Different strategies will likely coexist: fleets of small robots for tasks such as weeding, medium-sized platforms with hybrid powertrains, large tool-carrier robots for extensive operations, and straddle-type machines for orchards and vineyards. Each solution presents trade-offs in terms of power, flexibility, and cost. Research is increasingly focused on small fleets for their adaptability and environmental benefits, while large robots remain necessary for high-energy operations but require significant innovation in propulsion and navigation systems. In conclusion, the future of agricultural robotics will depend on advances in autonomy, powertrain technologies, and navigation reliability, as well as on regulatory frameworks that enable safe but practical deployment. Addressing these aspects will allow robotic platforms to become fully integrated, effective, and sustainable tools for modern farming.
Machine proposal: the concept of an ideal autonomous agricultural machine envisions a modular and versatile platform capable of performing a wide range of operations—from sowing and weeding to spraying, harvesting, and crop monitoring—through interchangeable implements and advanced AI-based control systems. Such a machine should integrate computer vision, multi-sensor fusion, and adaptive decision-making to ensure reliable operation under unstructured and variable field conditions, including uneven terrain, fluctuating soil moisture, and canopy cover. From an energy perspective, a hybrid propulsion strategy appears most promising, with electric motors prioritized for low- and medium-load tasks to minimize emissions, complemented by hybrid systems for energy-intensive operations until high-capacity batteries and rapid charging technologies become widely available. To further enhance sustainability, autonomous docking and recharging stations powered by renewable energy sources such as solar or wind systems should be developed. Navigation and traction systems would also need to be highly adaptive, relying on GPS-RTK, LiDAR, multispectral imaging, and AI-driven path planning in combination with modular wheel, track, or hybrid drive configurations to guarantee stability, efficiency, and minimal soil compaction across diverse environments. Finally, this ideal machine should operate within a coordinated fleet paradigm, supervised by a single operator overseeing multiple units rather than continuously monitoring individual robots, thereby improving scalability, reducing labour demand, and ensuring safer, more sustainable, and economically viable agricultural practices.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/agronomy15092185/s1.

Author Contributions

Conceptualization, M.R., C.F. and A.P.; methodology, M.F. (Mattia Fontani) and M.F. (Marco Fontanelli); software, S.M.L.; validation, A.P., M.R., S.M.L. and L.G.; formal analysis, M.F. (Mattia Fontani).; investigation, M.F. (Mattia Fontani) and M.F. (Marco Fontanelli); resources, A.P.; data curation, S.M.L. and L.G.; writing—original draft preparation, M.F. (Marco Fontanelli) and M.F. (Mattia Fontani); writing—review and editing, S.M.L., A.P., M.F. (Marco Fontanelli) and M.R.; visualization, S.M.L., M.R., C.F. and L.G.; supervision, A.P. and M.F. (Marco Fontanelli); project administration, M.F. (Marco Fontanelli) and M.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Kennedy, J.; Hurtt, G.C.; Liang, X.-Z.; Chini, L.; Ma, L. Changing Cropland in Changing Climates: Quantifying Two Decades of Global Cropland Changes. Environ. Res. Lett. 2023, 18, 064010. [Google Scholar] [CrossRef]
  2. Crippa, M.; Solazzo, E.; Guizzardi, D.; Monforti-Ferrario, F.; Tubiello, F.N.; Leip, A. Food Systems Are Responsible for a Third of Global Anthropogenic GHG Emissions. Nat. Food 2021, 2, 198–209. [Google Scholar] [CrossRef]
  3. Raj, R.; Kumar, S.; Lal, S.P.; Singh, H.; Pradhan, J.; Bhardwaj, Y. A Brief Overview of Technologies in Automated Agriculture: Shaping the Farms of Tomorrow. Int. J. Environ. Clim. Change 2024, 14, 181–209. [Google Scholar] [CrossRef]
  4. Duckett, T.; Pearson, S.; Blackmore, S.; Grieve, B. Agricultural Robotics: The Future of Robotic Agriculture; UKRAS White Papers; EPSRC UK-RAS Network: London, UK, 2018. [Google Scholar]
  5. Hamilton, S.F.; Richards, T.J.; Shafran, A.P.; Vasilaky, K.N. Farm Labor Productivity and the Impact of Mechanization. Am. J. Agric. Econ. 2021, 104, 1435–1459. [Google Scholar] [CrossRef]
  6. Abella, M. The Prosperity Paradox: Fewer and More Vulnerable Farm Workers by Philip Martin, Oxford, Oxford University Press, 2021, xix + 213 pp.; Oxford University Press: Oxford, UK, 2021; Volume 59, pp. 230–233. [Google Scholar] [CrossRef]
  7. Yerebakan, M.O.; Hu, B. Human–Robot Collaboration in Modern Agriculture: A Review of the Current Research Landscape. Adv. Intell. Syst. 2024, 6, 2300823. [Google Scholar] [CrossRef]
  8. Tagarakis, A.C.; Benos, L.; Aivazidou, E.; Anagnostis, A.; Kateris, D.; Bochtis, D. Wearable Sensors for Identifying Activity Signatures in Human-Robot Collaborative Agricultural Environments. In Proceedings of the 13th EFITA International Conference, Online, 19 November 2021; MDPI: Basel, Switzerland, 2021; p. 5. [Google Scholar]
  9. Adamides, G.; Edan, Y. Human–Robot Collaboration Systems in Agricultural Tasks: A Review and Roadmap. Comput. Electron. Agric. 2023, 204, 107541. [Google Scholar] [CrossRef]
  10. Castro, H. Autonomous Field Robotics: Optimizing Efficiency and Safety Through Data; ResearchGate: Berlin, Germany, 2023. [Google Scholar]
  11. Li, X.; Ma, N.; Han, Y.; Yang, S.; Zheng, S. AHPPEBot: Autonomous Robot for Tomato Harvesting Based on Phenotyping and Pose Estimation 2024. arXiv 2024, arXiv:2405.06959. [Google Scholar] [CrossRef]
  12. Zhang, Z.; Kayacan, E.; Thompson, B.; Chowdhary, G. High Precision Control and Deep Learning-Based Corn Stand Counting Algorithms for Agricultural Robot. Auton. Robot. 2020, 44, 1289–1302. [Google Scholar] [CrossRef]
  13. Ahmadi, A.; Halstead, M.; McCool, C. Towards Autonomous Visual Navigation in Arable Fields 2022. arXiv 2022, arXiv:2109.11936. [Google Scholar] [CrossRef]
  14. Bras, A.; Montanaro, A.; Pierre, C.; Pradel, M.; Laconte, J. Toward a Better Understanding of Robot Energy Consumption in Agroecological Applications 2024. arXiv 2024, arXiv:2410.07697. [Google Scholar] [CrossRef]
  15. Cole, J. Autonomous Robotics in Action: Performance and Safety Optimization with Data; ResearchGate: Berlin, Germany, 2023. [Google Scholar]
  16. Padhiary, M.; Kumar, A.; Sethi, L.N. Emerging Technologies for Smart and Sustainable Precision Agriculture. Discov. Robot. 2025, 1, 6. [Google Scholar] [CrossRef]
  17. What Is a Robot?|WIRED. Available online: https://www.wired.com/story/what-is-a-robot/ (accessed on 11 March 2025).
  18. Lowenberg-DeBoer, J.; Huang, I.Y.; Grigoriadis, V.; Blackmore, S. Economics of Robots and Automation in Field Crop Production. Precis. Agric. 2020, 21, 278–299. [Google Scholar] [CrossRef]
  19. Shaikh, T.A.; Mir, W.A.; Rasool, T.; Sofi, S. Machine Learning for Smart Agriculture and Precision Farming: Towards Making the Fields Talk. Arch. Comput. Methods Eng. 2022, 29, 4557–4597. [Google Scholar] [CrossRef]
  20. Ayoub Shaikh, T.; Rasool, T.; Rasheed Lone, F. Towards Leveraging the Role of Machine Learning and Artificial Intelligence in Precision Agriculture and Smart Farming. Comput. Electron. Agric. 2022, 198, 107119. [Google Scholar] [CrossRef]
  21. Saiz-Rubio, V.; Rovira-Más, F. From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management. Agronomy 2020, 10, 207. [Google Scholar] [CrossRef]
  22. Xie, D.; Chen, L.; Liu, L.; Chen, L.; Wang, H. Actuators and Sensors for Application in Agricultural Robots: A Review. Machines 2022, 10, 913. [Google Scholar] [CrossRef]
  23. Vougioukas, S.G. Agricultural Robotics. Annu. Rev. Control Robot. Auton. Syst. Agric. Robot. 2019, 2, 365–392. [Google Scholar] [CrossRef]
  24. Upadhyay, A.; Zhang, Y.; Koparan, C.; Rai, N.; Howatt, K.; Bajwa, S.; Sun, X. Advances in Ground Robotic Technologies for Site-Specific Weed Management in Precision Agriculture: A Review. Comput. Electron. Agric. 2024, 225, 109363. [Google Scholar] [CrossRef]
  25. Liu, L.; Yang, F.; Liu, X.; Du, Y.; Li, X.; Li, G.; Chen, D.; Zhu, Z.; Song, Z. A Review of the Current Status and Common Key Technologies for Agricultural Field Robots. Comput. Electron. Agric. 2024, 227, 109630. [Google Scholar] [CrossRef]
  26. Gonzalez-de-Soto, M.; Emmi, L.; Benavides, C.; Garcia, I.; Gonzalez-de-Santos, P. Reducing Air Pollution with Hybrid-Powered Robotic Tractors for Precision Agriculture. Biosyst. Eng. 2016, 143, 79–94. [Google Scholar] [CrossRef]
  27. Wilson, J.N. Guidance of Agricultural Vehicles—A Historical Perspective. Comput. Electron. Agric. 2000, 25, 3–9. [Google Scholar] [CrossRef]
  28. Lochan, K.; Khan, A.; Elsayed, I.; Suthar, B.; Seneviratne, L.; Hussain, I. Advancements in Precision Spraying of Agricultural Robots: A Comprehensive Review. IEEE Access 2024, 12, 129447–129483. [Google Scholar] [CrossRef]
  29. MarketsandMarkets. Available online: https://www.marketsandmarkets.com/ (accessed on 11 July 2025).
  30. Grand View Research. Available online: https://www.researchandmarkets.com/ (accessed on 11 July 2025).
  31. Spherical Insights. Available online: https://www.sphericalinsights.com/ (accessed on 11 July 2025).
  32. Ritchie, H. Food Production Is Responsible for One-Quarter of the World’s Greenhouse Gas Emissions—Our World in Data. Available online: https://ourworldindata.org/food-ghg-emissions (accessed on 5 December 2024).
  33. 2022 Census of Agriculture|USDA/NASS. Available online: https://www.nass.usda.gov/Publications/AgCensus/2022/ (accessed on 11 July 2025).
  34. O’Meara, P. The Ageing Farming Workforce and the Health and Sustainability of Agricultural Communities: A Narrative Review. Aust. J. Rural Health 2019, 27, 281–289. [Google Scholar] [CrossRef]
  35. Adetunji, C.O.; Hefft, D.I.; Olugbemi, O.T. Agribots: A Gateway to the next Revolution in Agriculture. In AI, Edge and IoT-Based Smart Agriculture; Elsevier: Amsterdam, The Netherlands, 2022; pp. 301–311. ISBN 978-0-12-823694-9. [Google Scholar]
  36. Sivasangari, A.; Teja, A.K.; Gokulnath, S.; Ajitha, P.; Gomathi, R.M. Vignesh Revolutionizing Agriculture: Developing Autonomous Robots for Precise Farming. In Proceedings of the 2023 International Conference on Inventive Computation Technologies (ICICT), Lalitpur, Nepal, 26 April 2023; IEEE: New York City, NY, USA, 2023; pp. 1461–1468. [Google Scholar]
  37. Shahrooz, M.; Talaeizadeh, A.; Alasty, A. Agricultural Spraying Drones: Advantages and Disadvantages. In Proceedings of the 2020 Virtual Symposium in Plant Omics Sciences (OMICAS), Bogotá, Colombia, 23 November 2020; IEEE: New York City, NY, USA, 2020; pp. 1–5. [Google Scholar]
  38. Adekola Adebayo, R.; Constance Obiuto, N.; Clinton Festus-Ikhuoria, I.; Kayode Olajiga, O. Robotics in Manufacturing: A Review of Advances in Automation and Workforce Implications. Int. J. Adv. Multidiscip. Res. Stud. 2024, 4, 632–638. [Google Scholar] [CrossRef]
  39. Otani, T.; Itoh, A.; Mizukami, H.; Murakami, M.; Yoshida, S.; Terae, K.; Tanaka, T.; Masaya, K.; Aotake, S.; Funabashi, M.; et al. Agricultural Robot under Solar Panels for Sowing, Pruning, and Harvesting in a Synecoculture Environment. Agriculture 2022, 13, 18. [Google Scholar] [CrossRef]
  40. Vitibot Bakus L. Available online: https://vitibot.fr/ (accessed on 26 February 2025).
  41. Valero, C. Robótica en viñedo la ciencia ficción se hace realidad. VIDA MAQ 2022, 525, 50–54. [Google Scholar]
  42. AgXeed 5.115T2. Available online: https://www.agxeed.com/ (accessed on 21 February 2025).
  43. Mula. Available online: https://mula.ai/it/ (accessed on 26 February 2025).
  44. Black Shire RC3075. Available online: https://black-shire.com/ (accessed on 21 February 2025).
  45. Pellenc RX-20. Available online: https://www.pellenc.com/fr-fr/ (accessed on 26 February 2025).
  46. Pek SlopeHelper. Available online: https://slopehelper.com/ (accessed on 26 February 2025).
  47. AutoAgri IC20. Available online: https://autoagri.no/ (accessed on 21 February 2025).
  48. Naïo Technologies Orio. Available online: www.naio-technologies.com/en/orio-robot/ (accessed on 26 February 2025).
  49. Naïo Technologies Oz. Available online: www.naio-technologies.com/en/oz-robot/ (accessed on 26 February 2025).
  50. Naïo Technologies Ted. Available online: www.naio-technologies.com/en/ted-robot/ (accessed on 26 February 2025).
  51. Naïo Technologies Jo. Available online: www.naio-technologies.com/en/jo-robot/ (accessed on 26 February 2025).
  52. Free Green Nature Leonardo. Available online: https://www.freegreen-nature.it/ (accessed on 26 February 2025).
  53. Agrointelli Robotti 150D. Available online: https://agrointelli.com/robotti/ (accessed on 26 February 2025).
  54. Calleja-Huerta, A.; Lamandé, M.; Green, O.; Munkholm, L.J. Impacts of Load and Repeated Wheeling from a Lightweight Autonomous Field Robot on the Physical Properties of a Loamy Sand Soil. Soil Tillage Res. 2023, 233, 105791. [Google Scholar] [CrossRef]
  55. Amos A3. Available online: https://www.amospower.com/ (accessed on 21 February 2025).
  56. EarthSense TerraMax. Available online: https://www.earthsense.co/terramax (accessed on 26 February 2025).
  57. XAG R150. Available online: https://www.xa.com/en/r150-2022 (accessed on 26 February 2025).
  58. Goncharov, D.V.; Ivashchuk, O.A.; Kaliuzhnaya, E.V. Development of a Mobile Robotic Complex for Automated Monitoring and Harvesting of Agricultural Crops. In Proceedings of the 2024 International Russian Automation Conference (RusAutoCon), Sochi, Russia, 8 September 2024; IEEE: New York City, NY, USA, 2024; pp. 338–343. [Google Scholar]
  59. Liu, L.; Yang, Q.; He, W.; Yang, X.; Zhou, Q.; Addy, M.M. Design and Experiment of Nighttime Greenhouse Tomato Harvesting Robot. J. Eng. Technol. Sci. 2024, 56, 340–352. [Google Scholar] [CrossRef]
  60. Birrell, S.; Hughes, J.; Cai, J.Y.; Iida, F. A Field-tested Robotic Harvesting System for Iceberg Lettuce. J. Field Robot. 2020, 37, 225–245. [Google Scholar] [CrossRef] [PubMed]
  61. Tituaña, L.; Gholami, A.; He, Z.; Xu, Y.; Karkee, M.; Ehsani, R. A Small Autonomous Field Robot for Strawberry Harvesting. Smart Agric. Technol. 2024, 8, 100454. [Google Scholar] [CrossRef]
  62. Agrobot E-Series. Available online: https://www.agrobot.com/e-series (accessed on 11 February 2025).
  63. Ant Robotics ValeraFlex. Available online: https://www.antrobotics.de/wp-content/uploads/2023/11/Datenblatt_Valera-8_DE.pdf/ (accessed on 11 February 2025).
  64. AVL Motion S9000. Available online: https://www.avlmotion.com/nl/ (accessed on 11 February 2025).
  65. Zhong, M.; Han, R.; Liu, Y.; Huang, B.; Chai, X.; Liu, Y. Development, Integration, and Field Evaluation of an Autonomous Agaricus Bisporus Picking Robot. Comput. Electron. Agric. 2024, 220, 108871. [Google Scholar] [CrossRef]
  66. Miao, Z.; Yu, X.; Li, N.; Zhang, Z.; He, C.; Li, Z.; Deng, C.; Sun, T. Efficient Tomato Harvesting Robot Based on Image Processing and Deep Learning. Precis. Agric. 2023, 24, 254–287. [Google Scholar] [CrossRef]
  67. Hua, W.; Zhang, W.; Zhang, Z.; Liu, X.; Saha, C.; Hu, C.; Wang, X. Design, Assembly and Test of a Low-Cost Vacuum Based Apple Harvesting Robot. Smart Agric. 2024, 10, 27–48. [Google Scholar] [CrossRef]
  68. Sportelli, M.; Frasconi, C.; Fontanelli, M.; Pirchio, M.; Raffaelli, M.; Magni, S.; Caturegli, L.; Volterrani, M.; Mainardi, M.; Peruzzi, A. Autonomous Mowing and Complete Floor Cover for Weed Control in Vineyards. Agronomy 2021, 11, 538. [Google Scholar] [CrossRef]
  69. Zhao, P.; Chen, J.; Li, J.; Ning, J.; Chang, Y.; Yang, S. Design and Testing of an Autonomous Laser Weeding Robot for Strawberry Fields Based on DIN-LW-YOLO. Comput. Electron. Agric. 2025, 229, 109808. [Google Scholar] [CrossRef]
  70. Zheng, S.; Zhao, X.; Fu, H.; Tan, H.; Zhai, C.; Chen, L. Design and Experimental Evaluation of a Smart Intra-Row Weed Control System for Open-Field Cabbage. Agronomy 2025, 15, 112. [Google Scholar] [CrossRef]
  71. Gallou, J.; Lippi, M.; Galle, M.; Marino, A.; Gasparri, A. Modeling and Control of the Vitirover Robot for Weed Management in Precision Agriculture. In Proceedings of the 2024 10th International Conference on Control, Decision and Information Technologies (CoDIT), Vallette, Malta, 1 July 2024; IEEE: New York City, NY, USA, 2024; Volume 5, pp. 2670–2675. [Google Scholar]
  72. Moondino. Available online: https://www.moondino.it/ (accessed on 20 January 2025).
  73. Aigro, Up. Available online: https://www.aigro.nl/ (accessed on 20 January 2025).
  74. Aigen Element. Available online: https://www.aigen.io/ (accessed on 20 January 2025).
  75. Earth Rover Claws. Available online: https://www.earthrover.farm/ (accessed on 20 January 2025).
  76. Gonzalez-de-Santos, P.; Ribeiro, A.; Fernandez-Quintanilla, C.; Lopez-Granados, F.; Brandstoetter, M.; Tomic, S.; Pedrazzi, S.; Peruzzi, A.; Pajares, G.; Kaplanis, G.; et al. Fleets of Robots for Environmentally-Safe Pest Control in Agriculture. Precis. Agric. 2017, 18, 574–614. [Google Scholar] [CrossRef]
  77. Cantelli, L.; Bonaccorso, F.; Longo, D.; Melita, C.D.; Schillaci, G.; Muscato, G. A Small Versatile Electrical Robot for Autonomous Spraying in Agriculture. AgriEngineering 2019, 1, 391–402. [Google Scholar] [CrossRef]
  78. Fan, X.; Chai, X.; Zhou, J.; Sun, T. Deep Learning Based Weed Detection and Target Spraying Robot System at Seedling Stage of Cotton Field. Comput. Electron. Agric. 2023, 214, 108317. [Google Scholar] [CrossRef]
  79. Padhiary, M.; Tikute, S.V.; Saha, D.; Barbhuiya, J.A.; Sethi, L.N. Development of an IOT-Based Semi-Autonomous Vehicle Sprayer. Agric. Res. 2025, 14, 229–239. [Google Scholar] [CrossRef]
  80. Loukatos, D.; Templalexis, C.; Lentzou, D.; Xanthopoulos, G.; Arvanitis, K.G. Enhancing a Flexible Robotic Spraying Platform for Distant Plant Inspection via High-Quality Thermal Im-agery Data. Comput. Electron. Agric. 2021, 190, 106462. [Google Scholar] [CrossRef]
  81. Mohanty, T.; Pattanaik, P.; Dash, S.; Tripathy, H.P.; Holderbaum, W. Smart Robotic System Guided with YOLOv5 Based Machine Learning Framework for Efficient Herbicide Us-age in Rice (Oryza Sativa L.) under Precision Agriculture. Comput. Electron. Agric. 2025, 231, 110032. [Google Scholar] [CrossRef]
  82. Liu, H.; Du, Z.; Shen, Y.; Du, W.; Zhang, X. Development and Evaluation of an Intelligent Multivariable Spraying Robot for Orchards and Nurseries. Comput. Electron. Agric. 2024, 222, 109056. [Google Scholar] [CrossRef]
  83. Kilter AX-1. Available online: https://www.kiltersystems.com/ (accessed on 12 March 2025).
  84. Gerhards, R.; Andújar Sanchez, D.; Hamouz, P.; Peteinatos, G.G.; Christensen, S.; Fernandez-Quintanilla, C. Advances in Site-specific Weed Management in Agriculture—A Review. Weed Res. 2022, 62, 123–133. [Google Scholar] [CrossRef]
  85. Yanmar YV01. Available online: https://www.yanmar.com/eu/campaign/2021/10/vineyard/ (accessed on 12 March 2025).
  86. Agrobot Bug Vacuum. Available online: https://www.agrobot.com/bugvac (accessed on 12 March 2025).
  87. Ant Robotics Adir Tunnel Sprayer. Available online: https://www.antrobotics.de/wp-content/uploads/2025/08/Datenblatt_Adir-Power_DE-1.pdf (accessed on 12 March 2025).
  88. AgBot II. Available online: https://research.qut.edu.au/qcr/Projects/agbot-ii-robotic-site-specific-crop-and-weed-management-tool/ (accessed on 12 March 2025).
  89. Hejazipoor, H.; Massah, J.; Soryani, M.; Asefpour Vakilian, K.; Chegini, G. An Intelligent Spraying Robot Based on Plant Bulk Volume. Comput. Electron. Agric. 2021, 180, 105859. [Google Scholar] [CrossRef]
  90. Ecorobotix AVO. Available online: https://ecorobotix.com/ (accessed on 12 March 2025).
  91. Zhang, W.; Miao, Z.; Li, N.; He, C.; Sun, T. Review of Current Robotic Approaches for Precision Weed Management. Curr. Robot. Rep. 2022, 3, 139–151. [Google Scholar] [CrossRef] [PubMed]
  92. Liu, Z.; Wang, X.; Zheng, W.; Lv, Z.; Zhang, W. Design of a Sweet Potato Transplanter Based on a Robot Arm. Appl. Sci. 2021, 11, 9349. [Google Scholar] [CrossRef]
  93. Li, M.; Zhu, X.; Ji, J.; Jin, X.; Li, B.; Chen, K.; Zhang, W. Visual Perception Enabled Agriculture Intelligence: A Selective Seedling Picking Transplanting Robot. Comput. Electron. Agric. 2025, 229, 109821. [Google Scholar] [CrossRef]
  94. FarmDroid FD20. Available online: https://farmdroid.com/ (accessed on 11 May 2025).
  95. Gerhards, R.; Risser, P.; Spaeth, M.; Saile, M.; Peteinatos, G. A Comparison of Seven Innovative Robotic Weeding Systems and Reference Herbicide Strategies in Sugar Beet (Beta vulgaris subsp. vulgaris L.) and Rapeseed (Brassica napus L.). Weed Res. 2024, 64, 42–53. [Google Scholar] [CrossRef]
  96. EarthSense Terra Petra. Available online: https://www.earthsense.co/icover (accessed on 11 May 2025).
  97. Amrita, S.A.; Abirami, E.; Ankita, A.; Praveena, R.; Srimeena, R. Agricultural Robot for Automatic Ploughing and Seeding. In Proceedings of the 2015 IEEE Technological Innovation in ICT for Agriculture and Rural Development (TIAR), Chennai, India, 10–12 July 2015; IEEE: New York City, NY, USA, 2015; pp. 17–23. [Google Scholar]
  98. Shanmugasundar, G.; Kumar, G.M.; Gouthem, S.E.; Prakash, V.S. Design and Development of Solar Powered Autonomous Seed Sowing Robot. J. Pharm. Negat. Results 2022, 13, 1013–1016. [Google Scholar] [CrossRef]
  99. Zhang, Z.; He, W.; Wu, F.; Quesada, L.; Xiang, L. Development of a Bionic Hexapod Robot with Adaptive Gait and Clearance for Enhanced Agricultural Field Scouting. Front. Robot. AI 2024, 11, 1426269. [Google Scholar] [CrossRef] [PubMed]
  100. Antobot Insight. Available online: https://www.antobot.ai/ (accessed on 21 March 2025).
  101. EarthSense Terra Sentia+. Available online: https://www.earthsense.co/terrasentia (accessed on 21 March 2025).
  102. Schmitz, A.; Badgujar, C.; Mansur, H.; Flippo, D.; McCornack, B.; Sharda, A. Design of a Reconfigurable Crop Scouting Vehicle for Row Crop Navigation: A Proof-of-Concept Study. Sensors 2022, 22, 6203. [Google Scholar] [CrossRef]
  103. China Releases Smart Agriculture Action Plan. Available online: https://www.dcz-china.org/2024/10/31/china-releases-smart-agriculture-action-plan/?utm_source=chatgpt.com (accessed on 15 April 2025).
  104. Agriculture’s Technology Future: How Connectivity Can Yield New Growth|McKinsey. Available online: https://www.mckinsey.com/industries/agriculture/our-insights/agricultures-connected-future-how-technology-can-yield-new-growth?utm_source=chatgpt.com (accessed on 16 July 2025).
  105. Climate-Smart Agriculture and Forestry Resources. Available online: https://www.farmers.gov/conservation/climate-smart?utm_source=chatgpt.com (accessed on 16 July 2025).
  106. Farm Labor|Economic Research Service. Available online: https://www.ers.usda.gov/topics/farm-economy/farm-labor (accessed on 16 July 2025).
  107. France 2030. Available online: https://www.info.gouv.fr/actualite/la-french-agritech-au-service-de-l-innovation-agricole?utm_source=chatgpt.com (accessed on 17 July 2025).
  108. Gorjian, S.; Ebadi, H.; Trommsdorff, M.; Sharon, H.; Demant, M.; Schindele, S. The Advent of Modern Solar-Powered Electric Agricultural Machinery: A Solution for Sustainable Farm Operations. J. Clean. Prod. 2021, 292, 126030. [Google Scholar] [CrossRef]
  109. Liang, Z.; He, J.; Hu, C.; Pu, X.; Khani, H.; Dai, L.; Fan, D.; Manthiram, A.; Wang, Z.-L. Next-Generation Energy Harvesting and Storage Technologies for Robots Across All Scales. Adv. Intell. Syst. 2023, 5, 2200045. [Google Scholar] [CrossRef]
  110. Ummadi, V.; Gundlapalle, A.; Shaik, A.; B, S.M.R. Autonomous Agriculture Robot for Smart Farming. arXiv 2023, arXiv:2208.01708. [Google Scholar] [CrossRef]
  111. Quaglia, G.; Visconte, C.; Scimmi, L.S.; Melchiorre, M.; Cavallone, P.; Pastorelli, S. Design of a UGV Powered by Solar Energy for Precision Agriculture. Robotics 2020, 9, 13. [Google Scholar] [CrossRef]
  112. Chand, A.A.; Prasad, K.A.; Mar, E.; Dakai, S.; Mamun, K.A.; Islam, F.R.; Mehta, U.; Kumar, N.M. Design and Analysis of Photovoltaic Powered Battery-Operated Computer Vision-Based Multi-Purpose Smart Farming Robot. Agronomy 2021, 11, 530. [Google Scholar] [CrossRef]
  113. Bwambale, E.; Wanyama, J.; Adongo, T.A.; Umukiza, E.; Ntole, R.; Chikavumbwa, S.R.; Sibale, D.; Jeremaih, Z. A Review of Model Predictive Control in Precision Agriculture. Smart Agric. Technol. 2025, 10, 100716. [Google Scholar] [CrossRef]
  114. Young, S.N. Editorial: Intelligent Robots for Agriculture—Ag-Robot Development, Navigation, and Information Perception. Front. Robot. AI 2025, 12, 1597912. [Google Scholar] [CrossRef] [PubMed]
  115. Kim, J.; Kim, G.; Yoshitoshi, R.; Tokuda, K. Real-Time Object Detection for Edge Computing-Based Agricultural Automation: A Case Study Comparing the YOLOX and YOLOv12 Architectures and Their Performance in Potato Harvesting Systems. Sensors 2025, 25, 4586. [Google Scholar] [CrossRef]
  116. Li, X.; Zhu, L.; Chu, X.; Fu, H. Edge Computing-Enabled Wireless Sensor Networks for Multiple Data Collection Tasks in Smart Agriculture. J. Sens. 2020, 2020, 4398061. [Google Scholar] [CrossRef]
  117. Wang, L. Digital Twins in Agriculture: A Review of Recent Progress and Open Issues. Electronics 2024, 13, 2209. [Google Scholar] [CrossRef]
  118. Goldenits, G.; Mallinger, K.; Raubitzek, S.; Neubauer, T. Current Applications and Potential Future Directions of Reinforcement Learning-Based Digital Twins in Agriculture. Smart Agric. Technol. 2024, 8, 100512. [Google Scholar] [CrossRef]
  119. Liu, J.; Shu, L.; Lu, X.; Liu, Y. Survey of Intelligent Agricultural IoT Based on 5G. Electronics 2023, 12, 2336. [Google Scholar] [CrossRef]
  120. Gutiérrez Cejudo, J.; Enguix Andrés, F.; Lujak, M.; Carrascosa Casamayor, C.; Fernandez, A.; Hernández López, L. Towards Agrirobot Digital Twins: Agri-RO5—A Multi-Agent Architecture for Dynamic Fleet Simulation. Electronics 2023, 13, 80. [Google Scholar] [CrossRef]
  121. Nguyen, L.V. Swarm Intelligence-Based Multi-Robotics: A Comprehensive Review. AppliedMath 2024, 4, 1192–1210. [Google Scholar] [CrossRef]
Figure 1. Diagram of platforms’ analysis.
Figure 1. Diagram of platforms’ analysis.
Agronomy 15 02185 g001
Figure 2. Number of robots produced/designed by country.
Figure 2. Number of robots produced/designed by country.
Agronomy 15 02185 g002
Figure 3. Number of robots per type of navigation sensor.
Figure 3. Number of robots per type of navigation sensor.
Agronomy 15 02185 g003
Figure 4. Number of robots per type of safety sensor.
Figure 4. Number of robots per type of safety sensor.
Agronomy 15 02185 g004
Figure 5. Percentage distribution of robots per engine type.
Figure 5. Percentage distribution of robots per engine type.
Agronomy 15 02185 g005
Figure 6. Number of robots per type of presence on the market.
Figure 6. Number of robots per type of presence on the market.
Agronomy 15 02185 g006
Figure 7. (a) Number of robots per application field; (b) number of robots per type of application field (mono-application, double-application and triple-application).
Figure 7. (a) Number of robots per application field; (b) number of robots per type of application field (mono-application, double-application and triple-application).
Agronomy 15 02185 g007
Figure 8. Number of robots per type of main task.
Figure 8. Number of robots per type of main task.
Agronomy 15 02185 g008
Figure 9. Number of robots per type of traction system.
Figure 9. Number of robots per type of traction system.
Agronomy 15 02185 g009
Table 1. Overview of robotic platforms and their main features.
Table 1. Overview of robotic platforms and their main features.
Ref.Traction SystemEngineMain TaskApplication FieldNavigation SystemSafety SystemPresence on the MarketCountry
[39]4WDElectricMulti-purposeCrops under solar panels360° Camera-ResearchJapan
[40]4WDElectricMulti-purposeVineyardsLidar, RTK-GPSBumpers, LiDARCommercialFrance
[42]TracksHybrid (diesel+electric)Multi-purposeOpen-field cropsRTK-GPSLiDAR, sonar, radar, bumpesCommercialNetherlands
[43]4WDElectricMulti-purposeOrchards, open-field crops, greenhouseRTK-GPS, 4G-CommercialSpain
[44]TracksHybrid (diesel+electric)Multi-purposeOrchards, open-field cropsRTK-GPS, 4G, Wi-FiBumpers, LiDARCommercialItaly
[45]TracksHybrid (diesel+electric)Multi-purposeVineyardsRTK-GPS, 4G360° Camera, bumpersCommercialFrance
[46]TracksElectricMulti-purposeOrchards, vineyardsIMU, radiolocator, AI-powered sensorsBumpersCommercialSlovenia
[47]4WDElectricMulti-purposeOrchards, open-field cropsRTK-GPSLiDAR, GPS fence, 2x HD camerasCommercialNorway
[49]4WDElectricMulti-purposeGreenhouse, open-field cropsRTK-GPSGeo fencing, RGBCommercialFrance
[51]TracksElectricMulti-purposeVineyards, orchardsRTK-GPSRGB, LiDAR, bumpersCommercialFrance
[50]4WDElectricMulti-purposeVineyardsRTK-GPS, LiDAR, RGBLiDAR, RGB, geo fencing, ultrasonic sensorsCommercialFrance
[48]4WDElectricMulti-purposeOpen-field cropsRTK-GPSBumpers, LiDAR, geo fencing moduleCommercialFrance
[52]4WD or 2WD and 2 TracksHybrid (diesel+electric)Multi-purposeVineyards, orchardsIMU, RTK-GPS, LTE 4GRadar, AI-powered camera, bumpers, geo fencingCommercialItaly
[53]4WDEndothermic (diesel)Multi-purposeOpen-field cropsDual RTK-GPSLiDAR, bumpers, geo fencing CommercialDenmark
[55]TracksElectricMulti-purposeOrchards, open-field crops--CommercialUSA
[57]4WDElectricMulti-purposeOrchardsRTK-GPS-CommercialChina
[56]4WDElectricMulti-purposeOrchards, open-field cropsComputer vision, machine learning Computer vision, machine learning CommercialUSA
[58]TracksElectricHarvesting OrchardsWireless control, RGB camera-ResearchRussia
[59]TracksElectricHarvesting GreenhouseStereo camera-ResearchChina
[60]4WDGasoline generatorHarvesting Open-field cropsRemote control, 2x cameras -ResearchUK
[61]4WDElectricHarvesting Open-field cropsRGB-D camera-ResearchUSA
[62]3 wheelsElectricHarvesting Greenhouse, open-field cropLiDARLiDAR, virtual perimeterCommercialSpain
[63]2WDSolar-poweredHarvestingOpen-field crops Stereo cameraStereo camera, nearfield sensorsCommercialGermany
[64]2WDHybrid (diesel+electric)HarvestingOpen-field cropRemote control, RGB camera, LiDAR-CommercialHolland
[65]4WDElectricHarvesting GreenhouseStereo camera-ResearchChina
[66]4WDElectricHarvestingGreenhouseRGB-D, LiDAR, IMU-ResearchChina
[67]TracksElectricHarvestingOrchardsRGB-D-ResearchChina
[69]TracksElectricMechanical weedingGreenhouseRGB-D-ResearchChina
[70]4WDSolar-poweredMechanical weedingOpen-field cropStereo camera-ResearchChina
[71]4WDSolar-poweredMechanical weedingOrchards, vineyardsRTK-GPS, RGB, LiDAR-Commercial-ResearchFrance
[72]2WDSolar-poweredMechanical weedingOpen-field crop (rice)RTK-GPS-CommercialSwitzerland
[73]2WDElectricMechanical weedingOpen-field crops, orchardsRTK-GPS-CommercialNetherlands
[74]4WDSolar-poweredMechanical weedingOpen-field cropsRTK-GPS, AI visionAI visionCommercialUSA
[68]4WDElectricMechanical weedingVineyardsWire fenceUltranosic sensors, bumpersCommercial-ResearchItaly
[75]4WDSolar-poweredMechanical weedingOpen-field cropsRGB-D, IR, AI Vision-CommercialUK
[77]TrackedElectricPest control and chemical weeding Orchards, greenhouseRTK-DGPS GNSS, LiDAR, ultrasonic sensor, stereo cameraUltrasonic sensor, laser scannerResearchItaly
[78]4WDElectricPest control and chemical weeding Open-field cropsRGB-D-ResearchChina
[79]2WDElectricPest control and chemical weeding Open-field crops, greenhouseRemote control-ResearchIndia
[80]2WDElectric and solar-panel assistedPest control and chemical weeding Open-field crops, greenhouseRTK-GPS, IMU, thermal/optical camera-ResearchGreece
[81]2WDElectricPest control and chemical weeding Open-field crops, greenhouseWeb camera-ResearchIndia
[82]TrackedElectricPest control and chemical weeding OrchardsLiDAR, RTK-GPS-ResearchChina
[83]2WDElectricPest control and chemical weeding Open-field cropsRTK-GPS, camera-CommercialNorway
[85]TrackedEndothermic (gasoline)Pest control and chemical weeding VineyardsRTK-GPS-CommercialJapan
[86]2WDEndothermic (diesel)Pest control and chemical weeding Open-field cropsLiDARLiDAR, bumpers CommercialSpain
[87]4WDElectricPest control and chemical weeding GreenhouseIMU-CommercialGermany
[88]2WDElectricPest control and chemical weeding Open-field cropsRTK-GPS, cameras-PrototypeAustralia
[89]2WDElectricPest control and chemical weeding GreenhouseRGB-D, IR camera-ResearchIran
[76]4WDEndothermic (diesel)Pest control and chemical weeding Open-field cropsGNSS RTK, LiDAR, IMU, RGB, ultrasonic sensorsLiDARCommercial-ResearchSpain
[90]4WDSolar-powered, interchangeable batteriesPest control and chemical weeding Open-field cropsRTK-GPS, LiDAR, sonarLiDAR, sonar CommercialSwitzerland
[92]TrackedHybrid (gasoline+electric)TransplantingOpen-field crops2x cameras -ResearchChina
[93]2WDElectricTransplantingOpen-field cropsCCD camera-ResearchChina
[94]2WDSolar-powered, power-banksPloughing seeding Open-field cropsRTK-GPS-CommercialDenmark
[96]4WDElectricPloughing seeding Open-field cropsRTK-GPS, thermal camera-Start-upUSA
[97]4WDElectricPloughing seeding Open-field cropsIR camera, ultrasonic sensor-ResearchIndia
[98]2WDSolar-poweredPloughing seeding Open-field cropsIR camera -ResearchIndia
[99]6 flexible legsElectricScouting monitoringOpen-field crops, greenhouse, orchards3x RGB cameras, IMU, LiDAR-ResearchUSA
[100]4WDSolar-powered, power-banksScouting monitoringOpen-field crops, orchardsRTK-GPS, -CommercialUK
[101]4WDElectricScouting monitoringOpen-field crops4 RGB cameras, GPS, LoRa, LiDAR-Start-upUSA
[102]TrackedElectricScouting monitoringOpen-field cropsUltrasonic sensorsUltrasonic sensorsResearchUSA
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fontani, M.; Luglio, S.M.; Gagliardi, L.; Peruzzi, A.; Frasconi, C.; Raffaelli, M.; Fontanelli, M. A Systematic Review of 59 Field Robots for Agricultural Tasks: Applications, Trends, and Future Directions. Agronomy 2025, 15, 2185. https://doi.org/10.3390/agronomy15092185

AMA Style

Fontani M, Luglio SM, Gagliardi L, Peruzzi A, Frasconi C, Raffaelli M, Fontanelli M. A Systematic Review of 59 Field Robots for Agricultural Tasks: Applications, Trends, and Future Directions. Agronomy. 2025; 15(9):2185. https://doi.org/10.3390/agronomy15092185

Chicago/Turabian Style

Fontani, Mattia, Sofia Matilde Luglio, Lorenzo Gagliardi, Andrea Peruzzi, Christian Frasconi, Michele Raffaelli, and Marco Fontanelli. 2025. "A Systematic Review of 59 Field Robots for Agricultural Tasks: Applications, Trends, and Future Directions" Agronomy 15, no. 9: 2185. https://doi.org/10.3390/agronomy15092185

APA Style

Fontani, M., Luglio, S. M., Gagliardi, L., Peruzzi, A., Frasconi, C., Raffaelli, M., & Fontanelli, M. (2025). A Systematic Review of 59 Field Robots for Agricultural Tasks: Applications, Trends, and Future Directions. Agronomy, 15(9), 2185. https://doi.org/10.3390/agronomy15092185

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop