Next Article in Journal
Non-Linear Model of Predictive Control-Based Slip Control ABS Including Tyre Tread Thermal Dynamics
Previous Article in Journal
Investigation of Working Fluid Performance through a Centrifugal Compression System
 
 
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Robots, Perception, and Tasks in Precision Agriculture

DIMEAS—Department of Mechanical and Aerospace Engineering, Politecnico di Torino, 10129 Torino, Italy
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Appl. Mech. 2022, 3(3), 830-854; https://doi.org/10.3390/applmech3030049
Received: 12 May 2022 / Revised: 21 June 2022 / Accepted: 30 June 2022 / Published: 5 July 2022

Abstract

:
This review reports the recent state of the art in the field of mobile robots applied to precision agriculture. After a brief introduction to precision agriculture, the review focuses on two main topics. First, it provides a broad overview of the most widely used technologies in agriculture related to crop, field, and soil monitoring. Second, the main robotic solutions, with a focus on land-based robots, and their salient features are described. Finally, a short case study about a robot developed by the authors is introduced. This work aims to collect and highlight the most significant trends in research on robotics applied to agriculture. This review shows that the most studied perception solutions are those based on vision and cloud point detection and, following the same trend, most robotic solutions are small robots dedicated exclusively to monitoring tasks. However, the robotisation of other agricultural tasks is growing.

1. Introduction

The world’s population is steadily increasing, necessitating an increased food supply. According to Sylvestere [1], agricultural production, particularly field agriculture [2], must increase by 70% by 2050, when the global population is predicted to exceed 9 billion people. Simultaneously, increased agricultural activity leads to the waste and exploitation of irrigation water, fertiliser, and other phytosanitary products, compromising environmental sustainability and farmers’ profits [3]. Due to the fact that agriculture consumes the majority of the world’s land, the urgent need to enhance intensive food production while utilising less land and water in the next few years would necessarily entail significant social, economic, and environmental consequences. It is, therefore, critical to identify strategies and technology to lower such expenses by increasing production and revenues while also preserving the environment [4].
Standard management practises in traditional agriculture include the consistent use of water and agrochemicals, neglecting the variety of crop conditions and demands within a field [5]. On a worldwide scale, this approach is no longer viable for achieving high levels of output while limiting environmental damage, hence new sustainable alternatives, such as Precision Agriculture (PA) [6], must be proposed and implemented. Precision Agriculture, which comprises the application of contemporary information and communication technologies to boost agricultural yields and profitability, has received a lot of attention in recent years [7]. Precision Agriculture minimises the number of resources needed to produce fields, such as arable land, water, fertilisers, and other agrochemicals [8]. The most popular definition of PA is given by Pierce and Nowak [9]: ”Precision Agriculture is the application of farming strategies and methodologies to do the right thing, in the right place and at the right time”, while data and technologies detect and decide what is right [10]. As a result, PA is a data-driven strategy for acquiring, analysing, and better managing farm data in order to improve soil and resource management, plan, and manage crops. As a result, it has the potential to boost yields and profits while also improving environmental quality. It also assists farmers in making smart decisions while automating some critical farming processes [11,12]. In other words, PA is a comprehensive and long-term approach through which farmers change and regulate their procedures and cultivation methods (such as planting, tillage, harvesting, fertiliser, pesticide, and water application) to meet the changing soil and crop conditions and demands within the field [5].
According to Srinivasan [4], PA differs from conventional farming in that it is based on more precisely determining crop variations and linking spatial and temporal relationships to field actions, allowing farmers to manage their farms, crops, and practises from an entirely new perspective, resulting in
  • costs reductions,
  • optimisation of yields and quality concerning the productive capacity of each site,
  • better management of the resources, and
  • protection of the environment.
PA has the capacity to redefine agriculture by bringing plenty of advantages in terms of profitability, productivity, sustainability, crop quality, environmental protection, on-farm quality of life, food safety, and rural economic development [13,14]. On a regional level, the spatial and temporal categorisation of farms can help discover emerging patterns in factors that impact sustainability, whether in physically neighbouring regions or specific agro-ecological zones, and guide possible solutions. Creating geo-referenced datasets for each field in a given area will aid in identifying the reasons and elements that underpin each cropping system’s productive aspect, as well as providing a summary of the proportion and severity of the factors that hinder or improve sustainability within and between fields [4]. Many aspects of PA have been researched, including related technology, environmental consequences, economics, adoption rates, and motivations for adoption and non-adoption. Many studies [15,16,17] have demonstrated PA’s environmental and economic advantages. Despite this, academic studies and professional assessments continue to show that PA adoption is rare [18,19,20].

Sustainable Development Goals in Agriculture

The Sustainable Development Goals (SDGs) are “a universal call to action to end poverty, protect the planet, and improve the lives and prospects of everyone, everywhere. The 17 Goals were adopted by all UN Member States in 2015, as part of the 2030 Agenda for Sustainable Development, which set out a 15-year plan to achieve the Goals” [21]. Among the various scientific and technological challenges originating from the pursuit of the SDGs agenda, the use of new technologies and methodologies in agricultural operations has captivated the interest of the engineering research community [22]. In actuality, such technologies are excellent for the precision agriculture paradigm, which aims to increase agricultural production’s long-term profitability by employing specialised equipment. As a result of this, profitability and efficiency are improved [23]. SDG2, SDG6, SDG12, SDG13, and SDG15 are addressed in the PA scenario as detailed below.
  • Adopting sustainable agriculture techniques that boost productivity and production, aid ecosystem sustainability, and strengthen the capacity to respond to climate change, extreme weather, droughts, floods, and other calamities, as well as progressively increasing land and soil quality. All of this is critical to reaching SDG2’s goals of Zero Hunger by 2030 [24]. Farmers and agricultural organisations who use PA receive access to technologies that help them increase yields, check product quality, enhance crop management, and minimise resource usage costs, all of which help to assure global food security and alleviate hunger.
  • By 2030, global water consumption will have increased by more than 50%, with agriculture alone requiring more than can be provided. As a consequence, PA solutions help farmers to manage agrochemicals and minimise the overuse or unnecessary application of fertilisers and pesticides, protecting water and soil quality, in line with SDG6—Clean Water and Sanitation [25] to encourage access to and sustainable management of water and sanitation.
  • As part of SDG12—Responsible Consumption and Production [26], the UN is trying to achieve the sustainable management and effective use of natural resources. The PA approach aids farmers in achieving safe management of phytosanitary products and other waste and avoiding the use of harmful substances where possible, reducing their negative effects on soil, water, and air, and thus on human health, as a result of the implementation of active innovation in the development of its products.
  • SDG13—Climate Action [27] necessitates immediate measures and activities to address climate change and its implications. The PA methods provide users with tools and technology that not only assure sustainability and increase output and profit, but also allow them to monitor the climate and take appropriate preventative steps to safeguard both fields and the environment.
  • Soil degradation is a severe and growing threat due to unsustainable land use and management practices, as well as catastrophic climatic events that introduces a variety of social, economic, and governance challenges. PA works on SDG15—Life on Land [28] targets by encouraging measures that promote land and soil restoration, prevention, and sustainable usage through planned and appropriate crop and field interventions.

2. Review Methodology

The review was completed by identifying current research in the field of study, extracting the necessary material, and synthesising the data for the final step of review elaboration.
A total of 184 papers (64 of them are about sensing technologies, and 120 are about robotic applications) were gathered by searching two databases, Google Scholar and Scopus. These two databases were chosen because the quantity of articles returned by the searches was substantial, and the content was relevant to our research. Most of the selected papers were published in the time frame of 2016 to 2021, but some relevant occurrences may be older.
Since the intent of this review is to collect in a single paper what are the most common combinations of sensors, tasks, and robotic architectures developed so far and what are the areas lacking in them, two distinct review topics can be defined: sensing technologies and robotic architectures. The 64 papers about monitoring sensors were selected by querying the databases using the following keywords: “agricultural monitoring sensors”, “agriculture remote monitoring”, and “agriculture proximal monitoring”. All results reporting how a technology works or showing an example of application were selected. Regarding the selection of the 120 papers about robots in agriculture, the following keywords were used: “Agricultural robot”, “Agricultural rover”, and “Agricultural drone”. All selected papers in this category presented a functional prototype or a finished product.

3. Enabling Technologies in Precision Agriculture

The precision agriculture process is divided into three distinct phases that occur in a cyclical pattern (Figure 1). The data analysis and treatment plan phase, the precision treatment application phase, and the monitoring and data gathering phase are all instances of such stages.
To split the field into management zones (MZs) with homogeneous treatment requirements, the available data are post-processed and merged with soil and crop models in the first stage. MZs are used as intervention zones, with specific and specified treatments [29,30]. As a result, MZs prepare the soil by applying the appropriate amounts of fertiliser, water, or agrochemicals, and then planting occurs. During the growth of the crop, the soil and the crop are continually monitored and mapped with the goal of planning and carrying out any crop-management activity (such as watering, spraying herbicides or insecticides, or eradicating weeds) locally and only when necessary. The crop is harvested when it is ready, and the yield is measured and analysed. This phase does not necessitate autonomous treatment application; all that is required is that the treatments are carried out in accordance with the preceding phase’s plan. All data obtained in earlier phases and during any other monitoring activity are processed in the final phase to generate maps that describe field conditions such as topography, soil composition, soil nutrients, crop status, and yield maps. The raw data is subsequently delivered to the first step of a new cycle, where it is processed, analysed, and evaluated in order to identify new and relevant MZs and treatments.
PA is data-intensive and involves numerous technologies, yet the application of its concepts and principles does not require heavy equipment or large fields [4]. The PA method may be customised and used in a variety of situations. The essential features required by autonomous agricultural vehicles and systems functioning in the field tend to fall into four categories [31].
  • Navigation: The vehicle must move in the field independently, following a predetermined course, following key waypoints, and avoiding any obstacles or collisions.
  • Sensing: The PA vehicle must be capable of detecting, measuring, and sampling anything that might be relevant in planning crop and soil management operations.
  • Mapping: Sensing activities generate a great amount of data, which is generally processed by Geographic Information Systems (GIS), which are maps that collect all essential field features.
  • Action: The mapping creates numerous MZs, and then the necessary treatment is carried out in each MZ. As a result, PA vehicles must be equipped to carry out such operations on their own.
The navigation of all autonomous vehicles is dependent on all of the technologies that allow an agent to localise in an area and then map it, a process known as Simultaneous Localisation and Mapping (SLAM). As a result, autonomous navigation relies on the proper integration of a localisation system, such as the GNSS, with a collection of exteroceptive sensors (e.g., image-based systems, sonars, radars, LIDARs, and so on) to locate and map at the same time. By analysing the information offered by these solutions before they can provide agronomic advice or instructions, the post-processing and mapping of the data may provide farmers with knowledge of important resources, as well as information on how much and when they should be used [8]. This is the heart of PA, and it needs an in-depth understanding of the obtained sensory data in order to appropriately analyse it and determine the needed actions. Newer sensors are currently being created with user-friendliness in mind because of the complexities of this task and the rising popularity of PA [32]. The action category of PA is generally the progression in terms of automation and the local application of traditional mechanised farm applications. As a result, autonomous systems are often specialised for a limited set of activities. To summarise, the sensing component of PA, together with mapping, is the critical component that characterises the precision agricultural method. The goal of PA sensing, regardless of the technology utilised, is to assess soil and crop statuses (i.e., soil pH, humidity, nutrients composition, and crop conditions and yield), which can be completed remotely or proximally.

3.1. Remote Sensing

Remote-sensing methods and image analytics spectra are fascinating and ever-evolving trends in PA to analyse soil state and vegetation health from a distance using pictures gathered in various. One of the most extensively utilised remote-sensing systems is aerial monitoring, which uses images taken by satellites, manned aircraft, and unmanned aerial vehicles (UAVs) [33]. Unmanned ground vehicles are sometimes outfitted with remote sensors as well. Several previous research studies [34,35,36,37,38,39] published review articles of PA on various aspects of robotics and remote sensing. Remote sensing is a wide term that incorporates all noninvasive and nondestructive approaches for analysing and parametrising crop and plant conditions. Optical-visible and near-visible spectrum sensors and technologies for calculating phenotyping variables from intensity, spectral, and volumetric data may be classified into three groups, according to Yadun Navarez et al. [38].
  • Structural characterisation: Estimating characteristics such as canopy volume, plant height, leaf area coverage, and biomass, among others, allows farmers to make better decisions. A number of researchers [40,41] used canopy volume data to improve pesticide and fertiliser spraying on fruit trees in terms of input savings and environmental costs. Mora et al. [42] also used leaf area coverage for crop-growth monitoring and production prediction, since it reflects various physiological processes in plants. Furthermore, biomass mapping and monitoring enable the identification of changes in plantation conditions due to storms, drought, or diseases [43,44]. Furthermore, because bio-energy from certain crops has become one of the most extensively used energy sources, Kankare et al. [45] were able to use crop biomass as a productivity criterion.
  • Plant/Fruit detection: For automated actions such as pruning, harvesting, and sowing to be effective, precision in detecting things of interest in the environment is necessary. To achieve this aim, scientists have used a variety of plant and fruit features and qualities, including colour, shape, and temperature. Colour is a trait that may be used to identify the fruit inside the canopy [46,47] or in the agricultural field [48] in robotic fruit picking or crop harvesting. Furthermore, according to Karkee et al. [49], the morphology of the stems is the property that provides the cutting directions in the majority of instances for automated robotic pruning.
  • Physiology assessment: The canopy’s physical response to sunlight produces different spectral patterns that reveal information about the plant’s physiological health. As a result, a number of indices [50,51] based on crop spectral responses have been developed to assess parameters such as nitrogen deficiency, chlorophyll content, water stress, and insect infestation. In addition, numerous sensing tools (for example, infrared gas analysers) enable the direct assessment of a number of physiological characteristics in plants. Many of them need direct contact with the crop, which results in more exact readings; nevertheless, the measuring method takes a unique path, making this technique time-consuming in most cases [52].
Vision-based and range sensors are the two main types of remote-sensing technologies.

3.1.1. Vision-Based Sensors

RGB or colour cameras are the most basic vision-based sensors. Colour data may be used to determine parameters such as texture and geometrical characteristics, which are important in agricultural applications. The influence of changing ambient lighting conditions is the major drawback of using these sorts of sensors, especially in outdoor situations. The use of colour cameras to detect fruits or vegetables inside the canopy for use in automated harvesting operations is a common example of how this technology may be employed. For example, Zhao et al. [53] reported on the use of a number of segmentation algorithms based on colour and form to reliably differentiate unripe green citrus. Furthermore, RGB vision was used to capture colour and form for robotic apple picking operations and berry yield, respectively, in [46,54]. Barbedo et al. [55] proposed a method for recognising plant illness, whereas Nandi et al. [56] demonstrated how to measure mango maturity using a colour camera.
While a single camera may estimate depth data, stereo vision systems are mostly employed to recreate the environment in three dimensions. Comparable to systems that combine a colour camera with a depth sensor, this measurement approach provides a 3D point cloud that shows the scene. Identifying plants or fruits and estimating structural characteristics that describe plant morphology, such as tree height, diameter, and volume estimate, or plant height and leaf area estimation, are examples of agricultural research applications. Wang et al. [57], for example, examined fruit-recognition applications in apple orchards using stereoscopic vision.
Temperature has been proved to be a useful indicator for agricultural operations such as crop diagnostics and fruit identification outside of the visible spectrum. A thermal camera may be used to examine the canopy status as well as segment a tree from the rest of the image. Baluja et al. [58] also found plant temperature to be a predictor of plant water availability, enabling the establishment of localised irrigation based on plant temperature. Similarly, Jones et al. [50] used thermal cameras to show a relationship between the temperature of the leaves and water stress. They did, however, argue that reliable crop-water stress detection needs the use of a multi-sensor approach. Fruit identification is another application for thermal cameras, as fruits absorb and irradiate sunlight differently from plants, allowing for the development of accurate detection methods. Wachs et al. [59] used a thermal camera in combination with RGB pictures to identify green apples using image processing techniques, achieving a 74% accuracy. Reina et al. [60] have stated that thermal imaging may be used to increase agricultural robots’ ambient awareness, for example, by distinguishing human operators or animals concealed by deep vegetation.
Structured Light Cameras work by projecting an infrared (IR) pattern over the area and analysing the distortion of the returning pattern to determine distances. Due to the fact that their performance is heavily influenced by lighting conditions, they are mostly utilised in labs or greenhouses [61]. A structured light sensor is primarily used to identify structural parameters such as plant and tree size, height, and volume. For example, Chéné et al. [62] used data from a commercial structured light camera to construct a leaf segmentation algorithm. The same sensor was previously used to characterise sweet onions and cauliflowers in [63,64], demonstrating that the sensor is capable of detecting fruits and vegetables.
Radiation absorption and reflection in certain bands of the electromagnetic spectrum are intimately linked to physiological parameters including chlorophyll levels, nitrogen content, and water stress. Chlorophyll, for example, reflects green light while absorbing red and blue light. Similarly, the reflectance in the middle infrared band is affected by crop water content. Spectrometers or cameras, which are classified as multispectral or hyperspectral, are used to measure the reflectance of crops, plants, and trees. Multispectral imaging can assess a scene’s reflectance using a few broad spectral bands that are not always continuous. Hyperspectral cameras, on the other hand, detect reflectance in a limited but continuous range of the spectrum. Mulla [65] provides a plethora of indices that may be used to infer physiological variables from spectral imaging. Many researchers have examined water fluctuation, vigour, chlorophyll detection, crop yield, nitrogen stress, plant stress, weed and pest infestations, and soil composition and nutrients using multispectral or hyperspectral pictures [58,66,67,68,69,70].

3.1.2. Range Sensors

One of the main techniques is ultrasound range sensing. It comprises a short-duration, high-frequency acoustic pulse that travels through the air, hits the target, and then returns as an echo. The distance is determined by the sensor’s electronics based on the time it takes for the sound to be produced and for the echo signal to be received. It has been detailed in several studies as a tool for estimating crop metrics as volume, density, and height, among others. Palleja and Landers [71] proposed using ultrasonic sensors on a tractor to analyse the canopy densities of apple trees and grapevines in real-time. Escolà et al. [72] used a similar technique in earlier studies to estimate tree volume and regulate the appropriate quantity of agrochemicals to be sprayed.
The integration of an array of detectors and a source of light to give 3D distance and intensity data has recently spurred interest in Time-of-Flight (ToF) cameras. These cameras have been employed in agricultural studies to discover the structural properties of plants and fruits in particular. For instance, according to some publications [73,74], using a time-of-flight camera to acquire the geometrical properties of the plant allows for individual leaf modelling and monitoring. Both studies were carried out in the lab, since light from the sun can cause detectors to become saturated, resulting in poor performance, as Kazmi et al. [75] demonstrated. Plant characterisation and identification increase dramatically when colour information is provided (the equipment is sometimes referred to as an RGB-D camera). Vitzrabin and Edan [76], for example, presented the use of RGB-D data to detect red sweet peppers in greenhouses, with true positive rates of up to 90.9%. Gongal et al. [47] detail the usage of RGB-D cameras mounted on a moveable platform to collect data in an over-the-row route inside apple orchards. Elfiky et al. [77] describe the use of an RGB-D camera for the 3D reconstruction and modelling of apple trees.
Crop structural parameters such as volume, leaf area coverage, and height are also calculated using LIDAR sensors [43,78]. There are two types of laser scanners in LIDAR sensors: 3D and 2D LIDARs; however, the latter is more commonly employed, since it is less expensive and may be utilised to obtain 3D data with the right configuration. The use of point clouds obtained from 2D or 3D LIDARs to extract structural information from the canopy, including volume, area, leaf density, and branch dimensions, has been documented in several publications. Numerous publications [79,80] have reported the use of terrestrial and aerial applications to identify vegetation in broad areas or provide a geometric description of the crop. Fieber et al. [81] showed how landscape back-scattering was utilised to identify trees, grass, and ground using a laser scanner. In [82,83], data from a LIDAR is used to develop allometric models for quantifying the biomass and volume of individual trees. The hyperspectral LIDAR, in particular, is a one-of-a-kind technology that is currently being investigated. The idea of this technology is to combine the benefits of classic laser scanners with the capacity to recognise various wavelengths [84]. The use of a hyperspectral LIDAR to analyse the status of vegetation in controlled circumstances was detailed by Livny et al. [85], proving the use of this type of LIDAR in vegetation spectrum analysis. In addition, as Du et al. [51] showed for rice leaves, hyperspectral LIDAR may be utilised to assess factors such as nitrogen content (which is generally performed using spectral cameras or spectrometers).
Figure 2 summarises and groups previously introduced technologies based on phenotypic characteristics that can be estimated [38].

3.2. Proximal Sensing

Although remote sensing can provide extensive coverage, data are sometimes indirect and, when assessing soil qualities, are largely restricted to the surface. Furthermore, the resolution is insufficient to determine a field’s soil variability. The process of characterising fields using typical point-sampling methodologies, on the other hand, is extremely time-consuming, expensive, and impractical. As a result, proximal soil sensing is becoming more popular as a way to bridge the gap between high-resolution point data and lower-resolution remote-sensing data.
A proximal sensor is one that measures soil qualities directly or indirectly and is close to, or even in contact with, the ground [86].
Electromagnetic induction, electrical resistivity, and ground-penetrating radar are the used methods most often for soil proximate sensing, according to Allred et al. [87]. By sending brief pulses to the ground and then analysing the returned signals, ground-penetrating radar has mostly been used to measure soil depth and structure [88]. Furthermore, as Klotzsche et al. observed [89], similar sensors have been effectively utilised to map soil water content. Electromagnetic induction techniques create eddy currents in the soil and measure the resulting electromagnetic field to determine the soil’s electrical conductivity, a feature that may provide a lot of information about this. They were originally intended to test soil salinity, but they may now be used to examine soil texture, compaction, pH, and organic matter content, as well as water content [90]. Electrical resistivity approaches, such as electromagnetic induction, measure soil resistance by injecting a current through electrodes in the ground or by utilising capacitative coupling [87]. Soil structure, chemistry, and water content have all been estimated using measured resistivity [91].
Mechanical interactions, magnetic-based, time-domain reflectometry, optical reflectance, X-ray fluorescence, γ -ray spectroscopy, ion-selective potentiometry, and seismic are among the less common soil-sensing methods, according to Adamchuk et al. [92]. Mechanical, acoustic, and pneumatic sensors may be used to determine soil mechanical resistance. It is also feasible to assess soil compaction and type, as well as water content, using the measurement (with auxiliary moisture sensors). magnetic sensors assess soil composition (particularly magnetic minerals) and water quantity by measuring magnetic susceptibility. Time-domain reflectometry was also used to detect soil moisture. Proximal optical sensors employ the same technologies as vision-based distant sensors, but they are used on the soil surface or even below. Sensors of this type can detect a wide range of soil properties, including soil composition and water content. To determine soil composition and water content, X-ray fluorescence and γ -ray spectroscopy sensors analyse the reflected wave at the appropriate frequency. Ion-selective potentiometry, on the other hand, involves determining the presence of certain ions in order to determine soil composition and water content. Finally, seismic sensors produce vibrations that may be used to determine soil compaction, porosity, and water content.
Figure 3 summarises the previously discussed proximal sensing methods and illustrates the features that they can examine.

4. Robotics and Agriculture

As a consequence of intensification, mechanisation, and automation, agricultural production has increased significantly over time [93,94]. Agricultural equipment has become more efficient, reliable, and precise thanks to automation, which has reduced the need for human involvement [95]. Despite this, agriculture, particularly the horticulture industry, continues to be plagued by a severe shortage of people. Trends such as increasing farm size, declining farmer numbers, and increasing environmental consequences of food production all exacerbate the challenges posed by a lack of workers, necessitating even more efficient agricultural techniques [96], and the productivity of traditional farming, which involves farmers manually cultivating and managing crops, can be greatly improved by using intelligent machines [97]. Furthermore, it is necessary to develop a new agricultural system that leverages advanced automated techniques to eliminate human involvement, in addition to going beyond the extremely productive levels of conventional crop farming.
The employment of robots in agriculture appears to be a potential option in the context of precision agriculture [98,99] since it enables repetitive labour to be accomplished without losing precision throughout the working day. This sort of application is growing in popularity in robotics research, and a variety of robots are now available for purchase [100]. Agricultural robotics, in fact, intends to do more than just apply robotic technology to farming. Currently, the majority of agricultural vehicles used for weed identification, pesticide distribution, terrain levelling, irrigation, and other operations are operated manually. Due to the fact that information about the environment may be obtained autonomously and the robot can then conduct its job accordingly, the autonomous performance of such robots would enable continuous field management as well as better productivity and efficiency [31]. For robots to function adequately in agricultural settings and perform agricultural tasks, research must focus on the integration of numerous complementary sensors to achieve acceptable localisation and monitoring abilities, the design of simple manipulators to accomplish the required agricultural activity, the development of path planning, navigation, and guidance algorithms adapted to situations other than open fields, and the integration with workers and operators in this complex and highly dynamic scenario. This is a primary goal for the use of a variety of technologies targeted at enhancing crop yield and quality while cutting agricultural costs. Indeed, the primary long-term aim of food security in the face of climate change necessitates a shift in the existing agricultural paradigm, which focuses on lowering natural resource usage while increasing crop productivity. For example, Tremblay et al. [101,102] demonstrated that precision seeding and planting, combined with precise treatment application (which means only adding water and plant nutrients required by the crop at the optimal place and time), result in not only an increase in average plant size and uniformity of plant maturity, but also a reduction in the ratio of phytosanitary products and water to crop production and, thus, environmental impact. Furthermore, deploying robots or autonomous tractors to perform various agricultural tasks saves fuel consumption and pollution, according to recent studies [103,104].
Agricultural research on autonomous vehicles began in the early 1960s, with a focus on the development of automated steering systems [105]. In the 1990s, the great majority of mechanical activities in field crop farming used large, powerful, and high-capacity machinery that required a lot of energy and had high handling and operating costs. However, in the last decade, research at many universities and research institutes around the world has undergone a paradigm shift: agricultural robot automation is now regarded as critical for increasing overall productivity and should include the potential for improving fresh produce quality, lowering production costs, and eliminating the repetitive tasks of manual labour [106]. Cultivation and production processes, on the other hand, are sophisticated, diversified, labour-intensive, and typically specific to each crop. Crop characteristics and demands, geographical surroundings, and climatic and meteorological settings are all factors that determine the type of method and its components. This means that the technology, equipment, and procedures required to execute an agricultural operation involving a certain crop and environment may not be suitable for another crop or environment.
Nonetheless, a large body of recent research has demonstrated the technological viability of agricultural robots for a variety of crops, agricultural tasks, and robotic features [107]. However, only a few recent breakthroughs have been tested, adopted, and put into service, and automation solutions for field operations have yet to be commercially applied efficiently and widely [108,109]. From an industrial viewpoint, the bulk of the processes was changed [110]. In the previous three decades, many agricultural robots and intelligent automation research projects never made it to the implementation stage. The excessive cost of the intended systems, their inability to conduct vital agricultural labour, low system durability, and difficulty to efficiently repeat the same activity in slightly varied circumstances or meet mechanical, economic, and/or industrial criteria were the primary causes for these failures.
Bechar and Vigneault [111,112] outlined certain criteria that, in most cases, can guarantee that robot technology may be deployed in agriculture.
  • The cost of employing robots is lower than the cost of employing any other method.
  • Using robots in agriculture improves agricultural production capacities, yields, profits, and survival while also boosting product quality and uniformity.
  • The use of robots in growth and production processes minimises uncertainty and volatility.
  • In comparison to the conventional method, the introduction of robots allows the farmer to make higher-resolution judgements and/or increase the quality of the output, allowing for growth and production phase optimisation.
  • The robot is capable of doing tasks that are either dangerous or impossible to execute manually.
There are two key trends in robotics for agriculture that can be identified in the literature: (1) flying drones for remote sensing and certain small crop management duties and (2) ground rovers for all-around local monitoring and multi-task specialisation.

4.1. Flying Drones

As previously said, remote sensing using aerial images is a frequently used method for monitoring large areas. Satellite pictures are too expensive for the common farmer, and their resolution and quality are sometimes poor and impracticable because of weather conditions. As a result, human-piloted aircraft aerial footage is of greater quality than satellite images, but it is still prohibitively expensive for most farmers [33]. Unmanned Aerial Vehicles (UAVs) have just entered the market and have found several applications in a variety of disciplines, with the majority of research confirming UAVs’ enormous potential in PA [32,33,39]. UAVs can deliver centimetre-level resolution, integrate 3D canopy height and orthophoto data, deliver multiangular data, deliver high-quality hyperspectral data, and include a variety of auxiliary sensors. UAVs have the potential to become the standard platform for remote sensing applications that need very high-resolution, thermal, or hyperspectral data, such as weed identification and early crop disease detection [32].
Several publications have attempted to categorise the most typical UAV use in PA [32,33,39]. The most prevalent use is remote sensing for soil and crop monitoring, followed by patch spraying [33]. Remote sensing is based on the collection of several aerial pictures in various spectral ranges. Drought-stress detection, for example, is mostly based on thermal photography, but pathogen diagnosis based on the combination of thermal and hyperspectral data yields excellent results. Weed identification, on the other hand, is based on RGB cameras and machine learning object-based image analysis. UAVs are interesting for nutrient status monitoring and yield prediction, but integration with models can increase their application [32]. Combining electrical conductivity data with elevation and slope maps, as well as crop indices that characterise crop vigour, can improve the definition and quality of MZs in vineyards or fields with significant slopes, according to a recent study [113]. Except for irrigation management, where thermal and/or multispectral sensors monitor the crop’s water demands [39], it appears that there is a complete absence of standardisation, because various methodologies are used for the same application [32,39,114].
Although UAVs can monitor individual plants, the MZ’s typical spatial resolution is roughly 10 metres (e.g., sprayers with independent nozzles). As a result, UAVs can support technical improvement in order to enhance the resolution of management zones [32].

4.2. Land-Based Robots

Because they may be deployed locally and operate in the needed place, unmanned ground vehicles (UGV) can fill the UAV monitoring resolution gap. UGVs are also capable of proximal sensing and are frequently built to execute one or more specialised agricultural operation such as seeding, planting, weeding, treating, pruning, picking, handling, and harvesting. Many supporting tasks, including localisation and navigation with obstacle avoidance, detection of the object to treat, treatment or action to undertake, and so on, must be performed by the robot in order for its core functions to be completed. For example, the primary task in developing a disease-monitoring robot [115] is disease monitoring, but the robot must also be capable of self-localisation, trajectory planning, steering, navigating in the field, collaborating with workers and operators, and interacting with other robots or unexpected objects. Ceres et al. [116] developed and implemented a framework for a human-integrated citrus-harvesting robot, whereas Nguyen et al. [117] developed and implemented a framework for apple-harvesting robot motion and hierarchical job planning. A framework for agricultural and forestry robots was developed by Hellstrom and Ringdahl [118].
Many examples of the re-adaptation of agricultural devices with various degrees of automation are available in the market and academic literature. Although tractors and agricultural machinery in general (autonomous or not) are large and powerful, they tend to degrade soil and make it more difficult to traverse [119]. As a result, customised robotic UGVs have recently been designed to be tailored to select specific jobs to decrease bulkiness, weight, and soil deterioration owing to unwanted compaction while still being able to fit into the PA approach’s criteria. Unlike UAVs, where the mobile robotic platform makes little difference and the focus is mostly on the used remote sensing, agriculture UGVs produce more distinctive solutions that are heavily focused on and connected to the work environment and the capabilities required by the activities [120,121].
The following is a very basic overview of the current state of the art in commercial and academic agricultural UGVs. In addition, many classifications of agricultural robots are given depending on the most prominent trends.

4.2.1. Agricultural Robot Main Functions Taxonomy

Due to the fact that comparable technologies can be readily implemented in UAVs, the great majority of agricultural robots are entirely dedicated to remote or proximal soil or crop monitoring (also known as phenotyping, i.e., evaluating plant attributes to determine their state and potential helpful treatments). Due to the fact that the sensors are the primary emphasis of this type of robot, the designs are often simple. For almond orchards, Underwood et al. [122] presented a mobile robot with a scanning system that can map flower and fruit distributions, as well as estimate and anticipate outputs for individual trees. Mueller-Sim et al. [123] showcased a robot that can go beneath the canopy of row crops such as sorghum or maize and deploy a manipulator to collect plant phenotypic data using a modular array of non-contact sensors. Virlet et al. [124] created a massive robot based on an overhead gantry design that uses RGB, chlorophyll fluorescence, hyper-spectral, and thermal cameras to generate high throughput and extensive monitoring data. Cubero et al. [125] and Rey et al. [126] presented two separate robotic systems for detecting pests and disease on carrot fields and olive trees, respectively. Barbosa et al. [127] successfully tested an autonomous robot for monitoring cotton and soy crops; Menendez-Aponte et al. [128] suggest a robot for strawberry field scouting, and ByeLab [129,130,131,132] is a robot designed to check plant volume and health in orchards. Commercially accessible robotic platforms have lately been available [133,134].
Mechanical and chemical weed eradication is the second-most popular use for mobile robots in agriculture. Researchers are primarily interested in systems that use a vision system to detect and classify weeds before removing them using a precision mechanical actuator or a localised chemical application. Bakker et al. [135] suggested a multipurpose research vehicle with a diesel engine, hydraulic transmission, four-wheel drive, and four-wheel steering to test one of the first autonomous weeding robots a decade ago. Bawden et al. [136] used a vision-based online algorithm to develop a robot that eliminates weeds chemically or mechanically depending on the species with a 92.3% accuracy of correct weed detection. Utstumo et al. [137] developed a three-wheeled robotic platform for chemical weeding in indoor carrot fields that reduced herbicide consumption significantly while avoiding overloading chemicals that might otherwise impact and destroy the crops. Furthermore, autonomous weeding robots appear to be of special relevance to the industrial sector. Oz [138], DINO [139], and TED [140] are three distinct weeding robot models made by Naio Technologies. Carré built Anatis [141] to mechanically eliminate weeds without the need for a detecting system, a self-contained solution that is more akin to traditional agricultural gear with a three-point hitch for mounting weeding equipment. AVO by Ecobotix [142], on the other hand, is an autonomous robot with a big solar panel that sprays herbicide accurately to eradicate weeds using vision-based detection.
Seeding, planting, and transplanting are three more important areas where agricultural robots may be used. For instance, Haibo et al. [143] conducted a study and an experimental campaign on a robot for wheat precision planting with a 90% accuracy. Ruangurai et al. [144] used an unusual three-wheel rice-seeding robot to attain comparable seeding accuracy. Hassan et al. [145] presented a low-cost modular robot with a custom-designed seeding mechanism, whereas Srinivasan et al. [146] created a modular tracked robot for the same task. Several authors used extremely basic and low-cost autonomous robotic platforms to construct and test their precision seeding methods [147,148,149,150,151,152]. Mohammed and Jassim [153] developed and tested a robot that can seed, fertilise, and irrigate. Li et al. [154] created a unique solar and wind-powered tumbleweed-inspired robotic seeder for desert locations. Iqbal et al. [155] created a robotic pepper transplanter that moves on rails for greenhouse production, while Liu et al. [156] created a sweet potato transplanting robot that crawls. Rowbot [157], an autonomous seeding robot for row crops, is one of the few commercial alternatives available.
Robotic harvesters are another popular issue, with vision-based crop-recognition systems, specialised manipulators, and appropriate mobile platforms being strongly emphasised. Foglia and Reina [48] created and tested a pneumatic robotic arm for harvesting red radicchio, which was directed by one of the first vision-based systems to accurately detect the plants, in 2006. Tomato-harvesting robots have lately been proposed by several researchers [158,159], but there are many other harvesters for strawberries [160,161,162], sweet peppers [163,164,165], carrots and cantaloupes [166], lettuce [167], asparagus [168], apples [169], and fruits in general [170].
Spraying agrochemicals is another activity that largely relies on vision-based systems to accurately apply just the right amount of chemical goods in order to avoid dangerous spread. Chemical weeding robots fit under this category; however, spraying can also be used for other treatments, particularly in orchards. Cantelli et al. [171] provided numerous insights into designing a re-configurable robot for plant protection product distribution in greenhouses and the associated spraying management system. Danton et al. [172] followed the development of a robot for vineyards and orchards that can execute autonomous spraying operations to treat vegetation while minimising the spread of contaminants substances by using an array of independent sprayers. Terra et al. [173] took an alternative approach, developing an autonomous sprayer that can apply just the needed quantity of pesticides and is towed by a regular tractor. Some agricultural robots are not focused on a particular duty but can complete a variety of them owing to modular tools. Amrita et al. [174] built a robot that can plough, sow, pick crops, and spray pesticides automatically. Thorvald II, a modular robot that can be customised dependent on the environment and type of crop, was introduced by Grimstad and From [175]. Some robotic solutions based on traditional tractors that may be equipped with standard agricultural tools have also been developed by industrial producers, particularly agricultural machinery brands [176,177,178].
Robotic platforms’ great adaptability has resulted in more one-of-a-kind solutions for unusual tasks. Bug Vacuum [179], a pest control robot that eliminates insects by aspiring them, was commercialised by Agrobot. Williams et al. [180] tested a unique kiwifruit pollination robot, while Galati et al. [181] built a tracked robot to compress flax fibres, and Loukatos et al. [182] constructed a robot that works alongside farmers during harvesting as an autonomous carrier of fruits and vegetables.

4.2.2. Agricultural Robots and Vineyards

Among all the crops that may be grown, the ones with the highest market values appear to be the ones where robotics is used the most. Vineyards, in particular, appear to be attracting a lot of agricultural robot concepts. A selective pesticide sprayer built by Oberti et al. [183], a semi-autonomous sprayer designed by Adamides et al. [184], and even the European project Rovitis4.0 [185] are all examples of specialised agricultural robots. There are definitely some prototypes dedicated to phenotyping [186] and others meant for mechanical weeding tasks [187], but there are also other robots built for pruning [188] and shoot thinning [189], as well as multi-purpose robots [190].

4.2.3. Agricultural Robots Classified by Size

When agricultural robots are classified by their size, numerous patterns emerge in terms of capability, design, and characteristics. The vast majority of precision agriculture UGVs are small electrically driven robots that are primarily used for crop and/or soil monitoring. They are rarely utilised for activities other than remote and proximal sensing because of their tiny size and limited power. Despite this, they need less investment, are more agile, do not have particular energy supply issues, and can be adapted to a wide range of crops.
There are robots with the size and power of traditional agricultural machinery and tractors on the opposite end of the robot size spectrum. In this instance, the more commonly recognised design trend, especially among major agricultural equipment manufacturers, is to incorporate autonomous features into current agricultural vehicle architectures (e.g., tractors, combine harvesters) rather than developing wholly new designs [95,191]. Apart from farmers’ experience with similar machinery and the adaptability of tractor implements (i.e., an autonomous tractor with a typical three-point hitch, or a similar interface, can easily employ traditional agricultural machinery usually towed and powered by traditional tractors), these systems are expensive and work conveniently mostly in wide-open fields.
Medium-sized robots offer the best of both worlds, as well as the widest range of designs. This group of robots can generally monitor the field using remote and proximal sensing, but they can also perform tasks in the field. As previously said, many robots are highly specialised for a very small range of tasks; as a result, the demands of the job drive their design. Other robots, on the other hand, are versatile platforms that can complete a lot of tasks, resulting in designs that are more adaptable and modular.
Small-to-medium-sized systems that can perform several tasks (multipurpose systems) are seen as a viable alternative to all of them. Indeed, they have the potential to lower the required economic investment [192] and to promote a new production paradigm (especially in developing countries), one that employs lucrative technology that has significant environmental and social advantages [13].

4.2.4. Agricultural Robots by Mobility Layout Configurations

Fue et al. and Oliveira et al. [193,194] published reports on agricultural robots that provided a measure of the most often used mobility layout variants. According to Oliveira et al. [194], 63% of agricultural robots are four-wheel drive or four-steering-wheel robots, with the latter being favoured where tight manoeuvrability is required. All other layouts (such as tracked, legged, rolling, etc.) have far lower adoption rates. Fue et al. [193] also mentioned the widespread usage of robots running on tracks in greenhouses or, more generally, in highly organised environments. Both evaluations included some discussion of legged robots, which suggested similar ideas: despite the fact that they are very light and thrive in extremely impervious terrain, their feet must be constructed adequately to minimise soil damage from penetration and the robot sinking due to the great pressure present in the small contact areas.
Vidoni et al. [195] used an ad hoc simulator to compare mobile robotic platform configurations in order to determine which mobility architecture configuration is best for agricultural activities on steep slopes, embankments, or hills. A tracked vehicle, according to the researchers, can handle the steepest inclines while maintaining high manoeuvrability, enhancing traction and decreasing soil compaction. Nonetheless, track slippage is unavoidable, and it causes substantial soil degradation that can lead to erosion, landslips, and water exposure [196,197,198,199,200,201,202]. Wheeled layouts offer a number of benefits, including increased efficiency and maximum speed. However, when the wheeled platform encounters especially loose or rough terrain, its efficiency plummets, and the vehicle may become stuck. Vidoni et al. [195] examined all wheeled alternatives and found that a three-wheel arrangement was the most nimble and basic, although it was readily overturned when used on steep inclines. Four-wheel platforms, on the other hand, preserve wheeled mobility benefits while providing substantially superior stability over severe inclines. Furthermore, due to their greater steering capability, agility, and stability, articulated wheeled vehicles were shown to be the best suited for uneven and side-slope terrains, according to the authors.

5. Collaboration of Multiple Robots in Precision Agriculture

UAVs may be used to monitor large areas; however, their associated sensors are frequently hampered by airspeed and altitude, as well as position uncertainty, restricting their ability to perform remote sensing accurately. UGVs, on the other hand, are machines that can monitor the environment and interact with the soil and crops in the field and over a variety of terrain. UGVs can accurately interact with objects, but they lack the rapid mobility of UAVs, and their navigation is severely limited by obstacles, barriers, and field characteristics in general. Establishing a diverse collaboration of various flying and ground robots to obtain more effective results is the logical progression of these ideas [203,204,205]. A farm operator may supervise the entire process and control the activities necessary to execute a difficult operation with several phases or even an entire mission autonomously using a system made up of a team of numerous heterogeneous aerial and ground mobile robots [206,207]. Figure 4 shows how a team of robotic agents can carry out PA operations over the course of a growth period.
Due to the potential advantages of robot collaboration in PA, most of the recent academic research has focused on integrating a fleet of cooperating robots into agricultural tasks. The European Flourish project [208], for example, optimised conventional aerial and ground robots to improve PA tasks such as crop monitoring. Tokekar et al. [12] and Bhandari et al. [209] looked at how commercial and low-cost UAVs and UGVs may be used to gather data in agricultural fields. The goal of Gonzales de Santos et al. [210] was to develop, deploy, and test a fleet of heterogeneous UGVs and UAVs to handle a variety of agricultural situations, such as weed and pest management, crop quality enhancement, and farm operator health and safety. Ribeiro et al. [206] created and tested a team of airborne and ground autonomous robots that efficiently coordinated to administer precise treatments to crops as part of the European RHEA project. Davoodi et al. [11] examined a team made up entirely of strategically positioned UGVs, allowing for maximum coverage of the whole field and monitoring of the MZs. Instead of open fields and orchards, some work is completed by a fleet of robots in greenhouses [211].

6. Case Study: Agri.Q

Almost all available robotic solutions for agriculture, except for a few small-scale prototypes used only for monitoring purposes, target flat fields or fields with no significant steep and impervious terrain. In the case of Italy, substantial regions of agricultural production are in hilly or mountainous areas, particularly when orchards, olive groves, and vineyards are taken into account [212]. This occurrence may rise in the future as a result of the climate change impact, which may force people to move to higher zones to meet temperature needs and maintain product quality [213]. As a result, robotic platforms must overcome and deal with a variety of challenges, for example, to address traction concerns, wheel slippage, tight spaces between rows, instabilities associated with terrain irregularities and changing slopes, poor GPS signal reception, and dedicated awareness (location, terrain, environment) systems and architectures must be built. Furthermore, activities in steep regions are carried out either manually or with small and compact vehicles, exposing the driver to dangerous conditions. For the latter, overturning incidents are virtually exclusively found in hilly/mountainous locations in Italy, with a mean incidence of more than four accidents per 100,000 working hours [214]. Field activities account for 68% of these. These dangers could be reduced or eliminated if autonomous and teleoperated systems were used. Furthermore, it is worth noting how orchard and vineyard activities on steep slopes, terraces, and embankments often need repeated tasks and the physical, non-ergonomic transport of even considerable loads. This, together with the ageing of the population and the resulting shortage of competent workers, provides a potential issue that will need to be addressed soon.
The authors have developed Agri.Q, an innovative UGV (Figure 5) [215,216,217,218,219]. The rover is specially developed for precision agriculture applications in vineyards and can function in an unstructured environment on uneven terrain, collaborating with drones if necessary. It is outfitted with various instruments and sensors to perform specific activities, such as field mapping and crop monitoring. Thanks to a redundant seven degrees of freedom (DOF) collaborative robot arm, it is also possible for the rover to interact with the environment, for example performing soil, leaf, and grape sample collections. Furthermore, it boasts a two DOF photovoltaic (PV) panel that can self-orient to assure a secure and always-flat drone landing platform or even maximise the gathering of sun rays during the auto-charging phase. The robotic arm is fixed on such a platform, hence the robot manipulation workspace can be dynamically adapted to different tasks and scenarios.
Agri.Q is an articulated robot made of two skid-steering modules, each module has two locomotion units driving two tyres each by means of a chain drive. Therefore, this architecture allows for efficiency comparable to more traditional wheeled systems and the effective distribution of pressure on the ground as with tracked vehicles. This last point, together with the robot’s low weight of around 110 kg, allows the robot to move nimbly over rough and soft terrains, minimising soil degradation. It has been designed to fulfill mostly proximal soil and vine monitoring and sample collection, while a flying drone can use Agri.Q as a base whenever its remote sensing capabilities are not required. Its battery can last about 3–4 h mostly depending on the terrain condition, but its 2 × 1 m orientable PV panel can double the robot’s autonomy if its missions are properly planned considering sun exposure and recharge phases.

7. Conclusions

This paper summarises the current state of the art in the subject of precision agriculture using mobile robots. Elements of the precision agriculture system are first analysed for their contribution to agricultural efficiency and sustainability, After a brief introduction to precision agriculture, the review focuses on two primary aspects. First, the state of the art sensor and monitoring technologies are described and classified into two major categories. Second, the most prominent current robotic solutions in precision agriculture are illustrated, in particular regarding land-based robots, and some trends in their designs and feature are discussed. Finally, the mobile robot for precision agriculture in vineyards named Agri.Q, developed by the authors, is presented as a valuable solution in fields characterised by significant inclines.
According to this study, the most researched perception solutions are those based on vision and cloud point sensors, often combined with machine learning approaches to interpret the collected data. Drones are the preferred robotic solution when monitoring of large fields is needed, but all of them are just a conventional drone carrying some sensors. Land-based robots instead show more unique designs based on their required tasks. Again, most land-based robotic solutions are small robots dedicated solely to monitoring activities. Other agricultural tasks, on the other hand, are becoming increasingly automated, in particular in orchards, vineyards, and other high-value agricultural products.

Author Contributions

Conceptualisation, A.B. and P.C.; methodology, A.B.; investigation, A.B., P.C., L.B., G.C. and L.T.; resources, G.Q.; data curation, A.B.; writing—original draft preparation, A.B.; writing—review and editing, P.C., L.B., G.C., L.T. and G.Q.; visualisation, A.B.; supervision, G.Q.; project administration, G.Q.; funding acquisition, G.Q. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sylvester, G. E-Agriculture in Action: Drones for Agriculture; FAO: Bangkok, Thailand, 2018. [Google Scholar]
  2. Crist, E.; Mora, C.; Engelman, R. The interaction of human population, food production, and biodiversity protection. Science 2017, 356, 260–264. [Google Scholar] [CrossRef]
  3. Ramankutty, N.; Mehrabi, Z.; Waha, K.; Jarvis, L.; Kremen, C.; Herrero, M.; Rieseberg, L.H. Trends in Global Agricultural Land Use: Implications for Environmental Health and Food Security. Annu. Rev. Plant Biol. 2018, 69, 789–815. [Google Scholar] [CrossRef][Green Version]
  4. Srinivasan, A. Handbook of Precision Agriculture: Principles and Applications; CRC Press: Boca Raton, FL, USA, 2006. [Google Scholar] [CrossRef]
  5. Corti, M.; Marino Gallina, P.; Cavalli, D.; Ortuani, B.; Cabassi, G.; Cola, G.; Vigoni, A.; Degano, L.; Bregaglio, S. Evaluation of In-Season Management Zones from High-Resolution Soil and Plant Sensors. Agronomy 2020, 10, 1124. [Google Scholar] [CrossRef]
  6. Griffin, T.W.; Yeager, E.A. Adoption of precision agriculture technology: A duration analysis. In Proceedings of the of 14th International Conference on Precision Agriculture; International Society of Precision Agriculture: Monticello, IL, USA, 2018. [Google Scholar]
  7. Barrientos, A.; Colorado, J.; Cerro, J.d.; Martinez, A.; Rossi, C.; Sanz, D.; Valente, J. Aerial remote sensing in agriculture: A practical approach to area coverage and path planning for fleets of mini aerial robots. J. Field Robot. 2011, 28, 667–689. [Google Scholar] [CrossRef][Green Version]
  8. Beloev, I.; Kinaneva, D.; Georgiev, G.; Hristov, G.; Zahariev, P. Artificial Intelligence-Driven Autonomous Robot for Precision Agriculture. Acta Technol. Agric. 2021, 24, 48–54. [Google Scholar] [CrossRef]
  9. Pierce, F.J.; Nowak, P. Aspects of Precision Agriculture. In Advances in Agronomy; Academic Press: Cambridge, MA, USA, 1999; Volume 67, pp. 1–85. [Google Scholar] [CrossRef]
  10. Zhang, N.; Wang, M.; Wang, N. Precision agriculture—A worldwide overview. Comput. Electron. Agric. 2002, 36, 113–132. [Google Scholar] [CrossRef]
  11. Davoodi, M.; Mohammadpour Velni, J.; Li, C. Coverage control with multiple ground robots for precision agriculture. Mech. Eng. 2018, 140, S4–S8. [Google Scholar] [CrossRef][Green Version]
  12. Tokekar, P.; Hook, J.V.; Mulla, D.; Isler, V. Sensor Planning for a Symbiotic UAV and UGV System for Precision Agriculture. IEEE Trans. Robot. 2016, 32, 1498–1511. [Google Scholar] [CrossRef]
  13. Lowenberg-DeBoer, J.; Huang, I.Y.; Grigoriadis, V.; Blackmore, S. Economics of robots and automation in field crop production. Precis. Agric. 2020, 21, 278–299. [Google Scholar] [CrossRef][Green Version]
  14. Vecchio, Y.; Agnusdei, G.P.; Miglietta, P.P.; Capitanio, F. Adoption of precision farming tools: The case of Italian farmers. Int. J. Environ. Res. Public Health 2020, 17, 869. [Google Scholar] [CrossRef][Green Version]
  15. Batte, M.T.; Arnholt, M.W. Precision farming adoption and use in Ohio: Case studies of six leading-edge adopters. Comput. Electron. Agric. 2003, 38, 125–139. [Google Scholar] [CrossRef]
  16. Pierce, F.J.; Elliott, T.V. Regional and on-farm wireless sensor networks for agricultural systems in Eastern Washington. Comput. Electron. Agric. 2007, 61, 32–43. [Google Scholar] [CrossRef]
  17. Swinton, S.M.; Lowenberg-DeBoer, J. Evaluating the Profitability of Site-Specific Farming. J. Prod. Agric. 1998, 11, 439–446. [Google Scholar] [CrossRef]
  18. Fountas, S.; Blackmore, S.; Ess, D.; Hawkins, S.; Blumhoff, G.; Lowenberg-Deboer, J.; Sorensen, C.G. Farmer Experience with Precision Agriculture in Denmark and the US Eastern Corn Belt. Precis. Agric. 2005, 6, 121–141. [Google Scholar] [CrossRef]
  19. Ellis, K.; Baugher, T.A.; Lewis, K. Results from Survey Instruments Used to Assess Technology Adoption for Tree Fruit Production. HortTechnology 2010, 20, 1043–1048. [Google Scholar] [CrossRef][Green Version]
  20. Lamb, D.W.; Frazier, P.; Adams, P. Improving pathways to adoption: Putting the right P’s in precision agriculture. Comput. Electron. Agric. 2008, 61, 4–9. [Google Scholar] [CrossRef]
  21. United Nations. Sustainable Development Goals. 2016. Available online: https://www.un.org/development/desa/publications/sustainable-development-goals-report-2016.html (accessed on 13 April 2022).
  22. Gil, J.D.B.; Reidsma, P.; Giller, K.; Todman, L.; Whitmore, A.; van Ittersum, M. Sustainable development goal 2: Improved targets and indicators for agriculture and food security. Ambio 2019, 48, 685–698. [Google Scholar] [CrossRef][Green Version]
  23. Vecchio, Y.; De Rosa, M.; Adinolfi, F.; Bartoli, L.; Masi, M. Adoption of precision farming tools: A context-related analysis. Land Use Policy 2020, 94, 104481. [Google Scholar] [CrossRef]
  24. United Nations. SDG 2: Zero Hunger. 2016. Available online: https://www.un.org/sustainabledevelopment/hunger/ (accessed on 13 April 2022).
  25. United Nations. SDG 6: Clean Water and Sanitation. 2016. Available online: https://ec.europa.eu/eurostat/statistics-explained/index.php?title=SDG_6_-_Clean_water_and_sanitation (accessed on 13 April 2022).
  26. United Nations. SDG 12: Responsible Consumption and Production. 2016. Available online: https://ec.europa.eu/eurostat/statistics-explained/index.php?title=SDG_12_-_Responsible_consumption_and_production_(statistical_annex)&oldid=363908 (accessed on 13 April 2022).
  27. United Nations. SDG 13: Climate Action. 2016. Available online: https://www.un.org/sustainabledevelopment/climate-action/ (accessed on 13 April 2022).
  28. United Nations. SDG 15: Life on Land. 2016. Available online: https://ec.europa.eu/eurostat/statistics-explained/index.php?title=SDG_15_-_Life_on_land (accessed on 13 April 2022).
  29. Long, D.S.; Carlson, G.R.; DeGloria, S.D. Quality of Field Management Maps. In Site-Specific Management for Agricultural Systems; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 1995; pp. 251–271. [Google Scholar] [CrossRef]
  30. Nawar, S.; Corstanje, R.; Halcro, G.; Mulla, D.; Mouazen, A.M. Chapter Four—Delineation of Soil Management Zones for Variable-Rate Fertilization: A Review. In Advances in Agronomy; Academic Press: Cambridge, MA, USA, 2017; Volume 143, pp. 175–245. [Google Scholar] [CrossRef]
  31. Auat Cheein, F.A.; Carelli, R. Agricultural Robotics: Unmanned Robotic Service Units in Agricultural Tasks. IEEE Ind. Electron. Mag. 2013, 7, 48–58. [Google Scholar] [CrossRef]
  32. Maes, W.H.; Steppe, K. Perspectives for Remote Sensing with Unmanned Aerial Vehicles in Precision Agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
  33. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
  34. Rosell, J.R.; Sanz, R. A review of methods and applications of the geometric characterization of tree crops in agricultural activities. Comput. Electron. Agric. 2012, 81, 124–141. [Google Scholar] [CrossRef][Green Version]
  35. Bac, C.W.; van Henten, E.J.; Hemming, J.; Edan, Y. Harvesting Robots for High-value Crops: State-of-the-art Review and Challenges Ahead. J. Field Robot. 2014, 31, 888–911. [Google Scholar] [CrossRef]
  36. Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and systems for fruit detection and localization: A review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
  37. Vázquez-Arellano, M.; Griepentrog, H.W.; Reiser, D.; Paraforos, D.S. 3-D Imaging Systems for Agricultural Applications—A Review. Sensors 2016, 16, 618. [Google Scholar] [CrossRef][Green Version]
  38. Yandun Narvaez, F.; Reina, G.; Torres-Torriti, M.; Kantor, G.; Cheein, F.A. A Survey of Ranging and Imaging Techniques for Precision Agriculture Phenotyping. IEEE/ASME Trans. Mechatronics 2017, 22, 2428–2439. [Google Scholar] [CrossRef]
  39. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef][Green Version]
  40. Chen, Y.; Zhu, H.; Ozkan, H.E. Development of variable-rate sprayer with laser scanning sensor to synchronize spray outputs to tree structures. Trans. ASABE 2012, 55, 773–781. [Google Scholar] [CrossRef][Green Version]
  41. Escolà, A.; Rosell-Polo, J.R.; Planas, S.; Gil, E.; Pomar, J.; Camp, F.; Llorens, J.; Solanelles, F. Variable rate sprayer. Part 1—Orchard prototype: Design, implementation and validation. Comput. Electron. Agric. 2013, 95, 122–135. [Google Scholar] [CrossRef][Green Version]
  42. Mora, M.; Avila, F.; Carrasco-Benavides, M.; Maldonado, G.; Olguín-Cáceres, J.; Fuentes, S. Automated computation of leaf area index from fruit trees using improved image processing algorithms applied to canopy cover digital photograpies. Comput. Electron. Agric. 2016, 123, 195–202. [Google Scholar] [CrossRef][Green Version]
  43. Eitel, J.U.H.; Magney, T.S.; Vierling, L.A.; Brown, T.T.; Huggins, D.R. LiDAR based biomass and crop nitrogen estimates for rapid, non-destructive assessment of wheat nitrogen status. Field Crop. Res. 2014, 159, 21–32. [Google Scholar] [CrossRef]
  44. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  45. Kankare, V.; Holopainen, M.; Vastaranta, M.; Puttonen, E.; Yu, X.; Hyyppä, J.; Vaaja, M.; Hyyppä, H.; Alho, P. Individual tree biomass estimation using terrestrial laser scanning. ISPRS J. Photogramm. Remote Sens. 2013, 75, 64–75. [Google Scholar] [CrossRef]
  46. Ji, W.; Zhao, D.; Cheng, F.; Xu, B.; Zhang, Y.; Wang, J. Automatic recognition vision system guided for apple harvesting robot. Comput. Electr. Eng. 2012, 38, 1186–1195. [Google Scholar] [CrossRef]
  47. Gongal, A.; Silwal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Apple crop-load estimation with over-the-row machine vision system. Comput. Electron. Agric. 2016, 120, 26–35. [Google Scholar] [CrossRef]
  48. Foglia, M.M.; Reina, G. Agricultural robot for radicchio harvesting. J. Field Robot. 2006, 23, 363–377. [Google Scholar] [CrossRef]
  49. Karkee, M.; Adhikari, B.; Amatya, S.; Zhang, Q. Identification of pruning branches in tall spindle apple trees for automated pruning. Comput. Electron. Agric. 2014, 103, 127–135. [Google Scholar] [CrossRef]
  50. Jones, H.G.; Serraj, R.; Loveys, B.R.; Xiong, L.; Wheaton, A.; Price, A.H.; Jones, H.G.; Serraj, R.; Loveys, B.R.; Xiong, L.; et al. Thermal infrared imaging of crop canopies for the remote diagnosis and quantification of plant responses to water stress in the field. Funct. Plant Biol. 2009, 36, 978–989. [Google Scholar] [CrossRef][Green Version]
  51. Du, L.; Gong, W.; Shi, S.; Yang, J.; Sun, J.; Zhu, B.; Song, S. Estimation of rice leaf nitrogen contents based on hyperspectral LIDAR. Int. J. Appl. Earth Obs. Geoinf. 2016, 44, 136–143. [Google Scholar] [CrossRef]
  52. Weerakkody, W.A.P.; Suriyagoda, L.D.B. Estimation of leaf and canopy photosynthesis of pot chrysanthemum and its implication on intensive canopy management. Sci. Hortic. 2015, 192, 237–243. [Google Scholar] [CrossRef]
  53. Zhao, C.; Lee, W.S.; He, D. Immature green citrus detection based on colour feature and sum of absolute transformed difference (SATD) using colour images in the citrus grove. Comput. Electron. Agric. 2016, 124, 243–253. [Google Scholar] [CrossRef]
  54. Nuske, S.; Wilshusen, K.; Achar, S.; Yoder, L.; Narasimhan, S.; Singh, S. Automated Visual Yield Estimation in Vineyards. J. Field Robot. 2014, 31, 837–860. [Google Scholar] [CrossRef]
  55. Barbedo, J.G.A.; Koenigkan, L.V.; Santos, T.T. Identifying multiple plant diseases using digital image processing. Biosyst. Eng. 2016, 147, 104–116. [Google Scholar] [CrossRef]
  56. Nandi, C.S.; Tudu, B.; Koley, C. A Machine Vision-Based Maturity Prediction System for Sorting of Harvested Mangoes. IEEE Trans. Instrum. Meas. 2014, 63, 1722–1730. [Google Scholar] [CrossRef]
  57. Wang, Q.; Nuske, S.; Bergerman, M.; Singh, S. Automated Crop Yield Estimation for Apple Orchards. In Experimental Robotics: The 13th International Symposium on Experimental Robotics, Quebec, Canada, 18–21 June 2021; Springer Tracts in Advanced Robotics; Springer International Publishing: Berlin/Heidelberg, Germany, 2013; pp. 745–758. [Google Scholar] [CrossRef]
  58. Baluja, J.; Diago, M.P.; Balda, P.; Zorer, R.; Meggio, F.; Morales, F.; Tardaguila, J. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 2012, 30, 511–522. [Google Scholar] [CrossRef]
  59. Wachs, J.P.; Stern, H.I.; Burks, T.; Alchanatis, V. Low and high-level visual feature-based apple detection from multi-modal images. Precis. Agric. 2010, 11, 717–735. [Google Scholar] [CrossRef]
  60. Reina, G.; Milella, A.; Rouveure, R.; Nielsen, M.; Worst, R.; Blas, M.R. Ambient awareness for agricultural robotic vehicles. Biosyst. Eng. 2016, 146, 114–132. [Google Scholar] [CrossRef]
  61. Li, D.; Xu, L.; Tan, C.; Goodman, E.D.; Fu, D.; Xin, L. Digitization and Visualization of Greenhouse Tomato Plants in Indoor Environments. Sensors 2015, 15, 4019–4051. [Google Scholar] [CrossRef][Green Version]
  62. Chéné, Y.; Rousseau, D.; Lucidarme, P.; Bertheloot, J.; Caffier, V.; Morel, P. On the use of depth camera for 3D phenotyping of entire plants. Comput. Electron. Agric. 2012, 82, 122–127. [Google Scholar] [CrossRef]
  63. Wang, W.; Li, C. Size estimation of sweet onions using consumer-grade RGB-depth sensor. J. Food Eng. 2014, 142, 153–162. [Google Scholar] [CrossRef]
  64. Andújar, D.; Ribeiro, A.; Fernández-Quintanilla, C.; Dorado, J. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
  65. Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  66. Hillnhütter, C.; Mahlein, A.K.; Sikora, R.A.; Oerke, E.C. Remote sensing to detect plant stress induced by Heterodera schachtii and Rhizoctonia solani in sugar beet fields. Field Crop. Res. 2011, 122, 70–77. [Google Scholar] [CrossRef]
  67. Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A.J. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  68. Torres-Sánchez, J.; López-Granados, F.; Castro, A.I.D.; Peña-Barragán, J.M. Configuration and Specifications of an Unmanned Aerial Vehicle (UAV) for Early Site Specific Weed Management. PLoS ONE 2013, 8, e58210. [Google Scholar] [CrossRef][Green Version]
  69. Zaman-Allah, M.; Vergara, O.; Araus, J.L.; Tarekegne, A.; Magorokosho, C.; Zarco-Tejada, P.J.; Hornero, A.; Albà, A.H.; Das, B.; Craufurd, P.; et al. Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize. Plant Methods 2015, 11, 35. [Google Scholar] [CrossRef][Green Version]
  70. AghaKouchak, A.; Farahmand, A.; Melton, F.S.; Teixeira, J.; Anderson, M.C.; Wardlow, B.D.; Hain, C.R. Remote sensing of drought: Progress, challenges and opportunities. Rev. Geophys. 2015, 53, 452–480. [Google Scholar] [CrossRef][Green Version]
  71. Palleja, T.; Landers, A.J. Real time canopy density estimation using ultrasonic envelope signals in the orchard and vineyard. Comput. Electron. Agric. 2015, 115, 108–117. [Google Scholar] [CrossRef]
  72. Escolà, A.; Planas, S.; Rosell, J.R.; Pomar, J.; Camp, F.; Solanelles, F.; Gracia, F.; Llorens, J.; Gil, E. Performance of an Ultrasonic Ranging Sensor in Apple Tree Canopies. Sensors 2011, 11, 2459–2477. [Google Scholar] [CrossRef][Green Version]
  73. Alenyà, G.; Dellen, B.; Torras, C. 3D modelling of leaves from color and ToF data for robotized plant measuring. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3408–3414. [Google Scholar] [CrossRef]
  74. Chaivivatrakul, S.; Tang, L.; Dailey, M.N.; Nakarmi, A.D. Automatic morphological trait characterization for corn plants via 3D holographic reconstruction. Comput. Electron. Agric. 2014, 109, 109–123. [Google Scholar] [CrossRef][Green Version]
  75. Kazmi, W.; Foix, S.; Alenyà, G. Plant leaf imaging using time of flight camera under sunlight, shadow and room conditions. In Proceedings of the 2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings, Magdeburg, Germany, 16–18 November 2012; pp. 192–197. [Google Scholar] [CrossRef][Green Version]
  76. Vitzrabin, E.; Edan, Y. Adaptive thresholding with fusion using a RGBD sensor for red sweet-pepper detection. Biosyst. Eng. 2016, 146, 45–56. [Google Scholar] [CrossRef]
  77. Elfiky, N.M.; Akbar, S.A.; Sun, J.; Park, J.; Kak, A. Automation of Dormant Pruning in Specialty Crop Production: An Adaptive Framework for Automatic Reconstruction and Modeling of Apple Trees. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Boston, MA, USA, 7–12 June 2015; pp. 65–73. [Google Scholar]
  78. Sanz-Cortiella, R.; Llorens-Calveras, J.; Escolà, A.; Arnó-Satorra, J.; Ribes-Dasi, M.; Masip-Vilalta, J.; Camp, F.; Gràcia-Aguilá, F.; Solanelles-Batlle, F.; Planas-DeMartí, S.; et al. Innovative LIDAR 3D Dynamic Measurement System to Estimate Fruit-Tree Leaf Area. Sensors 2011, 11, 5769–5791. [Google Scholar] [CrossRef] [PubMed]
  79. Pueschel, P.; Newnham, G.; Hill, J. Retrieval of Gap Fraction and Effective Plant Area Index from Phase-Shift Terrestrial Laser Scans. Remote Sens. 2014, 6, 2601–2627. [Google Scholar] [CrossRef][Green Version]
  80. Koenig, K.; Höfle, B.; Hämmerle, M.; Jarmer, T.; Siegmann, B.; Lilienthal, H. Comparative classification analysis of post-harvest growth detection from terrestrial LiDAR point clouds in precision agriculture. ISPRS J. Photogramm. Remote Sens. 2015, 104, 112–125. [Google Scholar] [CrossRef]
  81. Fieber, K.D.; Davenport, I.J.; Ferryman, J.M.; Gurney, R.J.; Walker, J.P.; Hacker, J.M. Analysis of full-waveform LiDAR data for classification of an orange orchard scene. ISPRS J. Photogramm. Remote Sens. 2013, 82, 63–82. [Google Scholar] [CrossRef][Green Version]
  82. Allouis, T.; Durrieu, S.; Véga, C.; Couteron, P. Stem Volume and Above-Ground Biomass Estimation of Individual Pine Trees From LiDAR Data: Contribution of Full-Waveform Signals. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 924–934. [Google Scholar] [CrossRef]
  83. Auat Cheein, F.A.; Guivant, J.; Sanz, R.; Escolà, A.; Yandún, F.; Torres-Torriti, M.; Rosell-Polo, J.R. Real-time approaches for characterization of fully and partially scanned canopies in groves. Comput. Electron. Agric. 2015, 118, 361–371. [Google Scholar] [CrossRef][Green Version]
  84. Lin, Y. LiDAR: An important tool for next-generation phenotyping technology of high potential for plant phenomics? Comput. Electron. Agric. 2015, 119, 61–73. [Google Scholar] [CrossRef]
  85. Livny, Y.; Yan, F.; Olson, M.; Chen, B.; Zhang, H.; El-Sana, J. Automatic reconstruction of tree skeletal structures from point clouds. In Proceedings of the ACM SIGGRAPH Asia 2010, Seoul, Korea, 15–18 December 2010; Association for Computing Machinery: New York, NY, USA, 2010. SIGGRAPH ASIA’10. pp. 1–8. [Google Scholar] [CrossRef][Green Version]
  86. Rossel, R.V.; Adamchuk, V.; Sudduth, K.; McKenzie, N.; Lobsey, C. Proximal soil sensing: An effective approach for soil measurements in space and time. Adv. Agron. 2011, 113, 243–291. [Google Scholar]
  87. Allred, B.; Daniels, J.J.; Ehsani, M.R. Handbook of Agricultural Geophysics; CRC Press: Boca Raton, FL, USA, 2008. [Google Scholar]
  88. Zajícová, K.; Chuman, T. Application of ground penetrating radar methods in soil studies: A review. Geoderma 2019, 343, 116–129. [Google Scholar] [CrossRef]
  89. Klotzsche, A.; Jonard, F.; Looms, M.C.; van der Kruk, J.; Huisman, J.A. Measuring soil water content with ground penetrating radar: A decade of progress. Vadose Zone J. 2018, 17, 1–9. [Google Scholar] [CrossRef][Green Version]
  90. Doolittle, J.A.; Brevik, E.C. The use of electromagnetic induction techniques in soils studies. Geoderma 2014, 223, 33–45. [Google Scholar] [CrossRef][Green Version]
  91. Samouëlian, A.; Cousin, I.; Tabbagh, A.; Bruand, A.; Richard, G. Electrical resistivity survey in soil science: A review. Soil Tillage Res. 2005, 83, 173–193. [Google Scholar] [CrossRef][Green Version]
  92. Adamchuk, V.; Allred, B.; Doolittle, J.; Grote, K.; Rossel, R.; Ditzler, C.; West, L. Tools for proximal soil sensing. In Soil Survey Manual. Natural Resources Conservation Service. US Department of Agriculture Handbook; USDA: Washington, DC, USA, 2015. [Google Scholar]
  93. Nof, S.Y. Springer Handbook of Automation; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  94. Zhang, Q. Opportunity of robotics in specialty crop production. IFAC Proc. Vol. 2013, 46, 38–39. [Google Scholar] [CrossRef]
  95. Schueller, J.K. CIGR handbook of agricultural engineering. Inf. Technol. 2006, 5, 330. [Google Scholar]
  96. Nagasaka, Y.; Umeda, N.; Kanetai, Y.; Taniwaki, K.; Sasaki, Y. Autonomous guidance for rice transplanting using global positioning and gyroscopes. Comput. Electron. Agric. 2004, 43, 223–234. [Google Scholar] [CrossRef]
  97. Xia, C.; Wang, L.; Chung, B.K.; Lee, J.M. In Situ 3D Segmentation of Individual Plant Leaves Using a RGB-D Camera for Agricultural Automation. Sensors 2015, 15, 20463–20479. [Google Scholar] [CrossRef]
  98. Blackmore, S. Towards robotic agriculture. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, Baltimore, MA, USA, 18–19 April 2016; Volume 9866, pp. 8–15. [Google Scholar] [CrossRef]
  99. Cariou, C.; Lenain, R.; Thuilot, B.; Berducat, M. Automatic guidance of a four-wheel-steering mobile robot for accurate field operations. J. Field Robot. 2009, 26, 504–518. [Google Scholar] [CrossRef]
  100. Bergerman, M.; Billingsley, J.; Reid, J.; van Henten, E. Robotics in Agriculture and Forestry. In Springer Handbook of Robotics; Springer Handbooks; Springer International Publishing: Cham, Switzerland, 2016; pp. 1463–1492. [Google Scholar] [CrossRef]
  101. Tremblay, N.; Fallon, E.; Ziadi, N. Sensing of Crop Nitrogen Status: Opportunities, Tools, Limitations, and Supporting Information Requirements. HortTechnology 2011, 21, 274–281. [Google Scholar] [CrossRef]
  102. Tremblay, N.; Bouroubi, Y.M.; Bélec, C.; Mullen, R.W.; Kitchen, N.R.; Thomason, W.E.; Ebelhar, S.; Mengel, D.B.; Raun, W.R.; Francis, D.D.; et al. Corn Response to Nitrogen is Influenced by Soil Texture and Weather. Agron. J. 2012, 104, 1658–1671. [Google Scholar] [CrossRef][Green Version]
  103. Gonzalez-de Soto, M.; Emmi, L.; Garcia, I.; Gonzalez-de Santos, P. Reducing fuel consumption in weed and pest control using robotic tractors. Comput. Electron. Agric. 2015, 114, 96–113. [Google Scholar] [CrossRef]
  104. Gonzalez-de Soto, M.; Emmi, L.; Benavides, C.; Garcia, I.; Gonzalez-de Santos, P. Reducing air pollution with hybrid-powered robotic tractors for precision agriculture. Biosyst. Eng. 2016, 143, 79–94. [Google Scholar] [CrossRef]
  105. Wilson, J.N. Guidance of agricultural vehicles—A historical perspective. Comput. Electron. Agric. 2000, 25, 3–9. [Google Scholar] [CrossRef]
  106. Choi, K.H.; Han, S.K.; Han, S.H.; Park, K.H.; Kim, K.S.; Kim, S. Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields. Comput. Electron. Agric. 2015, 113, 266–274. [Google Scholar] [CrossRef]
  107. Conesa-Muñoz, J.; Gonzalez-de Soto, M.; Gonzalez-de Santos, P.; Ribeiro, A. Distributed Multi-Level Supervision to Effectively Monitor the Operations of a Fleet of Autonomous Vehicles in Agricultural Tasks. Sensors 2015, 15, 5402–5428. [Google Scholar] [CrossRef][Green Version]
  108. Burks, T.; Villegas, F.; Hannan, M.; Flood, S.; Sivaraman, B.; Subramanian, V.; Sikes, J. Engineering and Horticultural Aspects of Robotic Fruit Harvesting: Opportunities and Constraints. HortTechnology 2005, 15, 79–87. [Google Scholar] [CrossRef]
  109. Xiang, R.; Jiang, H.; Ying, Y. Recognition of clustered tomatoes based on binocular stereo vision. Comput. Electron. Agric. 2014, 106, 75–90. [Google Scholar] [CrossRef]
  110. Billingsley, J.; Visala, A.; Dunn, M. Robotics in Agriculture and Forestry. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2008; pp. 1065–1077. [Google Scholar] [CrossRef][Green Version]
  111. Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  112. Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
  113. Ortuani, B.; Sona, G.; Ronchetti, G.; Mayer, A.; Facchi, A. Integrating geophysical and multispectral data to delineate homogeneous management zones within a vineyard in Northern Italy. Sensors 2019, 19, 3974. [Google Scholar] [CrossRef][Green Version]
  114. Belcore, E.; Angeli, S.; Colucci, E.; Musci, M.A.; Aicardi, I. Precision Agriculture Workflow, from Data Collection to Data Management Using FOSS Tools: An Application in Northern Italy Vineyard. ISPRS Int. J. Geo Inf. 2021, 10, 236. [Google Scholar] [CrossRef]
  115. Schor, N.; Bechar, A.; Ignat, T.; Dombrovsky, A.; Elad, Y.; Berman, S. Robotic Disease Detection in Greenhouses: Combined Detection of Powdery Mildew and Tomato Spotted Wilt Virus. IEEE Robot. Autom. Lett. 2016, 1, 354–360. [Google Scholar] [CrossRef]
  116. Ceres, R.; Pons, J.; Jiménez, A.; Martín, J.; Calderón, L. Design and implementation of an aided fruit-harvesting robot (Agribot). Ind. Robot 1998, 25, 337–346. [Google Scholar] [CrossRef]
  117. Nguyen, T.T.; Kayacan, E.; De Baedemaeker, J.; Saeys, W. Task and Motion Planning for Apple Harvesting Robot. IFAC Proc. Volume 2013, 46, 247–252. [Google Scholar] [CrossRef]
  118. Hellström, T.; Ringdahl, O. A software framework for agricultural and forestry robots. Ind. Robot. 2013, 40, 20–26. [Google Scholar] [CrossRef][Green Version]
  119. Tazzari, R.; Mengoli, D.; Marconi, L. Design Concept and Modelling of a Tracked UGV for Orchard Precision Agriculture. In Proceedings of the 2020 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Online, 4–6 November 2020; pp. 207–212. [Google Scholar] [CrossRef]
  120. Cuesta, F.; Gomez-Bravo, F.; Ollero, A. Parking maneuvers of industrial-like electrical vehicles with and without trailer. IEEE Trans. Ind. Electron. 2004, 51, 257–269. [Google Scholar] [CrossRef]
  121. Low, C.B.; Wang, D. GPS-Based Path Following Control for a Car-Like Wheeled Mobile Robot with Skidding and Slipping. IEEE Trans. Control. Syst. Technol. 2008, 16, 340–347. [Google Scholar] [CrossRef]
  122. Underwood, J.; Hung, C.; Whelan, B.; Sukkarieh, S. Mapping almond orchard canopy volume, flowers, fruit and yield using lidar and vision sensors. Comput. Electron. Agric. 2016, 130, 83–96. [Google Scholar] [CrossRef]
  123. Mueller-Sim, T.; Jenkins, M.; Abel, J.; Kantor, G. The Robotanist: A ground-based agricultural robot for high-throughput crop phenotyping. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3634–3639. [Google Scholar] [CrossRef]
  124. Virlet, N.; Sabermanesh, K.; Sadeghi-Tehran, P.; Hawkesford, M. Field Scanalyzer: An automated robotic field phenotyping platform for detailed crop monitoring. Funct. Plant Biol. 2017, 44, 143–153. [Google Scholar] [CrossRef][Green Version]
  125. Cubero, S.; Marco-noales, E.; Aleixos, N.; Barbé, S.; Blasco, J. Robhortic: A field robot to detect pests and diseases in horticultural crops by proximal sensing. Agriculture 2020, 10, 276. [Google Scholar] [CrossRef]
  126. Rey, B.; Aleixos, N.; Cubero, S.; Blasco, J. XF-ROVIM. A field robot to detect olive trees infected by Xylella fastidiosa using proximal sensing. Remote Sens. 2019, 11, 221. [Google Scholar] [CrossRef][Green Version]
  127. Barbosa, W.S.; Oliveira, A.I.S.; Barbosa, G.B.P.; Leite, A.C.; Figueiredo, K.T.; Vellasco, M.M.B.R.; Caarls, W. Design and Development of an Autonomous Mobile Robot for Inspection of Soy and Cotton Crops. In Proceedings of the 2019 12th International Conference on Developments in eSystems Engineering (DeSE), Kazan, Russia, 7–10 October 2019; pp. 557–562. [Google Scholar] [CrossRef]
  128. Menendez-Aponte, P.; Kong, X.; Xu, Y. An approximated, control affine model for a strawberry field scouting robot considering wheel-terrain interaction. Robotica 2019, 37, 1545–1561. [Google Scholar] [CrossRef]
  129. Bietresato, M.; Carabin, G.; Vidoni, R.; Gasparetto, A.; Mazzetto, F. Evaluation of a LiDAR-based 3D-stereoscopic vision system for crop-monitoring applications. Comput. Electron. Agric. 2016, 124, 1–13. [Google Scholar] [CrossRef]
  130. Bietresato, M.; Carabin, G.; D’Auria, D.; Gallo, R.; Ristorto, G.; Mazzetto, F.; Vidoni, R.; Gasparetto, A.; Scalera, L. A tracked mobile robotic lab for monitoring the plants volume and health. In Proceedings of the 2016 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA), Auckland, New Zealand, 29–31 August 2016; pp. 1–6. [Google Scholar] [CrossRef][Green Version]
  131. Ristorto, G.; Gallo, R.; Gasparetto, A.; Scalera, L.; Vidoni, R.; Mazzetto, F. A Mobile Laboratory for Orchard Health Status Monitoring in Precision Farming. Chem. Eng. Trans. 2017, 58, 661–666. [Google Scholar] [CrossRef]
  132. Vidoni, R.; Gallo, R.; Ristorto, G.; Carabin, G.; Mazzetto, F.; Scalera, L.; Gasparetto, A. ByeLab: An agricultural mobile robot prototype for proximal sensing and precision farming. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition, Tampa, FL, USA, 3–9 November 2017; American Society of Mechanical Engineers: New York, NY, USA; Volume 58370, p. V04AT05A057. [Google Scholar]
  133. EarthSense. Terra Sentia by EarthSense. 2019. Available online: https://researchpark.illinois.edu/article/earthsense-terrasentia-featured-in-successful-farming/ (accessed on 13 April 2022).
  134. Small Robot Company Tom Robot by Small Robot Company. 2021. Available online: https://www.smallrobotcompany.com (accessed on 13 April 2022).
  135. Bakker, T.; Asselt, K.; Bontsema, J.; Müller, J.; Straten, G. Systematic design of an autonomous platform for robotic weeding. J. Terramech. 2010, 47, 63–73. [Google Scholar] [CrossRef]
  136. Bawden, O.; Kulk, J.; Russell, R.; McCool, C.; English, A.; Dayoub, F.; Lehnert, C.; Perez, T. Robot for weed species plant-specific management. J. Field Robot. 2017, 34, 1179–1199. [Google Scholar] [CrossRef]
  137. Utstumo, T.; Urdal, F.; Brevik, A.; Dørum, J.; Netland, J.; Overskeid, O.; Berge, T.; Gravdahl, J. Robotic in-row weed control in vegetables. Comput. Electron. Agric. 2018, 154, 36–45. [Google Scholar] [CrossRef]
  138. Naio Technologies Autonomous Oz Weeding Robot. 2016. Available online: https://www.naio-technologies.com/en/oz/ (accessed on 13 April 2022).
  139. Naio Technologies DINO Vegetable Weeding Robot for Large-Scale Vegetable Farms. 2016. Available online: https://www.naio-technologies.com/en/dino/ (accessed on 13 April 2022).
  140. Naio Technologies TED, the Vineyard Weeding Robot. 2016. Available online: https://www.naio-technologies.com/en/ted/ (accessed on 13 April 2022).
  141. CARRÉ. ANATIS by CARRÉ. 2019. Available online: https://www.carre.fr/entretien-des-cultures-et-prairies/anatis/?lang=en (accessed on 13 April 2022).
  142. Ecorobotix. AVO The Autonomous Robot Weeder from Ecorobotix. 2020. Available online: https://platform.innoseta.eu/product/400 (accessed on 13 April 2022).
  143. Haibo, L.; Shuliang, D.; Zunmin, L.; Chuijie, Y. Study and Experiment on a Wheat Precision Seeding Robot. J. Robot. 2015, 2015, 696301. [Google Scholar] [CrossRef]
  144. Ruangurai, P.; Ekpanyapong, M.; Pruetong, C.; Watewai, T. Automated three-wheel rice seeding robot operating in dry paddy fields. Maejo Int. J. Sci. Technol. 2015, 9, 403–412. [Google Scholar]
  145. Hassan, M.U.; Ullah, M.; Iqbal, J. Towards autonomy in agriculture: Design and prototyping of a robotic vehicle with seed selector. In Proceedings of the 2016 2nd International Conference on Robotics and Artificial Intelligence (ICRAI), Islamabad, Pakistan, 20–22 April 2016; pp. 37–44. [Google Scholar] [CrossRef]
  146. Srinivasan, N.; Prabhu, P.; Smruthi, S.S.; Sivaraman, N.V.; Gladwin, S.J.; Rajavel, R.; Natarajan, A.R. Design of an autonomous seed planting robot. In Proceedings of the 2016 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), Agra, India, 21–23 December 2016; pp. 1–4. [Google Scholar] [CrossRef]
  147. Jayakrishna, P.V.S.; Reddy, M.S.; Sai, N.J.; Susheel, N.; Peeyush, K.P. Autonomous Seed Sowing Agricultural Robot. In Proceedings of the 2018 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Bangalore, India, 19–22 September 2018; pp. 2332–2336. [Google Scholar] [CrossRef]
  148. Ali, A.A.; Zohaib, M.; Mehdi, S.A. An Autonomous Seeder for Maize Crop. In Proceedings of the 2019 5th International Conference on Robotics and Artificial Intelligence, Singapore, 22–24 November; Association for Computing Machinery: New York, NY, USA, 2019. ICRAI’19. pp. 42–47. [Google Scholar] [CrossRef]
  149. Pramod, A.S.; Jithinmon, T.V. Development of mobile dual PR arm agricultural robot. J. Physics Conf. Ser. 2019, 1240, 012034. [Google Scholar] [CrossRef]
  150. Jose, C.M.; Sudheer, A.P.; Narayanan, M.D. Modelling and Analysis of Seeding Robot for Row Crops. In Proceedings of the Innovative Product Design and Intelligent Manufacturing Systems, Rourkela, India, 2–3 December 2020; Lecture Notes in Mechanical Engineering. Springer: Singapore, 2020; pp. 1003–1014. [Google Scholar] [CrossRef]
  151. Azmi, H.N.; Hajjaj, S.S.H.; Gsangaya, K.R.; Sultan, M.T.H.; Mail, M.F.; Hua, L.S. Design and fabrication of an agricultural robot for crop seeding. Mater. Today Proc. 2021. [Google Scholar] [CrossRef]
  152. Kumar, P.; Ashok, G. Design and fabrication of smart seed sowing robot. Mater. Today: Proc. 2021, 39, 354–358. [Google Scholar] [CrossRef]
  153. Mohammed, I.I.; Jassim, A.R.A.L. Design and Testing of an Agricultural Robot to Operate a Combined Seeding Machine. Ann. Rom. Soc. Cell Biol. 2021, 25, 92–106. [Google Scholar]
  154. Li, S.; Li, S.; Jin, L. The Design and Physical Implementation of Seeding Robots in Deserts. In Proceedings of the 2020 39th Chinese Control Conference (CCC), Online, 27–29 July 2020; pp. 3892–3897. [Google Scholar] [CrossRef]
  155. Iqbal, M.Z.; Islam, M.N.; Ali, M.; Kabir, M.S.N.; Park, T.; Kang, T.G.; Park, K.S.; Chung, S.O. Kinematic analysis of a hopper-type dibbling mechanism for a 2.6 kW two-row pepper transplanter. J. Mech. Sci. Technol. 2021, 35, 2605–2614. [Google Scholar] [CrossRef]
  156. Liu, Z.; Wang, X.; Zheng, W.; Lv, Z.; Zhang, W. Design of a sweet potato transplanter based on a robot arm. Appl. Sci. 2021, 11, 9349. [Google Scholar] [CrossRef]
  157. Rowbot. Rowbot. 2020. Available online: https://www.rowbot.com/ (accessed on 13 April 2022).
  158. Yaguchi, H.; Nagahama, K.; Hasegawa, T.; Inaba, M. Development of an autonomous tomato harvesting robot with rotational plucking gripper. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 652–657. [Google Scholar] [CrossRef]
  159. Feng, Q.; Zou, W.; Fan, P.; Zhang, C.; Wang, X. Design and test of robotic harvesting system for cherry tomato. Int. J. Agric. Biol. Eng. 2018, 11, 96–100. [Google Scholar] [CrossRef]
  160. Agrobot Robotic Harvesters|Agrobot. 2018. Available online: https://www.agrobot.com/e-series (accessed on 13 April 2022).
  161. Ge, Y.; Xiong, Y.; Tenorio, G.L.; From, P.J. Fruit Localization and Environment Perception for Strawberry Harvesting Robots. IEEE Access 2019, 7, 147642–147652. [Google Scholar] [CrossRef]
  162. Xiong, Y.; Ge, Y.; Grimstad, L.; From, P.J. An autonomous strawberry-harvesting robot: Design, development, integration, and field evaluation. J. Field Robot. 2020, 37. [Google Scholar] [CrossRef][Green Version]
  163. Bac, C.; Hemming, J.; van Tuijl, B.; Barth, R.; Wais, E.; van Henten, E. Performance Evaluation of a Harvesting Robot for Sweet Pepper. J. Field Robot. 2017, 34, 1123–1139. [Google Scholar] [CrossRef]
  164. Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T.; et al. Development of a sweet pepper harvesting robot. J. Field Robot. 2020, 37, 1027–1039. [Google Scholar] [CrossRef]
  165. Lehnert, C.; McCool, C.; Sa, I.; Perez, T. Performance improvements of a sweet pepper harvesting robot in protected cropping environments. J. Field Robot. 2020, 37. [Google Scholar] [CrossRef]
  166. Raja, V.; Bhaskaran, B.; Nagaraj, K.; Sampathkumar, J.; Senthilkumar, S. Agricultural harvesting using integrated robot system. Indones. J. Electr. Eng. Comput. Sci. 2022, 25, 152–158. [Google Scholar] [CrossRef]
  167. Birrell, S.; Hughes, J.; Cai, J.Y.; Iida, F. A field-tested robotic harvesting system for iceberg lettuce. J. Field Robot. 2020, 37, 225–245. [Google Scholar] [CrossRef][Green Version]
  168. Leu, A.; Razavi, M.; Langstädtler, L.; Ristić-Durrant, D.; Raffel, H.; Schenck, C.; Gräser, A.; Kuhfuss, B. Robotic Green Asparagus Selective Harvesting. IEEE/ASME Trans. Mechatronics 2017, 22, 2401–2410. [Google Scholar] [CrossRef]
  169. Kang, H.; Zhou, H.; Chen, C. Visual Perception and Modeling for Autonomous Apple Harvesting. IEEE Access 2020, 8, 62151–62163. [Google Scholar] [CrossRef]
  170. Bogue, R. Fruit picking robots: Has their time come? Ind. Robot. 2020, 47, 141–145. [Google Scholar] [CrossRef]
  171. Cantelli, L.; Bonaccorso, F.; Longo, D.; Melita, C.D.; Schillaci, G.; Muscato, G. A Small Versatile Electrical Robot for Autonomous Spraying in Agriculture. AgriEngineering 2019, 1, 391–402. [Google Scholar] [CrossRef][Green Version]
  172. Danton, A.; Roux, J.C.; Dance, B.; Cariou, C.; Lenain, R. Development of a spraying robot for precision agriculture: An edge following approach. In Proceedings of the 2020 IEEE Conference on Control Technology and Applications (CCTA), Montreal, QC, Canada, 24–26 August 2020; pp. 267–272. [Google Scholar] [CrossRef]
  173. Terra, F.; Nascimento, G.; Duarte, G.; Drews, P., Jr. Autonomous Agricultural Sprayer using Machine Vision and Nozzle Control. J. Intell. Robot. Syst. Theory Appl. 2021, 102. [Google Scholar] [CrossRef]
  174. Amrita, S.A.; Abirami, E.; Ankita, A.; Praveena, R.; Srimeena, R. Agricultural Robot for automatic ploughing and seeding. In Proceedings of the 2015 IEEE Technological Innovation in ICT for Agriculture and Rural Development (TIAR), Chennai, India, 10–12 July 2015; pp. 17–23. [Google Scholar] [CrossRef]
  175. Grimstad, L.; From, P.J. The Thorvald II Agricultural Robotic System. Robotics 2017, 6, 24. [Google Scholar] [CrossRef][Green Version]
  176. CASE IH. Case IH Autonomous Concept Vehicle. 2017. Available online: https://www.caseih.com/anz/en-au/innovations/autonomous-farming (accessed on 13 April 2022).
  177. Sitia. TREKTOR, 2020. Available online: https://www.sitia.fr/en/innovation-2/trektor/ (accessed on 13 April 2022).
  178. Deere, J. John Deere CES® 2022. Available online: https://ces2022.deere.com/ (accessed on 13 April 2022).
  179. Agrobot. Bug Vacuum, 2020. Available online: https://www.agrobot.com/bugvac (accessed on 13 April 2022).
  180. Williams, H.; Nejati, M.; Hussein, S.; Penhall, N.; Lim, J.Y.; Jones, M.H.; Bell, J.; Ahn, H.S.; Bradley, S.; Schaare, P.; et al. Autonomous pollination of individual kiwifruit flowers: Toward a robotic kiwifruit pollinator. J. Field Robot. 2020, 37, 246–262. [Google Scholar] [CrossRef]
  181. Galati, R.; Mantriota, G.; Reina, G. Design and Development of a Tracked Robot to Increase Bulk Density of Flax Fibers. J. Mech. Robot. 2021, 13. [Google Scholar] [CrossRef]
  182. Loukatos, D.; Petrongonas, E.; Manes, K.; Kyrtopoulos, I.V.; Dimou, V.; Arvanitis, K. A synergy of innovative technologies towards implementing an autonomous diy electric vehicle for harvester-assisting purposes. Machines 2021, 9, 82. [Google Scholar] [CrossRef]
  183. Oberti, R.; Marchi, M.; Tirelli, P.; Calcante, A.; Iriti, M.; Tona, E.; Hočevar, M.; Baur, J.; Pfaff, J.; Schütz, C.; et al. Selective spraying of grapevines for disease control using a modular agricultural robot. Biosyst. Eng. 2016, 146, 203–215. [Google Scholar] [CrossRef]
  184. Adamides, G.; Katsanos, C.; Constantinou, I.; Christou, G.; Xenos, M.; Hadzilacos, T.; Edan, Y. Design and development of a semi-autonomous agricultural vineyard sprayer: Human—Robot interaction aspects. J. Field Robot. 2017, 34, 1407–1426. [Google Scholar] [CrossRef]
  185. Azienda Agricola Pantano. Rovitis 4.0 by Azienda Agricola Pantano. 2018. Available online: https://www.aziendapantano.it/rovitis40.html (accessed on 13 April 2022).
  186. Shafiekhani, A.; Kadam, S.; Fritschi, F.; Desouza, G. Vinobot and vinoculer: Two robotic platforms for high-throughput field phenotyping. Sensors 2017, 17, 214. [Google Scholar] [CrossRef]
  187. Reiser, D.; Sehsah, E.S.; Bumann, O.; Morhard, J.; Griepentrog, H. Development of an autonomous electric robot implement for intra-row weeding in vineyards. Agriculture 2019, 9, 18. [Google Scholar] [CrossRef][Green Version]
  188. Botterill, T.; Paulin, S.; Green, R.; Williams, S.; Lin, J.; Saxton, V.; Mills, S.; Chen, X.; Corbett-Davies, S. A Robot System for Pruning Grape Vines. J. Field Robot. 2017, 34, 1100–1122. [Google Scholar] [CrossRef]
  189. Majeed, Y.; Karkee, M.; Zhang, Q.; Fu, L.; Whiting, M.D. Development and performance evaluation of a machine vision system and an integrated prototype for automated green shoot thinning in vineyards. J. Field Robot. 2021, 38, 898–916. [Google Scholar] [CrossRef]
  190. VitiBot. Bakus S by VitiBot. 2018. Available online: https://vitibot.fr/vineyards-robots-bakus/vineyard-robot-bakus-s/?lang=en (accessed on 13 April 2022).
  191. Thuilot, B.; Cariou, C.; Martinet, P.; Berducat, M. Automatic Guidance of a Farm Tractor Relying on a Single CP-DGPS. Auton. Robot. 2002, 13, 53–71. [Google Scholar] [CrossRef][Green Version]
  192. Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Hellmann Santos, C.; Pekkeriet, E. Agricultural Robotics for Field Operations. Sensors 2020, 20, 2672. [Google Scholar] [CrossRef]
  193. Fue, K.G.; Porter, W.M.; Barnes, E.M.; Rains, G.C. An Extensive Review of Mobile Agricultural Robotics for Field Operations: Focus on Cotton Harvesting. AgriEngineering 2020, 2, 150–174. [Google Scholar] [CrossRef][Green Version]
  194. Oliveira, L.F.P.; Moreira, A.P.; Silva, M.F. Advances in Agriculture Robotics: A State-of-the-Art Review and Challenges Ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
  195. Vidoni, R.; Bietresato, M.; Gasparetto, A.; Mazzetto, F. Evaluation and stability comparison of different vehicle configurations for robotic agricultural operations on side-slopes. Biosyst. Eng. 2015, 129, 197–211. [Google Scholar] [CrossRef]
  196. Braunack, M.V. Changes in physical properties of two dry soils during tracked vehicle passage. J. Terramech. 1986, 23, 141–151. [Google Scholar] [CrossRef]
  197. Braunack, M.V. The residual effects of tracked vehicles on soil surface properties. J. Terramech. 1986, 23, 37–50. [Google Scholar] [CrossRef]
  198. Braunack, M.V.; Williams, B.G. The effect of initial soil water content and vegetative cover on surface soil disturbance by tracked vehicles. J. Terramech. 1993, 30, 299–311. [Google Scholar] [CrossRef]
  199. Ayers, P.D. Environmental damage from tracked vehicle operation. J. Terramech. 1994, 31, 173–183. [Google Scholar] [CrossRef]
  200. Prosser, C.W.; Sedivec, K.K.; Barker, W.T. Tracked Vehicle Effects on Vegetation and Soil Characteristics. J. Range Manag. 2000, 53, 666–670. [Google Scholar] [CrossRef]
  201. Li, Q.; Ayers, P.D.; Anderson, A.B. Modeling of terrain impact caused by tracked vehicles. J. Terramech. 2007, 44, 395–410. [Google Scholar] [CrossRef]
  202. Molari, G.; Bellentani, L.; Guarnieri, A.; Walker, M.; Sedoni, E. Performance of an agricultural tractor fitted with rubber tracks. Biosyst. Eng. 2012, 111, 57–63. [Google Scholar] [CrossRef]
  203. Vu, Q.; Raković, M.; Delic, V.; Ronzhin, A. Trends in Development of UAV-UGV Cooperation Approaches in Precision Agriculture. In Proceedings of the Interactive Collaborative Robotics, Leipzig, Germany, 18–22 September 2018; Springer International Publishing: Cham, Switzerland, 2018. Lecture Notes in Computer Science. pp. 213–221. [Google Scholar] [CrossRef]
  204. Zoto, J.; Musci, M.A.; Khaliq, A.; Chiaberge, M.; Aicardi, I. Automatic path planning for unmanned ground vehicle using uav imagery. In Proceedings of the International Conference on Robotics in Alpe-Adria Danube Region, Kaiserlautern, Germany, 19–21 June 2019; pp. 223–230. [Google Scholar]
  205. Cinat, P.; Di Gennaro, S.F.; Berton, A.; Matese, A. Comparison of unsupervised algorithms for Vineyard Canopy segmentation from UAV multispectral images. Remote Sens. 2019, 11, 1023. [Google Scholar] [CrossRef][Green Version]
  206. Ribeiro, A.; Conesa-Muñoz, J. Multi-robot Systems for Precision Agriculture. In Innovation in Agricultural Robotics for Precision Agriculture: A Roadmap for Integrating Robots in Precision Agriculture; Progress in Precision Agriculture; Springer International Publishing: Cham, Switzerland, 2021; pp. 151–175. [Google Scholar] [CrossRef]
  207. Galceran, E.; Carreras, M. A survey on coverage path planning for robotics. Robot. Auton. Syst. 2013, 61, 1258–1276. [Google Scholar] [CrossRef][Green Version]
  208. Zurich, E. Aerial Data Collection and Analysis, and Automated Ground Intervention for Precision Farming|Flourish Project|Fact Sheet|H2020|CORDIS|European Commission; Florish Project: Payneham, Australia, 2018. [Google Scholar]
  209. Bhandari, S.; Raheja, A.; Green, R.L.; Do, D. Towards collaboration between unmanned aerial and ground vehicles for precision agriculture. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II, Anaheim, CA, USA, 9–13 April 2017; Volume 10218, pp. 26–39. [Google Scholar] [CrossRef]
  210. Gonzalez-de Santos, P.; Ribeiro, A.; Fernandez-Quintanilla, C.; Lopez-Granados, F.; Brandstoetter, M.; Tomic, S.; Pedrazzi, S.; Peruzzi, A.; Pajares, G.; Kaplanis, G.; et al. Fleets of robots for environmentally-safe pest control in agriculture. Precis. Agric. 2017, 18, 574–614. [Google Scholar] [CrossRef]
  211. Roldán, J.J.; Garcia-Aunon, P.; Garzón, M.; De León, J.; Del Cerro, J.; Barrientos, A. Heterogeneous Multi-Robot System for Mapping Environmental Variables of Greenhouses. Sensors 2016, 16, 1018. [Google Scholar] [CrossRef] [PubMed][Green Version]
  212. Vitali, G.; Cardillo, C.; Albertazzi, S.; Della Chiara, M.; Baldoni, G.; Signorotti, C.; Trisorio, A.; Canavari, M. Classification of Italian Farms in the FADN Database Combining Climate and Structural Information. Cartographica 2012, 47, 228–236. [Google Scholar] [CrossRef]
  213. Kaufmann, H.; Blanke, M. Chilling requirements of Mediterranean fruit crops in a changing climate. In Proceedings of the III International Symposium on Horticulture in Europe-SHE2016, Chania, Greece, 17–21 October 2016; pp. 275–280. [Google Scholar]
  214. Pessina, D.; Facchinetti, D. A survey on fatal accidents for overturning of agricultural tractors in Italy. Chem. Eng. Trans. 2017, 58, 79–84. [Google Scholar]
  215. Quaglia, G.; Visconte, C.; Carbonari, L.; Botta, A.; Cavallone, P. Agri. q: A Sustainable Rover for Precision Agriculture. In Solar Energy Conversion in Communities; Springer: Berlin/Heidelberg, Germany, 2020; pp. 81–91. [Google Scholar]
  216. Cavallone, P.; Visconte, C.; Carbonari, L.; Botta, A.; Quaglia, G. Design of the Mobile Robot Agri. q. In Proceedings of the Symposium on Robot Design, Dynamics and Control, Sapporo, Japan, 20–24 September 2020; pp. 288–296. [Google Scholar]
  217. Visconte, C.; Cavallone, P.; Carbonari, L.; Botta, A.; Quaglia, G. Design of a Mechanism with Embedded Suspension to Reconfigure the Agri_q Locomotion Layout. Robotics 2021, 10, 15. [Google Scholar] [CrossRef]
  218. Cavallone, P.; Botta, A.; Carbonari, L.; Visconte, C.; Quaglia, G. The Agri.q Mobile Robot: Preliminary Experimental Tests. In Proceedings of the Advances in Italian Mechanism Science; Niola, V., Gasparetto, A., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 524–532. [Google Scholar] [CrossRef]
  219. Botta, A.; Cavallone, P. Robotics Applied to Precision Agriculture: The Sustainable Agri.q Rover Case Study. In Proceedings of the I4SDG Workshop 2021, Online, 25–26 November 2021; Mechanisms and Machine Science. Springer International Publishing: Cham, Switzerland, 2022; pp. 41–50. [Google Scholar] [CrossRef]
Figure 1. The precision agriculture cycle and its main phases.
Figure 1. The precision agriculture cycle and its main phases.
Applmech 03 00049 g001
Figure 2. Remote sensing phenotype features and their corresponding technologies.
Figure 2. Remote sensing phenotype features and their corresponding technologies.
Applmech 03 00049 g002
Figure 3. Main proximal sensors for precision agriculture.
Figure 3. Main proximal sensors for precision agriculture.
Applmech 03 00049 g003
Figure 4. Schematic overview of the different ways to extract spatial information and the useful robotic platforms throughout a growing season.
Figure 4. Schematic overview of the different ways to extract spatial information and the useful robotic platforms throughout a growing season.
Applmech 03 00049 g004
Figure 5. Agri.Q in a vineyard in Castagnito, part of the Roero historical region in the south of the Italian region Piedmont.
Figure 5. Agri.Q in a vineyard in Castagnito, part of the Roero historical region in the south of the Italian region Piedmont.
Applmech 03 00049 g005
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Botta, A.; Cavallone, P.; Baglieri, L.; Colucci, G.; Tagliavini, L.; Quaglia, G. A Review of Robots, Perception, and Tasks in Precision Agriculture. Appl. Mech. 2022, 3, 830-854. https://doi.org/10.3390/applmech3030049

AMA Style

Botta A, Cavallone P, Baglieri L, Colucci G, Tagliavini L, Quaglia G. A Review of Robots, Perception, and Tasks in Precision Agriculture. Applied Mechanics. 2022; 3(3):830-854. https://doi.org/10.3390/applmech3030049

Chicago/Turabian Style

Botta, Andrea, Paride Cavallone, Lorenzo Baglieri, Giovanni Colucci, Luigi Tagliavini, and Giuseppe Quaglia. 2022. "A Review of Robots, Perception, and Tasks in Precision Agriculture" Applied Mechanics 3, no. 3: 830-854. https://doi.org/10.3390/applmech3030049

Article Metrics

Back to TopTop