Next Article in Journal
Evolutionary Neural Architecture Search (NAS) Using Chromosome Non-Disjunction for Korean Grammaticality Tasks
Next Article in Special Issue
Effect of Missing Vines on Total Leaf Area Determined by NDVI Calculated from Sentinel Satellite Data: Progressive Vine Removal Experiments
Previous Article in Journal
Mast Cells, Astrocytes, Arachidonic Acid: Do They Play a Role in Depression?
Previous Article in Special Issue
Detecting Banana Plantations in the Wet Tropics, Australia, Using Aerial Photography and U-Net
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Systematic Mapping Study on Remote Sensing in Agriculture

by
José Alberto García-Berná
1,
Sofia Ouhbi
2,
Brahim Benmouna
1,
Ginés García-Mateos
1,*,
José Luis Fernández-Alemán
1 and
José Miguel Molina-Martínez
3
1
Department of Computer Science and Systems, University of Murcia, 30100 Murcia, Spain
2
Department of Computer Science and Software Engineering, CIT, United Arab Emirates University, Al Ain 15551, UAE
3
Food Engineering and Agricultural Equipment Department, Technical University of Cartagena, 30203 Cartagena, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(10), 3456; https://doi.org/10.3390/app10103456
Submission received: 16 April 2020 / Revised: 14 May 2020 / Accepted: 15 May 2020 / Published: 17 May 2020
(This article belongs to the Special Issue Applications of Remote Image Capture System in Agriculture)

Abstract

:
The area of remote sensing techniques in agriculture has reached a significant degree of development and maturity, with numerous journals, conferences, and organizations specialized in it. Moreover, many review papers are available in the literature. The present work describes a literature review that adopts the form of a systematic mapping study, following a formal methodology. Eight mapping questions were defined, analyzing the main types of research, techniques, platforms, topics, and spectral information. A predefined search string was applied in the Scopus database, obtaining 1590 candidate papers. Afterwards, the most relevant 106 papers were selected, considering those with more than six citations per year. These are analyzed in more detail, answering the mapping questions for each paper. In this way, the current trends and new opportunities are discovered. As a result, increasing interest in the area has been observed since 2000; the most frequently addressed problems are those related to parameter estimation, growth vigor, and water usage, using classification techniques, that are mostly applied on RGB and hyperspectral images, captured from drones and satellites. A general recommendation that emerges from this study is to build on existing resources, such as agricultural image datasets, public satellite imagery, and deep learning toolkits.

1. Introduction

Nowadays, precision agriculture (PA) has become an essential component for modern agricultural businesses and production management. Thanks to the technological improvements, it has played an increasingly important role in agricultural production around the world by helping farmers in increasing crop yield, reducing costs and environmental impacts, and managing their land more efficiently. PA involves the integration of different areas such as geographic information systems (GIS), global positioning systems (GPS), and remote sensing (RS) technology [1]; decision support systems could also be added to this equation.
In general, GIS are computer systems that are used for storing, managing, analyzing, and displaying geospatial data [2]. In agriculture, they enable farmers and managers to handle data obtained from satellites and other types of sensors through georeferenced databases. Several research works have addressed PA problems from the perspective of GIS to reduce the environmental impact of agriculture, in applications such as disaster risk reduction [3], land use change monitoring and modeling [4], climate change detection [5], subsurface tile drains area detection [6], and identification of wetland areas [7].
GPS is closely related to GIS and RS, being used as input for both systems, i.e., GPS offers precise positioning of geospatial data and the collection of data in the field [8]. Some works have addresses PA problems from this point of view, such as solving weed management issues [9,10,11], but usually in conjunction with other technologies.
RS has been considered by some authors as the most cost-efficient technique for monitoring and analyzing large areas in the agricultural domain [12]. It can be considered as a part of the Earth observation domain, used for capturing and analyzing information about crops and soil features acquired from sensors mounted on different types of platforms such as satellites, aircraft, or ground-based equipment. Thus, the technologies related with remote sensing in agriculture (RSA) include hardware design of the cameras and capturing vehicles, communication technologies used to transfer the images [13], and the necessary tools of image processing, computer vision and machine learning to analyze the images and additional information available [14]. The obtained information is later used in agricultural decision support systems [15]. As the number of tasks and activities involved in the efficient use of these technologies can be overwhelming (from study design to quality assurance), efforts have been done to harmonize these tasks and provide general recommendations [16].
The existing applications of RSA include almost all tasks of the cultivation process [17]: estimation of cropland parameters; drought stress and use of water resources; pathogen and disease detection; weed detection; monitoring nutrient status and growth vigor of the plants; and yield estimation. These applications are affected by a set of parameters specific for each sensor type [12]:
  • Type of platform where the sensor is mounted: in-field systems, ground vehicles, aircraft or satellites.
  • Wavelengths of the electromagnetic spectrum that are captured; most frequently, they include visible, infrared, ultraviolet and microwaves.
  • Number and width of the spectral bands captured: panchromatic (a single wide band), multispectral (a small number of broad bands), and hyperspectral (many narrow bands).
  • Spatial resolution, measured in meters per pixel, which can be roughly classified in high (less than 1 mm for in-field cameras), medium, and low (around 1 km in some satellites and bands).
  • Temporal resolution, i.e., capture frequency of the system, which can range from real-time (in-field cameras) to several weeks (in some satellites).
  • Radiometric resolution, i.e., the number of bits per pixel and band (typically 8, 12, or 16 bits), and the source of energy (passive sensors or active sensors).
Airborne remote sensing is mostly realized with unmanned aerial vehicles (UAV), but also with manned aircraft. UAVs are generally low-cost, light, and low-speed planes that are well suited for remote sensing data collection [18]. UAVs are normally equipped with sensors, and have been used in many problems such as mapping weeds [19,20], monitoring the vegetation growth and yield estimation [21,22,23], managing water and irrigation [24,25], detecting diseases and monitoring plant health [26,27], crop spraying [28,29], and field phenotyping of the temperature of the canopy using thermal images [30]. In any case, the hardware capabilities depend on parameters such as weight, payload, range of flight, configuration, and cost [31]. Different kinds of UAVs have been used in last decades in PA applications, such as fixed wing drones [32], single rotors [33], quad rotors (or quadcopters) [34], hexa rotors (or hexacopters) [35], and octo rotors (or octocopters) [36]. Normally, the larger number of rotors involves better maneuverability, greater payload, and ease of use. However, they require a greater use of energy and, therefore, have less autonomy.
An alternative to drones is the use of satellites, which have gained popularity in RSA research thanks to projects such as MODIS [37], Landsat series [38], Gaofen-1/2 [39], ATLAS [40], and many others. Although they are considerably much more expensive, many of them are controlled by public or private institutions that provide free access to the obtained images. These systems have a large coverage, lower spatial and temporal resolution than UAVs, and normally each satellite includes many different capturing devices. Additionally, ground-based sensing devices have also been used in PA for certain applications and research studies [13,41,42], for example, mounted on mobile vehicles or static sensor networks.
The sensors most frequently embedded on RSA platforms are RGB cameras, multispectral and hyperspectral cameras, thermal cameras, Light Detection and Ranging (LiDAR), and Synthetic Aperture Radar (SAR) [43]. Multispectral cameras are useful to estimate parameters as chlorophyll content, leaf area index (LAI), leaf water content, and normalized difference vegetation index (NDVI), while thermal images are applied to study water stress in the plants. RGB cameras can be combined with LiDAR to obtain digital terrain/surface models (DTM/DSM) of the area being monitored [44]. SAR systems have the advantage that their quality is independent of light and weather conditions. The most basic applications of agricultural SAR remote sensing are crop identification and mapping [45], crop-type classification [46,47], and crop recognition [48].
In addition to hardware, the other major component of remote sensing systems is software. Image processing and computer vision have proven to be effective tools for analysis in PA applications, including photogrammetry techniques, vegetation indices, and machine learning as the most common areas in RSA. Photogrammetry consists of computing 3-dimensional digital terrain models [49,50,51] and orthophotos [52,53]. Other systems are based on vegetation indices, that are then used to classify the land cover or the crop type, such as obtaining the crop growing pattern [54,55], managing environmental issues [56,57], and estimating crop yield [55,58].
However, the area in which most research can be classified is machine learning. It is extensively used in PA in order to provide smart solutions for the tasks of interest. Unsupervised and supervised methods have been successfully applied, such as classification, clustering, and regression models [59]. For example, in [60], regression models are used to estimate vegetation indices, and in [24], it is used to predict crop water status. Classification techniques are the other major category, which have been used for weed detection [61,62], identification and quantification of the leaf area [63], disease detection [26,64], and identification of rapeseed [65]. Some of the most common classification techniques are listed as follows.
  • Artificial Neural Networks (ANN). ANN models have shown great potential in various RS applications in PA. For example, Hassan-Esfahani et al. [66] used an ANN to compute surface soil moisture. Poblete et al. [67] developed an ANN system to predict vine water status. In [68], the authors used ANNs to separate maize plants from weeds.
  • Random Forest algorithm (RF). It is an ensemble classification model that consists of a set of randomized decision trees. It has been used in [35] to estimate biomass and the amount of nitrogen in plants. In [36], RFs are applied to estimate the content of nitrogen in the leaf of the plants.
  • Support vector machines (SVM), naïve Bayes classifier, and k-mean clustering. These methods have also been applied in different areas of agricultural machine learning systems. Sannakki et al. [69] proposed a SVM classifier to detect diseases in pomegranate leaves at an early stage. Mokhtar et al. [70] presented a SVM-based technique for detecting diseases in tomato leaves. k-Nearest neighbors algorithm (kNN) was used in [71] to classify large agricultural land cover types. A system to discriminate weeds from crops using naïve Bayesian classifiers is presented in [72]. Moreover, in [73], Mondal et al. proposed a naïve Bayes classifier to detect gourd leaf diseases using color and texture features.
  • Deep Learning (DL). The use of DL in agriculture is a recent and promising alternative to traditional methods [74,75]. It has been used in several applications in the domain of PA. For example, a fully convolutional neural network for mapping weed is used in [76]. Castro et al. [77] used a CNN model for the classification of crops using multitemporal optical and SAR data. Mortensen et al. [78] addressed the problem of segmenting mixed crops applying CNN methods. dos Santos Ferreira et al. [20] proposed a deep learning-based CNN algorithm to classify weeds from grass and broadleaf. Moreover, Kussul et al. [79] dealt with the crop mapping problem using a multi-level DL network.
This paper describes a systematic mapping study in the area of remote sensing in agriculture. Many recent and interesting review papers can be found in the literature regarding RSA research [43,45,74,80,81,82,83]. However, the present paper is the first to adopt the form of a systematic mapping. These studies are characterized by following a formalized methodology, whose objective is to find the current trends in techniques, problems, applications, publication channels, etc., obtaining recommendations for researchers and practitioners in this area. The rest of the paper is organized as follows. In Section 2, the steps of the methodology used are explained. Then, Section 3 presents the quantitative results of the study. The main findings, suggestions, and limitations are discussed in Section 4. Finally, the conclusions and future perspectives are drawn in Section 5.

2. Research Methodology

The bibliographic review carried out in the present work has taken the form of a systematic mapping study [84], with the purpose of providing an overview of the field of remote sensing in agriculture (RSA) to identify the quantity and channels of the papers published, the type of research that is currently being done, and the results available in the literature. Systematic mapping studies follow a well-established methodology [85], consisting of the following main steps; (i) study planning by determining the mapping questions, the source databases, and the search string; (ii) searching for the relevant papers in the predefined databases; (iii) defining a classification scheme of the papers; (iv) mapping the selected papers; and (v) extracting the main findings, implications, and limitations of the study. All these steps are described in the following sections.

2.1. Formulation of the Mapping Questions

After analyzing the most interesting aspects to extract from the papers, a total of eight mapping questions (MQs) were defined. These questions help to perform the subsequent search and analysis processes. The first four questions (MQ1–4) extract general information about the publication channels, the frequency of the approaches, the research types and the empirical validation of the RSA studies. The rest of the questions (MQ5–8) are related to more specific aspects of RSA, such as the techniques used, the devices for image capturing, the problems addressed, and the type of spectral information considered. All these MQs were formulated to cover the key factors that comprise the field of RSA. Table 1 presents these MQs with the rationale that motivate their importance.

2.2. Definition of the Search Strategy

After analysing different bibliographic databases, the search was done in Scopus (https://www.scopus.com/). This database indexes an important number of journals and conferences with a certain level of rigor [86], many of them coinciding with those of the other databases. The search was done in December 2019.
Another key factor in the bibliographic search is the definition of the search string used in the database. Scopus allows to define a complex search string with Boolean operators and wildcards. This string takes a form similar to a sentence with subject–adjective–verb–complement, where all the main possibilities are considered for each component. Thus, it was formulated as follows.
TITLE ((sensing OR sensor* OR imaging OR imagery OR image*) AND (remote OR satellit* OR SAR OR UAV OR airborne OR hyperspectral OR thermal OR infrared OR “hyperspectral”) AND (detect* OR management OR monitor* OR estimat* OR classification OR recognition OR diagnosis OR identif*) AND (agricultur* OR plant OR crop* OR cultivar* OR plague OR canopy OR leaves OR infestation)).
Observe that this string is applied on the title of the paper rather than the abstract or the content, as this is more specific and produces less false-positives. The combination of the four groups of words with an AND requires that at least one word of each group appears in the title. The first group corresponds to the subject, including terms related to the scope of images and capture devices: sensing, sensor, imaging, imagery, and image. The second group is the adjective, and it is used to refine the previous set introducing the property of being remote. It contains the words remote, satellite, SAR, UAV, airborne, hyperspectral, thermal, infrared, and hyperspectral. Although the last four terms do not necessarily involve ”remote”, these types of spectral information are more common in remote sensing applications. The third group is the verb, so the terms correspond to the actions being performed with the images. These terms are the main tasks of RSA applications: detection, management, monitoring, estimation, classification, recognition, diagnosis, and identification. Finally, the fourth group is the complement, which indicates some property of the task. This allows to remove research works in remote sensing that are not related to agriculture. The terms in this group are agriculture, plant, crop, cultivar, plague, canopy, leaves, and infestation. The final search string was refined in a trial-and-error process, observing that the papers found are in the area of RSA, and no relevant papers are lost. For example, the terms “plague” and “infestation” were included after observing that some papers did not include other terms in the complement.

2.3. Study Selection

The following task after defining the search string is to establish the inclusion and exclusion criteria. Inclusion criteria are the conditions that should be met by the selected papers, whereas the exclusion criteria indicate what candidate papers should be removed from the review. Inclusion criteria (IC) were limited to the search string (IC1), and the papers should be written in English (IC2). On the other hand, the papers that meet one or more of these exclusion criteria (EC) were discarded:
  • EC1. Editorial papers, papers about colloquium and international meetings, and summer school papers.
  • EC2. Papers that have a citation ratio of less than 6 citations per year.
In EC1, editorial papers, papers about colloquium, international meetings, and summer school papers were not considered as the material provided in these manuscripts may not be of sufficient relevance and novelty. EC2 was based on the idea of selecting the most highly-cited publications. In addition, the impact of the literature on RSA was also kept in mind. For this purpose, a citation ratio with the number of citations divided by the numbers of years was employed. This ratio defines an objective criterion that allows to order the papers according to their relevance in the literature, taking into account that recent papers can have less citations than older papers.
The PRISMA methodology [87] was followed in the selection of the papers, providing a formal protocol for the accuracy and impartiality in the search of the titles in Scopus. Figure 1 shows the steps occurred during the study selection.

2.4. Data Extraction Strategy

The data extraction strategy refers to the way in which each question should be answered for each selected paper. This step requires a previous classification of the possible answers to each MQ and some indications to extract this information from the papers. The extraction strategy developed for the present study was as follows.
  • MQ1. To answer this question, the publication source and channel for each paper should be identified. The channel can be classified in journals, books, and conferences. The source refers to the name of the corresponding journal, book, or conference.
  • MQ2. In order to draw conclusions about the publication trends, articles should be classified per publication year. Therefore, this question extracts the year of each paper.
  • MQ3. Research works can be of different types, for example, a paper can propose new methods and techniques, it can evaluate existing solutions in a new application, or it can describe a specific experience that could be useful for other researchers. According to the authors of [88], the types of research can be classified into the following categories.
    -
    Evaluation Research. In this case, the research consists of the evaluation of an approach in RSA. This class also includes identifying new problems in RSA.
    -
    Solution Proposal. Research works which involve proposing a new solution for an existing problem in RSA. The proposed approach must be new, or it can be relevant modification of some existing method. An extensive experimentation is not required.
    -
    Experience Papers. These articles describe the personal experience of the authors. The paper explains what has been done and how it has been done in practice, and the obtained results.
    -
    Other. Other types of research can include, for example, reviews, opinion papers, etc.
    It is also possible to find some papers that can be classified into different categories, for example, an article can propose a new technique and perform an extensive experimental validation.
  • MQ4. Most of the research works are expected to have an empirical validation of the theoretical advances and proposals. These experimentation can be done in different ways. According to the authors of [89], the empirical research types can be classified into the following.
    -
    Case study. It is an empirical inquiry that investigates a phenomenon in its real-life context. One or many case studies can be described.
    -
    Survey. A survey is a method for collecting quantitative information related to aspects in RSA research, for example, through a questionnaire.
    -
    Experiment. This case refers to an empirical method applied under controlled conditions to observe its effects and the results of certain processes or treatments.
    -
    Data-based experiments. This is a different case from the previous category, as the research does not involve new experiments, but the data available from previous experiments is used. It can be either a public or private database.
    -
    Other. Other types can include meta-analysis, history-based evaluation, etc. It is also possible that some papers do not report any empirical validation.
  • MQ5. Another interesting aspect to analyze is the type of techniques that are used in the papers, that is, the computer vision or machine learning tasks that are addressed in [14]. Many different classifications can be found in the literature. Following the work in [82], in the present review, the techniques are classified as follows.
    -
    Image preprocessing and segmentation. Although they are different problems, the two are closely related since the input are images and the output are also images. Besides, they are typically the first steps of many computer vision systems. Image preprocessing includes the techniques whose purpose is to improve the quality of the images captured [90], e.g., to remove noise, enhance image contrast, correct geometric deformations, or remove artifacts. Image segmentation consists in separating image regions in different categories [78], e.g., separating plants and background, or detecting the regions of a crop of interest. Segmentation can be considered a result by itself, or it can be the input for further processing.
    -
    Feature extraction. Most frequently, after segmenting the regions of interest in the images, a set of features are extracted from them, although it can also be applied to the entire image. Feature extractors are a set of techniques to obtain relevant and high-level data from the images. The most usual types of features in RSA are color, texture, shape, and spectral features [91]. In many cases, the features are not explicitly predefined by the human experts, but they are given by a machine learning algorithm [75]. The extracted features can be used later for computing parameters of interest from the images, such as the water stress of the plants, or the crop yield.
    -
    Similarity measures and maximum likelihood. Most empirical research has been dedicated to find effective similarity measures on the extracted features. Then, the similarity values can be used in a maximum likelihood approach [92]. This can be used, for example, to predict the evolution of a certain crop from other previously observed cases with similar characteristics.
    -
    Classification systems. Given an image, or an image region, classification consists of determining the most likely class among a predefined set of classes [32,39,40]. For example, it can be used to classify a segmented region of plants in crop or weed, it can be used to classify a plot in dry land or irrigated, or to classify a fruit in unripe/ripe/overripe. Common classifiers used in RSA include support vector machines (SVM) [69,70], decision trees (DT), and artificial neural networks (ANN) [52], although they can also be used in the other problems.
    -
    Recognition systems. The purpose of a recognition system is to find the specific identity of the object of the given class. For example, a segmentation step can be used to separate an image in plant/background; then, a classifier is applied to find if a plant region is a tree, a grass or a weed; finally, the recognition step would determine the specific type of tree, grass, or weed [77]. Obviously, a recognizer should not be prepared to deal with all the instances from all the classes, but only for those species of interest that have been trained.
    -
    Other machine learning algorithms. In this category we include additional applications of machine learning algorithms [14]. These can include regression algorithms (e.g., for estimating the crop evapotranspiration), decision support systems (e.g., for deciding the fertirrigation schedules), or methods to automatize different processes (e.g., harvesting or fumigation machines).
    A complete computer vision system in agriculture should include many (if not all) of these techniques. Therefore, the papers have been classified according to the area where the most important contributions are done, although they could be classified into different categories.
  • MQ6. The platforms typically used to capture the images in agriculture are highly diversified [93]. They can be classified according to different criteria, such as the type of information captured (spectral or depth maps), the spatial, spectral and photometric resolution, or the type of cameras. However, they are most commonly classified considering the type of vehicles or devices in which they are mounted [80]. The main categories are listed as follows.
    -
    Satellite imagery. They are characterized for offering images of very large areas, with lower temporal resolution compared to the other platforms [94,95]. The high cost of this kind of device places them beyond the reach of farmers, being controlled by governmental or international institutions. However, in many cases, these organizations provide free access to the obtained satellite images for research purposes. Another characteristic of satellites is that most of them are equipped with multispectral or hyperspectral cameras [96].
    -
    Drones, UAVs, and manned aircraft. The use of these types of devices in agriculture has experienced a huge growth in the last decade [18]. In general, an aircraft is any vehicle which is able to fly. When they include a human pilot, they are referred as manned aircraft, while the term Unmanned Aerial Vehicle (UAV) is used when the vehicle can fly remotely (controlled by a human) or autonomously (without human control) [81]. The term drone is normally used as a synonym of UAV; however, it can also be used for other types of aquatic or land vehicles. Thus, all UAVs are drones, but not all drones are UAVs. The use of the term Unmanned Aerial System (UAS) is also frequent [97], which refers not only to the flying vehicle, but also to the ground control, communication units, support systems, etc. Compared to manned aircraft, UAVs are normally less expensive, less invasive, and safer tools, so they can be used in sensitive areas such as the polar regions [98]. The most common type of operation is the so-called visual line of sight (VLOS), where the pilot can directly see the UAV at all times; however, some systems are prepared to operate beyond visual line of sight (BVLOS) [99] allowing to cover larger extensions.
    -
    Other types of vehicles. In many cases, remote capture systems can be incorporated into the existing farm machinery [41], such as trucks, tractors, combine harvesters, etc. In this case, the images are typically used in real-time during the agricultural processes of plowing, irrigation, planting, weeding, or harvesting, more than for out-of-line analysis. We also include in this category other types of autonomous vehicles that can not be considered as UAVs, such as aerial balloons.
    -
    In-field installations. Remote image capture systems in agriculture also include field installations of fixed cameras. They can be considered remote in the sense that they are used and controlled remotely, not in the capture distance. They are usually based on inexpensive cameras communicating wirelessly, which are able to perform a real-time monitoring of the crops [13]. In counterpart, they have lower resolution than the other modalities, they only capture a small portion of the plots, and normally only RGB images are used. In some cases, they can be integrated into a wider Wireless Sensor Network (WSN) installed in the farms; these include other types of sensors (thermometers, barometers, lysimeters, etc.) that are out of the scope of the present review.
  • MQ7. To date, a large number of different problems have been addressed with the RSA techniques listed above [83]. However, this fact does not limit the possibility that other new topics and areas of application will appear in the future. According to the recent reviews [17,100], the main applications of interest can be classified as follows.
    -
    Agricultural parameters estimation. In this case, remote images are used to estimate parameters of large plots that would be difficult or expensive to be obtained using in-field methods. These parameters of interest can include crops or cropland parameters [45], for example, the height of the plants, the leaf area index (LAI), the percentage of green cover (PGC), the total biomass, the depth of the roots, or the surface roughness can be estimated.
    -
    Drought stress, irrigation, and water productivity. Due to the great importance of water in agriculture, this category includes all applications related to water and irrigation (although some of them could also be understood as parameter estimation) [91,101]. Optimization of water resources is an essential aspect of global sustainability due to the great water shortage in many regions. A key parameter is water balance, which measures the water incomes and outcomes, including the crop evapotranspiration (ET).
    -
    Nutrient status. Nutrient efficiency and avoiding nutrient losses are other topics that have received much attention in the literature of RSA. The proper use of nutrients can also be aimed at reducing pollution of the environment. It is particularly relevant the use of nitrogen (N), which has proved to affect the leaf and plant reflectance signatures [17].
    -
    Growth vigor. Monitoring plant vigor during the different stages of growth is another of the principal applications of RSA [23]. It can be based on different parameters such as the growth of the plant height, the total biomass, and the PGC. We distinguish this category from the parameter estimation in that these works perform a temporal analysis of the images.
    -
    Detection of pathogens, diseases, and insect pests. Early detection of these problems can help reduce losses. Precision agriculture systems are able to reduce pesticide use by performing site-specific spraying [102]. Thus, the effectiveness of these systems is related with the obtained quality, yield and sustainability of the crops.
    -
    Weed detection. The appearance of weeds is another problem that can appear during the cultivation process, leading to a reduction in the water and nutrients available for the crops of interest [103]. As weeds are also plants, the distinction between crops and weeds must be done using color, texture, shape, or spectral features.
    -
    Yield prediction. Regarding the last stages of the cultivation process, remote sensing images have been used to predict the yield before the actual harvesting [58]. These systems are usually based on regression models using parameters extracted from the images, although the most precise methods use accumulated temporal information and crop growth models.
    -
    Automatic crop harvesting. Intelligent harvesting machinery and picking robots have emerged in the last years as a feasible alternative to traditional harvesting methods [80], although the first experimental systems for automatic harvesting using machine vision date back to the 1980s.
  • MQ8. Computer vision systems in agriculture are not exclusively based on the use of visible light; a wide range of the electromagnetic spectrum has shown to be effective in different RSA applications, normally is frequencies lower than the visible wavelengths. Several reviews have analyzed the suitability of spectral information in different RSA problems [17,80,104]. The main types can be classified as follows.
    -
    RGB (visible spectrum). The visible spectrum corresponds to the wavelengths between 380 and 740 nm, which are visible by the human eye [105]. RGB cameras do not capture a complete spectrum of these wavelengths, but only three bands corresponding to red, green, and blue color. The main advantage of this category is the high availability, high spatial resolution, and low cost of the cameras, with respect to the other types of sensors. For these reasons, it is the predominant class in computer vision in general.
    -
    Red edge spectrum. This class corresponds to a small part of the visible spectrum, located at the end of the lowest frequencies, approximately from 670 to 740 nm. It is particularly important in agriculture [104], as the chlorophyll contained in vegetation reflects most of these wavelengths, while it absorbs a great part of the rest of the visible spectrum. Therefore, several vegetation indices have been defined based on the relationship between the reflection of red edge and red.
    -
    Near-infrared (NIR) and Vis-NIR. NIR includes the part of the infrared spectrum nearest to the visible region, approximately from 740 to 1500 nm. This class is also characterized by a high reflectance by the plants. The normalized difference vegetation index (NDVI) [23] is based on NIR and red bands, and is a very common parameter to study the amount and healthiness of vegetation. Consequently, most works include NIR and visible bands, being a typical range from 400 nm to 1500 nm; this is usually called visible-NIR or Vis-NIR.
    -
    Short-wave infrared. The term infrared refers to a broad slice of the electromagnetic spectrum ranging from 740 nm to 1 mm [93]. It is subdivided in near, short, mid, high, and far infrared, from lowest to highest wavelength. Short-wave infrared is located approximately from 1.5 to 3 μ m. This range is characterized by a high absorption from the water, so it is specially interesting for moisture analysis.
    -
    Long-wave infrared. This range corresponds to 8–15 μ m. It is also called thermal infrared [106], as it contains the wavelengths of the thermal emission of the objects. It is widely used in studies about soil moisture, crop evapotranspiration and water balance, which can be estimated from the relative temperatures [107].
    -
    Synthetic aperture radar (SAR). Unlike the previous passive sensing methods, SAR is an active sensing technique [45]. This means that the capture device emits some kind of radiation and receives the echo; normally, microwave radiations in different bands are used. This type of radar is called synthetic aperture because it takes advantage of the motion of the satellite or aircraft to simulate a large antenna, thus providing higher resolution images. Polarization properties of the waves are also used to provide more information of the land. The captured images are unaffected by the clouds, and it can be used in night-time operation. Although passive microwave capture is also possible, it is less used in RSA.
    -
    Light Detection and Ranging (LiDAR). This method also belongs to the category of active remote sensing, usually mounted on satellites and aircraft. In this case, the radiation is emitted by a laser beam, and the echo time is measured to calculate the distance to the objective. Unlike the other methods, which obtain radiation/absorption images in different wavelengths, the data obtained are depth images [108]. This type of images are also called digital elevation models (DEM). They can be used, for example, to estimate the height and volume of the plants.
    In addition, two other related terms are multispectral and hyperspectral images. These categories do not correspond to specific wavelengths, but to the number of channels that are captured.
    -
    Multispectral images (broad band). When the number of channels captured for each pixel is small, usually between three and 10 channels, we call them multispectral images [18]. Each channel corresponds to a broad range of the spectrum, which can have a descriptive name. For example, an RGB image can be understood as a multispectral image with three channels. In the review, this category has been used only when the paper cannot be classified in the previous classes. For example, satellite Lansat-8 (https://www.usgs.gov/land-resources/nli/landsat/landsat-8) is able to capture 11 different bands (although not all of them with the same spatial resolution).
    -
    Hyperspectral images (narrow band). These images are characterized by having a large number of channels, which can be some hundreds or even thousands [18]. For example, Hyperion imaging spectrometer is able to capture 224 bands with 10 nm wavelength intervals [109]. This high number of channels allows obtaining the spectral signature of the observed objects, in order to analyze their chemical composition. However, most computer vision techniques are designed for images with few channels. Specific methods should be applied when the spatial resolution of the images is small but the number of channels is very large.

2.5. Synthesis Procedure

After defining the mapping questions of interest, selecting the candidate papers, and performing the data extraction, the last step of the systematic mapping study is to synthesize the results. For each MQ, the papers are classified into the corresponding category (or categories, if more than one is applicable), and the results are presented in charts. Afterwards, these results are discussed using a variety of evaluation approaches. Finally, a narrative summary draws the main findings of the mapping study.

3. Results of the Systematic Mapping Study

As shown in Figure 1, 1590 candidate papers were first obtained by applying the search string in the Scopus database. From these, 1131 publications were selected after the application of exclusion criterion EC1. However, due to this large number, the more restrictive criterion EC2 was also applied; recall that this second criterion requires an average of 6 citations per year, so it is expected to extract the most relevant works. Finally, a total of 106 studies were selected and analysed to answer the MQs. The results obtained in the classification are presented in the following subsections.

3.1. MQ1. What Publication Channels Are the Main Targets for RSA?

This question refers both to the type of channel and the name of the publication. Figure 2 shows that almost all the selected papers were published in scientific journals, except for two conference papers and one book. The names of the journals with more than one publication are shown in Table 2. It is interesting to observe that all these journals are indexed in the Journal Citation Reports, being most of them in quartiles Q1 and Q2.

3.2. MQ2. How Has the Frequency of Approaches Related to RSA Changed over Time?

For this mapping question, it is interesting to consider both the set of 1131 candidate papers after applying EC1, and the final set of 106 papers after applying EC2. Figure 3 presents the number of articles published per year until 2019. This figure shows that there has been an important increase in the number of publications in RSA field in the last decade. Since 2000, this growth has followed a linear trend. Although the first papers date back to the 1970s, no paper meets the strict EC2 criterion until 1997; from 2002 onwards, there are always selected papers.
There is an evident decrease in the number of publications in 2019. However, this is a consequence that the study was carried out in the first months of 2020. It is possible that many publications at the end of 2019, particularly proceedings, are yet to be indexed in the database used. The same reason applies to the small number of selected papers, and also because they have not had time to receive a sufficient number of citations.

3.3. MQ3. What Are the Main Research Types of RSA Studies?

Four standard categories were defined for the types of research: evaluation research, solution proposal, experience papers, and others. Figure 4 shows that only three of these types were identified in the highly cited papers about RSA. Most of the papers were evaluation research (57%), and almost one-third of selected publications were solution proposals (28%). Reviews represented the remaining 15% of the selected papers.
It can be surprising the large proportion of review papers found, which can be explained by the large number of citations that they receive.

3.4. MQ4. Are RSA Studies Empirically Validated?

This question is closely related with the previous one, as both give an overview of how research is done. For this reason, Figure 4 shows the relationship between research types and empirical validation. Except for the review papers (which do not require validation), all the selected works were empirically evaluated. Most of the papers were evaluated through experiments, particularly data-based experiments. One paper explicitly stated using meta-analysis approach in its evaluation. Moreover, only 2% of the selected papers conducted case studies. These results demonstrate the importance of creating complete, verified and public available remote image databases for RSA research.

3.5. MQ5. What Types of Techniques Were Reported in RSA Research?

The most frequent types of techniques identified in the selected papers are presented in Table 3. More than half of the papers (54/106) focused on classification systems. It has to be noted that the type does not necessarily correspond to the final result of a proposed system. For example, a classification system can be used to classify crops and weeds, and the final result is presented as a method for detecting weeds, or it can perform a binary classification plant/soil, in order to perform an estimation of the crop coefficient. Therefore, the content of the papers were analyzed in detail to extract their main contributions.
The second most frequent computer vision task is feature extraction (16/106), which can be used for a subsequent classification, estimation, recognition, or monitoring process. The rest of categories, similarity measures and maximum likelihood, image processing and segmentation, recognition systems and other machine learning algorithms, present a very similar number of papers. Besides, a total of 10 papers were classified in more than one technique [108,110,111,112,113,114,115,116,117,118], considering their main contributions.

3.6. MQ6. What Are the Platforms Used to Capture the Images for RSA?

Figure 5 depicts the most frequent types of systems used in RSA to capture the images. In some cases, different capture devices are used, so the total number of systems is higher than the number of papers. Moreover, the capture process should not necessarily be done by the authors, as the research could be based on existing datasets.
Concerning the obtained results, it is remarkable the small number of research works that are based on in-field low cost cameras. Although most research has been done in this area, this may be due to the fact that, in some contexts, they are not considered to be included in the remote sensing category. On the other hand, the most frequent type of platform used in the research are satellites, followed by drones and manned aircraft, and finally other types of vehicles.

3.7. MQ7. What Are the Research Topics by RSA?

Again, this mapping question may be subject to different interpretations, since a paper can address different topics or it can be in the borderline between some of them. Thus, a careful inspection of the literature was done to classify the papers in the most adequate category. As a result, the main problems detected are shown in Table 4. In the case of automatic crop harvesting, no papers were found in the present mapping study, possibly because they do not consider the use of remote images.
The results indicate that the different categories are not equally distributed. The topics that received more attention are the estimation of agricultural parameters, the analysis of crop vigor, and the problems related to water usage. These represent more than 77% of the papers. The works related to detection of pathogens, diseases, and insect pests are about 10% of the total. Moreover, at the other end, the classes with relatively fewer publications are yield prediction, weed detection, and the analysis of nutrient status. Therefore, these types of problems represent a good opportunity to advance in RSA research.

3.8. MQ8. What Are the Different Types of Spectral Information Used?

This last mapping question refers to the type of spectral information of the images used in the research. As described, a wide range of the electromagnetic spectrum has proved to be useful in RSA. Each class can be suitable for some specific problem, or it can be applied to different tasks. The classification of the papers is presented in Table 5. As the labels of multispectral and hyperspectral data are not incompatible with the rest of categories, some papers are classified into more than one class. In addition, many works use different types of images, so they can also be classified in different classes. For example, in [204] three types of images (RGB, multispectral, and thermal images) are compared for the problem of high-throughput plant phenotyping, using UAV and in-field cameras.
According to these results, standard RGB images (i.e., the visible spectrum) are clearly the most frequent type of images employed (46/106). Hyperspectral images are also found very frequently (28/106), in many cases mounted on UAVs or acquired from satellites. Apart from the visible spectrum, the following bands most used in the research are thermal and near infrared. In the opposite side, LiDAR and short wavelength infrared are to less commonly used.

4. Discussion

4.1. Main Findings and Implications for Researchers and Practitioners

The ultimate purpose of the study is to gain an in-depth understanding of the current state of research in remote sensing in agriculture, in order to give suggestions about future lines of research and finding new possibilities and application areas. This is achieved by an analysis and discussion of the results presented in the previous section. The major findings that can be extracted are the following.
  • The main publication channels of the selected papers are journals, at a great distance from books and conferences. This is caused by the introduction of the strict exclusion criterion EC2 of six citations per year. Publications in journals are known to be cited more than those in conferences. Although conferences are important publication venues for computer science researchers [205], the research community tends to prefer publishing in journals due to the tenure and promotion guidelines in many institutions which only consider publications in high-impact factor journals [206]. However, the role of conferences as a means of spreading new ideas, showing ongoing research and connecting researchers should not be dismissed.
  • The research field of RSA has gained an increasing interest since the beginning of the millennium. This can be explained by the new technologies that appeared in this period (cameras, satellites, and UAVs) in addition to the improvements in telecommunications and data transmission. The decrease of publications observed in 2019 is a collateral effect of the review procedure and the minimum required number of citations. Thus, the increasing interest in RSA is expected to continue in the near future, favoring the appearance of new journals and conferences more specialized in the different areas of RSA.
  • Most of the selected papers are evaluation research using data sets. Solution proposals represent almost one-third of the selected papers, which indicates that the field has reached a certain maturity and researchers are more interested in evaluating existing technologies rather than proposing new ones. This is also supported by the large number of reviews identified. On the other hand, this highlights the importance of creating public and comprehensive datasets where the results of different authors can be compared. It would be recommended that this effort be carried out by existing institutions and associations, rather than particular research groups. One example of these public resources is EuroSAT [40], a public dataset of 27,000 labeled and georeferenced images from Sentinel-2 satellite useful for the classification of land usages; the DeepSat Airborne Dataset [94], with 500,000 image patches in Vis-NIR range; or the Copernicus Programme, which offers satellite and in-situ images for land monitoring (https://land.copernicus.eu/).
  • The majority of the selected empirically evaluated publications were conducted through experiments. Only two case studies have been identified in the selected papers, which means that it is difficult to perform this type of studies in RSA, as normally the research is done under uncontrolled settings. The small number of meta-analysis papers that were found indicates that there is an interesting opportunity to apply this type of statistical analysis whose purpose is to combine the results of multiple previous scientific works to assess these results and derive conclusions.
  • The computer vision task most frequently found in the selected papers is classification: given a image patch or region, classify it into a predefined set of classes of interest. This is an expected observation, as it is one of the most studied machine learning problems, it has a simple a clear definition, and its results can be used in different applications. Decision trees, support vector machines, classical neural networks, k-nearest neighbors. and Bayes classifiers are among the most frequent techniques. However, deep learning methods are gaining popularity [20,74,76,77,78,79], proving to overcome other techniques in many domains. It is recommended that it should be applied when it is really of interest and not as a fad. An interesting alternative could be the use of ensemble classification systems that have not been widely used in RSA research. Other problems that have been identified in the selected papers include feature extraction, maximum likelihood, image preprocessing and segmentation, and recognition systems. As a general recommendation, we advice to make use of free tools and libraries for machine learning and computer vision, taking advantage of the great effort done by the free software community, for example using Python with tools such as the scientific programming environment Scikit-Learn (https://scikit-learn.org/stable/) and deep learning ecosystem PyTorch (https://pytorch.org/).
  • The main types of platforms employed in RSA to capture the images are satellites, UAVs, and manned aircraft. However, in-field cameras and ground vehicles have not been widely used in RSA. The reason these images are used instead of the ground-based ones could be explained by the fact that they provide a broader view of the land. In addition, the resolution of the cameras permits to go from a global perspective to a more detailed view of a specific area. The fusion of satellite and UAV imagery [207] is an emerging field that would be very useful to harness the power of both capture systems. It is interesting to observe that many satellite imagery is freely available for research purposes, so this would be a convenient source for beginners. Among the most cited satellites in the selected papers, NASA’s Landsat missions (https://landsat.gsfc.nasa.gov), ESA’s Sentinel missions (https://sentinel.esa.int/), and ESA’s Envisat satellite (https://www.esa.int/Applications/Observing_the_Earth/Envisat/Mission_overview) can be mentioned. Other satellites that are referred in several papers include Proba-1/2, Spot, QuickBird, Ikonos, TerraSAR-X, and Radarsat-2. In the domain of UAVs, some research teams are specifically dedicated to the hardware development of capture systems that can be applied for different task, such as the system PhenoFly (https://kp.ethz.ch/infrastructure/uav-phenofly.html), which has been used in many publications.
  • The main research topics addressed by the RSA community were growth vigor, cropland parameter extraction, and water usage. From a general perspective, these problems have the potential to impact on the sustainability of agriculture. As a matter of fact, the better usage of the water together with an adequate knowledge on the cropland, the better growth vigor, crop quality, and efficiency is achieved. Indeed, sustainability is one of the main goals of precision agriculture, in line with the United Nations’ Sustainable Development Goals (SDG2: End hunger, achieve food security and improved nutrition and promote sustainable agriculture. SDG14. Protect, restore and promote sustainable use of terrestrial ecosystems, sustainably manage forests, combat desertification, and halt and reverse land degradation and halt biodiversity loss.). Sustainability was not a topic specifically addressed in the present mapping study, although all the problems are in some way related to it. The detection of pathogens, diseases, and insect pests attempts to reduce the amount of pesticides and insecticides; weed detection allows the use of site-specific spraying of herbicides, and nutrient analysis is related to the optimal use of fertilizers. On the other hand, yield prediction and automatic crop harvesting seek to optimize the productivity of farming. It cannot be discarded that new problems and applications appear in the future with the advance in technology, such as those involved in a completely automated cropping cycle.
  • Concerning the types of the images used, standard RGB images continue to be the most frequently used image type. This can be explained by the low cost and high availability of RGB cameras, and the fact that they are the main source for computer vision in general. In this sense, RSA research is commonly observed as a sub-domain of computer vision and image processing. New methodologies should be developed more specific to the agricultural domain, for example considering the spectral, temporal, and phenotypic dimensions. In many works, NIR channel is added to RGB tuples, forming a 4-valued tuple for each pixel. Visible spectrum allows to validate the results from simple inspection, but important information may be lost or not detected with this type of images. For example, the temperature of the objects cannot be measured with Vis-NIR images, but this could be useful to estimate the water status of the plants. In these applications, hyperspectral images and thermal infrared were found in a second level of usage, and on a third group near infrared, red edge spectrum and multispectral images. When the spectral bands of interest can be known a priori, multispectral images can be more interesting than hyperspectral, focusing on the wavelengths of interest. In this way, an interesting domain of research in RSA that deserves much more work is the determination of the optimal spectral bands for each problem [208]. Ideally, new cameras could be made that are specific to the selected and reduced number of wavelengths for each problem.

4.2. Limitations of the Mapping Study

The last step in the execution of a systematic mapping study is the evaluation of the limitations and weak points of the study itself. This analysis has to consider all the steps of the process. Several possible limitations have been recognized for the present study:
  • Using only Scopus as a source of publications. Other relevant publications that are not indexed in Scopus could have impacted the final results. However, our focus was to provide an overview on the most highly cited papers. For this reason, we chose to focus on Scopus as it is one of the largest databases available. Moreover, it has one of the most complete searching methods allowing Boolean combinations and wildcards.
  • Some missing terms in the search string might have impacted the results. In order to reduce this limitation, we formed the search string to include a broad range of terms of interest to this study. Besides, the search string was refine in several trials, by observing some papers that were not initially included. Therefore, we consider this threat is low.
  • Other classification criteria not present in this study might have provided interesting views on the selected papers. The eight mapping questions included provide interesting findings to researchers and practitioners, although other questions could be also useful.
  • The exclusion criterion of six cites per year could have rejected some interesting papers. This criterion was added to select only the most relevant works in the literature. This has shown to be very restrictive both for recent papers (only two papers from 2019 received more than six citations) and for older publications (e.g., the papers before year 2000 require more than 120 citations). However, as the purpose of the mapping study is to analyze the trends of the most relevant works, this is not a threat to the validity of the study.

5. Conclusions

Precision agriculture is a very active area of research with a significant impact on the improvement of global sustainability and the optimization of natural resources. It is based on information and communication technologies to achieve its goals, being remote image capture systems one of the main branches. This includes the development of cameras and capture devices, the remote communications of the images, image processing and computer vision tasks, and machine learning methods to automate the farming decisions.
The present systematic mapping study has presented a quantitative and qualitative analysis of the state-of-the-art in this rapidly evolving area. Only since the year 2000, more than 1400 journal and conference papers were found, and this trend is expected to continue in the future. A selection of the 106 most highly cited papers has been done to obtain an in-depth view of the state of the research. The archetype paper is a journal manuscript describing a classification problem on a dataset of satellite or UAV imagery, using existing computer vision and machine learning techniques, possibly with minor adaptations, applied in problems of parameter estimation, growth vigor, and water usage. Standard RGB and hyperspectral images are the most frequently found, although many works use different modalities.
Current trends are towards the popularization of the use of UAVs and the increasing availability of satellite imagery. However, we believe that a solution should integrate in-field cameras and airborne images in order to achieve high spatial and temporal resolution to cover large areas and reduce operational costs. Deep neural networks are also a very marked tendency as they can obtain excellent results in tasks of classification, segmentation, feature extraction, recognition, and analysis of time series. Finally, the integration should also be referred to the development of more holistic approaches that consider all the aspects involved in the cultivation cycle, and not just the problems in isolation.

Author Contributions

Conceptualization, J.A.G.-B. and S.O.; methodology, J.A.G.-B. and S.O.; validation, B.B., G.G.-M., and J.M.M.-M.; formal analysis, J.A.G.-B., S.O., and B.B.; investigation, J.A.G.-B. and S.O.; data curation, J.A.G.-B., S.O., G.G.-M., and J.L.F.-A.; writing—original draft preparation, J.A.G.-B., S.O., B.B., and G.G.-M.; writing—review and editing, J.A.G.-B., S.O., B.B., G.G.-M., J.L.F.-A., and J.M.M.-M.; visualization, J.A.G.-B. and S.O.; supervision, G.G.-M., J.L.F.-A., and J.M.M.-M.; project administration, J.L.F.-A.; funding acquisition, G.G.-M., J.L.F.-A., and J.M.M.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Spanish MICINN, as well as European Commission FEDER funds, under grant RTI2018-098156-B-C53. This research is part of the BIZDEVOPS-GLOBAL-UMU (RTI2018-098309- B-C33) project, and the Network of Excellence in Software Quality and Sustainability (TIN2017-90689-REDT). Both projects are supported by the Spanish Ministry of Science, Innovation and Universities and the European Regional Development Fund (ERDF).

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Brisco, B.; Brown, R.; Hirose, T.; McNairn, H.; Staenz, K. Precision agriculture and the role of remote sensing: A review. Can. J. Remote Sens. 1998, 24, 315–327. [Google Scholar] [CrossRef]
  2. Chang, K.T. Introduction to Geographic Information Systems; McGraw-Hill Higher Education: Boston, FL, USA, 2006. [Google Scholar]
  3. Der Sarkissian, R.; Zaninetti, J.M.; Abdallah, C. The use of geospatial information as support for Disaster Risk Reduction; contextualization to Baalbek-Hermel Governorate/Lebanon. Appl. Geogr. 2019, 111, 102075. [Google Scholar] [CrossRef]
  4. Chaikaew, P. Land Use Change Monitoring and Modelling using GIS and Remote Sensing Data for Watershed Scale in Thailand. In Land Use-Assessing the Past, Envisioning the Future; IntechOpen: London, UK, 2019. [Google Scholar]
  5. Bai, Y.; Kaneko, I.; Kobayashi, H.; Kurihara, K.; Takayabu, I.; Sasaki, H.; Murata, A. A Geographic Information System (GIS)-based approach to adaptation to regional climate change: A case study of Okutama-machi, Tokyo, Japan. Mitig. Adapt. Strateg. Glob. Chang. 2014, 19, 589–614. [Google Scholar] [CrossRef] [Green Version]
  6. Gökkaya, K.; Budhathoki, M.; Christopher, S.F.; Hanrahan, B.R.; Tank, J.L. Subsurface tile drained area detection using GIS and remote sensing in an agricultural watershed. Ecol. Eng. 2017, 108, 370–379. [Google Scholar] [CrossRef]
  7. Wu, Q. GIS and remote sensing applications in wetland mapping and monitoring. In Comprehensive Geographic Information Systems; Elsevier: Amsterdam, The Netherlands, 2018; pp. 140–157. [Google Scholar]
  8. Sanga, B.; Mohanty, D.; Singh, A.; Singh, R. Nexgen Technologies for Mining and Fuel Industries; Allied Publishers: New Delhi, India, 2017. [Google Scholar]
  9. Rocha, F.; Oliveira Neto, A.; Bottega, E.; Guerra, N.; Rocha, R.; Vilar, C. Weed mapping using techniques of precision agriculture. Planta Daninha 2015, 33, 157–164. [Google Scholar] [CrossRef] [Green Version]
  10. Slaughter, D.C.; Pérez Ruiz, M.; Fathallah, F.; Upadhyaya, S.; Gliever, C.J.; Miller, B. GPS-based intra-row weed control system: Performance and labor savings. In Automation Technology for Off-Road Equipment; ASABE: St. Joseph, MO, USA, 2012. [Google Scholar]
  11. De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An automatic random forest-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef] [Green Version]
  12. Khanal, S.; Fulton, J.; Shearer, S. An overview of current and potential applications of thermal remote sensing in precision agriculture. Comput. Electron. Agric. 2017, 139, 22–32. [Google Scholar] [CrossRef]
  13. Mateo-Aroca, A.; García-Mateos, G.; Ruiz-Canales, A.; Molina-García-Pardo, J.M.; Molina-Martínez, J.M. Remote Image Capture System to Improve Aerial Supervision for Precision Irrigation in Agriculture. Water 2019, 11, 255. [Google Scholar] [CrossRef] [Green Version]
  14. Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine learning in agriculture: A review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [Green Version]
  15. Lindblom, J.; Lundström, C.; Ljung, M.; Jonsson, A. Promoting sustainable intensification in precision agriculture: Review of decision support systems development and strategies. Precis. Agric. 2017, 18, 309–331. [Google Scholar] [CrossRef] [Green Version]
  16. Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current Practices in UAS-based Environmental Monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef] [Green Version]
  17. Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
  18. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  19. Bah, M.D.; Hafiane, A.; Canals, R. Weeds detection in UAV imagery using SLIC and the hough transform. In Proceedings of the 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), Montreal, QC, Canada, 28 November–1 December 2017; pp. 1–6. [Google Scholar]
  20. dos Santos Ferreira, A.; Freitas, D.M.; da Silva, G.G.; Pistori, H.; Folhes, M.T. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
  21. Jung, J.; Maeda, M.; Chang, A.; Landivar, J.; Yeom, J.; McGinty, J. Unmanned aerial system assisted framework for the selection of high yielding cotton genotypes. Comput. Electron. Agric. 2018, 152, 74–81. [Google Scholar] [CrossRef]
  22. Han, L.; Yang, G.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Clustering field-based maize phenotyping of plant-height growth and canopy spectral dynamics using a UAV remote-sensing approach. Front. Plant Sci. 2018, 9, 1638. [Google Scholar] [CrossRef] [Green Version]
  23. Wahab, I.; Hall, O.; Jirström, M. Remote sensing of yields: Application of uav imagery-derived ndvi for estimating maize vigor and yields in complex farming systems in sub-saharan africa. Drones 2018, 2, 28. [Google Scholar] [CrossRef] [Green Version]
  24. Quebrajo, L.; Perez-Ruiz, M.; Pérez-Urrestarazu, L.; Martínez, G.; Egea, G. Linking thermal imaging and soil remote sensing to enhance irrigation management of sugar beet. Biosyst. Eng. 2018, 165, 77–87. [Google Scholar] [CrossRef]
  25. Albornoz, C.; Giraldo, L.F. Trajectory design for efficient crop irrigation with a UAV. In Proceedings of the 2017 IEEE 3rd Colombian Conference on Automatic Control (CCAC), Cartagena, Colombia, 18–20 October 2017; pp. 1–6. [Google Scholar]
  26. Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
  27. Montero, D.; Rueda, C. Detection of palm oil bud rot employing artificial vision. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Constanta, Romania, 13–16 June 2018; Volume 437, p. 012004. [Google Scholar]
  28. Xue, X.; Lan, Y.; Sun, Z.; Chang, C.; Hoffmann, W.C. Develop an unmanned aerial vehicle based automatic aerial spraying system. Comput. Electron. Agric. 2016, 128, 58–66. [Google Scholar] [CrossRef]
  29. Garre, P.; Harish, A. Autonomous agricultural pesticide spraying uav. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Constanta, Romania, 13–16 June 2018; Volume 455, p. 012030. [Google Scholar]
  30. Perich, G.; Hund, A.; Anderegg, J.; Roth, L.; Boer, M.P.; Walter, A.; Liebisch, F.; Aasen, H. Assessment of multi-image UAV based high-throughput field phenotyping of canopy temperature. Front. Plant Sci. 2020, 11, 150. [Google Scholar] [CrossRef] [PubMed]
  31. Maurya, P. Hardware Implementation of a Flight Control System for an Unmanned Aerial Vehicle. 2015. Available online: http://www.cse.iitk.ac.in/users/moona/students/Y2258.pdf (accessed on 14 March 2020).
  32. Park, J.K.; Park, J.H. Crops classification using imagery of unmanned aerial vehicle (UAV). J. Korean Soc. Agric. Eng. 2015, 57, 91–97. [Google Scholar]
  33. Punjani, A.; Abbeel, P. Deep learning helicopter dynamics models. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 3223–3230. [Google Scholar]
  34. Peña Barragán, J.M.; Kelly, M.; Castro, A.I.d.; López Granados, F. Object-based approach for crop row characterization in UAV images for site-specific weed management. In Proceedings of the 4th GEOBIA, Rio de Janeiro, Brazil, 7–9 May 2012; pp. 426–430. [Google Scholar]
  35. Näsi, R.; Viljanen, N.; Kaivosoja, J.; Alhonoja, K.; Hakala, T.; Markelin, L.; Honkavaara, E. Estimating biomass and nitrogen amount of barley and grass using UAV and aircraft based spectral and photogrammetric 3D features. Remote Sens. 2018, 10, 1082. [Google Scholar] [CrossRef] [Green Version]
  36. Zheng, H.; Li, W.; Jiang, J.; Liu, Y.; Cheng, T.; Tian, Y.; Zhu, Y.; Cao, W.; Zhang, Y.; Yao, X. A comparative assessment of different modeling algorithms for estimating leaf nitrogen content in winter wheat using multispectral images from an unmanned aerial vehicle. Remote Sens. 2018, 10, 2026. [Google Scholar] [CrossRef] [Green Version]
  37. Wang, D.C.; Zhang, G.L.; Zhao, M.S.; Pan, X.Z.; Zhao, Y.G.; Li, D.C.; Macmillan, B. Retrieval and mapping of soil texture based on land surface diurnal temperature range data from MODIS. PLoS ONE 2015, 10, 1–14. [Google Scholar] [CrossRef]
  38. Shafian, S.; Maas, S.J. Index of soil moisture using raw Landsat image digital count data in Texas high plains. Remote Sens. 2015, 7, 2352–2372. [Google Scholar] [CrossRef] [Green Version]
  39. Tong, X.Y.; Xia, G.S.; Lu, Q.; Shen, H.; Li, S.; You, S.; Zhang, L. Land-cover classification with high-resolution remote sensing images using transferable deep models. Remote Sens. Environ. 2020, 237, 111322. [Google Scholar] [CrossRef] [Green Version]
  40. Helber, P.; Bischke, B.; Dengel, A.; Borth, D. Eurosat: A novel dataset and deep learning benchmark for land use and land cover classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 2217–2226. [Google Scholar] [CrossRef] [Green Version]
  41. Maes, W.; Steppe, K. Estimating evapotranspiration and drought stress with ground-based thermal remote sensing in agriculture: A review. J. Exp. Bot. 2012, 63, 4671–4712. [Google Scholar] [CrossRef] [Green Version]
  42. Soliman, A.; Heck, R.J.; Brenning, A.; Brown, R.; Miller, S. Remote sensing of soil moisture in vineyards using airborne and ground-based thermal inertia data. Remote Sens. 2013, 5, 3729–3748. [Google Scholar] [CrossRef] [Green Version]
  43. Daponte, P.; De Vito, L.; Glielmo, L.; Iannelli, L.; Liuzza, D.; Picariello, F.; Silano, G. A review on the use of drones for precision agriculture. In Proceedings of the IOP Conference Series: Earth and Environmental Science, Bogor, Indonesia, 10–11 September 2019; Volume 275, p. 012022. [Google Scholar]
  44. Daponte, P.; De Vito, L.; Mazzilli, G.; Picariello, F.; Rapuano, S. A height measurement uncertainty model for archaeological surveys by aerial photogrammetry. Measurement 2017, 98, 192–198. [Google Scholar] [CrossRef]
  45. Liu, C.A.; Chen, Z.X.; Yun, S.; Chen, J.S.; Hasi, T.; Pan, H.Z. Research advances of SAR remote sensing for agriculture applications: A review. J. Integr. Agric. 2019, 18, 506–525. [Google Scholar] [CrossRef] [Green Version]
  46. Tan, C.P.; Ewe, H.T.; Chuah, H.T. Agricultural crop-type classification of multi-polarization SAR images using a hybrid entropy decomposition and support vector machine technique. Int. J. Remote Sens. 2011, 32, 7057–7071. [Google Scholar] [CrossRef]
  47. Kussul, N.; Lemoine, G.; Gallego, F.J.; Skakun, S.V.; Lavreniuk, M.; Shelestov, A.Y. Parcel-based crop classification in Ukraine using Landsat-8 data and Sentinel-1A data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 2500–2508. [Google Scholar] [CrossRef]
  48. Stankiewicz, K.A. The efficiency of crop recognition on ENVISAT ASAR images in two growing seasons. IEEE Trans. Geosci. Remote Sens. 2006, 44, 806–814. [Google Scholar] [CrossRef]
  49. Ziliani, M.G.; Parkes, S.D.; Hoteit, I.; McCabe, M.F. Intra-season crop height variability at commercial farm scales using a fixed-wing UAV. Remote Sens. 2018, 10, 2007. [Google Scholar] [CrossRef] [Green Version]
  50. Torres-Sánchez, J.; de Castro, A.I.; Pena, J.M.; Jiménez-Brenes, F.M.; Arquero, O.; Lovera, M.; López-Granados, F. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosyst. Eng. 2018, 176, 172–184. [Google Scholar] [CrossRef]
  51. De Castro, A.I.; Jiménez-Brenes, F.M.; Torres-Sánchez, J.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. 3-D characterization of vineyards using a novel UAV imagery-based OBIA procedure for precision viticulture applications. Remote Sens. 2018, 10, 584. [Google Scholar] [CrossRef] [Green Version]
  52. Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. Weedmap: A large-scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef] [Green Version]
  53. Marino, S.; Alvino, A. Detection of homogeneous wheat areas using multitemporal UAS images and ground truth data analyzed by cluster analysis. Eur. J. Remote Sens. 2018, 51, 266–275. [Google Scholar] [CrossRef] [Green Version]
  54. de Souza, C.H.W.; Mercante, E.; Johann, J.A.; Lamparelli, R.A.C.; Uribe-Opazo, M.A. Mapping and discrimination of soya bean and corn crops using spectro-temporal profiles of vegetation indices. Int. J. Remote Sens. 2015, 36, 1809–1824. [Google Scholar] [CrossRef]
  55. Zheng, Y.; Zhang, M.; Zhang, X.; Zeng, H.; Wu, B. Mapping winter wheat biomass and yield using time series data blended from PROBA-V 100-and 300-m S1 products. Remote Sens. 2016, 8, 824. [Google Scholar] [CrossRef] [Green Version]
  56. Gouveia, C.; Trigo, R.; Beguería, S.; Vicente-Serrano, S.M. Drought impacts on vegetation activity in the Mediterranean region: An assessment using remote sensing data and multi-scale drought indicators. Glob. Planet. Chang. 2017, 151, 15–27. [Google Scholar] [CrossRef] [Green Version]
  57. Zhang, C.; Ren, H.; Qin, Q.; Ersoy, O.K. A new narrow band vegetation index for characterizing the degree of vegetation stress due to copper: The copper stress vegetation index (CSVI). Remote Sens. Lett. 2017, 8, 576–585. [Google Scholar] [CrossRef]
  58. Rembold, F.; Atzberger, C.; Savin, I.; Rojas, O. Using low resolution satellite imagery for yield prediction and yield anomaly detection. Remote Sens. 2013, 5, 1704–1733. [Google Scholar] [CrossRef] [Green Version]
  59. Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine Vision Systems in Precision Agriculture for Crop Farming. J. Imaging 2019, 5, 89. [Google Scholar] [CrossRef] [Green Version]
  60. Khan, Z.; Rahimi-Eichi, V.; Haefele, S.; Garnett, T.; Miklavcic, S.J. Estimation of vegetation indices for high-throughput phenotyping of wheat using aerial imaging. Plant Methods 2018, 14, 20. [Google Scholar] [CrossRef]
  61. Bah, M.D.; Hafiane, A.; Canals, R. Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef] [Green Version]
  62. Bah, M.D.; Dericquebourg, E.; Hafiane, A.; Canals, R. Deep learning based classification system for identifying weeds using high-resolution UAV imagery. In Proceedings of the Science and Information Conference, Las Vegas, NV, USA, 25–26 April 2018; pp. 176–187. [Google Scholar]
  63. Kruse, O.M.O.; Prats-Montalbán, J.M.; Indahl, U.G.; Kvaal, K.; Ferrer, A.; Futsaether, C.M. Pixel classification methods for identifying and quantifying leaf surface injury from digital images. Comput. Electron. Agric. 2014, 108, 155–165. [Google Scholar] [CrossRef]
  64. Albetis, J.; Jacquin, A.; Goulard, M.; Poilvé, H.; Rousseau, J.; Clenet, H.; Dedieu, G.; Duthoit, S. On the potentiality of UAV multispectral imagery to detect Flavescence dorée and Grapevine Trunk Diseases. Remote Sens. 2019, 11, 23. [Google Scholar] [CrossRef] [Green Version]
  65. Kurtulmuş, F.; Ünal, H. Discriminating rapeseed varieties using computer vision and machine learning. Expert Syst. Appl. 2015, 42, 1880–1891. [Google Scholar] [CrossRef]
  66. Hassan-Esfahani, L.; Torres-Rua, A.; Jensen, A.; McKee, M. Assessment of surface soil moisture using high-resolution multi-spectral imagery and artificial neural networks. Remote Sens. 2015, 7, 2627–2646. [Google Scholar] [CrossRef] [Green Version]
  67. Poblete, T.; Ortega-Farías, S.; Moreno, M.A.; Bardeen, M. Artificial neural network to predict vine water status spatial variability using multispectral information obtained from an unmanned aerial vehicle (UAV). Sensors 2017, 17, 2488. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  68. Jeon, H.Y.; Tian, L.F.; Zhu, H. Robust crop and weed segmentation under uncontrolled outdoor illumination. Sensors 2011, 11, 6270–6283. [Google Scholar] [CrossRef] [PubMed]
  69. Sannakki, S.S.; Rajpurohit, V.S.; Nargund, V. SVM-DSD: SVM Based diagnostic system for the detection of pomegranate leaf diseases. In Proceedings of the International Conference on Advances in Computing, Mumbai, India, 18–19 January 2013; pp. 715–720. [Google Scholar]
  70. Mokhtar, U.; El Bendary, N.; Hassenian, A.E.; Emary, E.; Mahmoud, M.A.; Hefny, H.; Tolba, M.F. SVM-based detection of tomato leaves diseases. In Intelligent Systems’ 2014; Springer: Berlin/Heidelberg, Germany, 2015; pp. 641–652. [Google Scholar]
  71. Dingle Robertson, L.; King, D.J. Comparison of pixel-and object-based classification in land cover change mapping. Int. J. Remote Sens. 2011, 32, 1505–1529. [Google Scholar] [CrossRef]
  72. De Rainville, F.M.; Durand, A.; Fortin, F.A.; Tanguy, K.; Maldague, X.; Panneton, B.; Simard, M.J. Bayesian classification and unsupervised learning for isolating weeds in row crops. Pattern Anal. Appl. 2014, 17, 401–414. [Google Scholar] [CrossRef]
  73. Mondal, D.; Kole, D.K.; Roy, K. Gradation of yellow mosaic virus disease of okra and bitter gourd based on entropy based binning and Naive Bayes classifier after identification of leaves. Comput. Electron. Agric. 2017, 142, 485–493. [Google Scholar] [CrossRef]
  74. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  75. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
  76. Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. PLoS ONE 2018, 13, e0196302. [Google Scholar] [CrossRef] [Green Version]
  77. Castro, J.D.B.; Feitoza, R.Q.; La Rosa, L.C.; Diaz, P.M.A.; Sanches, I.D.A. A Comparative analysis of deep learning techniques for sub-tropical crop types recognition from multitemporal optical/SAR image sequences. In Proceedings of the 2017 30th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), Niteroi, Brazil, 17–20 October 2017; pp. 382–389. [Google Scholar]
  78. Mortensen, A.K.; Dyrmann, M.; Karstoft, H.; Jørgensen, R.N.; Gislum, R. Semantic segmentation of mixed crops using deep convolutional neural network. In Proceedings of the International Conference of Agricultural Engineering (CIGR), Aarhus, Denmark, 26–29 June 2016. [Google Scholar]
  79. Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
  80. Tian, H.; Wang, T.; Liu, Y.; Qiao, X.; Li, Y. Computer Vision Technology in Agricultural Automation—A review. Inf. Process. Agric. 2019, 7, 1–19. [Google Scholar] [CrossRef]
  81. Mogili, U.R.; Deepak, B. Review on application of drone systems in precision agriculture. Procedia Comput. Sci. 2018, 133, 502–509. [Google Scholar] [CrossRef]
  82. Tripathi, M.K.; Maktedar, D.D. A role of computer vision in fruits and vegetables among various horticulture products of agriculture fields: A survey. Inf. Process. Agric. 2019. [Google Scholar] [CrossRef]
  83. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  84. Ouhbi, S.; Idri, A.; Fernández-Alemán, J.L.; Toval, A. Predicting software product quality: A systematic mapping study. Comput. Sist. 2015, 19, 547–562. [Google Scholar] [CrossRef] [Green Version]
  85. Petersen, K.; Vakkalanka, S.; Kuzniarz, L. Guidelines for conducting systematic mapping studies in software engineering: An update. Inf. Softw. Technol. 2015, 64, 1–18. [Google Scholar] [CrossRef]
  86. Mongeon, P.; Paul-Hus, A. The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics 2016, 106, 213–228. [Google Scholar] [CrossRef]
  87. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann. Intern. Med. 2009, 151, 264–269. [Google Scholar] [CrossRef] [Green Version]
  88. Ouhbi, S.; Idri, A.; Fernández-Alemán, J.L.; Toval, A. Requirements engineering education: A systematic mapping study. Requir. Eng. 2015, 20, 119–138. [Google Scholar] [CrossRef]
  89. Ouhbi, S.; Idri, A.; Aleman, J.L.F.; Toval, A. Evaluating software product quality: A systematic mapping study. In Proceedings of the 2014 Joint Conference of the International Workshop on Software Measurement and the International Conference on Software Process and Product Measurement, Rotterdam, The Netherlands, 6–8 October 2014; pp. 141–151. [Google Scholar]
  90. Minu, S.; Shetty, A.; Gopal, B. Review of preprocessing techniques used in soil property prediction from hyperspectral data. Cogent Geosci. 2016, 2, 1145878. [Google Scholar] [CrossRef]
  91. Atzberger, C. Advances in remote sensing of agriculture: Context description, existing operational monitoring systems and major information needs. Remote Sens. 2013, 5, 949–981. [Google Scholar] [CrossRef] [Green Version]
  92. Zhao, K.; García, M.; Liu, S.; Guo, Q.; Chen, G.; Zhang, X.; Zhou, Y.; Meng, X. Terrestrial lidar remote sensing of forests: Maximum likelihood estimates of canopy profile, leaf area index, and leaf angle distribution. Agric. For. Meteorol. 2015, 209, 100–113. [Google Scholar] [CrossRef]
  93. Angelopoulou, T.; Tziolas, N.; Balafoutis, A.; Zalidis, G.; Bochtis, D. Remote sensing techniques for soil organic carbon estimation: A review. Remote Sens. 2019, 11, 676. [Google Scholar] [CrossRef] [Green Version]
  94. Basu, S.; Ganguly, S.; Mukhopadhyay, S.; DiBiano, R.; Karki, M.; Nemani, R. Deepsat: A learning framework for satellite imagery. In Proceedings of the 23rd SIGSPATIAL International Conference on Advances in Geographic Information Systems, Seattle, WA, USA, 3–6 November 2015; pp. 1–10. [Google Scholar]
  95. Shelestov, A.; Lavreniuk, M.; Kussul, N.; Novikov, A.; Skakun, S. Exploring Google Earth Engine platform for big data processing: Classification of multitemporal satellite imagery for crop mapping. Front. Earth Sci. 2017, 5, 17. [Google Scholar] [CrossRef] [Green Version]
  96. Stagakis, S.; Markos, N.; Sykioti, O.; Kyparissis, A. Monitoring canopy biophysical and biochemical parameters in ecosystem scale using satellite hyperspectral imagery: An application on a Phlomis fruticosa Mediterranean ecosystem using multiangular CHRIS/PROBA observations. Remote Sens. Environ. 2010, 114, 977–994. [Google Scholar] [CrossRef]
  97. Zmarz, A. Introduction to the special issue UAS for mapping and monitoring. Eur. J. Remote Sens. 2019, 52, 1. [Google Scholar] [CrossRef] [Green Version]
  98. Korczak-Abshire, M.; Zmarz, A.; Rodzewicz, M.; Kycko, M.; Karsznia, I.; Chwedorzewska, K.J. Study of fauna population changes on Penguin Island and Turret Point Oasis (King George Island, Antarctica) using an unmanned aerial vehicle. Polar Biol. 2019, 42, 217–224. [Google Scholar] [CrossRef] [Green Version]
  99. Fang, S.X.; O’Young, S.; Rolland, L. Development of small uas beyond-visual-line-of-sight (bvlos) flight operations: System requirements and procedures. Drones 2018, 2, 13. [Google Scholar] [CrossRef] [Green Version]
  100. Rehman, T.U.; Mahmud, M.S.; Chang, Y.K.; Jin, J.; Shin, J. Current and future applications of statistical machine learning algorithms for agricultural machine vision systems. Comput. Electron. Agric. 2019, 156, 585–605. [Google Scholar] [CrossRef]
  101. Alchanatis, V.; Cohen, Y.; Cohen, S.; Moller, M.; Sprinstin, M.; Meron, M.; Tsipris, J.; Saranga, Y.; Sela, E. Evaluation of different approaches for estimating and mapping crop water status in cotton with thermal imaging. Precis. Agric. 2010, 11, 27–41. [Google Scholar] [CrossRef]
  102. Lamichhane, J.R.; Dachbrodt-Saaydeh, S.; Kudsk, P.; Messéan, A. Toward a reduced reliance on conventional pesticides in European agriculture. Plant Dis. 2016, 100, 10–24. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  103. Sabzi, S.; Abbaspour-Gilandeh, Y.; García-Mateos, G. A fast and accurate expert system for weed identification in potato crops using metaheuristic algorithms. Comput. Ind. 2018, 98, 80–89. [Google Scholar] [CrossRef]
  104. Huang, Y.; Chen, Z.X.; Tao, Y.; Huang, X.Z.; Gu, X.F. Agricultural remote sensing big data: Management and applications. J. Integr. Agric. 2018, 17, 1915–1931. [Google Scholar] [CrossRef]
  105. Hernández-Hernández, J.; García-Mateos, G.; González-Esquiva, J.; Escarabajal-Henarejos, D.; Ruiz-Canales, A.; Molina-Martínez, J.M. Optimal color space selection method for plant/soil segmentation in agriculture. Comput. Electron. Agric. 2016, 122, 124–132. [Google Scholar] [CrossRef]
  106. Jones, H.G.; Serraj, R.; Loveys, B.R.; Xiong, L.; Wheaton, A.; Price, A.H. Thermal infrared imaging of crop canopies for the remote diagnosis and quantification of plant responses to water stress in the field. Funct. Plant Biol. 2009, 36, 978–989. [Google Scholar] [CrossRef] [Green Version]
  107. Mangus, D.L.; Sharda, A.; Zhang, N. Development and evaluation of thermal infrared imaging system for high spatial and temporal resolution crop water stress monitoring of corn within a greenhouse. Comput. Electron. Agric. 2016, 121, 149–159. [Google Scholar] [CrossRef]
  108. Liu, X.; Bo, Y. Object-based crop species classification based on the combination of airborne hyperspectral images and LiDAR data. Remote Sens. 2015, 7, 922–950. [Google Scholar] [CrossRef] [Green Version]
  109. Pearlman, J.; Carman, S.; Segal, C.; Jarecke, P.; Clancy, P.; Browne, W. Overview of the Hyperion imaging spectrometer for the NASA EO-1 mission. In Proceedings of the IGARSS 2001. Scanning the Present and Resolving the Future. IEEE 2001 International Geoscience and Remote Sensing Symposium (Cat. No. 01CH37217), Piscataway, NJ, USA, 9–13 July 2001; Volume 7, pp. 3036–3038. [Google Scholar]
  110. Poblete-Echeverría, C.; Olmedo, G.; Ingram, B.; Bardeen, M. Detection and segmentation of vine canopy in ultra-high spatial resolution RGB imagery obtained from unmanned aerial vehicle (UAV): A case study in a commercial vineyard. Remote Sens. 2017, 9, 268. [Google Scholar] [CrossRef] [Green Version]
  111. Lucas, R.; Rowlands, A.; Brown, A.; Keyworth, S.; Bunting, P. Rule-based classification of multitemporal satellite imagery for habitat and agricultural land cover mapping. ISPRS J. Photogramm. Remote Sens. 2007, 62, 165–185. [Google Scholar] [CrossRef]
  112. Asaari, M.S.M.; Mishra, P.; Mertens, S.; Dhondt, S.; Inzé, D.; Wuyts, N.; Scheunders, P. Close-range hyperspectral image analysis for the early detection of stress responses in individual plants in a high-throughput phenotyping platform. ISPRS J. Photogramm. Remote Sens. 2018, 138, 121–138. [Google Scholar] [CrossRef]
  113. Müllerová, J.; Brŭna, J.; Bartaloš, T.; Dvořák, P.; Vítková, M.; Pyšek, P. Timing is important: Unmanned aircraft vs. satellite imagery in plant invasion monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  114. Duveiller, G.; Defourny, P. A conceptual framework to define the spatial resolution requirements for agricultural monitoring using remote sensing. Remote Sens. Environ. 2010, 114, 2637–2650. [Google Scholar] [CrossRef]
  115. Wei, Z.; Han, Y.; Li, M.; Yang, K.; Yang, Y.; Luo, Y.; Ong, S.H. A small UAV based multitemporal image registration for dynamic agricultural terrace monitoring. Remote Sens. 2017, 9, 904. [Google Scholar] [CrossRef] [Green Version]
  116. Ji, S.; Zhang, C.; Xu, A.; Shi, Y.; Duan, Y. 3D convolutional neural networks for crop classification with multitemporal remote sensing images. Remote Sens. 2018, 10, 75. [Google Scholar] [CrossRef] [Green Version]
  117. Yang, C.; Everitt, J.H.; Murden, D. Evaluating high resolution SPOT 5 satellite imagery for crop identification. Comput. Electron. Agric. 2011, 75, 347–354. [Google Scholar] [CrossRef]
  118. Zhang, X.; Liu, F.; He, Y.; Gong, X. Detecting macronutrients content and distribution in oilseed rape leaves based on hyperspectral imaging. Biosyst. Eng. 2013, 115, 56–65. [Google Scholar] [CrossRef]
  119. Ač, A.; Malenovskỳ, Z.; Olejníčková, J.; Gallé, A.; Rascher, U.; Mohammed, G. Meta-analysis assessing potential of steady-state chlorophyll fluorescence for remote sensing detection of plant water, temperature and nitrogen stress. Remote Sens. Environ. 2015, 168, 420–436. [Google Scholar] [CrossRef] [Green Version]
  120. Baghdadi, N.; Boyer, N.; Todoroff, P.; El Hajj, M.; Bégué, A. Potential of SAR sensors TerraSAR-X, ASAR/ENVISAT and PALSAR/ALOS for monitoring sugarcane crops on Reunion Island. Remote Sens. Environ. 2009, 113, 1724–1738. [Google Scholar] [CrossRef]
  121. Baret, F.; Houles, V.; Guérif, M. Quantification of plant stress using remote sensing observations and crop models: The case of nitrogen management. J. Exp. Bot. 2007, 58, 869–880. [Google Scholar] [CrossRef] [Green Version]
  122. Baret, F.; Buis, S. Estimating canopy characteristics from remote sensing observations: Review of methods and associated problems. In Advances in Land Remote Sensing; Springer: Berlin/Heidelberg, Germany, 2008; pp. 173–201. [Google Scholar]
  123. Behmann, J.; Steinrücken, J.; Plümer, L. Detection of early plant stress responses in hyperspectral images. ISPRS J. Photogramm. Remote Sens. 2014, 93, 98–111. [Google Scholar] [CrossRef]
  124. Bellvert, J.; Marsal, J.; Girona, J.; Gonzalez-Dugo, V.; Fereres, E.; Ustin, S.; Zarco-Tejada, P. Airborne thermal imagery to detect the seasonal evolution of crop water status in peach, nectarine and Saturn peach orchards. Remote Sens. 2016, 8, 39. [Google Scholar] [CrossRef] [Green Version]
  125. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  126. Blaes, X.; Vanhalle, L.; Defourny, P. Efficiency of crop identification based on optical and SAR image time series. Remote Sens. Environ. 2005, 96, 352–365. [Google Scholar] [CrossRef]
  127. Clevers, J.G.; Kooistra, L.; Schaepman, M.E. Estimating canopy water content using hyperspectral remote sensing data. Int. J. Appl. Earth Obs. Geoinf. 2010, 12, 119–125. [Google Scholar] [CrossRef]
  128. Er-Raki, S.; Chehbouni, A.; Guemouria, N.; Duchemin, B.; Ezzahar, J.; Hadria, R. Combining FAO-56 model and ground-based remote sensing to estimate water consumptions of wheat crops in a semi-arid region. Agric. Water Manag. 2007, 87, 41–54. [Google Scholar] [CrossRef] [Green Version]
  129. Garrigues, S.; Allard, D.; Baret, F.; Weiss, M. Influence of landscape spatial heterogeneity on the non-linear estimation of leaf area index from moderate spatial resolution remote sensing data. Remote Sens. Environ. 2006, 105, 286–298. [Google Scholar] [CrossRef]
  130. Glenn, E.P.; Neale, C.M.; Hunsaker, D.J.; Nagler, P.L. Vegetation index-based crop coefficients to estimate evapotranspiration by remote sensing in agricultural and natural ecosystems. Hydrol. Process. 2011, 25, 4050–4062. [Google Scholar] [CrossRef]
  131. Gonzalez-Dugo, M.; Neale, C.; Mateos, L.; Kustas, W.; Prueger, J.; Anderson, M.; Li, F. A comparison of operational remote sensing-based models for estimating crop evapotranspiration. Agric. For. Meteorol. 2009, 149, 1843–1853. [Google Scholar] [CrossRef]
  132. Houborg, R.; Anderson, M.; Daughtry, C. Utility of an image-based canopy reflectance modeling tool for remote estimation of LAI and leaf chlorophyll content at the field scale. Remote Sens. Environ. 2009, 113, 259–274. [Google Scholar] [CrossRef]
  133. Kaliramesh, S.; Chelladurai, V.; Jayas, D.; Alagusundaram, K.; White, N.; Fields, P. Detection of infestation by Callosobruchus maculatus in mung bean using near-infrared hyperspectral imaging. J. Stored Prod. Res. 2013, 52, 107–111. [Google Scholar] [CrossRef]
  134. Kong, W.; Zhang, C.; Liu, F.; Nie, P.; He, Y. Rice seed cultivar identification using near-infrared hyperspectral imaging and multivariate data analysis. Sensors 2013, 13, 8916–8927. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  135. Kullberg, E.G.; DeJonge, K.C.; Chávez, J.L. Evaluation of thermal remote sensing indices to estimate crop evapotranspiration coefficients. Agric. Water Manag. 2017, 179, 64–73. [Google Scholar] [CrossRef] [Green Version]
  136. Lanorte, A.; De Santis, F.; Nolè, G.; Blanco, I.; Loisi, R.V.; Schettini, E.; Vox, G. Agricultural plastic waste spatial estimation by Landsat 8 satellite images. Comput. Electron. Agric. 2017, 141, 35–45. [Google Scholar] [CrossRef]
  137. Lawrence, R.L.; Wood, S.D.; Sheley, R.L. Mapping invasive plants using hyperspectral imagery and Breiman Cutler classifications (RandomForest). Remote Sens. Environ. 2006, 100, 356–362. [Google Scholar] [CrossRef]
  138. Li, H.; Zheng, L.; Lei, Y.; Li, C.; Liu, Z.; Zhang, S. Estimation of water consumption and crop water productivity of winter wheat in North China Plain using remote sensing technology. Agric. Water Manag. 2008, 95, 1271–1278. [Google Scholar] [CrossRef]
  139. Li, Y.; Zhou, Q.; Zhou, J.; Zhang, G.; Chen, C.; Wang, J. Assimilating remote sensing information into a coupled hydrology-crop growth model to estimate regional maize yield in arid regions. Ecol. Model. 2014, 291, 15–27. [Google Scholar] [CrossRef]
  140. Lobell, D.B.; Asner, G.P.; Ortiz-Monasterio, J.I.; Benning, T.L. Remote sensing of regional crop production in the Yaqui Valley, Mexico: Estimates and uncertainties. Agric. Ecosyst. Environ. 2003, 94, 205–220. [Google Scholar] [CrossRef] [Green Version]
  141. López-López, M.; Calderón, R.; González-Dugo, V.; Zarco-Tejada, P.; Fereres, E. Early detection and quantification of almond red leaf blotch using high-resolution hyperspectral and thermal imagery. Remote Sens. 2016, 8, 276. [Google Scholar] [CrossRef] [Green Version]
  142. Löw, F.; Duveiller, G. Defining the spatial resolution requirements for crop identification using optical remote sensing. Remote Sens. 2014, 6, 9034–9063. [Google Scholar] [CrossRef] [Green Version]
  143. Lowe, A.; Harrison, N.; French, A.P. Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress. Plant Methods 2017, 13, 80. [Google Scholar] [CrossRef] [PubMed]
  144. Mahesh, S.; Jayas, D.; Paliwal, J.; White, N. Hyperspectral imaging to classify and monitor quality of agricultural materials. J. Stored Prod. Res. 2015, 61, 17–26. [Google Scholar] [CrossRef]
  145. Marshall, M.; Thenkabail, P. Developing in situ non-destructive estimates of crop biomass to address issues of scale in remote sensing. Remote Sens. 2015, 7, 808–835. [Google Scholar] [CrossRef] [Green Version]
  146. Mehl, P.; Chao, K.; Kim, M.; Chen, Y. Detection of defects on selected apple cultivars using hyperspectral and multispectral image analysis. Appl. Eng. Agric. 2002, 18, 219. [Google Scholar]
  147. Moran, M.S.; Inoue, Y.; Barnes, E. Opportunities and limitations for image-based remote sensing in precision crop management. Remote Sens. Environ. 1997, 61, 319–346. [Google Scholar] [CrossRef]
  148. Moudrỳ, V.; Gdulová, K.; Fogl, M.; Klápště, P.; Urban, R.; Komárek, J.; Moudrá, L.; Štroner, M.; Barták, V.; Solskỳ, M. Comparison of leaf-off and leaf-on combined UAV imagery and airborne LiDAR for assessment of a post-mining site terrain and vegetation structure: Prospects for monitoring hazards and restoration success. Appl. Geogr. 2019, 104, 32–41. [Google Scholar] [CrossRef]
  149. Müllerová, J.; Pergl, J.; Pyšek, P. Remote sensing as a tool for monitoring plant invasions: Testing the effects of data resolution and image classification approach on the detection of a model plant species Heracleum mantegazzianum (giant hogweed). Int. J. Appl. Earth Obs. Geoinf. 2013, 25, 55–65. [Google Scholar] [CrossRef]
  150. Pandey, A.; Chowdary, V.; Mal, B. Identification of critical erosion prone areas in the small agricultural watershed using USLE, GIS and remote sensing. Water Resour. Manag. 2007, 21, 729–746. [Google Scholar] [CrossRef]
  151. Pinter, P.J., Jr.; Hatfield, J.L.; Schepers, J.S.; Barnes, E.M.; Moran, M.S.; Daughtry, C.S.; Upchurch, D.R. Remote sensing for crop management. Photogramm. Eng. Remote Sens. 2003, 69, 647–664. [Google Scholar] [CrossRef] [Green Version]
  152. Prasad, A.K.; Chai, L.; Singh, R.P.; Kafatos, M. Crop yield estimation model for Iowa using remote sensing and surface parameters. Int. J. Appl. Earth Obs. Geoinf. 2006, 8, 26–33. [Google Scholar] [CrossRef]
  153. Schlerf, M.; Atzberger, C. Inversion of a forest reflectance model to estimate structural canopy variables from hyperspectral remote sensing data. Remote Sens. Environ. 2006, 100, 281–294. [Google Scholar] [CrossRef]
  154. Siachalou, S.; Mallinis, G.; Tsakiri-Strati, M. A hidden Markov models approach for crop classification: Linking crop phenology to time series of multi-sensor remote sensing data. Remote Sens. 2015, 7, 3633–3650. [Google Scholar] [CrossRef] [Green Version]
  155. Thomas, S.; Kuska, M.T.; Bohnenkamp, D.; Brugger, A.; Alisaac, E.; Wahabzada, M.; Behmann, J.; Mahlein, A.K. Benefits of hyperspectral imaging for plant disease detection and plant protection: A technical perspective. J. Plant Dis. Prot. 2018, 125, 5–20. [Google Scholar] [CrossRef]
  156. Xie, C.; Yang, C.; He, Y. Hyperspectral imaging for classification of healthy and gray mold diseased tomato leaves with different infection severities. Comput. Electron. Agric. 2017, 135, 154–162. [Google Scholar] [CrossRef]
  157. Yue, J.; Feng, H.; Jin, X.; Yuan, H.; Li, Z.; Zhou, C.; Yang, G.; Tian, Q. A comparison of crop parameters estimation using images from UAV-mounted snapshot hyperspectral sensor and high-definition digital camera. Remote Sens. 2018, 10, 1138. [Google Scholar] [CrossRef] [Green Version]
  158. Ma, Y.; Wang, S.; Zhang, L.; Hou, Y.; Zhuang, L.; He, Y.; Wang, F. Monitoring winter wheat growth in North China by combining a crop model and remote sensing data. Int. J. Appl. Earth Obs. Geoinf. 2008, 10, 426–437. [Google Scholar]
  159. Zhao, Y.; Chen, S.; Shen, S. Assimilating remote sensing information with crop model using Ensemble Kalman Filter for improving LAI monitoring and yield estimation. Ecol. Model. 2013, 270, 30–42. [Google Scholar] [CrossRef]
  160. Anderson, L.O.; Malhi, Y.; Aragão, L.E.; Ladle, R.; Arai, E.; Barbier, N.; Phillips, O. Remote sensing detection of droughts in Amazonian forest canopies. New Phytol. 2010, 187, 733–750. [Google Scholar] [CrossRef]
  161. Bendig, J.; Bolten, A.; Bareth, G. UAV-based imaging for multitemporal, very high Resolution Crop Surface Models to monitor Crop Growth VariabilityMonitoring des Pflanzenwachstums mit Hilfe multitemporaler und hoch auflösender Oberflächenmodelle von Getreidebeständen auf Basis von Bildern aus UAV-Befliegungen. Photogramm. Fernerkund. Geoinf. 2013, 2013, 551–562. [Google Scholar]
  162. Calera, A.; Campos, I.; Osann, A.; D’Urso, G.; Menenti, M. Remote sensing for crop water management: From ET modelling to services for the end users. Sensors 2017, 17, 1104. [Google Scholar] [CrossRef] [Green Version]
  163. Kamble, B.; Kilic, A.; Hubbard, K. Estimating crop coefficients using remote sensing-based vegetation index. Remote Sens. 2013, 5, 1588–1602. [Google Scholar] [CrossRef] [Green Version]
  164. Leinonen, I.; Jones, H.G. Combining thermal and visible imagery for estimating canopy temperature and identifying plant stress. J. Exp. Bot. 2004, 55, 1423–1431. [Google Scholar] [CrossRef] [Green Version]
  165. Möller, M.; Alchanatis, V.; Cohen, Y.; Meron, M.; Tsipris, J.; Naor, A.; Ostrovsky, V.; Sprintsin, M.; Cohen, S. Use of thermal and visible imagery for estimating crop water status of irrigated grapevine. J. Exp. Bot. 2006, 58, 827–838. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  166. Park, S.; Ryu, D.; Fuentes, S.; Chung, H.; Hernández-Montes, E.; O’Connell, M. Adaptive estimation of crop water stress in nectarine and peach orchards using high-resolution imagery from an unmanned aerial vehicle (UAV). Remote Sens. 2017, 9, 828. [Google Scholar] [CrossRef] [Green Version]
  167. Santesteban, L.; Di Gennaro, S.; Herrero-Langreo, A.; Miranda, C.; Royo, J.; Matese, A. High-resolution UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status within a vineyard. Agric. Water Manag. 2017, 183, 49–59. [Google Scholar] [CrossRef]
  168. Suárez, L.; Zarco-Tejada, P.J.; Sepulcre-Cantó, G.; Pérez-Priego, O.; Miller, J.; Jiménez-Muñoz, J.; Sobrino, J. Assessing canopy PRI for water stress detection with diurnal airborne imagery. Remote Sens. Environ. 2008, 112, 560–575. [Google Scholar] [CrossRef]
  169. Torres-Sánchez, J.; López-Granados, F.; Peña, J.M. An automatic object-based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops. Comput. Electron. Agric. 2015, 114, 43–52. [Google Scholar] [CrossRef]
  170. Capolupo, A.; Kooistra, L.; Berendonk, C.; Boccia, L.; Suomalainen, J. Estimating plant traits of grasslands from UAV-acquired hyperspectral images: A comparison of statistical approaches. ISPRS Int. J. Geo-Inf. 2015, 4, 2792–2820. [Google Scholar] [CrossRef]
  171. Carreiras, J.M.; Pereira, J.M.; Pereira, J.S. Estimation of tree canopy cover in evergreen oak woodlands using remote sensing. For. Ecol. Manag. 2006, 223, 45–53. [Google Scholar] [CrossRef]
  172. Cohen, Y.; Alchanatis, V.; Meron, M.; Saranga, Y.; Tsipris, J. Estimation of leaf water potential by thermal imagery and spatial analysis. J. Exp. Bot. 2005, 56, 1843–1852. [Google Scholar] [CrossRef] [Green Version]
  173. Di Gennaro, S.F.; Battiston, E.; Di Marco, S.; Facini, O.; Matese, A.; Nocentini, M.; Palliotti, A.; Mugnai, L. Unmanned Aerial Vehicle (UAV)-based remote sensing to monitor grapevine leaf stripe disease within a vineyard affected by esca complex. Phytopathol. Mediterr. 2016, 55, 262–275. [Google Scholar]
  174. Drake, J.B.; Knox, R.G.; Dubayah, R.O.; Clark, D.B.; Condit, R.; Blair, J.B.; Hofton, M. Above-ground biomass estimation in closed canopy neotropical forests using lidar remote sensing: Factors affecting the generality of relationships. Glob. Ecol. Biogeogr. 2003, 12, 147–159. [Google Scholar] [CrossRef]
  175. Hütt, C.; Koppe, W.; Miao, Y.; Bareth, G. Best accuracy land use/land cover (LULC) classification to derive crop types using multitemporal, multisensor, and multi-polarization SAR satellite images. Remote Sens. 2016, 8, 684. [Google Scholar] [CrossRef] [Green Version]
  176. Possoch, M.; Bieker, S.; Hoffmeister, D.; Bolten, A.; Schellberg, J.; Bareth, G. Multi-temporal crop surface models combined with the RGB vegetation index from UAV-based images for forage monitoring in grassland. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 991. [Google Scholar] [CrossRef]
  177. Sagan, V.; Maimaitijiang, M.; Sidike, P.; Eblimit, K.; Peterson, K.T.; Hartling, S.; Esposito, F.; Khanal, K.; Newcomb, M.; Pauli, D.; et al. Uav-based high resolution thermal imaging for vegetation monitoring, and plant phenotyping using ici 8640 p, flir vue pro r 640, and thermomap cameras. Remote Sens. 2019, 11, 330. [Google Scholar] [CrossRef] [Green Version]
  178. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.H. Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef] [Green Version]
  179. Swain, K.C.; Thomson, S.J.; Jayasuriya, H.P. Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop. Trans. ASAE Am. Soc. Agric. Eng. 2010, 53, 21. [Google Scholar]
  180. Yebra, M.; Van Dijk, A.; Leuning, R.; Huete, A.; Guerschman, J.P. Evaluation of optical remote sensing to estimate actual evapotranspiration and canopy conductance. Remote Sens. Environ. 2013, 129, 250–261. [Google Scholar] [CrossRef]
  181. Agapiou, A.; Alexakis, D.D.; Hadjimitsis, D.G. Spectral sensitivity of ALOS, ASTER, IKONOS, LANDSAT and SPOT satellite imagery intended for the detection of archaeological crop marks. Int. J. Digit. Earth 2014, 7, 351–372. [Google Scholar] [CrossRef]
  182. Chianucci, F.; Disperati, L.; Guzzi, D.; Bianchini, D.; Nardino, V.; Lastri, C.; Rindinella, A.; Corona, P. Estimation of canopy attributes in beech forests using true colour digital images from a small fixed-wing UAV. Int. J. Appl. Earth Obs. Geoinf. 2016, 47, 60–68. [Google Scholar] [CrossRef] [Green Version]
  183. Inglada, J.; Vincent, A.; Arias, M.; Marais-Sicre, C. Improved early crop type identification by joint use of high temporal resolution SAR and optical image time series. Remote Sens. 2016, 8, 362. [Google Scholar] [CrossRef] [Green Version]
  184. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  185. Mintenig, S.; Int-Veen, I.; Löder, M.G.; Primpke, S.; Gerdts, G. Identification of microplastic in effluents of waste water treatment plants using focal plane array-based micro-Fourier-transform infrared imaging. Water Res. 2017, 108, 365–372. [Google Scholar] [CrossRef] [PubMed]
  186. Wu, B.; Meng, J.; Li, Q.; Yan, N.; Du, X.; Zhang, M. Remote sensing-based global crop monitoring: Experiences with China’s CropWatch system. Int. J. Digit. Earth 2014, 7, 113–137. [Google Scholar] [CrossRef]
  187. Bock, C.; Poole, G.; Parker, P.; Gottwald, T. Plant disease severity estimated visually, by digital photography and image analysis, and by hyperspectral imaging. Crit. Rev. Plant Sci. 2010, 29, 59–107. [Google Scholar] [CrossRef]
  188. Bovensmann, H.; Buchwitz, M.; Burrows, J.; Reuter, M.; Krings, T.; Gerilowski, K.; Schneising, O.; Heymann, J.; Tretner, A.; Erzinger, J. A remote sensing technique for global monitoring of power plant CO2 emissions from space and related applications. Atmos. Meas. Tech. 2010, 3, 781–811. [Google Scholar] [CrossRef] [Green Version]
  189. Chaerle, L.; Leinonen, I.; Jones, H.G.; Van Der Straeten, D. Monitoring and screening plant populations with combined thermal and chlorophyll fluorescence imaging. J. Exp. Bot. 2006, 58, 773–784. [Google Scholar] [CrossRef] [Green Version]
  190. Jin, X.; Liu, S.; Baret, F.; Hemerlé, M.; Comar, A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef] [Green Version]
  191. Lasaponara, R.; Masini, N. Detection of archaeological crop marks by using satellite QuickBird multispectral imagery. J. Archaeol. Sci. 2007, 34, 214–221. [Google Scholar] [CrossRef]
  192. Prince, G.; Clarkson, J.P.; Rajpoot, N.M. Automatic detection of diseased tomato plants using thermal and stereo visible light images. PLoS ONE 2015, 10, e0123262. [Google Scholar]
  193. Rhee, J.; Im, J.; Carbone, G.J. Monitoring agricultural drought for arid and humid regions using multi-sensor remote sensing data. Remote Sens. Environ. 2010, 114, 2875–2887. [Google Scholar] [CrossRef]
  194. Smith, M.L.; Ollinger, S.V.; Martin, M.E.; Aber, J.D.; Hallett, R.A.; Goodale, C.L. Direct estimation of aboveground forest productivity through hyperspectral remote sensing of canopy nitrogen. Ecol. Appl. 2002, 12, 1286–1302. [Google Scholar] [CrossRef]
  195. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of winter wheat above-ground biomass using unmanned aerial vehicle-based snapshot hyperspectral sensor and crop height improved models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef] [Green Version]
  196. Zarco-Tejada, P.J.; Guillén-Climent, M.L.; Hernández-Clemente, R.; Catalina, A.; González, M.; Martín, P. Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle (UAV). Agric. For. Meteorol. 2013, 171, 281–294. [Google Scholar] [CrossRef] [Green Version]
  197. Elarab, M.; Ticlavilca, A.M.; Torres-Rua, A.F.; Maslova, I.; McKee, M. Estimating chlorophyll with thermal and broadband multispectral high resolution imagery from an unmanned aerial system using relevance vector machines for precision agriculture. Int. J. Appl. Earth Obs. Geoinf. 2015, 43, 32–42. [Google Scholar] [CrossRef] [Green Version]
  198. Kalacska, M.; Lalonde, M.; Moore, T. Estimation of foliar chlorophyll and nitrogen content in an ombrotrophic bog from hyperspectral data: Scaling from leaf to image. Remote Sens. Environ. 2015, 169, 270–279. [Google Scholar] [CrossRef]
  199. Moeckel, T.; Dayananda, S.; Nidamanuri, R.; Nautiyal, S.; Hanumaiah, N.; Buerkert, A.; Wachendorf, M. Estimation of vegetable crop parameter by multitemporal UAV-borne images. Remote Sens. 2018, 10, 805. [Google Scholar] [CrossRef] [Green Version]
  200. Moshou, D.; Bravo, C.; Oberti, R.; West, J.; Bodria, L.; McCartney, A.; Ramon, H. Plant disease detection based on data fusion of hyperspectral and multi-spectral fluorescence imaging using Kohonen maps. Real-Time Imaging 2005, 11, 75–83. [Google Scholar] [CrossRef]
  201. Rußwurm, M.; Korner, M. Temporal vegetation modelling using long short-term memory networks for crop identification from medium-resolution multi-spectral satellite images. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Honolulu, HI, USA, 21–26 July 2017; pp. 11–19. [Google Scholar]
  202. Skakun, S.; Kussul, N.; Shelestov, A.Y.; Lavreniuk, M.; Kussul, O. Efficiency assessment of multitemporal C-band Radarsat-2 intensity and Landsat-8 surface reflectance satellite imagery for crop classification in Ukraine. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 9, 3712–3719. [Google Scholar] [CrossRef]
  203. Xie, C.; Shao, Y.; Li, X.; He, Y. Detection of early blight and late blight diseases on tomato leaves using hyperspectral imaging. Sci. Rep. 2015, 5, 16564. [Google Scholar] [CrossRef]
  204. Gracia-Romero, A.; Kefauver, S.C.; Fernandez-Gallego, J.A.; Vergara-Díaz, O.; Nieto-Taladriz, M.T.; Araus, J.L. UAV and ground image-based phenotyping: A proof of concept with Durum wheat. Remote Sens. 2019, 11, 1244. [Google Scholar] [CrossRef] [Green Version]
  205. Franceschet, M. The role of conference publications in CS. Commun. ACM 2010, 53, 129–132. [Google Scholar] [CrossRef]
  206. Vrettas, G.; Sanderson, M. Conferences versus journals in computer science. J. Assoc. Inf. Sci. Technol. 2015, 66, 2674–2684. [Google Scholar] [CrossRef]
  207. Zou, Y.; Li, G.; Wang, S. The Fusion of Satellite and Unmanned Aerial Vehicle (UAV) Imagery for Improving Classification Performance. In Proceedings of the 2018 IEEE International Conference on Information and Automation (ICIA), Wuyishan, China, 11–13 August 2018; pp. 836–841. [Google Scholar]
  208. Abbaspour-Gilandeh, Y.; Sabzi, S.; Benmouna, B.; García-Mateos, G.; Hernández-Hernández, J.L.; Molina-Martínez, J.M. Estimation of the Constituent Properties of Red Delicious Apples Using a Hybrid of Artificial Neural Networks and Artificial Bee Colony Algorithm. Agronomy 2020, 10, 267. [Google Scholar] [CrossRef] [Green Version]
Figure 1. PRISMA flow chart resulting in the present mapping study.
Figure 1. PRISMA flow chart resulting in the present mapping study.
Applsci 10 03456 g001
Figure 2. Publication channels of the selected studies.
Figure 2. Publication channels of the selected studies.
Applsci 10 03456 g002
Figure 3. Publication trends throughout the years for the candidate and selected papers.
Figure 3. Publication trends throughout the years for the candidate and selected papers.
Applsci 10 03456 g003
Figure 4. Research types and empirical validation types of the selected papers.
Figure 4. Research types and empirical validation types of the selected papers.
Applsci 10 03456 g004
Figure 5. Frequency of the main types of capture platforms for the research in RSA.
Figure 5. Frequency of the main types of capture platforms for the research in RSA.
Applsci 10 03456 g005
Table 1. Mapping questions defined in the present review.
Table 1. Mapping questions defined in the present review.
IDMapping QuestionRationale
MQ1What publication channels are the main targets for RSA?Identifying where RSA research can be found, and the most adequate publication channels for future works
MQ2How has the frequency of approaches related to RSA changed over time?Identifying publication trends over time related to RSA
MQ3What are the main research types of RSA studies?Exploring different types of research existing in the literature about RSA
MQ4Are RSA studies empirically validated?Discovering if research works on RSA has been validated with empirical methods
MQ5What types of techniques were reported in RSA research?Detecting the most important types of computer vision and machine learning techniques reported in the existing RSA literature
MQ6What are the platforms used to capture the images for RSA?Exposing the main types of devices employed to obtain the images in RSA
MQ7What are the research topics addressed by RSA?Studying what are the most prominent topics currently tackled in RSA research
MQ8What are the different types of spectral information used?Analyzing what types of images are the most frequently used in RSA research
Table 2. Publication sources with more than one selected paper.
Table 2. Publication sources with more than one selected paper.
Journal NameTotal
Remote Sensing20
Remote Sensing of Environment15
Journal of Experimental Botany6
International Journal of Applied Earth Observation and Geoinformation6
Computers and Electronics in Agriculture5
Agricultural Water Management4
Agricultural and Forest Meteorology3
ISPRS Journal of Photogrammetry and Remote Sensing3
Journal of Stored Products Research2
Ecological Modelling2
International Journal of Digital Earth2
Table 3. Classification of the types of techniques used in the selected papers.
Table 3. Classification of the types of techniques used in the selected papers.
TechniquesRef.Total
Classification systems[101,104,106,108,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159]54
Feature extraction[41,91,107,108,111,155,160,161,162,163,164,165,166,167,168,169]16
Similarity and maximum likelihood[92,117,118,170,171,172,173,174,175,176,177,178,179,180]14
Preprocessing and segmentation[108,112,113,114,115,181,182,183,184,185,186]11
Recognition systems[96,187,188,189,190,191,192,193,194,195,196]11
Other machine learning algorithms[79,95,110,116,197,198,199,200,201,202,203]11
Table 4. Classification of the main types of research topics addressed in remote sensing in agriculture (RSA) papers.
Table 4. Classification of the main types of research topics addressed in remote sensing in agriculture (RSA) papers.
Research TopicRef.Total
Agricultural parameters extraction[79,91,95,96,104,108,110,111,114,120,122,125,126,129,136,142,147,148,150,153,154,157,161,175,178,181,182,183,184,188,191,196,201,202]34
Growth vigor[91,92,114,115,116,117,132,139,142,144,145,152,158,159,161,171,174,176,177,184,189,190,194,195,197,199]26
Drought stress, irrigation and water productivity[91,101,106,107,119,124,127,128,130,131,135,138,151,155,162,163,165,166,167,172,180]21
Detection of pathogens, diseases and insect pests[133,141,143,146,155,156,173,187,192,200,203]11
Yield prediction[91,134,140,151,152,159,179,186]8
Weed detection[113,137,149,169,170]5
Nutrient status[118,121,151,198]4
Automatic crop harvesting-0
Table 5. Classification of types of spectral information used in RSA research.
Table 5. Classification of types of spectral information used in RSA research.
Spectral InformationRef.Total
RGB (visible spectrum)[79,91,95,104,110,111,113,115,116,117,118,121,122,125,126,130,132,140,142,145,147,148,150,151,152,160,161,162,164,165,169,171,176,178,179,180,182,183,184,186,187,188,190,192,193,199]46
Hyperspectral (narrow band)[91,96,104,108,112,118,123,127,134,137,141,143,144,146,153,155,156,157,168,170,187,194,195,196,198,200,201,203]28
Long-wave infrared (thermal)[41,101,106,107,124,131,135,136,138,141,151,155,164,165,166,167,168,172,177,189,193,197]22
Near infrared (NIR)[116,117,118,121,130,133,134,142,145,151,158,181,186,193]14
Multispectral (broad band)[114,129,139,146,149,154,159,163,171,173,197,200,201]13
Red edge spectrum[101,104,119,128,132,142,158,160,181,185,188,191]12
Synthetic aperture radar (SAR)[79,104,120,121,126,183,202]7
Light detection and ranging (LiDAR)[92,108,148,174]4
Short-wave infrared[117,121]2

Share and Cite

MDPI and ACS Style

García-Berná, J.A.; Ouhbi, S.; Benmouna, B.; García-Mateos, G.; Fernández-Alemán, J.L.; Molina-Martínez, J.M. Systematic Mapping Study on Remote Sensing in Agriculture. Appl. Sci. 2020, 10, 3456. https://doi.org/10.3390/app10103456

AMA Style

García-Berná JA, Ouhbi S, Benmouna B, García-Mateos G, Fernández-Alemán JL, Molina-Martínez JM. Systematic Mapping Study on Remote Sensing in Agriculture. Applied Sciences. 2020; 10(10):3456. https://doi.org/10.3390/app10103456

Chicago/Turabian Style

García-Berná, José Alberto, Sofia Ouhbi, Brahim Benmouna, Ginés García-Mateos, José Luis Fernández-Alemán, and José Miguel Molina-Martínez. 2020. "Systematic Mapping Study on Remote Sensing in Agriculture" Applied Sciences 10, no. 10: 3456. https://doi.org/10.3390/app10103456

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop