Next Article in Journal
Enhancing Solar-Induced Fluorescence Interpretation: Quantifying Fractional Sunlit Vegetation Cover Using Linear Spectral Unmixing
Next Article in Special Issue
Integrating Topographic Skeleton into Deep Learning for Terrain Reconstruction from GDEM and Google Earth Image
Previous Article in Journal
The Influence of Ocean Processes on Fine-Scale Changes in the Yellow Sea Cold Water Mass Boundary Area Structure Based on Acoustic Observations
Previous Article in Special Issue
UAV-Based Disease Detection in Palm Groves of Phoenix canariensis Using Machine Learning and Multispectral Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review on UAV-Based Applications for Plant Disease Detection and Monitoring

1
Centre for Applied Climate Sciences, University of Southern Queensland, Toowoomba, QLD 4350, Australia
2
Water, Environment and Development Unit, SPHERES Research Unit, Department of Environmental Sciences and Management, University of Liège, 6700 Arlon, Belgium
3
Plant Protection Laboratory, Regional Center of Agricultural Research of Meknes, National Institute of Agricultural Research, Km 13, Route Haj Kaddour, BP 578, Meknes 50001, Morocco
4
Phytopathology Unit, Department of Plant Protection, Ecole Nationale d’Agriculture de Meknes, Meknes 50001, Morocco
5
Horticultural Sciences Department, University of Florida, Gainesville, FL 32611-0690, USA
6
Department of Agricultural Economics, Ecole Nationale d’Agriculture de Meknes, BP S/40, Meknes 50001, Morocco
7
Nematology Laboratory, Biotechnology Unit, National Institute of Agricultural Research, CRRA-Rabat, Rabat 10101, Morocco
8
Environmental Research and Innovation, Luxembourg Institute of Science and Technology, 4422 Belvaux, Luxembourg
9
Plant Pathology Laboratory, AgroBiosciences, College of Sustainable Agriculture and Environmental Sciences, Mohammed VI Polytechnic University, Lot 660, Hay Moulay Rachid, Ben Guerir 43150, Morocco
*
Author to whom correspondence should be addressed.
Remote Sens. 2023, 15(17), 4273; https://doi.org/10.3390/rs15174273
Submission received: 24 July 2023 / Revised: 17 August 2023 / Accepted: 29 August 2023 / Published: 31 August 2023
(This article belongs to the Special Issue Machine Learning for Multi-Source Remote Sensing Images Analysis)

Abstract

:
Remote sensing technology is vital for precision agriculture, aiding in early issue detection, resource management, and environmentally friendly practices. Recent advances in remote sensing technology and data processing have propelled unmanned aerial vehicles (UAVs) into valuable tools for obtaining detailed data on plant diseases with high spatial, temporal, and spectral resolution. Given the growing body of scholarly research centered on UAV-based disease detection, a comprehensive review and analysis of current studies becomes imperative to provide a panoramic view of evolving methodologies in plant disease monitoring and to strategically evaluate the potential and limitations of such strategies. This study undertakes a systematic quantitative literature review to summarize existing literature and discern current research trends in UAV-based applications for plant disease detection and monitoring. Results reveal a global disparity in research on the topic, with Asian countries being the top contributing countries (43 out of 103 papers). World regions such as Oceania and Africa exhibit comparatively lesser representation. To date, research has largely focused on diseases affecting wheat, sugar beet, potato, maize, and grapevine. Multispectral, reg-green-blue, and hyperspectral sensors were most often used to detect and identify disease symptoms, with current trends pointing to approaches integrating multiple sensors and the use of machine learning and deep learning techniques. Future research should prioritize (i) development of cost-effective and user-friendly UAVs, (ii) integration with emerging agricultural technologies, (iii) improved data acquisition and processing efficiency (iv) diverse testing scenarios, and (v) ethical considerations through proper regulations.

Graphical Abstract

1. Introduction

Plant diseases have multifaceted and far-reaching consequences, impacting agriculture, ecosystems, economies, and human well-being. They can lead to reduced crop yields, lower crop quality, and even complete crop failures, which can disrupt the supply chain, result in increased food prices and potential food shortages, and negatively impact food security and the livelihood of stakeholders engaged in agricultural sectors [1,2]. Globally, the economic impact of crop yield loss due to plant diseases is estimated to be around US$220 billion each year [3]. Annual yield losses due to plant diseases and pests in the top food staple rice, maize, and wheat range from 24.6% to 40.9% for rice, from 19.5% to 41.1% for maize, and from 10.1% to 28.1% for wheat worldwide [4]. Plant diseases can also alter ecosystems by affecting the abundance and distribution of plant species and disrupting the food web and ecosystem dynamics [5,6]. Some plant diseases may cause health issues in humans and livestock. For example, mycotoxins produced by certain fungi can contaminate crops, leading to the ingestion of toxins through food consumption [7]. It is, therefore, essential to adopt good management practices to reduce disease risk and potential epidemic outbreaks in order to minimize their impact and ensure good crop production [8,9].
Besides characterizing the main factors conducive to potential outbreaks, managing plant disease epidemics in farms also requires early and rapid detection of the disease, as well as a good knowledge of the patterns of the disease incidence and severity over time and space [1,10]. Over the past decades, remote sensing technology has emerged as a valuable data source in precision agriculture by providing spatially explicit and unbiased information on crops, soils, and environmental conditions across various spatial (ranging from individual fields to watersheds) and temporal scales [11]. For instance, remote sensing data can be used to detect and track plant disease outbreaks, assess disease severity, and verify the effectiveness of fungicide treatments [10,12,13,14]. Indeed, sensors embarked on remote sensing platforms such as satellites, aircraft, and aerial unmanned systems can detect changes in spectral reflectance, chlorophyll fluorescence, and plant temperature, which can indicate stress caused by a pathogenic organism [13,14,15,16].
In recent years, unmanned aerial vehicles (UAVs), or drones, have been increasingly used in precision agriculture [13,17] as they give the opportunity to bridge the existing gap between satellite remote sensing data and field monitoring. UAVs offer the ability to cover large areas quickly and efficiently and to collect high-resolution images in real-time [13,17,18,19,20,21,22]. They can fly at specific altitudes and angles, providing consistent and precise image data, and can be deployed regularly to monitor crops. Another advantage of using UAV is related to the data storage and documentation. UAVs provide a digital record of crop health over time, which can be useful for future analysis, research, and even insurance claims in case of crop losses due to diseases or extreme weather events (e.g., drought, flood, frost, etc.). In plant disease management, they are revolutionizing traditional methods of disease monitoring and treatment as they help in quantifying the extent of disease outbreaks and in detecting and identifying disease symptoms when human assessment is unsuitable or unavailable [17,23]. Since they can be deployed regularly, UAVs provide frequent updates on the spatial distribution of diseases, which enables farmers to make timely decisions about disease management strategies. Moreover, UAVs can access areas that are difficult to reach by traditional means, such as hilly terrain, dense vegetation, or large fields, enabling comprehensive disease monitoring across the entire agricultural landscape. By enabling the early detection of disease outbreaks and timely monitoring of disease progress, UAV-based imagery provides critical data that can be used to improve management practices and to increase time efficiency and crop yields, ultimately leading to profitable and sustainable farming activities [13,19,21].
There have been multiple review articles dealing with the use of UAV for monitoring and assessing biotic plant stresses, including plant diseases (e.g., [13,17,24,25,26,27]). For example, Barbedo [13] discussed UAV imagery-based monitoring of different plant stresses caused by drought, nutrition disorders, and diseases and the detection of pests and weeds using UAVs. Their review involved more than 100 research articles, including 25 papers related to UAV-based applications for disease detection and quantification. The types of sensors used to capture the images, the methods employed in data processing and analysis, and the challenges inherent to automatic plant disease detection were the focus of the section dedicated to plant diseases in their review (Section 2.3 in [13]). The latter point of discussion in their study aligned with their previous study (Barbedo [24]) where the author provided a comprehensive review of the main challenges associated with automatic plant disease identification using visible range images. In their review, Neupane and Baysal-Gurel [17] presented an overview of how UAVs can be employed to monitor plant diseases in the field while also discussing the fundamental principles related to UAV components like peripherals, sensors, and cameras, their constraints and practical usage. They also reflected on the main challenges associated with the automatic detection of plant diseases [17]. While acknowledging the challenges already discussed in [24], these authors focused on the issues related to image analysis and result interpretation, flight regulations, and privacy issues while operating UAVs and suggested possible solutions to address these challenges. Other review articles [25,26,27] reported on specific aspects of UAV-based disease classification using deep learning (DL), an advanced machine learning (ML) technique. While the reviews in [25,26] provided a broader scope as they encompass all existing sensors and cameras, the study of Kuswidiyanto et al. [27] focused on the use of UAV hyperspectral data and DL to detect plant diseases.
In all the reviews listed above, an overview of the types of plants and diseases investigated using UAV imagery, the trends of sensor and camera types, along with the related data analysis methods has yet to be provided. Furthermore, as UAV-based plant stress detection is still a subject of ongoing research, a comprehensive overview and interpretation of current research on UAV-based applications for plant disease detection and monitoring is of particular interest. For farmers willing to adopt such approaches, such a comprehensive review can serve as a repository of knowledge, elucidating the evolving landscape of technological advancements and methodologies pertinent to disease management. It also offers a strategic perspective on the potential and limitations of these approaches. For agribusinesses, comprehensive reviews can facilitate informed decision-making regarding investment, implementation, and integration of UAV systems within farm activities. For researchers, in addition to providing potential research avenues, the findings of the review can help create and/or foster collaboration and information exchange, encouraging innovation and cross-sectoral synergy.
In this study, we aimed to analyze current research on UAV-based approaches for plant disease detection, identification, and quantification using a systematic quantitative literature review (SQLR) [28,29]. Through the SQLR we complemented existing reviews by quantitatively assessing the literature on the topic. More specifically, we systematically examined the literature to provide a comprehensive overview of the types of plants and diseases investigated using UAV imagery and to characterize the types of sensors used and methods employed to analyze the images and quantify disease incidence and/or severity. From the analysis, we then discussed future research directions for improved management of plant diseases using UAV-based approaches.

2. Methods

A systematic quantitative literature review of the scientific literature was carried out following the methods outlined in Pickering and Byrne [28]. A set of key search terms was applied to survey the literature in two scholarly databases, Scopus and Web of Science, to identify the relevant literature published in peer-reviewed English language academic journals about UAV-based applications for plant disease detection and monitoring. The literature search was done by article title, abstract, and keywords using the search strings [“UAV” ‘AND’ “plant” AND “disease”]. We limited our search to the period ending in December 2022. We used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) diagram [30] to track the process of identifying and selecting relevant papers for this study (Figure 1). The literature search returned 450 publications (245 from Scopus and 205 from Web of Science), from which 148 duplicates were removed. Publications such as review articles, book chapters, and conference proceeding papers were also excluded. Then, we read the abstracts and screened each paper (i.e., reading the materials and methods section). This resulted in the exclusion of 104 articles because they were either irrelevant or did not focus enough on the detection and/or monitoring of plant diseases using UAV imagery. A total of 103 relevant peer-reviewed articles were selected to be fully examined in this study.
Results of the literature search in Scopus and Web of Science were handled using EndNote (version X9.3.3; Clarivate, London, UK). This included the removal of duplicates and conference papers and screening of abstracts. Data analysis was carried out in MS Excel (Microsoft, Redmond, WA, USA) and R (version 4.3.1 [31]).

3. Results

3.1. Increased Research Interest in Recent Years

One hundred and three peer-reviewed research articles that specifically discussed UAV-based approaches to identify and/or monitor plant diseases were fully examined. The temporal distribution of scholarly articles published on the review topic is presented In Figure 2. The first research article dealing with the review topic was published in 2013 (Figure 2) and was about the use of UAV imagery-derived information to assess Verticillium wilt infection and severity in olive orchards [32]. From 2013 to 2017, the rate of publications was low, with barely two research articles published per year. Between 2018 and 2022, a sharp increase in peer-reviewed articles published on the topic was observed, with 93% of the total articles reviewed published during that period (Figure 2), indicating the growing interest in plant disease research involving UAV-derived information. This trend from the year 2013 onward can be explained by different factors, including the costs of UAVs, which are becoming more affordable., the improvement of techniques for handling and processing UAV imagery data, and the need for convenient and cost-effective solutions to manage plant diseases in agricultural production [17,33].
Despite the increase in research interest in recent years, a global disparity in such research was found. Studies were conducted in 29 different countries. From a world region perspective, as of December 2022, Asian countries have contributed the most to research on plant disease detection and monitoring based on UAV imagery (43 papers in total, Table 1). World regions with few research articles are Oceania (one related paper [34]) and Africa (four related papers [35,36,37,38]) (Table 1). From a country perspective, the top two countries that most contributed to research on plant disease detection and monitoring based on UAV imagery were China and the United States of America (USA), with 25 and 18 research articles, respectively, as of December 2022 (Figure 3). Research on the use of drones in agriculture in these two countries is being driven by the potential drone market and the need for improved crop management practices to cost-effectively address productivity issues while relying less on manual labor [39]. The other top countries were Brazil (seven papers), Malaysia (six papers), Germany, and Italy (five papers each) (Figure 3). For the majority (15) of the remaining countries of study, only one research article was identified in our SQLR.

3.2. Plants of Interest Found in the Reviewed Research Articles

The SQLR indicated that current research has dealt with disease symptoms on 35 different plants (Figure 4). Not surprisingly, diseases in cereal crops were most often investigated in the articles reviewed, with wheat and maize being the cereal crops that were most investigated (Figure 4). Other plant species most often studied included potato and sugar beet (Figure 4). When breaking down the number of research articles by plant species investigated for the top countries of studies China, USA, Brazil, Malaysia, Germany, and Italy, the analysis showed that in China or the USA, diseases in 10 different plant species were investigated. Diseases on wheat, pine tree, and banana were the most studied in China, whereas in the USA, it was research on maize diseases that dominated (Figure 4a). In this latter country, the number of research articles reporting on UAV-based approaches for disease monitoring was the same for apple, citrus, cotton, tomato, and watermelon (Figure 4a). In Brazil, diseases on five plant species were investigated, with coffee and soybean dominating. For Malaysia, research on UAV-based monitoring of diseases affecting oil palm ranked first among the three plant species of study (rice and eucalyptus were the two other plant species). A distinct trait was found for Germany, where most studies (four out of five) concerned sugar beet (Figure 4a).
While one would expect that the economic importance of the crop in a given country would have resulted, to some extent, in greater research on addressing key production challenges, such as plant disease management using emerging technologies and techniques, our analysis showed mixed results. In countries, such as China, USA, Brazil, and Malaysia, diseases affecting economically important crops were among the most researched. For example, China, which is among the top 10 wheat-producing countries worldwide [41], had wheat as the top researched crop when it comes to UAV-based approaches for disease management. However, countries such France, the Russian Federation, Canada, India or Ukraine, which also ranked among the world’s leading wheat producers, were missing from our analysis (Figure 4). Such under-representation of the world’s leading producing countries holds true for the vast majority of the study crops reported in the papers reviewed in our SQLR. This highlights the need for further investigation into the feasibility and performance of UAV-based approaches for disease management in these countries and others where such studies have yet to be carried out.

3.3. Diseases and Groups of Pathogens Investigated

The list of plant diseases whose symptoms and/or severity were assessed using UAV-based imagery is presented in Table 2. Overall, the symptoms and/or severity of more than 80 plant diseases have been monitored using UAV-based approaches. Depending on the plant and the disease, the studies involved disease symptoms visible on either leaf, stem, or fruit, with most of the studies focusing on leaf diseases. In wheat, six main diseases were investigated, including leaf rust (caused by Puccinia triticina) [42], yellow rust (caused by P. striiformis f. sp. tritici) [42,43,44,45,46,47,48,49,50,51], powdery mildew (caused by Blumeria graminim f. sp. tritici) [52], tan spot (caused by Pyrenophora tritici-repentis) [53], Septoria leaf blotch (caused by Zymoseptoria tritici) [53], and Fusarium head blight (caused by a complex of Fusarium graminearum Schwabe and F. culmorum) [54,55] (Table 2). The first four diseases typically attack wheat leaves, whereas yellow rust can cause damage to the leaves and stems, whereas symptoms of Fusarium head blight are visible on infected spikelets. For potatoes, symptoms of five diseases have been investigated using UAV-based approaches (Table 2). These diseases include potato early blight (caused by Alternaria solani Sorauer) [56], late blight (caused by Phytophthora infestans (Mont.) De Bary) [57,58,59,60], the Y virus (caused by the potato virus Y) [61], soft rot (caused by Erwinia bacteria) [56], and vascular wilt (caused by Pseudomonas solanacearum) [62]. While some of these diseases can also affect potato tubers (i.e., late blight and potato Y virus), only symptoms visible on the above-ground organs (stems and leaves) were assessed using UAVs. Most of the articles reviewed focused on the symptoms of a single disease, amongst other diseases. Exceptions include the study of Heidarian Dehkordi et al. [42] regarding the identification of plants infected by yellow rust and wheat leaf rust in winter wheat using UAV-based red-green-blue (RGB) imagery, and that of Kalischuk et al. [63] in which symptoms of various diseases, including gummy stem blight, anthracnose, Fusarium wilt, Phytophthora fruit rot, Alternaria leaf spot, and cucurbit leaf crumple disease of watermelon, were assessed using UAV-based multispectral imagery.
The causing pathogens of the diseases found in the articles reviewed in the SQLR belong to six biological groups: fungi, fungi-like oomycetes, bacteria, viruses, stramenopila, and nematode (Table S1). Fungi accounted for more than half of the pathogens, followed by bacteria. The majority of these pathogenic organisms are biotrophs (Table S1). Diseases resulting from infections by hemibiotrophic and necrotrophic pathogens were also identified using UAV imagery data.

3.4. Sensors Used for the Detection and Monitoring of Plant Diseases

Various types of sensors mounted on UAVs have been used to collect high spatial and spectral resolution data for plant disease detection and monitoring (Figure 5). The most used sensors were multispectral, RGB, hyperspectral, and digital cameras. Wheat was the plant whose diseases were investigated using different sensor types (individually or in combination) (Figure 5). Thus, symptoms of yellow rust on wheat leaves have been investigated using data from multispectral sensors [43,45,51], RGB cameras [42,48,50], hyperspectral sensors [44,46,47], and RGB + multispectral sensors [49]. Symptoms of Fusarium head blight were identified using data captured by hyperspectral sensors [55] and thermal infrared + RGB sensors [54], whereas symptoms of Septoria leaf blotch and tan spot were detected using RGB + multispectral sensors [53] (Figure 5). Images acquired using multispectral and RGB sensors were more often used to derive vegetation indices (VIs), which allowed for the detection of changes in vegetation health indicative of disease (e.g., discoloration, wilting, spots). Owing to their capability to capture images in different narrow spectral bands, hyperspectral sensors were used to detect more subtle changes in vegetation health that may not be visible with other sensors. Such data were used to create spectral signatures characteristic of a given disease.
There were multiple studies in which combinations of different sensors were used. Combinations of sensors included multispectral + digital, multispectral + thermal, multispectral + RGB, hyperspectral + RGB, RGB + near infrared, thermal + multispectral + RGB, thermal infrared + RGB, Multispectral + Hyperspectral + Thermal, and hyperspectral + RGB + Light Detection and Ranging (LiDAR) (Figure 5). In the majority of studies involving multiple sensors, the information derived from them was used directly for detecting disease symptoms. However, there were instances where data from one sensor complemented the processing of data from other sensors in order to identify disease symptoms. Examples include the study of Yu et al. [114], in which hyperspectral, RGB, and LiDAR sensors were involved. Data from the LiDAR sensor (i.e., digital elevation model data) were used during the pre-processing step of hyperspectral data. The identification of pine wilt disease symptoms was then conducted based on information retrieved from the hyperspectral and RGB sensors [114].

3.5. Methods Used for Image Processing and Data Analysis

By capturing high spatial and spectral resolution images, sensors, and cameras embarked on UAVs provide valuable data that can be leveraged to analyze and detect plant disease symptoms. Results of our SQLR showed that various techniques, including visual analysis, computer vision, and VI-based analysis, have been used to process and analyze UAV-based imagery data for plant disease detection. Among these techniques, computer vision was the most used technique. It was employed in 52 studies out of 103 research articles reviewed. In computer vision, the algorithms used for image classification and object recognition were machine learning (ML) algorithms that enabled the extraction of meaningful information from the images by automatically identifying and classifying visual patterns associated with disease symptoms. Generally, after image pre-processing, feature extraction techniques were employed to identify the relevant visual characteristics associated with disease symptoms. Then, the extracted features were classified into different categories (e.g., healthy, diseased, etc.). Next, ML algorithms were trained on labeled datasets where regions of interest have been annotated manually as healthy or diseased by human experts. Through the training process, the algorithms learned to recognize and distinguish between healthy and diseased plant organs. Depending on the extracted features, the classification analysis was either color, texture, shape, or spectral-based. Color-based analysis examines variations in coloration of the plant organ of interest (i.e., leaf) that may indicate the presence of disease. Examples of the reviewed articles employing this method are [52,57,85,92,96]. Shape-based analysis helps to discern irregularities in plant structures caused by diseases (e.g., [61,97]). Texture analysis techniques (e.g., the gray level co-occurrence matrix) enable the quantification of textural differences between healthy and diseased plant organs. Such a technique was employed by Guo et al. [47]. Spectral-based analysis involves the identification of specific disease symptoms based on the distinctive spectral patterns associated with different pathogens or physiological responses of plants to infections (e.g., [59,78,109,117]). Combining multiple features (i.e., color and texture features) was also adopted to achieve a comprehensive representation of the disease symptoms and enhance the accuracy of subsequent classification algorithms (e.g., [46,116]).
The second most used method for plant disease detection based on UAV imagery was through VIs. In studies employing this method (23 in total), commonly used VIs, such as normalized difference vegetation index (NDVI), green NDVI, triangular greenness index, and simple ratio index, or new VIs, were calculated and used as potential predictors in regression and ML-based analyses to explain the variability in disease severity (e.g., [32,69,77,79,80,107,112]). VIs were also used to assess the homogeneity of matched-pair data (visual and UAV imagery-based disease severity) (e.g., [54,63,84,94,119]) and for mapping vegetation health status of the study area (e.g., [42,45,83]). To monitor the progress of the disease over time, temporal analysis involving either computer vision or VIs-based technique was used. In studies involving temporal analysis (e.g., [42]), changes in the severity and spatial distribution of disease symptoms were quantified by comparing the outputs of images captured on different dates.

3.6. Increasing Popularity of Machine Learning-Based Approaches

Table 3 provides an overview of the statistical, ML, and mapping approaches used to analyze UAV imagery data for plant disease detection, along with the temporal distribution of related research articles, as reviewed in this study. Unsurprisingly, ML-based approaches were the most adopted approaches, with 2/3 of related reviewed papers (Table 3, Figure 6). They have been used to analyze data acquired from virtually all the sensors surveyed in the SQLR (Figure 6). Another method employed for UAV data analysis was regression modeling (Figure 6). The choice of an ML-based approach for disease detection using UAV imagery is guided by various factors, including data complexity and scale, diversity of training data, computational resources, the model interpretability, scalability, and domain expertise [25]. Moreover, in the articles reviewed in our study, the ML algorithms most often used included support vector machine, random forest, K-nearest neighbors, Naive Bayes, artificial neural networks, and deep learning (DL) algorithms (i.e., convolutional neural network (CNN), back-propagation, and multilayer perceptron), with the latter approaches being increasingly used in recent years. For example, in 2022 alone, there were 11 research articles in which DL algorithms were employed to identify symptoms of plant diseases (Table 3). In comparison, for the same year, there were 12 research articles dealing with ML-based approaches other than DL-based (Table 3).
Of particular interest in DL-based approaches are the CNNs, which are gaining in popularity for image classification tasks in plant disease monitoring because of their good capability to learn hierarchical features from raw data, compared to traditional ML algorithms, and their ability to automatically and simultaneously extract spatial and spectral features [25,27,137]. The CNN architectures used in the research articles reviewed in this study comprised classical and customized architectures. Classical CNN architectures included the ResNet (e.g., [56,91,92,93,101,111,132]), U-Net (e.g., [49,56]), DarkNet53 (e.g., [111,132]), LeNet-5 (e.g., [85]), SqueezeNet (e.g., [101]), GoogleNet (e.g., [116]), and DeepLabv3+ (e.g., [50]). Customized architectures were a combination or modified version of the classical architectures. Examples of combined architectures included the EfficientNet-EfficientDet network (see [64]), the vine disease detection network (VddNet), which is a parallel architecture based on the Visual Geometry Group (VGG) network encoder (see [86]), the MobiRes-Net, which combines the ResNet50 and MobileNet architectures (e.g., [104]), the Inception-ResNet architecture, which combines the Inception and ResNet architectures (see [46]), and the CropdocNet, which consists of three encoders (i.e., spectral information, spectral–spatial feature, and class-capsule encoders) and a decoder (see [60]). The modified versions of classical architectures used in the reviewed research articles were the “enhanced CenterNet”, which was employed in [113], the fully convolutional DenseNet, employed in [126], and the efficient dual flow U-Net, employed in [51]. The performance of ML algorithms heavily relies on the quality and representativeness of the training dataset, as well as the choice of appropriate features [25]. In recent years, there have been more concerted undertakings in employing DL techniques to identify plant diseases using data publicly available UAV imagery datasets (e.g., [133]). Efforts to make methodologically collected and annotated data publicly available for research (e.g., PlantVillage dataset [40]) are commendable and must be sustained as such dataset would enhance the benchmarking of ML methods and allow for a better orientation of research.

4. Discussion

4.1. Promising Means for Improving Plant Disease Management

Eleven years on from the work of Mahlein et al. [14], which critically reviewed the use of non-invasive sensors for the detection, identification, and quantification of plant diseases, there has been noticeable progress in the field of plant disease detection and monitoring using remote sensing derived information. In recent years, UAV-based imagery has become the new norm for plot and field-level studies. UAV-based approaches for plant disease detection and identification have several advantages over traditional methods as sensors mounted on UAVs provide high-resolution and spectral images that can be used to identify small-scale changes in crop health. UAVs also provide a fast and effective solution for capturing images over larger farmland areas, which can be challenging when using ground-based methods, though the use of UAVs in larger areas can be limited by the payload capacity and battery resources [13]. Other advantages of UAV-based approaches for plant disease monitoring include the reduced reliance on manual inspection and scouting, thereby saving time and resources. While initial investments in UAV technology might be significant, they can lead to long-term cost savings. As such, UAVs offer a promising approach for improved plant disease management. The variety of approaches used in the research articles reviewed in this study, the limited number of plant species whose diseases have been investigated (35 plant species for the period ending in December 2022, Figure 4), and the different plant diseases involved (over 80 diseases, Table 2) indicate that the use of UAVs for plant disease detection and monitoring is most likely to become widespread in the coming years, that is, being used in different regions and/or extended to other plant species and diseases.
While UAV-based approaches for plant disease monitoring offer several advantages, it is important to acknowledge their limitations [13,17,24]. Challenges related to background interference, weather conditions, sensor constraints, resource limitations (e.g., peripherals, sensors) and disparities between ML-based model training and validation stages, variations in disease symptoms over time and in space have been addressed in [13,17,24]. In papers published after these review articles, we found that researchers were still facing similar challenges and constraints. Therefore, these challenges will not be discussed extensively here. We briefly discussed some. Adverse weather, such as strong winds, rain, or low light conditions during UAVs flights, can hinder image acquisition and potentially impact the accuracy of disease detection. Another limitation is related to the image annotation consistency. Because the accuracy of disease detection relies on the expertise and experience of the human annotators who label the training datasets, variations in annotations among different operators can introduce inconsistencies and affect the generalization capabilities of the classification models. To overcome limitations associated with weather conditions, careful consideration and planning are required to avoid unfavorable weather conditions as much as possible and ensure a representative sampling of the field. Another potential solution would be to develop autonomous UAV systems that can operate in complex environments (e.g., under reduced light conditions) and adapt to changing conditions to improve flight operations. To address annotation consistency, regular training and calibration sessions are possible solutions to help overcome such a challenge.

4.2. Addressing the Potential Ethical Implications and Privacy Concerns

From a broader perspective, the increasing use of UAVs in precision agriculture can raise several ethical and privacy concerns [138,139]. These include (1) privacy intrusion: Images of individuals or private properties can inadvertently be captured by UAVs, resulting in privacy infringement and concerns about surveillance, (2) consent and awareness: Farmers and individuals residing in areas where UAVs are deployed might not be adequately informed about the data collection activities, raising concerns about informed consent and awareness, (3) data collection and ownership: Sensitive information about crops, farms and even land use can be collected by UAVs, which can lead to concerns over data access and usage rights, (4) data security: The transmission and storage of data collected by UAVs could be susceptible to cyberattacks or unauthorized access, leading to the compromise of sensitive agricultural information, (5) regulations and oversight: Insufficient regulations and oversight could lead to misuse of UAVs for unauthorized activities, such as trespassing or unauthorized surveillance, and (6) confidentiality: Competing agricultural businesses might be concerned about their proprietary techniques or practices being revealed through UAV surveillance. Addressing these ethical and privacy concerns involves technology development, regulatory measures, public awareness, and responsible practices. Governments and regulatory bodies should establish clear guidelines and regulations governing the use of UAVs for agricultural purposes, including data collection, storage, and sharing. These regulations should address privacy, data ownership, and informed consent. For some countries, regulations governing the use of UAVs to facilitate data collection and monitoring in agricultural activities do exist [140] (https://www.droneregulations.info/index.html; accessed on 14 August 2023). A potential solution to address privacy intrusion concerns is to conduct thorough privacy impact assessments before deploying UAVs. This would involve evaluating potential privacy risks and developing mitigation strategies to minimize data collection and privacy intrusion. Likewise, before operating UAVs in fields, one should ensure that farmers, landowners, and individuals in the vicinity are informed about the purpose of data collection and provide their informed consent. Regarding data security concerns, they could be addressed by implementing strong data anonymization and encryption techniques to protect the identity of individuals and property captured by UAVs and ensure that collected data cannot be easily traced back to specific individuals or locations. Minimizing data collection, e.g., collecting only the necessary data for plant disease monitoring and minimizing the amount of personal or sensitive information collected, can also help protect farmers. Lastly, collaboration with experts in ethics, law, privacy, and technology is needed to develop comprehensive strategies for addressing ethical and privacy concerns effectively.

4.3. The Way Forward

Research on using UAV-based approaches to detect and monitor plant stress caused by diseases is still underway, and there are ample opportunities to develop innovative solutions and improve the effectiveness and efficiency of these approaches. Current image analysis techniques for plant disease detection can be time-consuming, labor-intensive, and computationally demanding, particularly when it comes to using sophisticated CNN-based approaches, that require graphical processing units to train models. Balancing the trade-offs between resource requirements, model complexity, performance, and interpretability, and transfer learning opportunities has guided the choice of the most suitable ML technique for analyzing UAV imagery data. Future research can focus on improving the efficiency of ML-based approaches through the development of more advanced ML algorithms that can analyze images quickly and accurately. This will allow for the development of methods for real-time data analysis and decision-making tools that can be integrated with UAV systems. In this line, future research can investigate the use of reinforcement learning algorithms for plant disease management, which will involve training the models to learn from past actions and make decisions that optimize long-term plant health and minimize disease outbreaks.
There have been encouraging outcomes in integrating multiple sensors to provide more detailed and accurate data for plant disease detection, as highlighted by the number of related research articles, though this remains limited to a few numbers of plant species and diseases (Table 2). Future research can explore extending such approaches to economically important plant diseases of major food crops, such as rice, wheat, maize, cassava, plantains, potatoes, sorghum, soybeans, sweet potatoes, and yams, around the world. Research can also focus on integrating UAV data from multiple sensors or with satellite imagery (i.e., data fusion) for plant disease detection, as it has been explored for crop yield forecasting [141] and crop monitoring [142]. Such UAV and satellite data fusion will allow for a better understanding of crop health patterns and trends over large areas [143].
When a plant is infected with a disease, the infection can cause changes in the plant’s physiological functions, namely the way it uses and loses heat. These changes are often reflected in the plant’s temperature, which can be detected by thermal imaging [32,115]. As of December 2022 (end year of the literature search for this study), the use of thermal cameras as a single source of data or in combination with other sensors was reported in only six studies [32,54,89,100,107,115] (Table 2). This limited number of studies can be explained by the affordability of thermal sensors, which can hamper their widespread use. Indeed, the price for a drone equipped with a thermal camera can be something between US$5,000 and US$15,000. Our SQLR also revealed that there are multiple diseases whose symptoms are yet to be investigated using the sensors and cameras currently available (Table 1), pointing to potential research directions in the future. However, the selection of sensors and/or cameras to investigate such research questions remains contingent upon the crop type, the target disease and level of precision desired, and the study resources. There was no report on the use of UAVs equipped with electrochemical sensors for plant disease detection. Electrochemical sensors detect changes in the concentration of certain chemicals, such as enzymes or metabolites, that are released by plants in response to infections by a pathogen. There have been several studies dealing with the use of biosensors for pathogens recognition at the plant level (see for a comprehensive review Cardoso et al. [144]). As the field of remote sensing of physiological markers of plants through electrochemical sensors continues to evolve, it is worth investigating the integration of such sensors with UAV-based systems for early detection of plant diseases in different environmental conditions. Similarly, there is a need to develop real-time disease detection systems that can be integrated with UAVs, ground-based sensors, and other Internet of Things (IoT) devices [18].
Operating a drone requires a certain level of technical skill and training, making it difficult for some farmers to utilize drones for plant disease management effectively. This may require additional investment in training and education programs to ensure that farmers are able to use drones safely and effectively. Moreover, our analysis showed that research on UAV-based approaches for plant disease detection and monitoring is not evenly distributed globally (Figure 3). This highlights the need for new and/or more assessments of UAV systems in diverse agricultural settings to ascertain their effectiveness and adaptability to different crops, weather conditions, and topography. The global disparity in research on UAV-based approaches for plant disease monitoring also suggests that such research in a given country is not driven by the country’s scientific and technical capacity or wealth. This is evident as countries with a relatively small number of reported research articles included both wealthy, technologically advanced countries and developing ones (Figure 4b). Implementing suitable strategies to overcome the different barriers to UAVs use in farming activities (e.g., by establishing enabling guidelines and regulations governing the use of UAVs) can help to address such disparity. Additionally, assessing the performance of UAV-based approaches for plant disease management could be carried out for the world’s major socio-economically important crops (e.g., food staple crops), which would improve the current state of research in the field. This will require strong and continuous support from funding institutions willing to promote and back the creation of consistent and extensive reference UAV imagery data and other remote sensing technologies over large spatial scales.

5. Conclusions

We fully examined over 100 peer-reviewed research articles that specifically discussed UAV-based approaches for detecting, identifying, and quantifying plant diseases. Current research has covered a range of diseases affecting various plant species in different regions worldwide, with diseases affecting wheat, sugar beet, potato, and maize being the most investigated. The choice of sensors and cameras depends on the target disease, the desired level of detail, the type of crop being monitored, and the resources available for the study. Our systematic quantitative literature review showed that UAVs equipped with multispectral sensors, RGB cameras, hyperspectral sensors, and digital cameras were most often used to capture the data. While UAVs have the potential to greatly improve plant disease management and crop protection, there are several limitations that must be considered to fully realize the potential benefits of UAVs for plant disease management in farms. These limitations include those associated with weather and flight conditions, the variability in disease symptoms under different agricultural settings and environmental conditions, operational costs, the need for qualified personnel, and the constraints related to data management and analysis. Possible solutions to address these limitations include careful consideration and planning to avoid unfavorable weather conditions as much as possible, the use of combinations of sensors to leverage their individual capabilities, and regular training and calibration sessions to help improve annotations consistency. Addressing such limitations will require close and continuous collaboration between farmers, researchers, and industry. As future research directions, we suggest the development of autonomous UAV systems that can operate in complex environments, the development of more advanced ML algorithms to improve the efficiency of UAV-based approaches and allow for real-time data analysis and decision-making process, leveraging emerging technologies to improve the overall decision-making process for plant disease management in fields, and the evaluation of UAV-based approaches for plant disease management in diverse agricultural settings. Furthermore, with the increased use of UAVs in precision agriculture, it is important to address potential ethical and privacy concerns related to data collection and sharing through responsible and ethical use of the technology, proper regulations and guidelines, and transparent and fair decision-making processes.
This review, although comprehensive, is subject to some limitations. First, it is based on a limited number of research articles, and only peer-reviewed articles were considered. Secondly, the findings presented in our review are derived exclusively from English academic literature, which may introduce a bias in our results. Lastly, it is worth noting that our review did not encompass gray literature, which includes reports, theses, and dissertations, thus reflecting primarily the corpus of published peer-reviewed academic research. Nonetheless, we believe that the sample of relevant research articles reviewed accurately represents the current body of literature pertaining to the review topic. As technology continues to improve, the potential of UAVs for plant disease detection and monitoring will keep growing, providing farmers with valuable tools to help manage crop health and ensure sustainable and profitable agricultural production.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/rs15174273/s1, Table S1: List of the plant diseases and their causal pathogenic organisms as reviewed in the systematic quantitative literature review.

Author Contributions

Conceptualization, L.K., M.E.J. and R.L.; methodology, L.K.; software, L.K.; validation, L.K., formal analysis, L.K.; investigation, L.K. and Z.B.; data curation, L.K.; writing—original draft preparation, L.K. and M.E.J.; writing—review and editing, M.E.J., Z.B., R.L, S.-E.L., M.Z.K.R., I.D.I.A., N.M., F.M. and J.J. All authors have read and agreed to the published version of the manuscript.

Funding

The research was financially supported by the European Project FoodLand/H2020.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is therefore not applicable to this article.

Acknowledgments

We gratefully acknowledge the support of the Department of Plant Protection of the Ecole Nationale d’Agriculture of-Meknes, Morocco.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ristaino, J.B.; Anderson, P.K.; Bebber, D.P.; Brauman, K.A.; Cunniffe, N.J.; Fedoroff, N.V.; Finegold, C.; Garrett, K.A.; Gilligan, C.A.; Jones, C.M.; et al. The persistent threat of emerging plant disease pandemics to global food security. Proc. Natl. Acad. Sci. USA 2021, 118, e2022239118. [Google Scholar] [CrossRef]
  2. Chaloner, T.M.; Gurr, S.J.; Bebber, D.P. Plant pathogen infection risk tracks global crop yields under climate change. Nat. Clim. Chang. 2021, 11, 710–715. [Google Scholar] [CrossRef]
  3. FAO. New Standards to Curb the Global Spread of Plant Pests and Diseases; The Food and Agriculture Organization of the United Nations (FAO): Rome, Italy, 2019; Available online: https://www.fao.org/news/story/en/item/1187738/icode/ (accessed on 16 January 2023).
  4. Savary, S.; Willocquet, L.; Pethybridge, S.J.; Esker, P.; McRoberts, N.; Nelson, A. The global burden of pathogens and pests on major food crops. Nat. Ecol. Evol. 2019, 3, 430–439. [Google Scholar] [CrossRef] [PubMed]
  5. Chakraborty, S.; Newton, A.C. Climate change, plant diseases and food security: An overview. Plant Pathol. 2011, 60, 2–14. [Google Scholar] [CrossRef]
  6. Gilbert, G.S. Evolutionary ecology of plant diseases in natural ecosystems. Annu. Rev. Phytopathol. 2002, 40, 13–43. [Google Scholar] [CrossRef] [PubMed]
  7. Bennett, J.W.; Klich, M. Mycotoxins. In Encyclopedia of Microbiology, 3rd ed.; Schaechter, M., Ed.; Academic Press: Oxford, UK, 2009; pp. 559–565. [Google Scholar]
  8. Cao, S.; Luo, H.; Jin, M.A.; Jin, S.; Duan, X.; Zhou, Y.; Chen, W.; Liu, T.; Jia, Q.; Zhang, B.; et al. Intercropping influenced the occurrence of stripe rust and powdery mildew in wheat. Crop Prot. 2015, 70, 40–46. [Google Scholar] [CrossRef]
  9. Verreet, J.A.; Klink, H.; Hoffmann, G.M. Regional monitoring for disease prediction and optimization of plant protection measures: The IPM wheat model. Plant Dis. 2000, 84, 816–826. [Google Scholar] [CrossRef]
  10. Jones, R.A.C.; Naidu, R.A. Global dimensions of plant virus diseases: Current status and future perspectives. Annu. Rev. Virol. 2019, 6, 387–409. [Google Scholar] [CrossRef]
  11. Moran, M.S.; Inoue, Y.; Barnes, E.M. Opportunities and limitations for image-based remote sensing in precision crop management. Remote Sens. Environ. 1997, 61, 319–346. [Google Scholar] [CrossRef]
  12. Seelan, S.K.; Laguette, S.; Casady, G.M.; Seielstad, G.A. Remote sensing applications for precision agriculture: A learning community approach. Remote Sens. Environ. 2003, 88, 157–169. [Google Scholar] [CrossRef]
  13. Barbedo, J.G.A. A review on the use of unmanned aerial vehicles and imaging sensors for monitoring and assessing plant stresses. Drones 2019, 3, 40. [Google Scholar] [CrossRef]
  14. Mahlein, A.-K.; Oerke, E.-C.; Steiner, U.; Dehne, H.-W. Recent advances in sensing plant diseases for precision crop protection. Eur. J. Plant Pathol. 2012, 133, 197–209. [Google Scholar] [CrossRef]
  15. Bock, C.H.; Poole, G.H.; Parker, P.E.; Gottwald, T.R. Plant disease severity estimated visually, by digital photography and image analysis, and by hyperspectral imaging. Crit. Rev. Plant Sci. 2010, 29, 59–107. [Google Scholar] [CrossRef]
  16. Chang, C.Y.; Zhou, R.; Kira, O.; Marri, S.; Skovira, J.; Gu, L.; Sun, Y. An Unmanned Aerial System (UAS) for concurrent measurements of solar-induced chlorophyll fluorescence and hyperspectral reflectance toward improving crop monitoring. Agric. For. Meteorol. 2020, 294, 108145. [Google Scholar] [CrossRef]
  17. Neupane, K.; Baysal-Gurel, F. Automatic identification and monitoring of plant diseases using unmanned aerial vehicles: A review. Remote Sens. 2021, 13, 3841. [Google Scholar] [CrossRef]
  18. Boursianis, A.D.; Papadopoulou, M.S.; Diamantoulakis, P.; Liopa-Tsaalidi, A.; Barouchas, P.; Salahas, G.; Karagiannidis, G.; Wan, S.; Goudos, S.K. Internet of Things (IoT) and agricultural unmanned aerial vehicles (UAVs) in smart farming: A comprehensive review. Internet Things 2022, 18, 100187. [Google Scholar] [CrossRef]
  19. Panday, U.S.; Pratihast, A.K.; Aryal, J.; Kayastha, R.B. A review on drone-based data solutions for cereal crops. Drones 2020, 4, 41. [Google Scholar] [CrossRef]
  20. Feng, L.; Chen, S.; Zhang, C.; Zhang, Y.; He, Y. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping. Comput. Electron. Agric. 2021, 182, 106033. [Google Scholar] [CrossRef]
  21. Mahlein, A.-K. Plant disease detection by imaging sensors-Parallels and specific demands for precision agriculture and plant phenotyping. Plant Dis. 2015, 100, 241–251. [Google Scholar] [CrossRef]
  22. Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
  23. Barbedo, J.G.A. Factors influencing the use of deep learning for plant disease recognition. Biosyst. Eng. 2018, 172, 84–91. [Google Scholar] [CrossRef]
  24. Barbedo, J.G.A. A review on the main challenges in automatic plant disease identification based on visible range images. Biosyst. Eng. 2016, 144, 52–60. [Google Scholar] [CrossRef]
  25. Bouguettaya, A.; Zarzour, H.; Kechida, A.; Taberkit, A.M. A survey on deep learning-based identification of plant and crop diseases from UAV-based aerial images. Cluster Comp. 2022, 26, 1297–1317. [Google Scholar] [CrossRef] [PubMed]
  26. Shahi, T.B.; Xu, C.-Y.; Neupane, A.; Guo, W. Recent advances in crop disease detection using UAV and deep learning techniques. Remote Sens. 2023, 15, 2450. [Google Scholar] [CrossRef]
  27. Kuswidiyanto, L.W.; Noh, H.H.; Han, X.Z. Plant disease diagnosis using deep learning based on aerial hyperspectral images: A review. Remote Sens. 2022, 14, 6031. [Google Scholar] [CrossRef]
  28. Pickering, C.; Byrne, J. The benefits of publishing systematic quantitative literature reviews for PhD candidates and other early-career researchers. High. Educ. Res. Dev. 2014, 33, 534–548. [Google Scholar] [CrossRef]
  29. Pickering, C.; Grignon, J.; Steven, R.; Guitart, D.; Byrne, J. Publishing not perishing: How research students transition from novice to knowledgeable using systematic quantitative literature reviews. Stud. High. Educ. 2015, 40, 1756–1769. [Google Scholar] [CrossRef]
  30. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ 2009, 339, b2535. [Google Scholar] [CrossRef]
  31. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2023. [Google Scholar]
  32. Calderón, R.; Navas-Cortés, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  33. Messina, G.; Modica, G. Applications of UAV thermal imagery in precision agriculture: State of the art and future research outlook. Remote Sens. 2020, 12, 1491. [Google Scholar] [CrossRef]
  34. Sandino, J.; Pegg, G.; Gonzalez, F.; Smith, G. Aerial mapping of forests affected by pathogens using UAVs, hyperspectral sensors, and artificial intelligence. Sensors 2018, 18, 944. [Google Scholar] [CrossRef]
  35. Chivasa, W.; Mutanga, O.; Biradar, C. UAV-based multispectral phenotyping for disease resistance to accelerate crop improvement under changing climate conditions. Remote Sens. 2020, 12, 2445. [Google Scholar] [CrossRef]
  36. Gomez Selvaraj, M.; Vergara, A.; Montenegro, F.; Alonso Ruiz, H.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Omondi, A.B.; et al. Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and Republic of Benin. ISPRS J. Photogramm. Remote Sens. 2020, 169, 110–124. [Google Scholar] [CrossRef]
  37. Chivasa, W.; Mutanga, O.; Burgueño, J. UAV-based high-throughput phenotyping to increase prediction and selection accuracy in maize varieties under artificial MSV inoculation. Comput. Electron. Agric. 2021, 184, 106128. [Google Scholar] [CrossRef]
  38. Alabi, T.R.; Adewopo, J.; Duke, O.P.; Kumar, P.L. Banana mapping in heterogenous smallholder farming systems using high-resolution remote sensing imagery and machine learning models with implications for banana bunchy top disease surveillance. Remote Sens. 2022, 14, 5206. [Google Scholar] [CrossRef]
  39. Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in agriculture: A review and bibliometric analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
  40. Hughes, D.P.; Salathé, M. An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv 2016, arXiv:1511.08060. [Google Scholar]
  41. FAOSTAT. Crops and Livestock Products; The Food and Agriculture Organization of the United Nations (FAO): Rome, Italy, 2023; Available online: https://www.fao.org/faostat/en/#data/QCL (accessed on 14 August 2023).
  42. Heidarian Dehkordi, R.; El Jarroudi, M.; Kouadio, L.; Meersmans, J.; Beyer, M. Monitoring wheat leaf rust and stripe rust in winter wheat using high-resolution UAV-based Red-Green-Blue imagery. Remote Sens. 2020, 12, 3696. [Google Scholar] [CrossRef]
  43. Su, J.Y.; Liu, C.J.; Coombes, M.; Hu, X.P.; Wang, C.H.; Xu, X.M.; Li, Q.D.; Guo, L.; Chen, W.H. Wheat yellow rust monitoring by learning from multispectral UAV aerial imagery. Comput. Electron. Agric. 2018, 155, 157–166. [Google Scholar] [CrossRef]
  44. Bohnenkamp, D.; Behmann, J.; Mahlein, A.K. In-field detection of yellow rust in wheat on the ground canopy and UAV scale. Remote Sens. 2019, 11, 2495. [Google Scholar] [CrossRef]
  45. Su, J.Y.; Liu, C.J.; Hu, X.P.; Xu, X.M.; Guo, L.; Chen, W.H. Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery. Comput. Electron. Agric. 2019, 167, 105035. [Google Scholar] [CrossRef]
  46. Zhang, X.; Han, L.; Dong, Y.; Shi, Y.; Huang, W.; Han, L.; González-Moreno, P.; Ma, H.; Ye, H.; Sobeih, T. A deep learning-based approach for automated yellow rust disease detection from high-resolution hyperspectral UAV images. Remote Sens. 2019, 11, 1554. [Google Scholar] [CrossRef]
  47. Guo, A.T.; Huang, W.J.; Dong, Y.Y.; Ye, H.C.; Ma, H.Q.; Liu, B.; Wu, W.B.; Ren, Y.; Ruan, C.; Geng, Y. Wheat yellow rust detection using UAV-based hyperspectral technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
  48. Pan, Q.; Gao, M.; Wu, P.; Yan, J.; Li, S. A deep-learning-based approach for wheat yellow rust disease recognition from unmanned aerial vehicle images. Sensors 2021, 21, 6540. [Google Scholar] [CrossRef]
  49. Su, J.Y.; Yi, D.W.; Su, B.F.; Mi, Z.W.; Liu, C.J.; Hu, X.P.; Xu, X.M.; Guo, L.; Chen, W.H. Aerial visual perception in smart farming: Field study of wheat yellow rust monitoring. IEEE Trans. Indus. Inform. 2021, 17, 2242–2249. [Google Scholar] [CrossRef]
  50. Deng, J.; Zhou, H.R.; Lv, X.; Yang, L.J.; Shang, J.L.; Sun, Q.Y.; Zheng, X.; Zhou, C.Y.; Zhao, B.Q.; Wu, J.C.; et al. Applying convolutional neural networks for detecting wheat stripe rust transmission centers under complex field conditions using RGB-based high spatial resolution images from UAVs. Comput. Electron. Agric. 2022, 200, 107211. [Google Scholar] [CrossRef]
  51. Zhang, T.; Yang, Z.; Xu, Z.; Li, J. Wheat yellow rust severity detection by efficient DF-Unet and UAV multispectral imagery. IEEE Sens. J. 2022, 22, 9057–9068. [Google Scholar] [CrossRef]
  52. Liu, W.; Cao, X.R.; Fan, J.R.; Wang, Z.H.; Yan, Z.Y.; Luo, Y.; West, J.S.; Xu, X.M.; Zhou, Y.L. Detecting wheat powdery mildew and predicting grain yield using unmanned aerial photography. Plant Dis. 2018, 102, 1981–1988. [Google Scholar] [CrossRef]
  53. Vagelas, I.; Cavalaris, C.; Karapetsi, L.; Koukidis, C.; Servis, D.; Madesis, P. Protective effects of Systiva® seed treatment fungicide for the control of winter wheat foliar diseases caused at early stages due to climate change. Agronomy 2022, 12, 2000. [Google Scholar] [CrossRef]
  54. Francesconi, S.; Harfouche, A.; Maesano, M.; Balestra, G.M. UAV-based thermal, RGB imaging and gene expression analysis allowed detection of Fusarium head blight and gave new insights into the physiological responses to the disease in durum wheat. Front. Plant Sci. 2021, 12, 628575. [Google Scholar] [CrossRef]
  55. Zhang, H.S.; Huang, L.S.; Huang, W.J.; Dong, Y.Y.; Weng, S.Z.; Zhao, J.L.; Ma, H.Q.; Liu, L.Y. Detection of wheat Fusarium head blight using UAV-based spectral and image feature fusion. Front. Plant Sci. 2022, 13, 4427. [Google Scholar] [CrossRef] [PubMed]
  56. Van De Vijver, R.; Mertens, K.; Heungens, K.; Nuyttens, D.; Wieme, J.; Maes, W.H.; Van Beek, J.; Somers, B.; Saeys, W. Ultra-high-resolution UAV-based detection of Alternaria solani infections in potato fields. Remote Sens. 2022, 14, 6232. [Google Scholar] [CrossRef]
  57. Sugiura, R.; Tsuda, S.; Tamiya, S.; Itoh, A.; Nishiwaki, K.; Murakami, N.; Shibuya, Y.; Hirafuji, M.; Nuske, S. Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 2016, 148, 1–10. [Google Scholar] [CrossRef]
  58. Duarte-Carvajalino, J.M.; Alzate, D.F.; Ramirez, A.A.; Santa-Sepulveda, J.D.; Fajardo-Rojas, A.E.; Soto-Suárez, M. Evaluating late blight severity in potato crops using unmanned aerial vehicles and machine learning algorithms. Remote Sens. 2018, 10, 1513. [Google Scholar] [CrossRef]
  59. Franceschini, M.H.D.; Bartholomeus, H.; van Apeldoorn, D.F.; Suomalainen, J.; Kooistra, L. Feasibility of unmanned aerial vehicle optical imagery for early detection and severity assessment of late blight in Potato. Remote Sens. 2019, 11, 224. [Google Scholar] [CrossRef]
  60. Shi, Y.; Han, L.X.; Kleerekoper, A.; Chang, S.; Hu, T.L. Novel CropdocNet model for automated potato late blight disease detection from unmanned aerial vehicle-based hyperspectral imagery. Remote Sens. 2022, 14, 396. [Google Scholar] [CrossRef]
  61. Siebring, J.; Valente, J.; Franceschini, M.H.D.; Kamp, J.; Kooistra, L. Object-based image analysis applied to low altitude aerial imagery for potato plant trait retrieval and pathogen detection. Sensors 2019, 19, 5477. [Google Scholar] [CrossRef]
  62. León-Rueda, W.A.; León, C.; Caro, S.G.; Ramírez-Gil, J.G. Identification of diseases and physiological disorders in potato via multispectral drone imagery using machine learning tools. Trop. Plant Pathol. 2022, 47, 152–167. [Google Scholar] [CrossRef]
  63. Kalischuk, M.; Paret, M.L.; Freeman, J.H.; Raj, D.; Da Silva, S.; Eubanks, S.; Wiggins, D.J.; Lollar, M.; Marois, J.J.; Mellinger, H.C.; et al. An improved crop scouting technique incorporating unmanned aerial vehicle-assisted multispectral crop imaging into conventional scouting practice for gummy stem blight in watermelon. Plant Dis. 2019, 103, 1642–1650. [Google Scholar] [CrossRef]
  64. Prasad, A.; Mehta, N.; Horak, M.; Bae, W.D. A two-step machine learning approach for crop disease detection using GAN and UAV technology. Remote Sens. 2022, 14, 4765. [Google Scholar] [CrossRef]
  65. Yağ, İ.; Altan, A. Artificial intelligence-based robust hybrid algorithm design and implementation for real-time detection of plant diseases in agricultural environments. Biology 2022, 11, 1732. [Google Scholar] [CrossRef]
  66. Xiao, D.Q.; Pan, Y.Q.; Feng, J.Z.; Yin, J.J.; Liu, Y.F.; He, L. Remote sensing detection algorithm for apple fire blight based on UAV multispectral image. Comput. Electron. Agric. 2022, 199, 107137. [Google Scholar] [CrossRef]
  67. Lei, S.H.; Luo, J.B.; Tao, X.J.; Qiu, Z.X. Remote sensing detecting of yellow leaf disease of arecanut based on UAV multisource sensors. Remote Sens. 2021, 13, 4562. [Google Scholar] [CrossRef]
  68. Calou, V.B.C.; Teixeira, A.d.S.; Moreira, L.C.J.; Lima, C.S.; de Oliveira, J.B.; de Oliveira, M.R.R. The use of UAVs in monitoring yellow sigatoka in banana. Biosyst. Eng. 2020, 193, 115–125. [Google Scholar] [CrossRef]
  69. Ye, H.C.; Huang, W.J.; Huang, S.Y.; Cui, B.; Dong, Y.Y.; Guo, A.T.; Ren, Y.; Jin, Y. Recognition of banana fusarium wilt based on UAV remote sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef]
  70. Ye, H.C.; Huang, W.J.; Huang, S.Y.; Cui, B.; Dong, Y.Y.; Guo, A.T.; Ren, Y.; Jin, Y. Identification of banana fusarium wilt using supervised classification algorithms with UAV-based multi-spectral imagery. Int. J. Agric. Biol. Eng. 2020, 13, 136–142. [Google Scholar] [CrossRef]
  71. Zhang, S.M.; Li, X.H.; Ba, Y.X.; Lyu, X.G.; Zhang, M.Q.; Li, M.Z. Banana fusarium wilt disease detection by supervised and unsupervised methods from UAV-based multispectral imagery. Remote Sens. 2022, 14, 1231. [Google Scholar] [CrossRef]
  72. Booth, J.C.; Sullivan, D.; Askew, S.A.; Kochersberger, K.; McCall, D.S. Investigating targeted spring dead spot management via aerial mapping and precision-guided fungicide applications. Crop Sci. 2021, 61, 3134–3144. [Google Scholar] [CrossRef]
  73. Abdulridha, J.; Batuman, O.; Ampatzidis, Y. UAV-based remote sensing technique to detect citrus canker disease utilizing hyperspectral imaging and machine learning. Remote Sens. 2019, 11, 1373. [Google Scholar] [CrossRef]
  74. DadrasJavan, F.; Samadzadegan, F.; Seyed Pourazar, S.H.; Fazeli, H. UAV-based multispectral imagery for fast citrus greening detection. J. Plant Dis. Prot. 2019, 126, 307–318. [Google Scholar] [CrossRef]
  75. Pourazar, H.; Samadzadegan, F.; Javan, F.D. Aerial multispectral imagery for plant disease detection: Radiometric calibration necessity assessment. Eur. J. Remote Sens. 2019, 52, 17–31. [Google Scholar] [CrossRef]
  76. Deng, X.L.; Zhu, Z.H.; Yang, J.C.; Zheng, Z.; Huang, Z.X.; Yin, X.B.; Wei, S.J.; Lan, Y.B. Detection of Citrus Huanglongbing based on multi-input neural network model of UAV hyperspectral remote sensing. Remote Sens. 2020, 12, 2678. [Google Scholar] [CrossRef]
  77. Garza, B.N.; Ancona, V.; Enciso, J.; Perotto-Baldivieso, H.L.; Kunta, M.; Simpson, C. Quantifying citrus tree health using true color UAV images. Remote Sens. 2020, 12, 170. [Google Scholar] [CrossRef]
  78. Moriya, É.A.S.; Imai, N.N.; Tommaselli, A.M.G.; Berveglieri, A.; Santos, G.H.; Soares, M.A.; Marino, M.; Reis, T.T. Detection and mapping of trees infected with citrus gummosis using UAV hyperspectral data. Comput. Electron. Agric. 2021, 188, 106298. [Google Scholar] [CrossRef]
  79. Marin, D.B.; Ferraz, G.A.E.S.; Santana, L.S.; Barbosa, B.D.S.; Barata, R.A.P.; Osco, L.P.; Ramos, A.P.M.; Guimarães, P.H.S. Detecting coffee leaf rust with UAV-based vegetation indices and decision tree machine learning models. Comput. Electron. Agric. 2021, 190, 106476. [Google Scholar] [CrossRef]
  80. Soares, A.D.S.; Vieira, B.S.; Bezerra, T.A.; Martins, G.D.; Siquieroli, A.C.S. Early detection of coffee leaf rust caused by Hemileia vastatrix using multispectral images. Agronomy 2022, 12, 2911. [Google Scholar] [CrossRef]
  81. Wang, T.Y.; Thomasson, J.A.; Isakeit, T.; Yang, C.H.; Nichols, R.L. A plant-by-plant method to identify and treat cotton root rot based on UAV remote sensing. Remote Sens. 2020, 12, 2453. [Google Scholar] [CrossRef]
  82. Wang, T.Y.; Thomasson, J.A.; Yang, C.H.; Isakeit, T.; Nichols, R.L. Automatic classification of cotton root rot disease based on UAV remote sensing. Remote Sens. 2020, 12, 1310. [Google Scholar] [CrossRef]
  83. Megat Mohamed Nazir, M.N.; Terhem, R.; Norhisham, A.R.; Mohd Razali, S.; Meder, R. Early monitoring of health status of plantation-grown eucalyptus pellita at large spatial scale via visible spectrum imaging of canopy foliage using unmanned aerial vehicles. Forests 2021, 12, 1393. [Google Scholar] [CrossRef]
  84. di Gennaro, S.F.; Battiston, E.; di Marco, S.; Facini, O.; Matese, A.; Nocentini, M.; Palliotti, A.; Mugnai, L. Unmanned Aerial Vehicle (UAV)-based remote sensing to monitor grapevine leaf stripe disease within a vineyard affected by esca complex. Phytopathol. Mediterr. 2016, 55, 262–275. [Google Scholar] [CrossRef]
  85. Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
  86. Kerkech, M.; Hafiane, A.; Canals, R. VddNet: Vine disease detection network based on multispectral images and depth map. Remote Sens. 2020, 12, 3305. [Google Scholar] [CrossRef]
  87. Dwivedi, R.; Dey, S.; Chakraborty, C.; Tiwari, S. Grape disease detection network based on multi-task learning and attention features. IEEE Sens. J. 2021, 21, 17573–17580. [Google Scholar] [CrossRef]
  88. Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.B.; Dedieu, G. Detection of Flavescence dorée grapevine disease using Unmanned Aerial Vehicle (UAV) multispectral imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef]
  89. Savian, F.; Martini, M.; Ermacora, P.; Paulus, S.; Mahlein, A.K. Prediction of the kiwifruit decline syndrome in diseased orchards by remote sensing. Remote Sens. 2020, 12, 2194. [Google Scholar] [CrossRef]
  90. Carmo, G.J.D.; Castoldi, R.; Martins, G.D.; Jacinto, A.C.P.; Tebaldi, N.D.; Charlo, H.C.D.; Zampiroli, R. Detection of lesions in lettuce caused by Pectobacterium carotovorum subsp. carotovorum by supervised classification using multispectral images. Can. J. Remote Sens. 2022, 48, 144–157. [Google Scholar] [CrossRef]
  91. Stewart, E.L.; Wiesner-Hanks, T.; Kaczmar, N.; DeChant, C.; Wu, H.; Lipson, H.; Nelson, R.J.; Gore, M.A. Quantitative phenotyping of northern leaf blight in uav images using deep learning. Remote Sens. 2019, 11, 2209. [Google Scholar] [CrossRef]
  92. Wiesner-Hanks, T.; Wu, H.; Stewart, E.L.; DeChant, C.; Kaczmar, N.; Lipson, H.; Gore, M.A.; Nelson, R.J. Millimeter-level plant disease detection from aerial photographs via deep learning and crowdsourced data. Front. Plant Sci. 2019, 10, 1550. [Google Scholar] [CrossRef]
  93. Wu, H.; Wiesner-Hanks, T.; Stewart, E.L.; DeChant, C.; Kaczmar, N.; Gore, M.A.; Nelson, R.J.; Lipson, H. Autonomous detection of plant disease symptoms directly from aerial imagery. Plant Phenome J. 2019, 2, 1–9. [Google Scholar] [CrossRef]
  94. Gao, J.M.; Ding, M.L.; Sun, Q.Y.; Dong, J.Y.; Wang, H.Y.; Ma, Z.H. Classification of southern corn rust severity based on leaf-level hyperspectral data collected under solar illumination. Remote Sens. 2022, 14, 2551. [Google Scholar] [CrossRef]
  95. Oh, S.; Lee, D.Y.; Gongora-Canul, C.; Ashapure, A.; Carpenter, J.; Cruz, A.P.; Fernandez-Campos, M.; Lane, B.Z.; Telenko, D.E.P.; Jung, J.; et al. Tar spot disease quantification using unmanned aircraft systems (UAS) data. Remote Sens. 2021, 13, 2567. [Google Scholar] [CrossRef]
  96. Ganthaler, A.; Losso, A.; Mayr, S. Using image analysis for quantitative assessment of needle bladder rust disease of Norway spruce. Plant Pathol. 2018, 67, 1122–1130. [Google Scholar] [CrossRef]
  97. Izzuddin, M.A.; Hamzah, A.; Nisfariza, M.N.; Idris, A.S. Analysis of multispectral imagery from unmanned aerial vehicle (UAV) using object-based image analysis for detection of ganoderma disease in oil palm. J. Oil Palm Res. 2020, 32, 497–508. [Google Scholar] [CrossRef]
  98. Ahmadi, P.; Mansor, S.; Farjad, B.; Ghaderpour, E. Unmanned aerial vehicle (UAV)-based remote sensing for early-stage detection of Ganoderma. Remote Sens. 2022, 14, 1239. [Google Scholar] [CrossRef]
  99. Kurihara, J.; Koo, V.C.; Guey, C.W.; Lee, Y.P.; Abidin, H. Early detection of basal stem rot disease in oil palm tree using unmanned aerial vehicle-based hyperspectral imaging. Remote Sens. 2022, 14, 799. [Google Scholar] [CrossRef]
  100. Cao, F.; Liu, F.; Guo, H.; Kong, W.W.; Zhang, C.; He, Y. Fast detection of Sclerotinia sclerotiorum on oilseed rape leaves using low-altitude remote sensing technology. Sensors 2018, 18, 4464. [Google Scholar] [CrossRef] [PubMed]
  101. Rangarajan, A.K.; Balu, E.J.; Boligala, M.S.; Jagannath, A.; Ranganathan, B.N. A low-cost UAV for detection of Cercospora leaf spot in okra using deep convolutional neural network. Multimed. Tools Appl. 2022, 81, 21565–21589. [Google Scholar] [CrossRef]
  102. Di Nisio, A.; Adamo, F.; Acciani, G.; Attivissimo, F. Fast detection of olive trees affected by Xylella fastidiosa from UAVs using multispectral imaging. Sensors 2020, 20, 4915. [Google Scholar] [CrossRef]
  103. Castrignanò, A.; Belmonte, A.; Antelmi, I.; Quarto, R.; Quarto, F.; Shaddad, S.; Sion, V.; Muolo, M.R.; Ranieri, N.A.; Gadaleta, G.; et al. A geostatistical fusion approach using UAV data for probabilistic estimation of Xylella fastidiosa subsp. pauca infection in olive trees. Sci. Total Environ. 2021, 752, 141814. [Google Scholar] [CrossRef]
  104. Ksibi, A.; Ayadi, M.; Soufiene, B.O.; Jamjoom, M.M.; Ullah, Z. MobiRes-Net: A hybrid deep learning model for detecting and classifying olive leaf diseases. Appl. Sci. 2022, 12, 10278. [Google Scholar] [CrossRef]
  105. Alberto, R.T.; Rivera, J.C.E.; Biagtan, A.R.; Isip, M.F. Extraction of onion fields infected by anthracnose-twister disease in selected municipalities of Nueva Ecija using UAV imageries. Spat. Inf. Res. 2020, 28, 383–389. [Google Scholar] [CrossRef]
  106. McDonald, M.R.; Tayviah, C.S.; Gossen, B.D. Human vs. Machine, the eyes have it. Assessment of Stemphylium leaf blight on onion using aerial photographs from an NIR camera. Remote Sens. 2022, 14, 293. [Google Scholar] [CrossRef]
  107. Calderón, R.; Montes-Borrego, M.; Landa, B.B.; Navas-Cortés, J.A.; Zarco-Tejada, P.J. Detection of downy mildew of opium poppy using high-resolution multi-spectral and thermal imagery acquired with an unmanned aerial vehicle. Precision Agric. 2014, 15, 639–661. [Google Scholar] [CrossRef]
  108. Bagheri, N. Application of aerial remote sensing technology for detection of fire blight infected pear trees. Comput. Electron. Agric. 2020, 168, 105147. [Google Scholar] [CrossRef]
  109. Chen, T.; Yang, W.; Zhang, H.; Zhu, B.; Zeng, R.; Wang, X.; Wang, S.; Wang, L.; Qi, H.; Lan, Y.; et al. Early detection of bacterial wilt in peanut plants through leaf-level hyperspectral and unmanned aerial vehicle data. Comput. Electron. Agric. 2020, 177, 105708. [Google Scholar] [CrossRef]
  110. Li, F.D.; Liu, Z.Y.; Shen, W.X.; Wang, Y.; Wang, Y.L.; Ge, C.K.; Sun, F.G.; Lan, P. A remote sensing and airborne edge-computing based detection system for pine wilt disease. IEEE Access 2021, 9, 66346–66360. [Google Scholar] [CrossRef]
  111. Wu, B.Z.; Liang, A.J.; Zhang, H.F.; Zhu, T.F.; Zou, Z.Y.; Yang, D.M.; Tang, W.Y.; Li, J.; Su, J. Application of conventional UAV-based high-throughput object detection to the early diagnosis of pine wilt disease by deep learning. For. Ecol. Manag. 2021, 486, 118986. [Google Scholar] [CrossRef]
  112. Yu, R.; Ren, L.L.; Luo, Y.Q. Early detection of pine wilt disease in Pinus tabuliformis in North China using a field portable spectrometer and UAV-based hyperspectral imagery. For. Ecosyst. 2021, 8, 44. [Google Scholar] [CrossRef]
  113. Liang, D.; Liu, W.; Zhao, L.; Zong, S.; Luo, Y. An improved convolutional neural network for plant disease detection using unmanned aerial vehicle images. Nat. Environ. Pollut. Technol. 2022, 21, 899–908. [Google Scholar] [CrossRef]
  114. Yu, R.; Huo, L.N.; Huang, H.G.; Yuan, Y.; Gao, B.T.; Liu, Y.J.; Yu, L.F.; Li, H.A.; Yang, L.Y.; Ren, L.L.; et al. Early detection of pine wilt disease tree candidates using time-series of spectral signatures. Front. Plant Sci. 2022, 13, 1000093. [Google Scholar] [CrossRef]
  115. Smigaj, M.; Gaulton, R.; Suárez, J.C.; Barr, S.L. Canopy temperature from an Unmanned Aerial Vehicle as an indicator of tree stress associated with red band needle blight severity. For. Ecol. Manag. 2019, 433, 699–708. [Google Scholar] [CrossRef]
  116. Dang, L.M.; Hassan, S.I.; Suhyeon, I.; Sangaiah, A.K.; Mehmood, I.; Rho, S.; Seo, S.; Moon, H. UAV based wilt detection system via convolutional neural networks. Sustain. Comput.-Infor. 2020, 28, 100250. [Google Scholar] [CrossRef]
  117. Dang, L.M.; Wang, H.; Li, Y.; Min, K.; Kwak, J.T.; Lee, O.N.; Park, H.; Moon, H. Fusarium wilt of radish detection using RGB and near infrared images from unmanned aerial vehicles. Remote Sens. 2020, 12, 2863. [Google Scholar] [CrossRef]
  118. Zhang, D.; Zhou, X.; Zhang, J.; Lan, Y.; Xu, C.; Liang, D. Detection of rice sheath blight using an unmanned aerial system with high-resolution color and multispectral imaging. PLoS ONE 2018, 13, e0187470. [Google Scholar] [CrossRef] [PubMed]
  119. Kharim, M.N.A.; Wayayok, A.; Fikri Abdullah, A.; Rashid Mohamed Shariff, A.; Mohd Husin, E.; Razif Mahadi, M. Predictive zoning of pest and disease infestations in rice field based on UAV aerial imagery. Egypt. J. Remote Sens. Space Sci. 2022, 25, 831–840. [Google Scholar] [CrossRef]
  120. Tetila, E.C.; Machado, B.B.; Menezes, G.K.; Da Silva Oliveira, A., Jr.; Alvarez, M.; Amorim, W.P.; De Souza Belete, N.A.; Da Silva, G.G.; Pistori, H. Automatic recognition of soybean leaf diseases using UAV images and deep convolutional neural networks. IEEE Geosci. Remote Sens. 2020, 17, 903–907. [Google Scholar] [CrossRef]
  121. Castelao Tetila, E.; Brandoli Machado, B.; Belete, N.A.D.S.; Guimaraes, D.A.; Pistori, H. Identification of soybean foliar diseases using unmanned aerial vehicle images. IEEE Geosci. Remote Sens. 2017, 14, 2190–2194. [Google Scholar] [CrossRef]
  122. Babu, R.G.; Chellaswamy, C. Different stages of disease detection in squash plant based on machine learning. J. Biosci. 2022, 47, 9. [Google Scholar] [CrossRef] [PubMed]
  123. Jay, S.; Comar, A.; Benicio, R.; Beauvois, J.; Dutartre, D.; Daubige, G.; Li, W.; Labrosse, J.; Thomas, S.; Henry, N.; et al. Scoring Cercospora leaf spot on sugar beet: Comparison of UGV and UAV phenotyping systems. Plant Phenomics 2020, 2020, 9452123. [Google Scholar] [CrossRef]
  124. Rao Pittu, V.S. Image processing system integrated multicopter for diseased area and disease recognition in agricultural farms. Int. J. Control Autom. 2020, 13, 219–230. [Google Scholar]
  125. Rao Pittu, V.S.; Gorantla, S.R. Diseased area recognition and pesticide spraying in farming lands by multicopters and image processing system. J. Eur. Syst. Autom. 2020, 53, 123–130. [Google Scholar] [CrossRef]
  126. Görlich, F.; Marks, E.; Mahlein, A.K.; König, K.; Lottes, P.; Stachniss, C. UAV-based classification of Cercospora leaf spot using RGB images. Drones 2021, 5, 34. [Google Scholar] [CrossRef]
  127. Gunder, M.; Yamati, F.R.I.; Kierdorf, J.; Roscher, R.; Mahlein, A.K.; Bauckhage, C. Agricultural plant cataloging and establishment of a data framework from UAV-based crop images by computer vision. Gigascience 2022, 11, giac054. [Google Scholar] [CrossRef]
  128. Ispizua Yamati, F.R.; Barreto, A.; Günder, M.; Bauckhage, C.; Mahlein, A.K. Sensing the occurrence and dynamics of Cercospora leaf spot disease using UAV-supported image data and deep learning. Zuckerindustrie 2022, 147, 79–86. [Google Scholar] [CrossRef]
  129. Joalland, S.; Screpanti, C.; Varella, H.V.; Reuther, M.; Schwind, M.; Lang, C.; Walter, A.; Liebisch, F. Aerial and ground based sensing of tolerance to beet cyst nematode in sugar beet. Remote Sens. 2018, 10, 787. [Google Scholar] [CrossRef]
  130. Narmilan, A.; Gonzalez, F.; Salgadoe, A.S.A.; Powell, K. Detection of white leaf disease in sugarcane using machine learning techniques over UAV multispectral images. Drones 2022, 6, 230. [Google Scholar] [CrossRef]
  131. Xu, Y.P.; Shrestha, V.; Piasecki, C.; Wolfe, B.; Hamilton, L.; Millwood, R.J.; Mazarei, M.; Stewart, C.N. Sustainability trait modeling of field-grown switchgrass (Panicum virgatum) using UAV-based imagery. Plants 2021, 10, 2726. [Google Scholar] [CrossRef]
  132. Zhao, X.H.; Zhang, J.C.; Tang, A.L.; Yu, Y.F.; Yan, L.J.; Chen, D.M.; Yuan, L. The stress detection and segmentation strategy in tea plant at canopy level. Front. Plant Sci. 2022, 13, 9054. [Google Scholar] [CrossRef]
  133. Yamamoto, K.; Togami, T.; Yamaguchi, N. Super-resolution of plant disease images for the acceleration of image-based phenotyping and vigor diagnosis in agriculture. Sensors 2017, 17, 2557. [Google Scholar] [CrossRef]
  134. Abdulridha, J.; Ampatzidis, Y.; Kakarla, S.C.; Roberts, P. Detection of target spot and bacterial spot diseases in tomato using UAV-based and benchtop-based hyperspectral imaging techniques. Precis. Agric. 2020, 21, 955–978. [Google Scholar] [CrossRef]
  135. Abdulridha, J.; Ampatzidis, Y.; Qureshi, J.; Roberts, P. Laboratory and UAV-based identification and classification of tomato yellow leaf curl, bacterial spot, and target spot diseases in tomato utilizing hyperspectral imaging and machine learning. Remote Sens. 2020, 12, 2732. [Google Scholar] [CrossRef]
  136. Abdulridha, J.; Ampatzidis, Y.; Qureshi, J.; Roberts, P. Identification and classification of downy mildew severity stages in watermelon utilizing aerial and ground remote sensing and machine learning. Front. Plant Sci. 2022, 13, 791018. [Google Scholar] [CrossRef]
  137. Lu, J.; Tan, L.; Jiang, H. Review on convolutional neural network (CNN) applied to plant leaf disease classification. Agriculture 2021, 11, 707. [Google Scholar] [CrossRef]
  138. Frankelius, P.; Norrman, C.; Johansen, K. Agricultural Innovation and the Role of Institutions: Lessons from the Game of Drones. J. Agric. Environ. Ethics 2019, 32, 681–707. [Google Scholar] [CrossRef]
  139. Ayamga, M.; Tekinerdogan, B.; Kassahun, A. Exploring the challenges posed by regulations for the use of drones in agriculture in the African context. Land 2021, 10, 164. [Google Scholar] [CrossRef]
  140. Jeanneret, C.; Rambaldi, G. Drone Governance: A Scan of Policies, Laws and Regulations Governing the Use of Unmanned Aerial Vehicles (UAVs) in 79 ACP Countries; CTA Working Paper; 16/12; CTA: Wageningen, The Netherlands, 2016; Available online: https://hdl.handle.net/10568/90121 (accessed on 14 August 2023).
  141. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  142. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Daloye, A.M.; Erkbol, H.; Fritschi, F.B. Crop monitoring using satellite/UAV data fusion and machine learning. Remote Sens. 2020, 12, 1357. [Google Scholar] [CrossRef]
  143. Alvarez-Vanhard, E.; Corpetti, T.; Houet, T. UAV & satellite synergies for optical remote sensing applications: A literature review. Sci. Remote Sens. 2021, 3, 100019. [Google Scholar] [CrossRef]
  144. Cardoso, R.M.; Pereira, T.S.; Facure, M.H.M.; dos Santos, D.M.; Mercante, L.A.; Mattoso, L.H.C.; Correa, D.S. Current progress in plant pathogen detection enabled by nanomaterials-based (bio)sensors. Sens. Actuators Rep. 2022, 4, 100068. [Google Scholar] [CrossRef]
Figure 1. Steps taken for the systematic quantitative literature review (adapted from Moher et al. [30]). N refers to the number of research papers.
Figure 1. Steps taken for the systematic quantitative literature review (adapted from Moher et al. [30]). N refers to the number of research papers.
Remotesensing 15 04273 g001
Figure 2. The number of research articles published per year on the review topic for the period ending in December 2022. The red line indicates the cumulative total.
Figure 2. The number of research articles published per year on the review topic for the period ending in December 2022. The red line indicates the cumulative total.
Remotesensing 15 04273 g002
Figure 3. Geographical distribution of the reviewed research articles according to the country of study. Note: There were four articles where the country of study was either undefined in the article or where data were sourced from the PlantVillage dataset [40].
Figure 3. Geographical distribution of the reviewed research articles according to the country of study. Note: There were four articles where the country of study was either undefined in the article or where data were sourced from the PlantVillage dataset [40].
Remotesensing 15 04273 g003
Figure 4. The proportion of plant species whose diseases were investigated in the research articles reviewed per country of study. (a) Countries with more than one study plant; (b) countries with one study plant. The segments in each ring are proportionate to the number of related research articles reviewed in the systematic literature review (SQLR).
Figure 4. The proportion of plant species whose diseases were investigated in the research articles reviewed per country of study. (a) Countries with more than one study plant; (b) countries with one study plant. The segments in each ring are proportionate to the number of related research articles reviewed in the systematic literature review (SQLR).
Remotesensing 15 04273 g004
Figure 5. The distribution of sensor types and plants whose diseases were investigated in the research articles reviewed. The segments in each ring are proportionate to the number of related research articles reviewed in the SQLR. RGB, NIR, and LiDAR stand for red-green-blue, near-infrared, and light detection and ranging, respectively.
Figure 5. The distribution of sensor types and plants whose diseases were investigated in the research articles reviewed. The segments in each ring are proportionate to the number of related research articles reviewed in the SQLR. RGB, NIR, and LiDAR stand for red-green-blue, near-infrared, and light detection and ranging, respectively.
Remotesensing 15 04273 g005
Figure 6. The distribution of sensor types and methods used for data analysis. The segments in each ring are proportionate to the number of related research articles reviewed in the SQLR.
Figure 6. The distribution of sensor types and methods used for data analysis. The segments in each ring are proportionate to the number of related research articles reviewed in the SQLR.
Remotesensing 15 04273 g006
Table 1. Proportion of research articles reviewed per world geographic regions.
Table 1. Proportion of research articles reviewed per world geographic regions.
RegionCount of Articles
Asia43
Europe23
North America19
South America9
Africa4
Oceania1
Table 2. List of plant diseases whose symptoms and/or severity were investigated in the research articles reviewed in the systematic quantitative literature review.
Table 2. List of plant diseases whose symptoms and/or severity were investigated in the research articles reviewed in the systematic quantitative literature review.
PlantDiseaseRelated Reviewed Study
Apple treeCedar rust[64,65]
Scab[64]
Fire blight[66]
Areca palmYellow leaf disease[67]
BananaYellow sigatoka[68]
Xanthomonas wilt of banana[36]
Banana bunchy top virus[36,38]
Fusarium wilt[69,70,71]
BermudagrassSpring dead spot[72]
CitrusCitrus canker[73]
Citrus huanglongbing disease[74,75,76,77]
Phytophthora foot rot[77]
Citrus gummosis disease[78]
CoffeeCoffee leaf rust[79,80]
CottonCotton root rot[81,82]
EucalyptusVarious leaf diseases[83]
GrapevineGrapevine leaf stripe[84,85,86,87]
Flavescence dorée phytoplasma[88]
Black rot[65,87]
Isariopsis leaf spot[86,87]
KiwifruitKiwifruit decline[89]
LettuceSoft rot[90]
MaizeNorthern leaf blight[91,92,93]
Southern leaf blight[94]
Maize streak virus disease[35,37]
Tar spot[95]
Norway spruceNeedle bladder rust[96]
Oil palmBasal stem rot[97,98,99]
Oilseed rapeSclerotinia[100]
OkraCercospora leaf spot[101]
Olive treeVerticillium wilt[32]
Xylella fastidiosa[102,103]
Peacock spot[104]
OnionAnthracnose-twister[105]
Stemphylium leaf blight[106]
Opium poppyDowny mildew[107]
Paperbark treeMyrtle rust[34]
Peach treeFire blight[108]
PeanutBacterial wilt[109]
Pine treePine wilt disease[110,111,112,113,114]
Red band needle blight[115]
PotatoPotato late blight[57,58,59,60]
Potato early blight[56]
Potato Y virus[61]
Vascular wilt[62]
Soft rot[61]
RadishFusarium wilt[116,117]
RiceSheath blight[118]
Bacterial leaf blight[119]
Bacterial panicle blight[119]
SoybeanTarget spot[120,121]
Powdery mildew[120,121]
SquashPowdery mildew[122]
Sugar beetCercospora leaf spot[123,124,125,126,127,128]
Anthracnose[124,125]
Alternaria leaf spot[124,125]
Beet cyst nematode[129]
SugarcaneWhite leaf phytoplasma[130]
SwitchgrassRust disease[131]
TeaAnthracnose[132]
TomatoBacterial spot[133,134,135]
Early blight[133]
Late blight[133]
Septoria leaf spot[133]
Tomato mosaic virus[133]
Leaf mold[133]
Target leaf spot[133,134,135]
Tomato yellow leaf curl virus[133,135]
WatermelonGummy stem blight[63]
Anthracnose[63]
Fusarium wilt[63]
Phytophthora fruit rot[63]
Alternaria leaf spot[63]
Cucurbit leaf crumple[63]
Downy mildew[136]
WheatYellow rust[42,43,44,45,46,47,48,49,50,51]
Leaf rust[42]
Septoria leaf spot[53]
Powdery mildew[52]
Tan spot[53]
Fusarium head blight[54,55]
Table 3. Methods used to identify plant disease symptoms from UAV imagery and temporal distribution of related research articles published.
Table 3. Methods used to identify plant disease symptoms from UAV imagery and temporal distribution of related research articles published.
Method 1Year
201320142016201720182019202020212022Total
ANOVA 1 135
Clustering analysis 2 2
Correlation analysis 21 3
Geostatistics/GIS 1 113
Machine learning (ML) 2 2351561243
ML/Deep learning 3 114431023
ML and Deep learning 1 12
ML and regression models 1 1
ML and statistical comparison 1 1
Pixel-wise comparison 31 4
Regression models11 2 1229
Threshold-based colour analysis 1 1 2
Vegetation health mapping 2 2
Visual analysis 1 113
1 ANOVA: analysis of variance. GIS: geographic information system. 2 Examples of ML used include support vector machine, random forest, K-nearest neighbors, Naive Bayes, and artificial neural networks. 3 Although deep learning is a subset of machine learning, the differentiation made here aims to highlight its relative importance among ML-based approaches over the years.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kouadio, L.; El Jarroudi, M.; Belabess, Z.; Laasli, S.-E.; Roni, M.Z.K.; Amine, I.D.I.; Mokhtari, N.; Mokrini, F.; Junk, J.; Lahlali, R. A Review on UAV-Based Applications for Plant Disease Detection and Monitoring. Remote Sens. 2023, 15, 4273. https://doi.org/10.3390/rs15174273

AMA Style

Kouadio L, El Jarroudi M, Belabess Z, Laasli S-E, Roni MZK, Amine IDI, Mokhtari N, Mokrini F, Junk J, Lahlali R. A Review on UAV-Based Applications for Plant Disease Detection and Monitoring. Remote Sensing. 2023; 15(17):4273. https://doi.org/10.3390/rs15174273

Chicago/Turabian Style

Kouadio, Louis, Moussa El Jarroudi, Zineb Belabess, Salah-Eddine Laasli, Md Zohurul Kadir Roni, Ibn Dahou Idrissi Amine, Nourreddine Mokhtari, Fouad Mokrini, Jürgen Junk, and Rachid Lahlali. 2023. "A Review on UAV-Based Applications for Plant Disease Detection and Monitoring" Remote Sensing 15, no. 17: 4273. https://doi.org/10.3390/rs15174273

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop