Special Issue "UAVs for Vegetation Monitoring"

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing in Agriculture and Vegetation".

Deadline for manuscript submissions: 31 December 2020.

Special Issue Editors

Dr. Ana de Castro Megías
Website SciProfiles
Guest Editor
IMAPING Group—Remote Sensing for Precision Agriculture. Crop Protection. Institute for Sustainable Agriculture – CSIC (Spanish National Research Council), Campus Alameda del Obispo. 14004 Córdoba, Spain
Interests: remote sensing; crop monitoring; precision agriculture; weeds; plant disease; crop phenotyping; site-specific weed management; object-based image analysis (OBIA)
Dr. Yeyin Shi
Website
Guest Editor
Biological Systems Engineering, University of Nebraska-Lincoln, 3605 Fair Street, NE 68583, USA
Interests: remote sensing; UAVs; data analysis; machine learning; abiotic and biotic stress monitoring and management
Dr. Jose M. Peña
Website SciProfiles
Guest Editor
Institute of Agricultural Sciences, CSIC, Plant Protection Department 28006 Madrid, Spain
Interests: precision agriculture; UAV and satellite remote sensing; object-based image analysis (OBIA); digitization and sensors in agriculture; crop protection; weed mapping; sustainable agriculture
Special Issues and Collections in MDPI journals
Prof. Dr. Joe Maja
Website
Guest Editor
Agricultural Sciences Department, Edisto Research and Education Center, Clemson University, 64 Research Road, Blackville, SC 29817, USA
Interests: sensors; automation; emerging technologies; small unmanned aircraft systems (sUAS) and robotics

Special Issue Information

Dear Colleagues,

Global crop production faces a major challenge regarding sustainability in the context of a rapidly growing world population and the gradual diminishing of natural resources. Remote sensing plays a fundamental role in changing the plant production model through the development of new technologies (robots, UAVs, sensors), making products more profitable and competitive and, also, more sustainable. Among the new advances, unmanned aerial vehicles (UAVs) equipped with perception systems have demonstrated suitability in the timely assessment and monitoring of vegetation. They can be operated at low altitudes, providing an ultra-high spatial resolution image, have great flexibility of flight scheduling for data collection at critical and desired moments and, also, the generation of digital surface models (DSMs) using highly overlapped images and photo-reconstruction techniques or artificial vision. Therefore, it is essential to advance the research for the technical configuration of UAVs, as well as to improve processing and analyzing of the UAV imagery of agricultural and forest scenarios in order to strengthen the knowledge of ecosystems and thereby improve farmers’ decision-making processes.

We encourage all members involved in the development and applications of UAVs to show their most recent findings focused on promising developments related to vegetation monitoring. This Special Issue welcomes original and innovative papers demonstrating the use of UAVs for remote sensing applications in the areas of agricultural, forestry, and natural resources managements. The selection of papers for publication will depend on quality and rigor of the research and paper. Specific topics include, but are not limited to:

  • UAV configuration and specifications for forest or agricultural applications;
  • Object- or pixel-based image analysis approaches for vegetation monitoring;
  • Artificial intelligence-based image-processing approaches;
  • Integration of UAV images with ground-based dataset or other remote and proximal measurements;
  • Biotic (weeds, disease) and abiotic (water, nutrition deficiencies) stress factors—sensing and modeling;
  • Crop yield estimation or prediction;
  • High-throughput phenotyping;
  • UAV-based prescription map development for site-specific management;
  • Precision agriculture applications;
  • UAV image pre-processing for radiometric, spectral and spatial calibration, and mosaicking;
  • Development, integration, and testing of new and emerging sensors and technologies for UAV-based crop management.

Dr. Ana de Castro Megías
Dr. Yeyin Shi
Dr. José M. Peña
Prof. Dr. Joe Maja
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • UAV
  • crop mapping
  • image analysis
  • multispectral
  • hyperspectral
  • thermal
  • reflectance
  • classification
  • digital surface model
  • precision agriculture

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

Open AccessFeature PaperArticle
Comparison of Object Detection and Patch-Based Classification Deep Learning Models on Mid- to Late-Season Weed Detection in UAV Imagery
Remote Sens. 2020, 12(13), 2136; https://doi.org/10.3390/rs12132136 - 03 Jul 2020
Abstract
Mid- to late-season weeds that escape from the routine early-season weed management threaten agricultural production by creating a large number of seeds for several future growing seasons. Rapid and accurate detection of weed patches in field is the first step of site-specific weed [...] Read more.
Mid- to late-season weeds that escape from the routine early-season weed management threaten agricultural production by creating a large number of seeds for several future growing seasons. Rapid and accurate detection of weed patches in field is the first step of site-specific weed management. In this study, object detection-based convolutional neural network models were trained and evaluated over low-altitude unmanned aerial vehicle (UAV) imagery for mid- to late-season weed detection in soybean fields. The performance of two object detection models, Faster RCNN and the Single Shot Detector (SSD), were evaluated and compared in terms of weed detection performance using mean Intersection over Union (IoU) and inference speed. It was found that the Faster RCNN model with 200 box proposals had similar good weed detection performance to the SSD model in terms of precision, recall, f1 score, and IoU, as well as a similar inference time. The precision, recall, f1 score and IoU were 0.65, 0.68, 0.66 and 0.85 for Faster RCNN with 200 proposals, and 0.66, 0.68, 0.67 and 0.84 for SSD, respectively. However, the optimal confidence threshold of the SSD model was found to be much lower than that of the Faster RCNN model, which indicated that SSD might have lower generalization performance than Faster RCNN for mid- to late-season weed detection in soybean fields using UAV imagery. The performance of the object detection model was also compared with patch-based CNN model. The Faster RCNN model yielded a better weed detection performance than the patch-based CNN with and without overlap. The inference time of Faster RCNN was similar to patch-based CNN without overlap, but significantly less than patch-based CNN with overlap. Hence, Faster RCNN was found to be the best model in terms of weed detection performance and inference time among the different models compared in this study. This work is important in understanding the potential and identifying the algorithms for an on-farm, near real-time weed detection and management. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Figure 1

Open AccessArticle
Assessing the Operation Parameters of a Low-altitude UAV for the Collection of NDVI Values Over a Paddy Rice Field
Remote Sens. 2020, 12(11), 1850; https://doi.org/10.3390/rs12111850 - 08 Jun 2020
Abstract
Unmanned aerial vehicle (UAV) remote sensing platforms allow for normalized difference vegetation index (NDVI) values to be mapped with a relatively high resolution, therefore enabling an unforeseeable ability to evaluate the influence of the operation parameters on the quality of the thus acquired [...] Read more.
Unmanned aerial vehicle (UAV) remote sensing platforms allow for normalized difference vegetation index (NDVI) values to be mapped with a relatively high resolution, therefore enabling an unforeseeable ability to evaluate the influence of the operation parameters on the quality of the thus acquired data. In order to better understand the effects of these parameters, we made a comprehensive evaluation on the effects of the solar zenith angle (SZA), the time of day (TOD), the flight altitude (FA) and the growth level of paddy rice at a pixel-scale on UAV-acquired NDVI values. Our results show that: (1) there was an inverse relationship between the FA (≤100 m) and the mean NDVI values, (2) TOD and SZA had a greater impact on UAV–NDVIs than the FA and the growth level; (3) Better growth levels of rice—measured using the NDVI—could reduce the effects of the FA, TOD and SZA. We expect that our results could be used to better plan flight campaigns that aim to collect NDVI values over paddy rice fields. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Figure 1

Open AccessArticle
Closing the Phenotyping Gap: High Resolution UAV Time Series for Soybean Growth Analysis Provides Objective Data from Field Trials
Remote Sens. 2020, 12(10), 1644; https://doi.org/10.3390/rs12101644 - 20 May 2020
Abstract
Close remote sensing approaches can be used for high throughput on-field phenotyping in the context of plant breeding and biological research. Data on canopy cover (CC) and canopy height (CH) and their temporal changes throughout the growing season can yield information about crop [...] Read more.
Close remote sensing approaches can be used for high throughput on-field phenotyping in the context of plant breeding and biological research. Data on canopy cover (CC) and canopy height (CH) and their temporal changes throughout the growing season can yield information about crop growth and performance. In the present study, sigmoid models were fitted to multi-temporal CC and CH data obtained using RGB imagery captured with a drone for a broad set of soybean genotypes. The Gompertz and Beta functions were used to fit CC and CH data, respectively. Overall, 90.4% fits for CC and 99.4% fits for CH reached an adjusted R2 > 0.70, demonstrating good performance of the models chosen. Using these growth curves, parameters including maximum absolute growth rate, early vigor, maximum height, and senescence were calculated for a collection of soybean genotypes. This information was also used to estimate seed yield and maturity (R8 stage) (adjusted R2 = 0.51 and 0.82). Combinations of parameter values were tested to identify genotypes with interesting traits. An integrative approach of fitting a curve to a multi-temporal dataset resulted in biologically interpretable parameters that were informative for relevant traits. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Open AccessArticle
Segmenting Purple Rapeseed Leaves in the Field from UAV RGB Imagery Using Deep Learning as an Auxiliary Means for Nitrogen Stress Detection
Remote Sens. 2020, 12(9), 1403; https://doi.org/10.3390/rs12091403 - 29 Apr 2020
Abstract
Crop leaf purpling is a common phenotypic change when plants are subject to some biotic and abiotic stresses during their growth. The extraction of purple leaves can monitor crop stresses as an apparent trait and meanwhile contributes to crop phenotype analysis, monitoring, and [...] Read more.
Crop leaf purpling is a common phenotypic change when plants are subject to some biotic and abiotic stresses during their growth. The extraction of purple leaves can monitor crop stresses as an apparent trait and meanwhile contributes to crop phenotype analysis, monitoring, and yield estimation. Due to the complexity of the field environment as well as differences in size, shape, texture, and color gradation among the leaves, purple leaf segmentation is difficult. In this study, we used a U-Net model for segmenting purple rapeseed leaves during the seedling stage based on unmanned aerial vehicle (UAV) RGB imagery at the pixel level. With the limited spatial resolution of rapeseed images acquired by UAV and small object size, the input patch size was carefully selected. Experiments showed that the U-Net model with the patch size of 256 × 256 pixels obtained better and more stable results with a F-measure of 90.29% and an Intersection of Union (IoU) of 82.41%. To further explore the influence of image spatial resolution, we evaluated the performance of the U-Net model with different image resolutions and patch sizes. The U-Net model performed better compared with four other commonly used image segmentation approaches comprising support vector machine, random forest, HSeg, and SegNet. Moreover, regression analysis was performed between the purple rapeseed leaf ratios and the measured N content. The negative exponential model had a coefficient of determination (R²) of 0.858, thereby explaining much of the rapeseed leaf purpling in this study. This purple leaf phenotype could be an auxiliary means for monitoring crop growth status so that crops could be managed in a timely and effective manner when nitrogen stress occurs. Results demonstrate that the U-Net model is a robust method for purple rapeseed leaf segmentation and that the accurate segmentation of purple leaves provides a new method for crop nitrogen stress monitoring. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Open AccessArticle
Influence of Model Grid Size on the Estimation of Surface Fluxes Using the Two Source Energy Balance Model and sUAS Imagery in Vineyards
Remote Sens. 2020, 12(3), 342; https://doi.org/10.3390/rs12030342 - 21 Jan 2020
Abstract
Evapotranspiration (ET) is a key variable for hydrology and irrigation water management, with significant importance in drought-stricken regions of the western US. This is particularly true for California, which grows much of the high-value perennial crops in the US. The advent [...] Read more.
Evapotranspiration (ET) is a key variable for hydrology and irrigation water management, with significant importance in drought-stricken regions of the western US. This is particularly true for California, which grows much of the high-value perennial crops in the US. The advent of small Unmanned Aerial System (sUAS) with sensor technology similar to satellite platforms allows for the estimation of high-resolution ET at plant spacing scale for individual fields. However, while multiple efforts have been made to estimate ET from sUAS products, the sensitivity of ET models to different model grid size/resolution in complex canopies, such as vineyards, is still unknown. The variability of row spacing, canopy structure, and distance between fields makes this information necessary because additional complexity processing individual fields. Therefore, processing the entire image at a fixed resolution that is potentially larger than the plant-row separation is more efficient. From a computational perspective, there would be an advantage to running models at much coarser resolutions than the very fine native pixel size from sUAS imagery for operational applications. In this study, the Two-Source Energy Balance with a dual temperature (TSEB2T) model, which uses remotely sensed soil/substrate and canopy temperature from sUAS imagery, was used to estimate ET and identify the impact of spatial domain scale under different vine phenological conditions. The analysis relies upon high-resolution imagery collected during multiple years and times by the Utah State University AggieAirTM sUAS program over a commercial vineyard located near Lodi, California. This project is part of the USDA-Agricultural Research Service Grape Remote Sensing Atmospheric Profile and Evapotranspiration eXperiment (GRAPEX). Original spectral and thermal imagery data from sUAS were at 10 cm and 60 cm per pixel, respectively, and multiple spatial domain scales (3.6, 7.2, 14.4, and 30 m) were evaluated and compared against eddy covariance (EC) measurements. Results indicated that the TSEB2T model is only slightly affected in the estimation of the net radiation (Rn) and the soil heat flux (G) at different spatial resolutions, while the sensible and latent heat fluxes (H and LE, respectively) are significantly affected by coarse grid sizes. The results indicated overestimation of H and underestimation of LE values, particularly at Landsat scale (30 m). This refers to the non-linear relationship between the land surface temperature (LST) and the normalized difference vegetation index (NDVI) at coarse model resolution. Another predominant reason for LE reduction in TSEB2T was the decrease in the aerodynamic resistance (Ra), which is a function of the friction velocity ( u * ) that varies with mean canopy height and roughness length. While a small increase in grid size can be implemented, this increase should be limited to less than twice the smallest row spacing present in the sUAS imagery. The results also indicated that the mean LE at field scale is reduced by 10% to 20% at coarser resolutions, while the with-in field variability in LE values decreased significantly at the larger grid sizes and ranged between approximately 15% and 45%. This implies that, while the field-scale values of LE are fairly reliable at larger grid sizes, the with-in field variability limits its use for precision agriculture applications. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Open AccessArticle
A Comparative Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using Multi-Temporal UAS Data
Remote Sens. 2019, 11(23), 2757; https://doi.org/10.3390/rs11232757 - 23 Nov 2019
Cited by 1
Abstract
This study presents a comparative study of multispectral and RGB (red, green, and blue) sensor-based cotton canopy cover modelling using multi-temporal unmanned aircraft systems (UAS) imagery. Additionally, a canopy cover model using an RGB sensor is proposed that combines an RGB-based vegetation index [...] Read more.
This study presents a comparative study of multispectral and RGB (red, green, and blue) sensor-based cotton canopy cover modelling using multi-temporal unmanned aircraft systems (UAS) imagery. Additionally, a canopy cover model using an RGB sensor is proposed that combines an RGB-based vegetation index with morphological closing. The field experiment was established in 2017 and 2018, where the whole study area was divided into approximately 1 x 1 m size grids. Grid-wise percentage canopy cover was computed using both RGB and multispectral sensors over multiple flights during the growing season of the cotton crop. Initially, the normalized difference vegetation index (NDVI)-based canopy cover was estimated, and this was used as a reference for the comparison with RGB-based canopy cover estimations. To test the maximum achievable performance of RGB-based canopy cover estimation, a pixel-wise classification method was implemented. Later, four RGB-based canopy cover estimation methods were implemented using RGB images, namely Canopeo, the excessive greenness index, the modified red green vegetation index and the red green blue vegetation index. The performance of RGB-based canopy cover estimation was evaluated using NDVI-based canopy cover estimation. The multispectral sensor-based canopy cover model was considered to be a more stable and accurately estimating canopy cover model, whereas the RGB-based canopy cover model was very unstable and failed to identify canopy when cotton leaves changed color after canopy maturation. The application of a morphological closing operation after the thresholding significantly improved the RGB-based canopy cover modeling. The red green blue vegetation index turned out to be the most efficient vegetation index to extract canopy cover with very low average root mean square error (2.94% for the 2017 dataset and 2.82% for the 2018 dataset), with respect to multispectral sensor-based canopy cover estimation. The proposed canopy cover model provides an affordable alternate of the multispectral sensors which are more sensitive and expensive. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Open AccessArticle
Using Vegetation Indices and a UAV Imaging Platform to Quantify the Density of Vegetation Ground Cover in Olive Groves (Olea Europaea L.) in Southern Spain
Remote Sens. 2019, 11(21), 2564; https://doi.org/10.3390/rs11212564 - 01 Nov 2019
Cited by 2
Abstract
In olive groves, vegetation ground cover (VGC) plays an important ecological role. The EU Common Agricultural Policy, through cross-compliance, acknowledges the importance of this factor, but, to determine the real impact of VGC, it must first be quantified. Accordingly, in the present study, [...] Read more.
In olive groves, vegetation ground cover (VGC) plays an important ecological role. The EU Common Agricultural Policy, through cross-compliance, acknowledges the importance of this factor, but, to determine the real impact of VGC, it must first be quantified. Accordingly, in the present study, eleven vegetation indices (VIs) were applied to quantify the density of VGC in olive groves (Olea europaea L.), according to high spatial resolution (10–12 cm) multispectral images obtained by an unmanned aerial vehicle (UAV). The fieldwork was conducted in early spring, in a Mediterranean mountain olive grove in southern Spain presenting various VGC densities. A five-step method was applied: (1) generate image mosaics using UAV technology; (2) apply the VIs; (3) quantify VGC density by means of sampling plots (ground-truth); (4) calculate the mean reflectance of the spectral bands and of the VIs in each sampling plot; and (5) quantify VGC density according to the VIs. The most sensitive index was IRVI, which accounted for 82% (p < 0.001) of the variability of VGC density. The capability of the VIs to differentiate VGC densities increased in line with the cover interval range. RVI most accurately distinguished VGC densities > 80% in a cover interval range of 10% (p < 0.001), while IRVI was most accurate for VGC densities < 30% in a cover interval range of 15% (p < 0.01). IRVI, NRVI, NDVI, GNDVI and SAVI differentiated the complete series of VGC densities when the cover interval range was 30% (p < 0.001 and p < 0.05). Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Open AccessArticle
Estimating and Examining the Sensitivity of Different Vegetation Indices to Fractions of Vegetation Cover at Different Scaling Grids for Early Stage Acacia Plantation Forests Using a Fixed-Wing UAS
Remote Sens. 2019, 11(15), 1816; https://doi.org/10.3390/rs11151816 - 03 Aug 2019
Cited by 3
Abstract
Understanding the information on land conditions and especially green vegetation cover is important for monitoring ecosystem dynamics. The fraction of vegetation cover (FVC) is a key variable that can be used to observe vegetation cover trends. Conventionally, satellite data are utilized to compute [...] Read more.
Understanding the information on land conditions and especially green vegetation cover is important for monitoring ecosystem dynamics. The fraction of vegetation cover (FVC) is a key variable that can be used to observe vegetation cover trends. Conventionally, satellite data are utilized to compute these variables, although computations in regions such as the tropics can limit the amount of available observation information due to frequent cloud coverage. Unmanned aerial systems (UASs) have become increasingly prominent in recent research and can remotely sense using the same methods as satellites but at a lower altitude. UASs are not limited by clouds and have a much higher resolution. This study utilizes a UAS to determine the emerging trends for FVC estimates at an industrial plantation site in Indonesia, which utilizes fast-growing Acacia trees that can rapidly change the land conditions. First, the UAS was utilized to collect high-resolution RGB imagery and multispectral images for the study area. The data were used to develop general land use/land cover (LULC) information for the site. Multispectral data were converted to various vegetation indices, and within the determined resolution grid (5, 10, 30 and 60 m), the fraction of each LULC type was analyzed for its correlation between the different vegetation indices (Vis). Finally, a simple empirical model was developed to estimate the FVC from the UAS data. The results show the correlation between the FVC (acacias) and different Vis ranging from R2 = 0.66–0.74, 0.76–0.8, 0.84–0.89 and 0.93–0.94 for 5, 10, 30 and 60 m grid resolutions, respectively. This study indicates that UAS-based FVC estimations can be used for observing fast-growing acacia trees at a fine scale resolution, which may assist current restoration programs in Indonesia. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Other

Jump to: Research

Open AccessLetter
Recognition of Banana Fusarium Wilt Based on UAV Remote Sensing
Remote Sens. 2020, 12(6), 938; https://doi.org/10.3390/rs12060938 - 13 Mar 2020
Cited by 2
Abstract
Fusarium wilt (Panama disease) of banana currently threatens banana production areas worldwide. Timely monitoring of Fusarium wilt disease is important for the disease treatment and adjustment of banana planting methods. The objective of this study was to establish a method for identifying the [...] Read more.
Fusarium wilt (Panama disease) of banana currently threatens banana production areas worldwide. Timely monitoring of Fusarium wilt disease is important for the disease treatment and adjustment of banana planting methods. The objective of this study was to establish a method for identifying the banana regions infested or not infested with Fusarium wilt disease using unmanned aerial vehicle (UAV)-based multispectral imagery. Two experiments were conducted in this study. In experiment 1, 120 sample plots were surveyed, of which 75% were used as modeling dataset for model fitting and the remaining were used as validation dataset 1 (VD1) for validation. In experiment 2, 35 sample plots were surveyed, which were used as validation dataset 2 (VD2) for model validation. An UAV equipped with a five band multispectral camera was used to capture the multispectral imagery. Eight vegetation indices (VIs) related to pigment absorption and plant growth changes were chosen for determining the biophysical and biochemical characteristics of the plants. The binary logistic regression (BLR) method was used to assess the spatial relationships between the VIs and the plants infested or not infested with Fusarium wilt. The results showed that the banana Fusarium wilt disease can be easily identified using the VIs including the green chlorophyll index (CIgreen), red-edge chlorophyll index (CIRE), normalized difference vegetation index (NDVI), and normalized difference red-edge index (NDRE). The fitting overall accuracies of the models were greater than 80%. Among the investigated VIs, the CIRE exhibited the best performance both for the VD1 (OA = 91.7%, Kappa = 0.83) and VD2 (OA = 80.0%, Kappa = 0.59). For the same type of VI, the VIs including a red-edge band obtained a better performance than that excluding a red-edge band. A simulation of imagery with different spatial resolutions (i.e., 0.5-m, 1-m, 2-m, 5-m, and 10-m resolutions) showed that good identification accuracy of Fusarium wilt was obtained when the resolution was higher than 2 m. As the resolution decreased, the identification accuracy of Fusarium wilt showed a decreasing trend. The findings indicate that UAV-based remote sensing with a red-edge band is suitable for identifying banana Fusarium wilt disease. The results of this study provide guidance for detecting the disease and crop planting adjustment. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Open AccessLetter
Watson on the Farm: Using Cloud-Based Artificial Intelligence to Identify Early Indicators of Water Stress
Remote Sens. 2019, 11(22), 2645; https://doi.org/10.3390/rs11222645 - 13 Nov 2019
Cited by 1
Abstract
As demand for freshwater increases while supply remains stagnant, the critical need for sustainable water use in agriculture has led the EPA Strategic Plan to call for new technologies that can optimize water allocation in real-time. This work assesses the use of cloud-based [...] Read more.
As demand for freshwater increases while supply remains stagnant, the critical need for sustainable water use in agriculture has led the EPA Strategic Plan to call for new technologies that can optimize water allocation in real-time. This work assesses the use of cloud-based artificial intelligence to detect early indicators of water stress across six container-grown ornamental shrub species. Near-infrared images were previously collected with modified Canon and MAPIR Survey II cameras deployed via a small unmanned aircraft system (sUAS) at an altitude of 30 meters. Cropped images of plants in no, low-, and high-water stress conditions were split into four-fold cross-validation sets and used to train models through IBM Watson’s Visual Recognition service. Despite constraints such as small sample size (36 plants, 150 images) and low image resolution (150 pixels by 150 pixels per plant), Watson generated models were able to detect indicators of stress after 48 hours of water deprivation with a significant to marginally significant degree of separation in four out of five species tested (p < 0.10). Two models were also able to detect indicators of water stress after only 24 hours, with models trained on images of as few as eight water-stressed Buddleia plants achieving an average area under the curve (AUC) of 0.9884 across four folds. Ease of pre-processing, minimal amount of training data required, and outsourced computation make cloud-based artificial intelligence services such as IBM Watson Visual Recognition an attractive tool for agriculture analytics. Cloud-based artificial intelligence can be combined with technologies such as sUAS and spectral imaging to help crop producers identify deficient irrigation strategies and intervene before crop value is diminished. When brought to scale, frameworks such as these can drive responsive irrigation systems that monitor crop status in real-time and maximize sustainable water use. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Back to TopTop