remotesensing-logo

Journal Browser

Journal Browser

Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing in Agriculture and Vegetation".

Deadline for manuscript submissions: closed (31 December 2018) | Viewed by 226177

Special Issue Editors


E-Mail Website
Guest Editor
INRA, UMR-EMMAH, UMT-CAPTE, 228 Route de l'aérodrome, CS 40509, F-84914 Avignon, France
Interests: hyperspectral and multispectral remote sensing; crop models; data assimilation; crop phenotyping traits; UAV-based sensors; UGV-based sensors; precision farming
National Engineering Research Center for Information Technology in Agriculture (NERCITA), Beijing Research Center for Information Technology in Agriculture, Beijing Academy of Agriculture and Forestry Sciences, 11 Middle Road, Haidian District, Beijing 100097, China
Interests: remote sensing; crop growth model; crop yield and quality predicting; precision agriculture; agronomy
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Institute of Geomatics, University of Natural Resources and Life Sciences, 1090 Vienna, Austria
Interests: remote sensing of vegetation with focus on time series analysis and use of physically based radiative transfer models for mapping biochemical and biophysical traits
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

To meet the global food security challenges under changing climatic scenarios, it is most important to enhance crop yield under resource competence. Accurate and precise measurements of crop phenotyping traits play an important role in harnessing the potentiality of genomic resources in the genetic improvement of crop yield. In traditional crop phenotyping, traits are assessed with statistical analysis methods, which must be done manually. Human effort, time, and resources are needed to measure plant characteristics. With the fast development of Unmanned Ground Vehicle (UGV), Unmanned Aerial Vehicle (UAV), sensor technologies, and image algorithms, the integration of UGV, UAV, sensors and algorithmic applications for automatic crop phenotyping are being handled to overcome the defects of manual techniques. These high-throughput non-invasive crop phenotyping platforms have been used to estimate LAI, canopy cover, nitrogen, chlorophyll, biomass, plant structure, plant density, phenology, leaf health, canopy/leaf temperature, and the physiological state of photosynthetic machinery under different stress conditions. They have become much more advanced in order to provide a solution to genomics-enabled improvements and address our need of precise and efficient phenotyping of crop plants. They will also help in finding more relevant solutions for the major problems that are currently limiting crop production.

This Special Issue is focused on the latest innovative research results in the field of remote sensing technology, senor technologies, and imagery algorithm development and applications specifically addressing issues estimating the crop phenotyping traits based on UGV and UAV imagery. The list below provides a general (but not exhaustive) overview of the topics that are solicited for this Special Issue:

Ø  UGV and UAV platforms application for crop phenotyping traits

Ø  Imagery algorithms (data fusion, segmentation, classification, machine learning, and deep learning, etc.) to estimate crop phenotyping traits

Ø  Sensors (RGB, multispectral, hyperspectral, thermal, Lidar, fluorescence, etc.) application for crop phenotyping traits

Ø  Combination of different sensors data to improve the estimation accuracy of crop phenotyping traits

Ø  Data assimilation of multisource images into two- or three-dimensional crop models

Dr. Xiuliang Jin
Dr. Zhenhai Li
Prof. Dr. Clement Atzberger
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • crop phenotyping traits
  • unmanned ground vehicle
  • unmanned aerial vehicle imagery
  • imagery algorithms
  • segmentation
  • classification
  • machine learning
  • different sensors data
  • data assimilation
  • two or three dimensional crop models

Published Papers (31 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Other

9 pages, 1408 KiB  
Editorial
Editorial for the Special Issue “Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery”
by Xiuliang Jin, Zhenhai Li and Clement Atzberger
Remote Sens. 2020, 12(6), 940; https://doi.org/10.3390/rs12060940 - 13 Mar 2020
Cited by 11 | Viewed by 3162
Abstract
High-throughput crop phenotyping is harnessing the potential of genomic resources for the genetic improvement of crop production under changing climate conditions. As global food security is not yet assured, crop phenotyping has received increased attention during the past decade. This spectral issue (SI) [...] Read more.
High-throughput crop phenotyping is harnessing the potential of genomic resources for the genetic improvement of crop production under changing climate conditions. As global food security is not yet assured, crop phenotyping has received increased attention during the past decade. This spectral issue (SI) collects 30 papers reporting research on estimation of crop phenotyping traits using unmanned ground vehicle (UGV) and unmanned aerial vehicle (UAV) imagery. Such platforms were previously not widely available. The special issue includes papers presenting recent advances in the field, with 22 UAV-based papers and 12 UGV-based articles. The special issue covers 16 RGB sensor papers, 11 papers on multi-spectral imagery, and further 4 papers on hyperspectral and 3D data acquisition systems. A total of 13 plants’ phenotyping traits, including morphological, structural, and biochemical traits are covered. Twenty different data processing and machine learning methods are presented. In this way, the special issue provides a good overview regarding potential applications of the platforms and sensors, to timely provide crop phenotyping traits in a cost-efficient and objective manner. With the fast development of sensors technology and image processing algorithms, we expect that the estimation of crop phenotyping traits supporting crop breeding scientists will gain even more attention in the future. Full article
Show Figures

Figure 1

Research

Jump to: Editorial, Other

24 pages, 7524 KiB  
Article
High-Throughput Phenotyping Analysis of Potted Soybean Plants Using Colorized Depth Images Based on A Proximal Platform
by Xiaodan Ma, Kexin Zhu, Haiou Guan, Jiarui Feng, Song Yu and Gang Liu
Remote Sens. 2019, 11(9), 1085; https://doi.org/10.3390/rs11091085 - 07 May 2019
Cited by 27 | Viewed by 5609
Abstract
Canopy color and structure can strongly reflect plant functions. Color characteristics and plant height as well as canopy breadth are important aspects of the canopy phenotype of soybean plants. High-throughput phenotyping systems with imaging capabilities providing color and depth information can rapidly acquire [...] Read more.
Canopy color and structure can strongly reflect plant functions. Color characteristics and plant height as well as canopy breadth are important aspects of the canopy phenotype of soybean plants. High-throughput phenotyping systems with imaging capabilities providing color and depth information can rapidly acquire data of soybean plants, making it possible to quantify and monitor soybean canopy development. The goal of this study was to develop a 3D imaging approach to quantitatively analyze soybean canopy development under natural light conditions. Thus, a Kinect sensor-based high-throughput phenotyping (HTP) platform was developed for soybean plant phenotyping. To calculate color traits accurately, the distortion phenomenon of color images was first registered in accordance with the principle of three primary colors and color constancy. Then, the registered color images were applied to depth images for the reconstruction of the colorized three-dimensional canopy structure. Furthermore, the 3D point cloud of soybean canopies was extracted from the background according to adjusted threshold, and each area of individual potted soybean plants in the depth images was segmented for the calculation of phenotypic traits. Finally, color indices, plant height and canopy breadth were assessed based on 3D point cloud of soybean canopies. The results showed that the maximum error of registration for the R, G, and B bands in the dataset was 1.26%, 1.09%, and 0.75%, respectively. Correlation analysis between the sensors and manual measurements yielded R2 values of 0.99, 0.89, and 0.89 for plant height, canopy breadth in the west-east (W–E) direction, and canopy breadth in the north-south (N–S) direction, and R2 values of 0.82, 0.79, and 0.80 for color indices h, s, and i, respectively. Given these results, the proposed approaches provide new opportunities for the identification of the quantitative traits that control canopy structure in genetic/genomic studies or for soybean yield prediction in breeding programs. Full article
Show Figures

Graphical abstract

19 pages, 5776 KiB  
Article
Comparing Nadir and Multi-Angle View Sensor Technologies for Measuring in-Field Plant Height of Upland Cotton
by Alison L. Thompson, Kelly R. Thorp, Matthew M. Conley, Diaa M. Elshikha, Andrew N. French, Pedro Andrade-Sanchez and Duke Pauli
Remote Sens. 2019, 11(6), 700; https://doi.org/10.3390/rs11060700 - 23 Mar 2019
Cited by 27 | Viewed by 5460
Abstract
Plant height is a morphological characteristic of plant growth that is a useful indicator of plant stress resulting from water and nutrient deficit. While height is a relatively simple trait, it can be difficult to measure accurately, especially in crops with complex canopy [...] Read more.
Plant height is a morphological characteristic of plant growth that is a useful indicator of plant stress resulting from water and nutrient deficit. While height is a relatively simple trait, it can be difficult to measure accurately, especially in crops with complex canopy architectures like cotton. This paper describes the deployment of four nadir view ultrasonic transducers (UTs), two light detection and ranging (LiDAR) systems, and an unmanned aerial system (UAS) with a digital color camera to characterize plant height in an upland cotton breeding trial. The comparison of the UTs with manual measurements demonstrated that the Honeywell and Pepperl+Fuchs sensors provided more precise estimates of plant height than the MaxSonar and db3 Pulsar sensors. Performance of the multi-angle view LiDAR and UAS technologies demonstrated that the UAS derived 3-D point clouds had stronger correlations (0.980) with the UTs than the proximal LiDAR sensors. As manual measurements require increased time and labor in large breeding trials and are prone to human error reducing repeatability, UT and UAS technologies are an efficient and effective means of characterizing cotton plant height. Full article
Show Figures

Figure 1

18 pages, 8292 KiB  
Article
Wind Field Distribution of Multi-rotor UAV and Its Influence on Spectral Information Acquisition of Rice Canopies
by Lei Feng, Weikang Wu, Junmin Wang, Chu Zhang, Yiying Zhao, Susu Zhu and Yong He
Remote Sens. 2019, 11(6), 602; https://doi.org/10.3390/rs11060602 - 13 Mar 2019
Cited by 10 | Viewed by 3247
Abstract
Unmanned aerial vehicles (UAV) are widely used as remote sensing platforms to effectively monitor agricultural conditions. The wind field generated by the rotors in low-altitude operations will cause the deformation of rice crops, and may affect the acquisition of the true spectral information. [...] Read more.
Unmanned aerial vehicles (UAV) are widely used as remote sensing platforms to effectively monitor agricultural conditions. The wind field generated by the rotors in low-altitude operations will cause the deformation of rice crops, and may affect the acquisition of the true spectral information. In this study, a low-altitude UAV remote sensing simulation platform and a triple-direction wind field wireless sensor network system were built to explore the wind field distribution law. Combined with the multi-spectral images of the rice canopy, the influence of wind field on the spectral information acquisition was analyzed through variance and regression analysis. The results showed that the Z-direction wind field of UAV rotors dominated along three directions (X, Y, and Z). The coefficient of determination (R2) of three linear regression models for Normalized Difference Vegetation Index (NDVI), Ratio Vegetation Index (RVI), and Canopy Coverage Rate (CCR) was 0.782, 0.749, and 0.527, respectively. Therefore, the multi-rotor UAV wind field had an impact on the spectral information acquisition of rice canopy, and this influence could eventually affect the assessment of rice growth status. The models established in this study could provide a reference for the revised model of spectral indices, and offer guidance for the actual operations of low-altitude multi-rotor UAV. Full article
Show Figures

Figure 1

18 pages, 6402 KiB  
Article
Quantifying Lodging Percentage and Lodging Severity Using a UAV-Based Canopy Height Model Combined with an Objective Threshold Approach
by Norman Wilke, Bastian Siegmann, Lasse Klingbeil, Andreas Burkart, Thorsten Kraska, Onno Muller, Anna van Doorn, Sascha Heinemann and Uwe Rascher
Remote Sens. 2019, 11(5), 515; https://doi.org/10.3390/rs11050515 - 03 Mar 2019
Cited by 62 | Viewed by 6705
Abstract
Unmanned aerial vehicles (UAVs) open new opportunities in precision agriculture and phenotyping because of their flexibility and low cost. In this study, the potential of UAV imagery was evaluated to quantify lodging percentage and lodging severity of barley using structure from motion (SfM) [...] Read more.
Unmanned aerial vehicles (UAVs) open new opportunities in precision agriculture and phenotyping because of their flexibility and low cost. In this study, the potential of UAV imagery was evaluated to quantify lodging percentage and lodging severity of barley using structure from motion (SfM) techniques. Traditionally, lodging quantification is based on time-consuming manual field observations. Our UAV-based approach makes use of a quantitative threshold to determine lodging percentage in a first step. The derived lodging estimates showed a very high correlation to reference data (R2 = 0.96, root mean square error (RMSE) = 7.66%) when applied to breeding trials, which could also be confirmed under realistic farming conditions. As a second step, an approach was developed that allows the assessment of lodging severity, information that is important to estimate yield impairment, which also takes the intensity of lodging events into account. Both parameters were tested on three ground sample distances. The lowest spatial resolution acquired from the highest flight altitude (100 m) still led to high accuracy, which increases the practicability of the method for large areas. Our new lodging assessment procedure can be used for insurance applications, precision farming, and selecting for genetic lines with greater lodging resistance in breeding research. Full article
Show Figures

Graphical abstract

19 pages, 2315 KiB  
Article
Spectral Reflectance Modeling by Wavelength Selection: Studying the Scope for Blueberry Physiological Breeding under Contrasting Water Supply and Heat Conditions
by Gustavo A. Lobos, Alejandro Escobar-Opazo, Félix Estrada, Sebastián Romero-Bravo, Miguel Garriga, Alejandro del Pozo, Carlos Poblete-Echeverría, Jaime Gonzalez-Talice, Luis González-Martinez and Peter Caligari
Remote Sens. 2019, 11(3), 329; https://doi.org/10.3390/rs11030329 - 07 Feb 2019
Cited by 20 | Viewed by 4505
Abstract
To overcome the environmental changes occurring now and predicted for the future, it is essential that fruit breeders develop cultivars with better physiological performance. During the last few decades, high-throughput plant phenotyping and phenomics have been developed primarily in cereal breeding programs. In [...] Read more.
To overcome the environmental changes occurring now and predicted for the future, it is essential that fruit breeders develop cultivars with better physiological performance. During the last few decades, high-throughput plant phenotyping and phenomics have been developed primarily in cereal breeding programs. In this study, plant reflectance, at the level of the leaf, was used to assess several physiological traits in five Vaccinium spp. cultivars growing under four controlled conditions (no-stress, water deficit, heat stress, and combined stress). Two modeling methodologies [Multiple Linear Regression (MLR) and Partial Least Squares (PLS)] with or without (W/O) prior wavelength selection (multicollinearity, genetic algorithms, or in combination) were considered. PLS generated better estimates than MLR, although prior wavelength selection improved MLR predictions. When data from the environments were combined, PLS W/O gave the best assessment for most of the traits, while in individual environments, the results varied according to the trait and methodology considered. The highest validation predictions were obtained for chlorophyll a/b (R2Val ≤ 0.87), maximum electron transport rate (R2Val ≤ 0.60), and the irradiance at which the electron transport rate is saturated (R2Val ≤ 0.59). The results of this study, the first to model modulated chlorophyll fluorescence by reflectance, confirming the potential for implementing this tool in blueberry breeding programs, at least for the estimation of a number of important physiological traits. Additionally, the differential effects of the environment on the spectral signature of each cultivar shows this tool could be directly used to assess their tolerance to specific environments. Full article
Show Figures

Graphical abstract

17 pages, 4383 KiB  
Article
Maize Plant Phenotyping: Comparing 3D Laser Scanning, Multi-View Stereo Reconstruction, and 3D Digitizing Estimates
by Yongjian Wang, Weiliang Wen, Sheng Wu, Chuanyu Wang, Zetao Yu, Xinyu Guo and Chunjiang Zhao
Remote Sens. 2019, 11(1), 63; https://doi.org/10.3390/rs11010063 - 31 Dec 2018
Cited by 67 | Viewed by 9303
Abstract
High-throughput phenotyping technologies have become an increasingly important topic of crop science in recent years. Various sensors and data acquisition approaches have been applied to acquire the phenotyping traits. It is quite confusing for crop phenotyping researchers to determine an appropriate way for [...] Read more.
High-throughput phenotyping technologies have become an increasingly important topic of crop science in recent years. Various sensors and data acquisition approaches have been applied to acquire the phenotyping traits. It is quite confusing for crop phenotyping researchers to determine an appropriate way for their application. In this study, three representative three-dimensional (3D) data acquisition approaches, including 3D laser scanning, multi-view stereo (MVS) reconstruction, and 3D digitizing, were evaluated for maize plant phenotyping in multi growth stages. Phenotyping traits accuracy, post-processing difficulty, device cost, data acquisition efficiency, and automation were considered during the evaluation process. 3D scanning provided satisfactory point clouds for medium and high maize plants with acceptable efficiency, while the results were not satisfactory for small maize plants. The equipment used in 3D scanning is expensive, but is highly automatic. MVS reconstruction provided satisfactory point clouds for small and medium plants, and point deviations were observed in upper parts of higher plants. MVS data acquisition, using low-cost cameras, exhibited the highest efficiency among the three evaluated approaches. The one-by-one pipeline data acquisition pattern allows the use of MVS high-throughput in further phenotyping platforms. Undoubtedly, enhancement of point cloud processing technologies is required to improve the extracted phenotyping traits accuracy for both 3D scanning and MVS reconstruction. Finally, 3D digitizing was time-consuming and labor intensive. However, it does not depend on any post-processing algorithms to extract phenotyping parameters and reliable phenotyping traits could be derived. The promising accuracy of 3D digitizing is a better verification choice for other 3D phenotyping approaches. Our study provides clear reference about phenotyping data acquisition of maize plants, especially for the affordable and portable field phenotyping platforms to be developed. Full article
Show Figures

Graphical abstract

25 pages, 10811 KiB  
Article
Intra-Season Crop Height Variability at Commercial Farm Scales Using a Fixed-Wing UAV
by Matteo G. Ziliani, Stephen D. Parkes, Ibrahim Hoteit and Matthew F. McCabe
Remote Sens. 2018, 10(12), 2007; https://doi.org/10.3390/rs10122007 - 11 Dec 2018
Cited by 55 | Viewed by 7624
Abstract
Monitoring the development of vegetation height through time provides a key indicator of crop health and overall condition. Traditional manual approaches for monitoring crop height are generally time consuming, labor intensive and impractical for large-scale operations. Dynamic crop heights collected through the season [...] Read more.
Monitoring the development of vegetation height through time provides a key indicator of crop health and overall condition. Traditional manual approaches for monitoring crop height are generally time consuming, labor intensive and impractical for large-scale operations. Dynamic crop heights collected through the season allow for the identification of within-field problems at critical stages of the growth cycle, providing a mechanism for remedial action to be taken against end of season yield losses. With advances in unmanned aerial vehicle (UAV) technologies, routine monitoring of height is now feasible at any time throughout the growth cycle. To demonstrate this capability, five digital surface maps (DSM) were reconstructed from high-resolution RGB imagery collected over a field of maize during the course of a single growing season. The UAV retrievals were compared against LiDAR scans for the purpose of evaluating the derived point clouds capacity to capture ground surface variability and spatially variable crop height. A strong correlation was observed between structure-from-motion (SfM) derived heights and pixel-to-pixel comparison against LiDAR scan data for the intra-season bare-ground surface (R2 = 0.77 − 0.99, rRMSE = 0.44% − 0.85%), while there was reasonable agreement between canopy comparisons (R2 = 0.57 − 0.65, rRMSE = 37% − 50%). To examine the effect of resolution on retrieval accuracy and processing time, an evaluation of several ground sampling distances (GSD) was also performed. Our results indicate that a 10 cm resolution retrieval delivers a reliable product that provides a compromise between computational cost and spatial fidelity. Overall, UAV retrievals were able to accurately reproduce the observed spatial variability of crop heights within the maize field through the growing season and provide a valuable source of information with which to inform precision agricultural management in an operational context. Full article
Show Figures

Graphical abstract

20 pages, 7947 KiB  
Article
Automated Open Cotton Boll Detection for Yield Estimation Using Unmanned Aircraft Vehicle (UAV) Data
by Junho Yeom, Jinha Jung, Anjin Chang, Murilo Maeda and Juan Landivar
Remote Sens. 2018, 10(12), 1895; https://doi.org/10.3390/rs10121895 - 27 Nov 2018
Cited by 50 | Viewed by 6066
Abstract
Unmanned aerial vehicle (UAV) images have great potential for various agricultural applications. In particular, UAV systems facilitate timely and precise data collection in agriculture fields at high spatial and temporal resolutions. In this study, we propose an automatic open cotton boll detection algorithm [...] Read more.
Unmanned aerial vehicle (UAV) images have great potential for various agricultural applications. In particular, UAV systems facilitate timely and precise data collection in agriculture fields at high spatial and temporal resolutions. In this study, we propose an automatic open cotton boll detection algorithm using ultra-fine spatial resolution UAV images. Seed points for a region growing algorithm were generated hierarchically with a random base for computation efficiency. Cotton boll candidates were determined based on the spatial features of each region growing segment. Spectral threshold values that automatically separate cotton bolls from other non-target objects were derived based on input images for adaptive application. Finally, a binary cotton boll classification was performed using the derived threshold values and other morphological filters to reduce noise from the results. The open cotton boll classification results were validated using reference data and the results showed an accuracy higher than 88% in various evaluation measures. Moreover, the UAV-extracted cotton boll area and actual crop yield had a strong positive correlation (0.8). The proposed method leverages UAV characteristics such as high spatial resolution and accessibility by applying automatic and unsupervised procedures using images from a single date. Additionally, this study verified the extraction of target regions of interest from UAV images for direct yield estimation. Cotton yield estimation models had R2 values between 0.63 and 0.65 and RMSE values between 0.47 kg and 0.66 kg per plot grid. Full article
Show Figures

Graphical abstract

12 pages, 1570 KiB  
Article
How Far Can Consumer-Grade UAV RGB Imagery Describe Crop Production? A 3D and Multitemporal Modeling Approach Applied to Zea mays
by Adrien Michez, Sébastien Bauwens, Yves Brostaux, Marie-Pierre Hiel, Sarah Garré, Philippe Lejeune and Benjamin Dumont
Remote Sens. 2018, 10(11), 1798; https://doi.org/10.3390/rs10111798 - 13 Nov 2018
Cited by 23 | Viewed by 4419
Abstract
In recent decades, remote sensing has increasingly been used to estimate the spatio-temporal evolution of crop biophysical parameters such as the above-ground biomass (AGB). On a local scale, the advent of unmanned aerial vehicles (UAVs) seems to be a promising trade-off between satellite/airborne [...] Read more.
In recent decades, remote sensing has increasingly been used to estimate the spatio-temporal evolution of crop biophysical parameters such as the above-ground biomass (AGB). On a local scale, the advent of unmanned aerial vehicles (UAVs) seems to be a promising trade-off between satellite/airborne and terrestrial remote sensing. This study aims to evaluate the potential of a low-cost UAV RGB solution to predict the final AGB of Zea mays. Besides evaluating the interest of 3D data and multitemporality, our study aims to answer operational questions such as when one should plan a combination of two UAV flights for AGB modeling. In this case, study, final AGB prediction model performance reached 0.55 (R-square) using only UAV information and 0.8 (R-square) when combining UAV information from a single flight with a single-field AGB measurement. The adding of UAV height information to the model improves the quality of the AGB prediction. Performing two flights provides almost systematically an improvement in AGB prediction ability in comparison to most single flights. Our study provides clear insight about how we can counter the low spectral resolution of consumer-grade RGB cameras using height information and multitemporality. Our results highlight the importance of the height information which can be derived from UAV data on one hand, and on the other hand, the lower relative importance of RGB spectral information. Full article
Show Figures

Figure 1

19 pages, 2062 KiB  
Article
High-Throughput Phenotyping of Crop Water Use Efficiency via Multispectral Drone Imagery and a Daily Soil Water Balance Model
by Kelly R. Thorp, Alison L. Thompson, Sara J. Harders, Andrew N. French and Richard W. Ward
Remote Sens. 2018, 10(11), 1682; https://doi.org/10.3390/rs10111682 - 25 Oct 2018
Cited by 47 | Viewed by 6962
Abstract
Improvement of crop water use efficiency (CWUE), defined as crop yield per volume of water used, is an important goal for both crop management and breeding. While many technologies have been developed for measuring crop water use in crop management studies, rarely have [...] Read more.
Improvement of crop water use efficiency (CWUE), defined as crop yield per volume of water used, is an important goal for both crop management and breeding. While many technologies have been developed for measuring crop water use in crop management studies, rarely have these techniques been applied at the scale of breeding plots. The objective was to develop a high-throughput methodology for quantifying water use in a cotton breeding trial at Maricopa, AZ, USA in 2016 and 2017, using evapotranspiration (ET) measurements from a co-located irrigation management trial to evaluate the approach. Approximately weekly overflights with an unmanned aerial system provided multispectral imagery from which plot-level fractional vegetation cover ( f c ) was computed. The f c data were used to drive a daily ET-based soil water balance model for seasonal crop water use quantification. A mixed model statistical analysis demonstrated that differences in ET and CWUE could be discriminated among eight cotton varieties ( p < 0 . 05 ), which were sown at two planting dates and managed with four irrigation levels. The results permitted breeders to identify cotton varieties with more favorable water use characteristics and higher CWUE, indicating that the methodology could become a useful tool for breeding selection. Full article
Show Figures

Graphical abstract

19 pages, 2972 KiB  
Article
Quantitative Identification of Maize Lodging-Causing Feature Factors Using Unmanned Aerial Vehicle Images and a Nomogram Computation
by Liang Han, Guijun Yang, Haikuan Feng, Chengquan Zhou, Hao Yang, Bo Xu, Zhenhai Li and Xiaodong Yang
Remote Sens. 2018, 10(10), 1528; https://doi.org/10.3390/rs10101528 - 23 Sep 2018
Cited by 46 | Viewed by 4946
Abstract
Maize (zee mays L.) is one of the most important grain crops in China. Lodging is a natural disaster that can cause significant yield losses and threaten food security. Lodging identification and analysis contributes to evaluate disaster losses and cultivates lodging-resistant maize varieties. [...] Read more.
Maize (zee mays L.) is one of the most important grain crops in China. Lodging is a natural disaster that can cause significant yield losses and threaten food security. Lodging identification and analysis contributes to evaluate disaster losses and cultivates lodging-resistant maize varieties. In this study, we collected visible and multispectral images with an unmanned aerial vehicle (UAV), and introduce a comprehensive methodology and workflow to extract lodging features from UAV imagery. We use statistical methods to screen several potential feature factors (e.g., texture, canopy structure, spectral characteristics, and terrain), and construct two nomograms (i.e., Model-1 and Model-2) with better validation performance based on selected feature factors. Model-2 was superior to Model-1 in term of its discrimination ability, but had an over-fitting phenomenon when the predicted probability of lodging went from 0.2 to 0.4. The results show that the nomogram could not only predict the occurrence probability of lodging, but also explore the underlying association between maize lodging and the selected feature factors. Compared with spectral features, terrain features, texture features, canopy cover, and genetic background, canopy structural features were more conclusive in discriminating whether maize lodging occurs at the plot scale. Using nomogram analysis, we identified protective factors (i.e., normalized difference vegetation index, NDVI and canopy elevation relief ratio, CRR) and risk factors (i.e., Hcv) related to maize lodging, and also found a problem of terrain spatial variability that is easily overlooked in lodging-resistant breeding trials. Full article
Show Figures

Graphical abstract

17 pages, 6217 KiB  
Article
Evaluating Late Blight Severity in Potato Crops Using Unmanned Aerial Vehicles and Machine Learning Algorithms
by Julio M. Duarte-Carvajalino, Diego F. Alzate, Andrés A. Ramirez, Juan D. Santa-Sepulveda, Alexandra E. Fajardo-Rojas and Mauricio Soto-Suárez
Remote Sens. 2018, 10(10), 1513; https://doi.org/10.3390/rs10101513 - 21 Sep 2018
Cited by 84 | Viewed by 10340
Abstract
This work presents quantitative prediction of severity of the disease caused by Phytophthora infestans in potato crops using machine learning algorithms such as multilayer perceptron, deep learning convolutional neural networks, support vector regression, and random forests. The machine learning algorithms are trained using [...] Read more.
This work presents quantitative prediction of severity of the disease caused by Phytophthora infestans in potato crops using machine learning algorithms such as multilayer perceptron, deep learning convolutional neural networks, support vector regression, and random forests. The machine learning algorithms are trained using datasets extracted from multispectral data captured at the canopy level with an unmanned aerial vehicle, carrying an inexpensive digital camera. The results indicate that deep learning convolutional neural networks, random forests and multilayer perceptron using band differences can predict the level of Phytophthora infestans affectation on potato crops with acceptable accuracy. Full article
Show Figures

Graphical abstract

18 pages, 7214 KiB  
Article
Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape
by Liang Wan, Yijian Li, Haiyan Cen, Jiangpeng Zhu, Wenxin Yin, Weikang Wu, Hongyan Zhu, Dawei Sun, Weijun Zhou and Yong He
Remote Sens. 2018, 10(9), 1484; https://doi.org/10.3390/rs10091484 - 17 Sep 2018
Cited by 99 | Viewed by 11486
Abstract
Remote estimation of flower number in oilseed rape under different nitrogen (N) treatments is imperative in precision agriculture and field remote sensing, which can help to predict the yield of oilseed rape. In this study, an unmanned aerial vehicle (UAV) equipped with Red [...] Read more.
Remote estimation of flower number in oilseed rape under different nitrogen (N) treatments is imperative in precision agriculture and field remote sensing, which can help to predict the yield of oilseed rape. In this study, an unmanned aerial vehicle (UAV) equipped with Red Green Blue (RGB) and multispectral cameras was used to acquire a series of field images at the flowering stage, and the flower number was manually counted as a reference. Images of the rape field were first classified using K-means method based on Commission Internationale de l’Éclairage (CIE) L*a*b* space, and the result showed that classified flower coverage area (FCA) possessed a high correlation with the flower number (r2 = 0.89). The relationships between ten commonly used vegetation indices (VIs) extracted from UAV-based RGB and multispectral images and the flower number were investigated, and the VIs of Normalized Green Red Difference Index (NGRDI), Red Green Ratio Index (RGRI) and Modified Green Red Vegetation Index (MGRVI) exhibited the highest correlation to the flower number with the absolute correlation coefficient (r) of 0.91. Random forest (RF) model was developed to predict the flower number, and a good performance was achieved with all UAV variables (r2 = 0.93 and RMSEP = 16.18), while the optimal subset regression (OSR) model was further proposed to simplify the RF model, and a better result with r2 = 0.95 and RMSEP = 14.13 was obtained with the variable combination of RGRI, normalized difference spectral index (NDSI (944, 758)) and FCA. Our findings suggest that combining VIs and image classification from UAV-based RGB and multispectral images possesses the potential of estimating flower number in oilseed rape. Full article
Show Figures

Graphical abstract

20 pages, 5036 KiB  
Article
Three-Dimensional Reconstruction of Soybean Canopies Using Multisource Imaging for Phenotyping Analysis
by Haiou Guan, Meng Liu, Xiaodan Ma and Song Yu
Remote Sens. 2018, 10(8), 1206; https://doi.org/10.3390/rs10081206 - 01 Aug 2018
Cited by 22 | Viewed by 5671
Abstract
Geometric three-dimensional (3D) reconstruction has emerged as a powerful tool for plant phenotyping and plant breeding. Although laser scanning is one of the most intensely used sensing techniques for 3D reconstruction projects, it still has many limitations, such as the high investment cost. [...] Read more.
Geometric three-dimensional (3D) reconstruction has emerged as a powerful tool for plant phenotyping and plant breeding. Although laser scanning is one of the most intensely used sensing techniques for 3D reconstruction projects, it still has many limitations, such as the high investment cost. To overcome such limitations, in the present study, a low-cost, novel, and efficient imaging system consisting of a red-green-blue (RGB) camera and a photonic mixer detector (PMD) was developed, and its usability for plant phenotyping was demonstrated via a 3D reconstruction of a soybean plant that contains color information. To reconstruct soybean canopies, a density-based spatial clustering of applications with noise (DBSCAN) algorithm was used to extract canopy information from the raw 3D point cloud. Principal component analysis (PCA) and iterative closest point (ICP) algorithms were then used to register the multisource images for the 3D reconstruction of a soybean plant from both the side and top views. We then assessed phenotypic traits such as plant height and the greenness index based on the deviations of test samples. The results showed that compared with manual measurements, the side view-based assessments yielded a determination coefficient (R2) of 0.9890 for the estimation of soybean height and a R2 of 0.6059 for the estimation of soybean canopy greenness index; the top view-based assessment yielded a R2 of 0.9936 for the estimation of soybean height and a R2 of 0.8864 for the estimation of soybean canopy greenness. Together, the results indicated that an assembled 3D imaging device applying the algorithms developed in this study could be used as a reliable and robust platform for plant phenotyping, and potentially for automated and high-throughput applications under both natural light and indoor conditions. Full article
Show Figures

Graphical abstract

24 pages, 8308 KiB  
Article
A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera
by Jibo Yue, Haikuan Feng, Xiuliang Jin, Huanhuan Yuan, Zhenhai Li, Chengquan Zhou, Guijun Yang and Qingjiu Tian
Remote Sens. 2018, 10(7), 1138; https://doi.org/10.3390/rs10071138 - 18 Jul 2018
Cited by 127 | Viewed by 8821
Abstract
Timely and accurate estimates of crop parameters are crucial for agriculture management. Unmanned aerial vehicles (UAVs) carrying sophisticated cameras are very pertinent for this work because they can obtain remote-sensing images with higher temporal, spatial, and ground resolution than satellites. In this study, [...] Read more.
Timely and accurate estimates of crop parameters are crucial for agriculture management. Unmanned aerial vehicles (UAVs) carrying sophisticated cameras are very pertinent for this work because they can obtain remote-sensing images with higher temporal, spatial, and ground resolution than satellites. In this study, we evaluated (i) the performance of crop parameters estimates using a near-surface spectroscopy (350~2500 nm, 3 nm at 700 nm, 8.5 nm at 1400 nm, 6.5 nm at 2100 nm), a UAV-mounted snapshot hyperspectral sensor (450~950 nm, 8 nm at 532 nm) and a high-definition digital camera (Visible, R, G, B); (ii) the crop surface models (CSMs), RGB-based vegetation indices (VIs), hyperspectral-based VIs, and methods combined therefrom to make multi-temporal estimates of crop parameters and to map the parameters. The estimated leaf area index (LAI) and above-ground biomass (AGB) are obtained by using linear and exponential equations, random forest (RF) regression, and partial least squares regression (PLSR) to combine the UAV based spectral VIs and crop heights (from the CSMs). The results show that: (i) spectral VIs correlate strongly with LAI and AGB over single growing stages when crop height correlates positively with AGB over multiple growth stages; (ii) the correlation between the VIs multiplying crop height and AGB is greater than that between a single VI and crop height; (iii) the AGB estimate from the UAV-mounted snapshot hyperspectral sensor and high-definition digital camera is similar to the results from the ground spectrometer when using the combined methods (i.e., using VIs multiplying crop height, RF and PLSR to combine VIs and crop heights); and (iv) the spectral performance of the sensors is crucial in LAI estimates (the wheat LAI cannot be accurately estimated over multiple growing stages when using only crop height). The LAI estimates ranked from best to worst are ground spectrometer, UAV snapshot hyperspectral sensor, and UAV high-definition digital camera. Full article
Show Figures

Graphical abstract

19 pages, 4301 KiB  
Article
Quantitative Estimation of Wheat Phenotyping Traits Using Ground and Aerial Imagery
by Zohaib Khan, Joshua Chopin, Jinhai Cai, Vahid-Rahimi Eichi, Stephan Haefele and Stanley J. Miklavcic
Remote Sens. 2018, 10(6), 950; https://doi.org/10.3390/rs10060950 - 14 Jun 2018
Cited by 33 | Viewed by 6276
Abstract
This study evaluates an aerial and ground imaging platform for assessment of canopy development in a wheat field. The dependence of two canopy traits, height and vigour, on fertilizer treatment was observed in a field trial comprised of ten varieties of spring wheat. [...] Read more.
This study evaluates an aerial and ground imaging platform for assessment of canopy development in a wheat field. The dependence of two canopy traits, height and vigour, on fertilizer treatment was observed in a field trial comprised of ten varieties of spring wheat. A custom-built mobile ground platform (MGP) and an unmanned aerial vehicle (UAV) were deployed at the experimental site for standard red, green and blue (RGB) image collection on five occasions. Meanwhile, reference field measurements of canopy height and vigour were manually recorded during the growing season. Canopy level estimates of height and vigour for each variety and treatment were computed by image analysis. The agreement between estimates from each platform and reference measurements was statistically analysed. Estimates of canopy height derived from MGP imagery were more accurate (RMSE = 3.95 cm, R2 = 0.94) than estimates derived from UAV imagery (RMSE = 6.64 cm, R2 = 0.85). In contrast, vigour was better estimated using the UAV imagery (RMSE = 0.057, R2 = 0.57), compared to MGP imagery (RMSE = 0.063, R2 = 0.42), albeit with a significant fixed and proportional bias. The ability of the platforms to capture differential development of traits as a function of fertilizer treatment was also investigated. Both imaging methodologies observed a higher median canopy height of treated plots compared with untreated plots throughout the season, and a greater median vigour of treated plots compared with untreated plots exhibited in the early growth stages. While the UAV imaging provides a high-throughput method for canopy-level trait determination, the MGP imaging captures subtle canopy structures, potentially useful for fine-grained analyses of plants. Full article
Show Figures

Graphical abstract

21 pages, 15356 KiB  
Article
Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects
by Kasper Johansen, Tri Raharjo and Matthew F. McCabe
Remote Sens. 2018, 10(6), 854; https://doi.org/10.3390/rs10060854 - 01 Jun 2018
Cited by 101 | Viewed by 9817
Abstract
Unmanned aerial vehicles (UAV) provide an unprecedented capacity to monitor the development and dynamics of tree growth and structure through time. It is generally thought that the pruning of tree crops encourages new growth, has a positive effect on fruiting, makes fruit-picking easier, [...] Read more.
Unmanned aerial vehicles (UAV) provide an unprecedented capacity to monitor the development and dynamics of tree growth and structure through time. It is generally thought that the pruning of tree crops encourages new growth, has a positive effect on fruiting, makes fruit-picking easier, and may increase yield, as it increases light interception and tree crown surface area. To establish the response of pruning in an orchard of lychee trees, an assessment of changes in tree structure, i.e., tree crown perimeter, width, height, area and Plant Projective Cover (PPC), was undertaken using multi-spectral UAV imagery collected before and after a pruning event. While tree crown perimeter, width and area could be derived directly from the delineated tree crowns, height was estimated from a produced canopy height model and PPC was most accurately predicted based on the NIR band. Pre- and post-pruning results showed significant differences in all measured tree structural parameters, including an average decrease in tree crown perimeter of 1.94 m, tree crown width of 0.57 m, tree crown height of 0.62 m, tree crown area of 3.5 m2, and PPC of 14.8%. In order to provide guidance on data collection protocols for orchard management, the impact of flying height variations was also examined, offering some insight into the influence of scale and the scalability of this UAV-based approach for larger orchards. The different flying heights (i.e., 30, 50 and 70 m) produced similar measurements of tree crown width and PPC, while tree crown perimeter, area and height measurements decreased with increasing flying height. Overall, these results illustrate that routine collection of multi-spectral UAV imagery can provide a means of assessing pruning effects on changes in tree structure in commercial orchards, and highlight the importance of collecting imagery with consistent flight configurations, as varying flying heights may cause changes to tree structural measurements. Full article
Show Figures

Graphical abstract

19 pages, 3382 KiB  
Article
Time-Series Multispectral Indices from Unmanned Aerial Vehicle Imagery Reveal Senescence Rate in Bread Wheat
by Muhammad Adeel Hassan, Mengjiao Yang, Awais Rasheed, Xiuliang Jin, Xianchun Xia, Yonggui Xiao and Zhonghu He
Remote Sens. 2018, 10(6), 809; https://doi.org/10.3390/rs10060809 - 23 May 2018
Cited by 102 | Viewed by 9171
Abstract
Detection of senescence’s dynamics in crop breeding is time consuming and needs considerable details regarding its rate of progression and intensity. Normalized difference red-edge index (NDREI) along with four other spectral vegetative indices (SVIs) derived from unmanned aerial vehicle (UAV) based spatial imagery, [...] Read more.
Detection of senescence’s dynamics in crop breeding is time consuming and needs considerable details regarding its rate of progression and intensity. Normalized difference red-edge index (NDREI) along with four other spectral vegetative indices (SVIs) derived from unmanned aerial vehicle (UAV) based spatial imagery, were evaluated for rapid and accurate prediction of senescence. For this, 32 selected winter wheat genotypes were planted under full and limited irrigation treatments. Significant variations for all five SVIs: green normalize difference vegetation index (GNDVI), simple ratio (SR), green chlorophyll index (GCI), red-edge chlorophyll index (RECI), and normalized difference red-edge index (NDREI) among genotypes and between treatments, were observed from heading to late grain filling stages. The SVIs showed strong relationship (R2 = 0.69 to 0.78) with handheld measurements of chlorophyll and leaf area index (LAI), while negatively correlated (R2 = 0.75 to 0.77) with canopy temperature (CT) across the treatments. NDREI as a new SVI showed higher correlations with ground data under both treatments, similarly as exhibited by other four SVIs. There were medium to strong correlations (r = 0.23–0.63) among SVIs, thousand grain weight (TGW) and grain yield (GY) under both treatments. Senescence rate was calculated by decreasing values of SVIs from their peak values at heading stage, while variance for senescence rate among genotypes and between treatments could be explained by SVIs variations. Under limited irrigation, 10% to 15% higher senescence rate was detected as compared with full irrigation. Principle component analysis corroborated the negative association of high senescence rate with TGW and GY. Some genotypes, such as Beijing 0045, Nongda 5181, and Zhongmai 175, were selected with low senescence rate, stable TGW and GY in both full and limited irrigation treatments, nearly in accordance with the actual performance of these cultivars in field. Thus, SVIs derived from UAV appeared as a promising tool for rapid and precise estimation of senescence rate at maturation stages. Full article
Show Figures

Graphical abstract

18 pages, 5928 KiB  
Article
Estimation of Vegetable Crop Parameter by Multi-temporal UAV-Borne Images
by Thomas Moeckel, Supriya Dayananda, Rama Rao Nidamanuri, Sunil Nautiyal, Nagaraju Hanumaiah, Andreas Buerkert and Michael Wachendorf
Remote Sens. 2018, 10(5), 805; https://doi.org/10.3390/rs10050805 - 22 May 2018
Cited by 58 | Viewed by 8320
Abstract
3D point cloud analysis of imagery collected by unmanned aerial vehicles (UAV) has been shown to be a valuable tool for estimation of crop phenotypic traits, such as plant height, in several species. Spatial information about these phenotypic traits can be used to [...] Read more.
3D point cloud analysis of imagery collected by unmanned aerial vehicles (UAV) has been shown to be a valuable tool for estimation of crop phenotypic traits, such as plant height, in several species. Spatial information about these phenotypic traits can be used to derive information about other important crop characteristics, like fresh biomass yield, which could not be derived directly from the point clouds. Previous approaches have often only considered single date measurements using a single point cloud derived metric for the respective trait. Furthermore, most of the studies focused on plant species with a homogenous canopy surface. The aim of this study was to assess the applicability of UAV imagery for capturing crop height information of three vegetables (crops eggplant, tomato, and cabbage) with a complex vegetation canopy surface during a complete crop growth cycle to infer biomass. Additionally, the effect of crop development stage on the relationship between estimated crop height and field measured crop height was examined. Our study was conducted in an experimental layout at the University of Agricultural Science in Bengaluru, India. For all the crops, the crop height and the biomass was measured at five dates during one crop growth cycle between February and May 2017 (average crop height was 42.5, 35.5, and 16.0 cm for eggplant, tomato, and cabbage). Using a structure from motion approach, a 3D point cloud was created for each crop and sampling date. In total, 14 crop height metrics were extracted from the point clouds. Machine learning methods were used to create prediction models for vegetable crop height. The study demonstrates that the monitoring of crop height using an UAV during an entire growing period results in detailed and precise estimates of crop height and biomass for all three crops (R2 ranging from 0.87 to 0.97, bias ranging from −0.66 to 0.45 cm). The effect of crop development stage on the predicted crop height was found to be substantial (e.g., median deviation increased from 1% to 20% for eggplant) influencing the strength and consistency of the relationship between point cloud metrics and crop height estimates and, thus, should be further investigated. Altogether the results of the study demonstrate that point cloud generated from UAV-based RGB imagery can be used to effectively measure vegetable crop biomass in larger areas (relative error = 17.6%, 19.7%, and 15.2% for eggplant, tomato, and cabbage, respectively) with a similar accuracy as biomass prediction models based on measured crop height (relative error = 21.6, 18.8, and 15.2 for eggplant, tomato, and cabbage). Full article
Show Figures

Graphical abstract

21 pages, 12732 KiB  
Article
Aerial and Ground Based Sensing of Tolerance to Beet Cyst Nematode in Sugar Beet
by Samuel Joalland, Claudio Screpanti, Hubert Vincent Varella, Marie Reuther, Mareike Schwind, Christian Lang, Achim Walter and Frank Liebisch
Remote Sens. 2018, 10(5), 787; https://doi.org/10.3390/rs10050787 - 19 May 2018
Cited by 41 | Viewed by 6730
Abstract
The rapid development of image-based phenotyping methods based on ground-operating devices or unmanned aerial vehicles (UAV) has increased our ability to evaluate traits of interest for crop breeding in the field. A field site infested with beet cyst nematode (BCN) and planted with [...] Read more.
The rapid development of image-based phenotyping methods based on ground-operating devices or unmanned aerial vehicles (UAV) has increased our ability to evaluate traits of interest for crop breeding in the field. A field site infested with beet cyst nematode (BCN) and planted with four nematode susceptible cultivars and five tolerant cultivars was investigated at different times during the growing season. We compared the ability of spectral, hyperspectral, canopy height- and temperature information derived from handheld and UAV-borne sensors to discriminate susceptible and tolerant cultivars and to predict the final sugar beet yield. Spectral indices (SIs) related to chlorophyll, nitrogen or water allowed differentiating nematode susceptible and tolerant cultivars (cultivar type) from the same genetic background (breeder). Discrimination between the cultivar types was easier at advanced stages when the nematode pressure was stronger and the plants and canopies further developed. The canopy height (CH) allowed differentiating cultivar type as well but was much more efficient from the UAV compared to manual field assessment. Canopy temperatures also allowed ranking cultivars according to their nematode tolerance level. Combinations of SIs in multivariate analysis and decision trees improved differentiation of cultivar type and classification of genetic background. Thereby, SIs and canopy temperature proved to be suitable proxies for sugar yield prediction. The spectral information derived from handheld and the UAV-borne sensor did not match perfectly, but both analysis procedures allowed for discrimination between susceptible and tolerant cultivars. This was possible due to successful detection of traits related to BCN tolerance like chlorophyll, nitrogen and water content, which were reduced in cultivars with a low tolerance to BCN. The high correlation between SIs and final sugar beet yield makes the UAV hyperspectral imaging approach very suitable to improve farming practice via maps of yield potential or diseases. Moreover, the study shows the high potential of multi- sensor and parameter combinations for plant phenotyping purposes, in particular for data from UAV-borne sensors that allow for standardized and automated high-throughput data extraction procedures. Full article
Show Figures

Graphical abstract

21 pages, 6496 KiB  
Article
Phenotyping Conservation Agriculture Management Effects on Ground and Aerial Remote Sensing Assessments of Maize Hybrids Performance in Zimbabwe
by Adrian Gracia-Romero, Omar Vergara-Díaz, Christian Thierfelder, Jill E. Cairns, Shawn C. Kefauver and José L. Araus
Remote Sens. 2018, 10(2), 349; https://doi.org/10.3390/rs10020349 - 24 Feb 2018
Cited by 41 | Viewed by 8461
Abstract
In the coming decades, Sub-Saharan Africa (SSA) faces challenges to sustainably increase food production while keeping pace with continued population growth. Conservation agriculture (CA) has been proposed to enhance soil health and productivity to respond to this situation. Maize is the main staple [...] Read more.
In the coming decades, Sub-Saharan Africa (SSA) faces challenges to sustainably increase food production while keeping pace with continued population growth. Conservation agriculture (CA) has been proposed to enhance soil health and productivity to respond to this situation. Maize is the main staple food in SSA. To increase maize yields, the selection of suitable genotypes and management practices for CA conditions has been explored using remote sensing tools. They may play a fundamental role towards overcoming the traditional limitations of data collection and processing in large scale phenotyping studies. We present the result of a study in which Red-Green-Blue (RGB) and multispectral indexes were evaluated for assessing maize performance under conventional ploughing (CP) and CA practices. Eight hybrids under different planting densities and tillage practices were tested. The measurements were conducted on seedlings at ground level (0.8 m) and from an unmanned aerial vehicle (UAV) platform (30 m), causing a platform proximity effect on the images resolution that did not have any negative impact on the performance of the indexes. Most of the calculated indexes (Green Area (GA) and Normalized Difference Vegetation Index (NDVI)) were significantly affected by tillage conditions increasing their values from CP to CA. Indexes derived from the RGB-images related to canopy greenness performed better at assessing yield differences, potentially due to the greater resolution of the RGB compared with the multispectral data, although this performance was more precise for CP than CA. The correlations of the multispectral indexes with yield were improved by applying a soil-mask derived from a NDVI threshold with the aim of corresponding pixels with vegetation. The results of this study highlight the applicability of remote sensing approaches based on RGB images to the assessment of crop performance and hybrid choice. Full article
Show Figures

Graphical abstract

13 pages, 5074 KiB  
Article
High-Throughput Phenotyping of Canopy Cover and Senescence in Maize Field Trials Using Aerial Digital Canopy Imaging
by Richard Makanza, Mainassara Zaman-Allah, Jill E. Cairns, Cosmos Magorokosho, Amsal Tarekegne, Mike Olsen and Boddupalli M. Prasanna
Remote Sens. 2018, 10(2), 330; https://doi.org/10.3390/rs10020330 - 23 Feb 2018
Cited by 88 | Viewed by 10515
Abstract
In the crop breeding process, the use of data collection methods that allow reliable assessment of crop adaptation traits, faster and cheaper than those currently in use, can significantly improve resource use efficiency by reducing selection cost and can contribute to increased genetic [...] Read more.
In the crop breeding process, the use of data collection methods that allow reliable assessment of crop adaptation traits, faster and cheaper than those currently in use, can significantly improve resource use efficiency by reducing selection cost and can contribute to increased genetic gain through improved selection efficiency. Current methods to estimate crop growth (ground canopy cover) and leaf senescence are essentially manual and/or by visual scoring, and are therefore often subjective, time consuming, and expensive. Aerial sensing technologies offer radically new perspectives for assessing these traits at low cost, faster, and in a more objective manner. We report the use of an unmanned aerial vehicle (UAV) equipped with an RGB camera for crop cover and canopy senescence assessment in maize field trials. Aerial-imaging-derived data showed a moderately high heritability for both traits with a significant genetic correlation with grain yield. In addition, in some cases, the correlation between the visual assessment (prone to subjectivity) of crop senescence and the senescence index, calculated from aerial imaging data, was significant. We concluded that the UAV-based aerial sensing platforms have great potential for monitoring the dynamics of crop canopy characteristics like crop vigor through ground canopy cover and canopy senescence in breeding trial plots. This is anticipated to assist in improving selection efficiency through higher accuracy and precision, as well as reduced time and cost of data collection. Full article
Show Figures

Graphical abstract

16 pages, 5858 KiB  
Article
Estimating Barley Biomass with Crop Surface Models from Oblique RGB Imagery
by Sebastian Brocks and Georg Bareth
Remote Sens. 2018, 10(2), 268; https://doi.org/10.3390/rs10020268 - 09 Feb 2018
Cited by 63 | Viewed by 6415
Abstract
Non-destructive monitoring of crop development is of key interest for agronomy and crop breeding. Crop Surface Models (CSMs) representing the absolute height of the plant canopy are a tool for this. In this study, fresh and dry barley biomass per plot are estimated [...] Read more.
Non-destructive monitoring of crop development is of key interest for agronomy and crop breeding. Crop Surface Models (CSMs) representing the absolute height of the plant canopy are a tool for this. In this study, fresh and dry barley biomass per plot are estimated from CSM-derived plot-wise plant heights. The CSMs are generated in a semi-automated manner using Structure-from-Motion (SfM)/Multi-View-Stereo (MVS) software from oblique stereo RGB images. The images were acquired automatedly from consumer grade smart cameras mounted at an elevated position on a lifting hoist. Fresh and dry biomass were measured destructively at four dates each in 2014 and 2015. We used exponential and simple linear regression based on different calibration/validation splits. Coefficients of determination R 2 between 0.55 and 0.79 and root mean square errors (RMSE) between 97 and 234 g/m2 are reached for the validation of predicted vs. observed dry biomass, while Willmott’s refined index of model performance d r ranges between 0.59 and 0.77. For fresh biomass, R 2 values between 0.34 and 0.61 are reached, with root mean square errors (RMSEs) between 312 and 785 g/m2 and d r between 0.39 and 0.66. We therefore established the possibility of using this novel low-cost system to estimate barley dry biomass over time. Full article
Show Figures

Graphical abstract

24 pages, 9483 KiB  
Article
Recognition of Wheat Spike from Field Based Phenotype Platform Using Multi-Sensor Fusion and Improved Maximum Entropy Segmentation Algorithms
by Chengquan Zhou, Dong Liang, Xiaodong Yang, Bo Xu and Guijun Yang
Remote Sens. 2018, 10(2), 246; https://doi.org/10.3390/rs10020246 - 06 Feb 2018
Cited by 48 | Viewed by 6289
Abstract
To obtain an accurate count of wheat spikes, which is crucial for estimating yield, this paper proposes a new algorithm that uses computer vision to achieve this goal from an image. First, a home-built semi-autonomous multi-sensor field-based phenotype platform (FPP) is used to [...] Read more.
To obtain an accurate count of wheat spikes, which is crucial for estimating yield, this paper proposes a new algorithm that uses computer vision to achieve this goal from an image. First, a home-built semi-autonomous multi-sensor field-based phenotype platform (FPP) is used to obtain orthographic images of wheat plots at the filling stage. The data acquisition system of the FPP provides high-definition RGB images and multispectral images of the corresponding quadrats. Then, the high-definition panchromatic images are obtained by fusion of three channels of RGB. The Gram–Schmidt fusion algorithm is then used to fuse these multispectral and panchromatic images, thereby improving the color identification degree of the targets. Next, the maximum entropy segmentation method is used to do the coarse-segmentation. The threshold of this method is determined by a firefly algorithm based on chaos theory (FACT), and then a morphological filter is used to de-noise the coarse-segmentation results. Finally, morphological reconstruction theory is applied to segment the adhesive part of the de-noised image and realize the fine-segmentation of the image. The computer-generated counting results for the wheat plots, using independent regional statistical function in Matlab R2017b software, are then compared with field measurements which indicate that the proposed method provides a more accurate count of wheat spikes when compared with other traditional fusion and segmentation methods mentioned in this paper. Full article
Show Figures

Graphical abstract

23 pages, 7046 KiB  
Article
A Comparison of Regression Techniques for Estimation of Above-Ground Winter Wheat Biomass Using Near-Surface Spectroscopy
by Jibo Yue, Haikuan Feng, Guijun Yang and Zhenhai Li
Remote Sens. 2018, 10(1), 66; https://doi.org/10.3390/rs10010066 - 05 Jan 2018
Cited by 145 | Viewed by 8996
Abstract
Above-ground biomass (AGB) provides a vital link between solar energy consumption and yield, so its correct estimation is crucial to accurately monitor crop growth and predict yield. In this work, we estimate AGB by using 54 vegetation indexes (e.g., Normalized Difference Vegetation Index, [...] Read more.
Above-ground biomass (AGB) provides a vital link between solar energy consumption and yield, so its correct estimation is crucial to accurately monitor crop growth and predict yield. In this work, we estimate AGB by using 54 vegetation indexes (e.g., Normalized Difference Vegetation Index, Soil-Adjusted Vegetation Index) and eight statistical regression techniques: artificial neural network (ANN), multivariable linear regression (MLR), decision-tree regression (DT), boosted binary regression tree (BBRT), partial least squares regression (PLSR), random forest regression (RF), support vector machine regression (SVM), and principal component regression (PCR), which are used to analyze hyperspectral data acquired by using a field spectrophotometer. The vegetation indexes (VIs) determined from the spectra were first used to train regression techniques for modeling and validation to select the best VI input, and then summed with white Gaussian noise to study how remote sensing errors affect the regression techniques. Next, the VIs were divided into groups of different sizes by using various sampling methods for modeling and validation to test the stability of the techniques. Finally, the AGB was estimated by using a leave-one-out cross validation with these powerful techniques. The results of the study demonstrate that, of the eight techniques investigated, PLSR and MLR perform best in terms of stability and are most suitable when high-accuracy and stable estimates are required from relatively few samples. In addition, RF is extremely robust against noise and is best suited to deal with repeated observations involving remote-sensing data (i.e., data affected by atmosphere, clouds, observation times, and/or sensor noise). Finally, the leave-one-out cross-validation method indicates that PLSR provides the highest accuracy (R2 = 0.89, RMSE = 1.20 t/ha, MAE = 0.90 t/ha, NRMSE = 0.07, CV (RMSE) = 0.18); thus, PLSR is best suited for works requiring high-accuracy estimation models. The results indicate that all these techniques provide impressive accuracy. The comparison and analysis provided herein thus reveals the advantages and disadvantages of the ANN, MLR, DT, BBRT, PLSR, RF, SVM, and PCR techniques and can help researchers to build efficient AGB-estimation models. Full article
Show Figures

Graphical abstract

4420 KiB  
Article
High Throughput Phenotyping of Blueberry Bush Morphological Traits Using Unmanned Aerial Systems
by Aaron Patrick and Changying Li
Remote Sens. 2017, 9(12), 1250; https://doi.org/10.3390/rs9121250 - 02 Dec 2017
Cited by 32 | Viewed by 6953
Abstract
Phenotyping morphological traits of blueberry bushes in the field is important for selecting genotypes that are easily harvested by mechanical harvesters. Morphological data can also be used to assess the effects of crop treatments such as plant growth regulators, fertilizers, and environmental conditions. [...] Read more.
Phenotyping morphological traits of blueberry bushes in the field is important for selecting genotypes that are easily harvested by mechanical harvesters. Morphological data can also be used to assess the effects of crop treatments such as plant growth regulators, fertilizers, and environmental conditions. This paper investigates the feasibility and accuracy of an inexpensive unmanned aerial system in determining the morphological characteristics of blueberry bushes. Color images collected by a quadcopter are processed into three-dimensional point clouds via structure from motion algorithms. Bush height, extents, canopy area, and volume, in addition to crown diameter and width, are derived and referenced to ground truth. In an experimental farm, twenty-five bushes were imaged by a quadcopter. Height and width dimensions achieved a mean absolute error of 9.85 cm before and 5.82 cm after systematic under-estimation correction. Strong correlation was found between manual and image derived bush volumes and their traditional growth indices. Hedgerows of three Southern Highbush varieties were imaged at a commercial farm to extract five morphological features (base angle, blockiness, crown percent height, crown ratio, and vegetation ratio) associated with cultivation and machine harvestability. The bushes were found to be partially separable by multivariate analysis. The methodology developed from this study is not only valuable for plant breeders to screen genotypes with bush morphological traits that are suitable for machine harvest, but can also aid producers in crop management such as pruning and plot layout organization. Full article
Show Figures

Figure 1

14036 KiB  
Article
Evaluation of Seed Emergence Uniformity of Mechanically Sown Wheat with UAV RGB Imagery
by Tao Liu, Rui Li, Xiuliang Jin, Jinfeng Ding, Xinkai Zhu, Chengming Sun and Wenshan Guo
Remote Sens. 2017, 9(12), 1241; https://doi.org/10.3390/rs9121241 - 01 Dec 2017
Cited by 39 | Viewed by 7645
Abstract
The uniformity of wheat seed emergence is an important characteristic used to evaluate cultivars, cultivation mode and field management. Currently, researchers typically investigated the uniformity of seed emergence by manual measurement, a time-consuming and laborious process. This study employed field RGB images from [...] Read more.
The uniformity of wheat seed emergence is an important characteristic used to evaluate cultivars, cultivation mode and field management. Currently, researchers typically investigated the uniformity of seed emergence by manual measurement, a time-consuming and laborious process. This study employed field RGB images from unmanned aerial vehicles (UAVs) to obtain information related to the uniformity of wheat seed emergence and missing seedlings. The calculation of the length of areas with missing seedlings in both drill and broadcast sowing can be achieved by using an area localization algorithm, which facilitated the comprehensive evaluation of uniformity of seed emergence. Through a comparison between UAV images and the results of manual surveys used to gather data on the uniformity of seed emergence, the root-mean-square error (RMSE) was 0.44 for broadcast sowing and 0.64 for drill sowing. The RMSEs of the numbers of missing seedling regions for broadcast and drill sowing were 1.39 and 3.99, respectively. The RMSEs of the lengths of the missing seedling regions were 12.39 cm for drill sowing and 0.20 cm2 for broadcast sowing. The UAV image-based method provided a new and greatly improved method for efficiently measuring the uniformity of wheat seed emergence. The proposed method could provide a guideline for the intelligent evaluation of the uniformity of wheat seed emergence. Full article
Show Figures

Graphical abstract

Other

Jump to: Editorial, Research

19 pages, 8034 KiB  
Technical Note
Measuring Canopy Structure and Condition Using Multi-Spectral UAS Imagery in a Horticultural Environment
by Yu-Hsuan Tu, Kasper Johansen, Stuart Phinn and Andrew Robson
Remote Sens. 2019, 11(3), 269; https://doi.org/10.3390/rs11030269 - 30 Jan 2019
Cited by 54 | Viewed by 7935
Abstract
Tree condition, pruning and orchard management practices within intensive horticultural tree crop systems can be determined via measurements of tree structure. Multi-spectral imagery acquired from an unmanned aerial system (UAS) has been demonstrated as an accurate and efficient platform for measuring various tree [...] Read more.
Tree condition, pruning and orchard management practices within intensive horticultural tree crop systems can be determined via measurements of tree structure. Multi-spectral imagery acquired from an unmanned aerial system (UAS) has been demonstrated as an accurate and efficient platform for measuring various tree structural attributes, but research in complex horticultural environments has been limited. This research established a methodology for accurately estimating tree crown height, extent, plant projective cover (PPC) and condition of avocado tree crops, from a UAS platform. Individual tree crowns were delineated using object-based image analysis. In comparison to field measured canopy heights, an image-derived canopy height model provided a coefficient of determination (R2) of 0.65 and relative root mean squared error of 6%. Tree crown length perpendicular to the hedgerow was accurately mapped. PPC was measured using spectral and textural image information and produced an R2 value of 0.62 against field data. A random forest classifier was applied to assign tree condition into four categories in accordance with industry standards, producing out-of-bag accuracies >96%. Our results demonstrate the potential of UAS-based mapping for the provision of information to support the horticulture industry and facilitate orchard-based assessment and management. Full article
Show Figures

Graphical abstract

16 pages, 6248 KiB  
Technical Note
Prediction of Chlorophyll Content in Different Light Areas of Apple Tree Canopies based on the Color Characteristics of 3D Reconstruction
by Xiaodan Ma, Jiarui Feng, Haiou Guan and Gang Liu
Remote Sens. 2018, 10(3), 429; https://doi.org/10.3390/rs10030429 - 10 Mar 2018
Cited by 34 | Viewed by 6645
Abstract
Improving the speed and accuracy of chlorophyll (Ch1) content prediction in different light areas of apple trees is a central priority for understanding the growth response to light intensity and in turn increasing the primary production of apples. In vitro assessment by wet [...] Read more.
Improving the speed and accuracy of chlorophyll (Ch1) content prediction in different light areas of apple trees is a central priority for understanding the growth response to light intensity and in turn increasing the primary production of apples. In vitro assessment by wet chemical extraction is the standard method for leaf chlorophyll determination. This measurement is expensive, laborious, and time-consuming. Over the years, alternative methods—both rapid and nondestructive—were explored, and many vegetation indices (VIs) were developed to retrieve Ch1 content at the canopy level from meter- to decameter-scale reflectance observations, which have lower accuracy due to the possible confounding influence of the canopy structure. Thus, the spatially continuous distribution of Ch1 content in different light areas within an apple tree canopy remains unresolved. Therefore, the objective of this study is to develop methods for Ch1 content estimation in areas of different light intensity by using 3D models with color characteristics acquired by a 3D laser scanner with centimeter spatial resolution. Firstly, to research relative light intensity (RLI), canopies were scanned with a FARO Focus3D 120 laser scanner on a calm day without strong light intensity and then divided into 180 cube units for each canopy according to actual division methods in three-dimensional spaces based on distance information. Meanwhile, four different types of RLI were defined as 0–30%, 30–60%, 60–85%, and 85–100%, respectively, according to the actual division method for tree canopies. Secondly, Ch1 content in the 180 cubic units of each apple tree was measured by a leaf chlorophyll meter (soil and plant analyzer development, SPAD). Then, color characteristics were extracted from each cubic area of the 3D model and calculated by two color variables, which could be regarded as effective indicators of Ch1 content in field crop areas. Finally, to address the complexity and fuzziness of relationships between the color characteristics and Ch1 content of apple tree canopies (which could not be expressed by an accurate mathematical model), a three-layer artificial neural network (ANN) was constructed as a predictive model to find Ch1 content in different light areas in apple tree canopies. The results indicated that the mean highest and mean lowest value of Ch1 content distributed in 60–85% and 0–30% of RLI areas, respectively, and that there was no significant difference between adjacent RLI areas. Additionally, color characteristics changed regularly as the RLI rose within canopies. Moreover, the prediction of Ch1 content was strongly correlated with those of actual measurements (R = 0.9755) by the SPAD leaf chlorophyll meter. In summary, the color characteristics in 3D apple tree canopies combined with ANN technology could be used as a potential rapid technique for predicting Ch1 content in different areas of light in apple tree canopies. Full article
Show Figures

Graphical abstract

4023 KiB  
Technical Note
Estimation of Wheat LAI at Middle to High Levels Using Unmanned Aerial Vehicle Narrowband Multispectral Imagery
by Xia Yao, Ni Wang, Yong Liu, Tao Cheng, Yongchao Tian, Qi Chen and Yan Zhu
Remote Sens. 2017, 9(12), 1304; https://doi.org/10.3390/rs9121304 - 12 Dec 2017
Cited by 108 | Viewed by 8805
Abstract
Leaf area index (LAI) is a significant biophysical variable in the models of hydrology, climatology and crop growth. Rapid monitoring of LAI is critical in modern precision agriculture. Remote sensing (RS) on satellite, aerial and unmanned aerial vehicles (UAVs) has become a popular [...] Read more.
Leaf area index (LAI) is a significant biophysical variable in the models of hydrology, climatology and crop growth. Rapid monitoring of LAI is critical in modern precision agriculture. Remote sensing (RS) on satellite, aerial and unmanned aerial vehicles (UAVs) has become a popular technique in monitoring crop LAI. Among them, UAVs are highly attractive to researchers and agriculturists. However, some of the UAVs vegetation index (VI)—derived LAI models have relatively low accuracy because of the limited number of multispectral bands, especially as they tend to saturate at the middle to high LAI levels, which are the LAI levels of high-yielding wheat crops in China. This study aims to effectively estimate wheat LAI with UAVs narrowband multispectral image (400–800 nm spectral regions, 10 cm resolution) under varying growth conditions during five critical growth stages, and to provide the potential technical support for optimizing the nitrogen fertilization. Results demonstrated that the newly developed LAI model with modified triangular vegetation index (MTVI2) has better accuracy with higher coefficient of determination (Rc2 = 0.79, Rv2 = 0.80) and lower relative root mean squared error (RRMSE = 24%), and higher sensitivity under various LAI values (from 2 to 7), which will broaden the applied range of the new LAI model. Furthermore, this LAI model displayed stable performance under different sub-categories of growth stages, varieties, and eco-sites. In conclusion, this study could provide effective technical support to precisely monitor the crop growth with UAVs in various crop yield levels, which should prove helpful in family farm for the modern agriculture. Full article
Show Figures

Graphical abstract

Back to TopTop