E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery"

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing in Agriculture and Vegetation".

Deadline for manuscript submissions: closed (31 December 2018)

Special Issue Editors

Guest Editor
Dr. Xiuliang Jin

INRA, UMR-EMMAH,UMT-CAPTE, 228 Route de l'aérodrome, CS 40509, F-84914, Avignon, France
Website | E-Mail
Interests: hyperspectral and multispectral remote sensing; crop models; data assimilation; crop phenotyping traits; UAV-based sensors; UGV-based sensors; precision farming
Guest Editor
Dr. Zhenhai Li

ational Engineering Research Center for Information Technology in Agriculture (NERCITA), Beijing Research Center for Information Technology in Agriculture, Beijing Academy of Agriculture and Forestry Sciences, 11 Middle Road, Haidian District, Beijing 100097, China
Website | E-Mail
Interests: hyperspectral and multispectral remote sensing; crop models; grain yield and quality prediction; crop nitrogen monitoring; radiative transfer model
Guest Editor
Prof. Dr. Clement Atzberger

University of Natural Resources and Life Sciences (BOKU), Vienna, Austria
Website | E-Mail
Phone: +43 (1) 47654 5101
Interests: time series analysis; vegetation monitoring and dynamics; land surface phenology; drought early warning systems; EO for agriculture, forestry and natural resource management; imaging spectroscopy; radiative transfer modeling; machine learning; neural nets; vegetation biophysical variables

Special Issue Information

Dear Colleagues,

To meet the global food security challenges under changing climatic scenarios, it is most important to enhance crop yield under resource competence. Accurate and precise measurements of crop phenotyping traits play an important role in harnessing the potentiality of genomic resources in the genetic improvement of crop yield. In traditional crop phenotyping, traits are assessed with statistical analysis methods, which must be done manually. Human effort, time, and resources are needed to measure plant characteristics. With the fast development of Unmanned Ground Vehicle (UGV), Unmanned Aerial Vehicle (UAV), sensor technologies, and image algorithms, the integration of UGV, UAV, sensors and algorithmic applications for automatic crop phenotyping are being handled to overcome the defects of manual techniques. These high-throughput non-invasive crop phenotyping platforms have been used to estimate LAI, canopy cover, nitrogen, chlorophyll, biomass, plant structure, plant density, phenology, leaf health, canopy/leaf temperature, and the physiological state of photosynthetic machinery under different stress conditions. They have become much more advanced in order to provide a solution to genomics-enabled improvements and address our need of precise and efficient phenotyping of crop plants. They will also help in finding more relevant solutions for the major problems that are currently limiting crop production.

This Special Issue is focused on the latest innovative research results in the field of remote sensing technology, senor technologies, and imagery algorithm development and applications specifically addressing issues estimating the crop phenotyping traits based on UGV and UAV imagery. The list below provides a general (but not exhaustive) overview of the topics that are solicited for this Special Issue:

Ø  UGV and UAV platforms application for crop phenotyping traits

Ø  Imagery algorithms (data fusion, segmentation, classification, machine learning, and deep learning, etc.) to estimate crop phenotyping traits

Ø  Sensors (RGB, multispectral, hyperspectral, thermal, Lidar, fluorescence, etc.) application for crop phenotyping traits

Ø  Combination of different sensors data to improve the estimation accuracy of crop phenotyping traits

Ø  Data assimilation of multisource images into two- or three-dimensional crop models

Dr. Xiuliang Jin
Dr. Zhenhai Li
Prof. Dr. Clement Atzberger
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • crop phenotyping traits
  • unmanned ground vehicle
  • unmanned aerial vehicle imagery
  • imagery algorithms
  • segmentation
  • classification
  • machine learning
  • different sensors data
  • data assimilation
  • two or three dimensional crop models

Published Papers (26 papers)

View options order results:
result details:
Displaying articles 1-26
Export citation of selected articles as:

Research

Jump to: Other

Open AccessArticle Spectral Reflectance Modeling by Wavelength Selection: Studying the Scope for Blueberry Physiological Breeding under Contrasting Water Supply and Heat Conditions
Remote Sens. 2019, 11(3), 329; https://doi.org/10.3390/rs11030329
Received: 19 December 2018 / Revised: 28 January 2019 / Accepted: 30 January 2019 / Published: 7 February 2019
PDF Full-text (2315 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
To overcome the environmental changes occurring now and predicted for the future, it is essential that fruit breeders develop cultivars with better physiological performance. During the last few decades, high-throughput plant phenotyping and phenomics have been developed primarily in cereal breeding programs. In [...] Read more.
To overcome the environmental changes occurring now and predicted for the future, it is essential that fruit breeders develop cultivars with better physiological performance. During the last few decades, high-throughput plant phenotyping and phenomics have been developed primarily in cereal breeding programs. In this study, plant reflectance, at the level of the leaf, was used to assess several physiological traits in five Vaccinium spp. cultivars growing under four controlled conditions (no-stress, water deficit, heat stress, and combined stress). Two modeling methodologies [Multiple Linear Regression (MLR) and Partial Least Squares (PLS)] with or without (W/O) prior wavelength selection (multicollinearity, genetic algorithms, or in combination) were considered. PLS generated better estimates than MLR, although prior wavelength selection improved MLR predictions. When data from the environments were combined, PLS W/O gave the best assessment for most of the traits, while in individual environments, the results varied according to the trait and methodology considered. The highest validation predictions were obtained for chlorophyll a/b (R2Val ≤ 0.87), maximum electron transport rate (R2Val ≤ 0.60), and the irradiance at which the electron transport rate is saturated (R2Val ≤ 0.59). The results of this study, the first to model modulated chlorophyll fluorescence by reflectance, confirming the potential for implementing this tool in blueberry breeding programs, at least for the estimation of a number of important physiological traits. Additionally, the differential effects of the environment on the spectral signature of each cultivar shows this tool could be directly used to assess their tolerance to specific environments. Full article
Figures

Graphical abstract

Open AccessArticle Maize Plant Phenotyping: Comparing 3D Laser Scanning, Multi-View Stereo Reconstruction, and 3D Digitizing Estimates
Remote Sens. 2019, 11(1), 63; https://doi.org/10.3390/rs11010063
Received: 27 November 2018 / Revised: 27 December 2018 / Accepted: 28 December 2018 / Published: 31 December 2018
PDF Full-text (4383 KB) | HTML Full-text | XML Full-text
Abstract
High-throughput phenotyping technologies have become an increasingly important topic of crop science in recent years. Various sensors and data acquisition approaches have been applied to acquire the phenotyping traits. It is quite confusing for crop phenotyping researchers to determine an appropriate way for [...] Read more.
High-throughput phenotyping technologies have become an increasingly important topic of crop science in recent years. Various sensors and data acquisition approaches have been applied to acquire the phenotyping traits. It is quite confusing for crop phenotyping researchers to determine an appropriate way for their application. In this study, three representative three-dimensional (3D) data acquisition approaches, including 3D laser scanning, multi-view stereo (MVS) reconstruction, and 3D digitizing, were evaluated for maize plant phenotyping in multi growth stages. Phenotyping traits accuracy, post-processing difficulty, device cost, data acquisition efficiency, and automation were considered during the evaluation process. 3D scanning provided satisfactory point clouds for medium and high maize plants with acceptable efficiency, while the results were not satisfactory for small maize plants. The equipment used in 3D scanning is expensive, but is highly automatic. MVS reconstruction provided satisfactory point clouds for small and medium plants, and point deviations were observed in upper parts of higher plants. MVS data acquisition, using low-cost cameras, exhibited the highest efficiency among the three evaluated approaches. The one-by-one pipeline data acquisition pattern allows the use of MVS high-throughput in further phenotyping platforms. Undoubtedly, enhancement of point cloud processing technologies is required to improve the extracted phenotyping traits accuracy for both 3D scanning and MVS reconstruction. Finally, 3D digitizing was time-consuming and labor intensive. However, it does not depend on any post-processing algorithms to extract phenotyping parameters and reliable phenotyping traits could be derived. The promising accuracy of 3D digitizing is a better verification choice for other 3D phenotyping approaches. Our study provides clear reference about phenotyping data acquisition of maize plants, especially for the affordable and portable field phenotyping platforms to be developed. Full article
Figures

Graphical abstract

Open AccessArticle Intra-Season Crop Height Variability at Commercial Farm Scales Using a Fixed-Wing UAV
Remote Sens. 2018, 10(12), 2007; https://doi.org/10.3390/rs10122007
Received: 24 September 2018 / Revised: 3 December 2018 / Accepted: 4 December 2018 / Published: 11 December 2018
PDF Full-text (10811 KB) | HTML Full-text | XML Full-text
Abstract
Monitoring the development of vegetation height through time provides a key indicator of crop health and overall condition. Traditional manual approaches for monitoring crop height are generally time consuming, labor intensive and impractical for large-scale operations. Dynamic crop heights collected through the season [...] Read more.
Monitoring the development of vegetation height through time provides a key indicator of crop health and overall condition. Traditional manual approaches for monitoring crop height are generally time consuming, labor intensive and impractical for large-scale operations. Dynamic crop heights collected through the season allow for the identification of within-field problems at critical stages of the growth cycle, providing a mechanism for remedial action to be taken against end of season yield losses. With advances in unmanned aerial vehicle (UAV) technologies, routine monitoring of height is now feasible at any time throughout the growth cycle. To demonstrate this capability, five digital surface maps (DSM) were reconstructed from high-resolution RGB imagery collected over a field of maize during the course of a single growing season. The UAV retrievals were compared against LiDAR scans for the purpose of evaluating the derived point clouds capacity to capture ground surface variability and spatially variable crop height. A strong correlation was observed between structure-from-motion (SfM) derived heights and pixel-to-pixel comparison against LiDAR scan data for the intra-season bare-ground surface (R2 = 0.77 − 0.99, rRMSE = 0.44% − 0.85%), while there was reasonable agreement between canopy comparisons (R2 = 0.57 − 0.65, rRMSE = 37% − 50%). To examine the effect of resolution on retrieval accuracy and processing time, an evaluation of several ground sampling distances (GSD) was also performed. Our results indicate that a 10 cm resolution retrieval delivers a reliable product that provides a compromise between computational cost and spatial fidelity. Overall, UAV retrievals were able to accurately reproduce the observed spatial variability of crop heights within the maize field through the growing season and provide a valuable source of information with which to inform precision agricultural management in an operational context. Full article
Figures

Graphical abstract

Open AccessArticle Automated Open Cotton Boll Detection for Yield Estimation Using Unmanned Aircraft Vehicle (UAV) Data
Remote Sens. 2018, 10(12), 1895; https://doi.org/10.3390/rs10121895
Received: 31 October 2018 / Revised: 23 November 2018 / Accepted: 25 November 2018 / Published: 27 November 2018
PDF Full-text (7947 KB) | HTML Full-text | XML Full-text
Abstract
Unmanned aerial vehicle (UAV) images have great potential for various agricultural applications. In particular, UAV systems facilitate timely and precise data collection in agriculture fields at high spatial and temporal resolutions. In this study, we propose an automatic open cotton boll detection algorithm [...] Read more.
Unmanned aerial vehicle (UAV) images have great potential for various agricultural applications. In particular, UAV systems facilitate timely and precise data collection in agriculture fields at high spatial and temporal resolutions. In this study, we propose an automatic open cotton boll detection algorithm using ultra-fine spatial resolution UAV images. Seed points for a region growing algorithm were generated hierarchically with a random base for computation efficiency. Cotton boll candidates were determined based on the spatial features of each region growing segment. Spectral threshold values that automatically separate cotton bolls from other non-target objects were derived based on input images for adaptive application. Finally, a binary cotton boll classification was performed using the derived threshold values and other morphological filters to reduce noise from the results. The open cotton boll classification results were validated using reference data and the results showed an accuracy higher than 88% in various evaluation measures. Moreover, the UAV-extracted cotton boll area and actual crop yield had a strong positive correlation (0.8). The proposed method leverages UAV characteristics such as high spatial resolution and accessibility by applying automatic and unsupervised procedures using images from a single date. Additionally, this study verified the extraction of target regions of interest from UAV images for direct yield estimation. Cotton yield estimation models had R2 values between 0.63 and 0.65 and RMSE values between 0.47 kg and 0.66 kg per plot grid. Full article
Figures

Graphical abstract

Open AccessArticle How Far Can Consumer-Grade UAV RGB Imagery Describe Crop Production? A 3D and Multitemporal Modeling Approach Applied to Zea mays
Remote Sens. 2018, 10(11), 1798; https://doi.org/10.3390/rs10111798
Received: 24 September 2018 / Revised: 7 November 2018 / Accepted: 7 November 2018 / Published: 13 November 2018
PDF Full-text (1570 KB) | HTML Full-text | XML Full-text
Abstract
In recent decades, remote sensing has increasingly been used to estimate the spatio-temporal evolution of crop biophysical parameters such as the above-ground biomass (AGB). On a local scale, the advent of unmanned aerial vehicles (UAVs) seems to be a promising trade-off between satellite/airborne [...] Read more.
In recent decades, remote sensing has increasingly been used to estimate the spatio-temporal evolution of crop biophysical parameters such as the above-ground biomass (AGB). On a local scale, the advent of unmanned aerial vehicles (UAVs) seems to be a promising trade-off between satellite/airborne and terrestrial remote sensing. This study aims to evaluate the potential of a low-cost UAV RGB solution to predict the final AGB of Zea mays. Besides evaluating the interest of 3D data and multitemporality, our study aims to answer operational questions such as when one should plan a combination of two UAV flights for AGB modeling. In this case, study, final AGB prediction model performance reached 0.55 (R-square) using only UAV information and 0.8 (R-square) when combining UAV information from a single flight with a single-field AGB measurement. The adding of UAV height information to the model improves the quality of the AGB prediction. Performing two flights provides almost systematically an improvement in AGB prediction ability in comparison to most single flights. Our study provides clear insight about how we can counter the low spectral resolution of consumer-grade RGB cameras using height information and multitemporality. Our results highlight the importance of the height information which can be derived from UAV data on one hand, and on the other hand, the lower relative importance of RGB spectral information. Full article
Figures

Figure 1

Open AccessArticle High-Throughput Phenotyping of Crop Water Use Efficiency via Multispectral Drone Imagery and a Daily Soil Water Balance Model
Remote Sens. 2018, 10(11), 1682; https://doi.org/10.3390/rs10111682
Received: 12 September 2018 / Revised: 19 October 2018 / Accepted: 22 October 2018 / Published: 25 October 2018
PDF Full-text (2062 KB) | HTML Full-text | XML Full-text
Abstract
Improvement of crop water use efficiency (CWUE), defined as crop yield per volume of water used, is an important goal for both crop management and breeding. While many technologies have been developed for measuring crop water use in crop management studies, rarely have [...] Read more.
Improvement of crop water use efficiency (CWUE), defined as crop yield per volume of water used, is an important goal for both crop management and breeding. While many technologies have been developed for measuring crop water use in crop management studies, rarely have these techniques been applied at the scale of breeding plots. The objective was to develop a high-throughput methodology for quantifying water use in a cotton breeding trial at Maricopa, AZ, USA in 2016 and 2017, using evapotranspiration (ET) measurements from a co-located irrigation management trial to evaluate the approach. Approximately weekly overflights with an unmanned aerial system provided multispectral imagery from which plot-level fractional vegetation cover ( f c ) was computed. The f c data were used to drive a daily ET-based soil water balance model for seasonal crop water use quantification. A mixed model statistical analysis demonstrated that differences in ET and CWUE could be discriminated among eight cotton varieties ( p < 0 . 05 ), which were sown at two planting dates and managed with four irrigation levels. The results permitted breeders to identify cotton varieties with more favorable water use characteristics and higher CWUE, indicating that the methodology could become a useful tool for breeding selection. Full article
Figures

Graphical abstract

Open AccessArticle Quantitative Identification of Maize Lodging-Causing Feature Factors Using Unmanned Aerial Vehicle Images and a Nomogram Computation
Remote Sens. 2018, 10(10), 1528; https://doi.org/10.3390/rs10101528
Received: 26 August 2018 / Revised: 19 September 2018 / Accepted: 20 September 2018 / Published: 23 September 2018
Cited by 1 | PDF Full-text (2972 KB) | HTML Full-text | XML Full-text
Abstract
Maize (zee mays L.) is one of the most important grain crops in China. Lodging is a natural disaster that can cause significant yield losses and threaten food security. Lodging identification and analysis contributes to evaluate disaster losses and cultivates lodging-resistant maize varieties. [...] Read more.
Maize (zee mays L.) is one of the most important grain crops in China. Lodging is a natural disaster that can cause significant yield losses and threaten food security. Lodging identification and analysis contributes to evaluate disaster losses and cultivates lodging-resistant maize varieties. In this study, we collected visible and multispectral images with an unmanned aerial vehicle (UAV), and introduce a comprehensive methodology and workflow to extract lodging features from UAV imagery. We use statistical methods to screen several potential feature factors (e.g., texture, canopy structure, spectral characteristics, and terrain), and construct two nomograms (i.e., Model-1 and Model-2) with better validation performance based on selected feature factors. Model-2 was superior to Model-1 in term of its discrimination ability, but had an over-fitting phenomenon when the predicted probability of lodging went from 0.2 to 0.4. The results show that the nomogram could not only predict the occurrence probability of lodging, but also explore the underlying association between maize lodging and the selected feature factors. Compared with spectral features, terrain features, texture features, canopy cover, and genetic background, canopy structural features were more conclusive in discriminating whether maize lodging occurs at the plot scale. Using nomogram analysis, we identified protective factors (i.e., normalized difference vegetation index, NDVI and canopy elevation relief ratio, CRR) and risk factors (i.e., Hcv) related to maize lodging, and also found a problem of terrain spatial variability that is easily overlooked in lodging-resistant breeding trials. Full article
Figures

Graphical abstract

Open AccessArticle Evaluating Late Blight Severity in Potato Crops Using Unmanned Aerial Vehicles and Machine Learning Algorithms
Remote Sens. 2018, 10(10), 1513; https://doi.org/10.3390/rs10101513
Received: 10 July 2018 / Revised: 12 September 2018 / Accepted: 17 September 2018 / Published: 21 September 2018
Cited by 1 | PDF Full-text (6217 KB) | HTML Full-text | XML Full-text
Abstract
This work presents quantitative prediction of severity of the disease caused by Phytophthora infestans in potato crops using machine learning algorithms such as multilayer perceptron, deep learning convolutional neural networks, support vector regression, and random forests. The machine learning algorithms are trained using [...] Read more.
This work presents quantitative prediction of severity of the disease caused by Phytophthora infestans in potato crops using machine learning algorithms such as multilayer perceptron, deep learning convolutional neural networks, support vector regression, and random forests. The machine learning algorithms are trained using datasets extracted from multispectral data captured at the canopy level with an unmanned aerial vehicle, carrying an inexpensive digital camera. The results indicate that deep learning convolutional neural networks, random forests and multilayer perceptron using band differences can predict the level of Phytophthora infestans affectation on potato crops with acceptable accuracy. Full article
Figures

Graphical abstract

Open AccessArticle Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape
Remote Sens. 2018, 10(9), 1484; https://doi.org/10.3390/rs10091484
Received: 14 August 2018 / Revised: 8 September 2018 / Accepted: 14 September 2018 / Published: 17 September 2018
PDF Full-text (7214 KB) | HTML Full-text | XML Full-text
Abstract
Remote estimation of flower number in oilseed rape under different nitrogen (N) treatments is imperative in precision agriculture and field remote sensing, which can help to predict the yield of oilseed rape. In this study, an unmanned aerial vehicle (UAV) equipped with Red [...] Read more.
Remote estimation of flower number in oilseed rape under different nitrogen (N) treatments is imperative in precision agriculture and field remote sensing, which can help to predict the yield of oilseed rape. In this study, an unmanned aerial vehicle (UAV) equipped with Red Green Blue (RGB) and multispectral cameras was used to acquire a series of field images at the flowering stage, and the flower number was manually counted as a reference. Images of the rape field were first classified using K-means method based on Commission Internationale de l’Éclairage (CIE) L*a*b* space, and the result showed that classified flower coverage area (FCA) possessed a high correlation with the flower number (r2 = 0.89). The relationships between ten commonly used vegetation indices (VIs) extracted from UAV-based RGB and multispectral images and the flower number were investigated, and the VIs of Normalized Green Red Difference Index (NGRDI), Red Green Ratio Index (RGRI) and Modified Green Red Vegetation Index (MGRVI) exhibited the highest correlation to the flower number with the absolute correlation coefficient (r) of 0.91. Random forest (RF) model was developed to predict the flower number, and a good performance was achieved with all UAV variables (r2 = 0.93 and RMSEP = 16.18), while the optimal subset regression (OSR) model was further proposed to simplify the RF model, and a better result with r2 = 0.95 and RMSEP = 14.13 was obtained with the variable combination of RGRI, normalized difference spectral index (NDSI (944, 758)) and FCA. Our findings suggest that combining VIs and image classification from UAV-based RGB and multispectral images possesses the potential of estimating flower number in oilseed rape. Full article
Figures

Graphical abstract

Open AccessArticle Three-Dimensional Reconstruction of Soybean Canopies Using Multisource Imaging for Phenotyping Analysis
Remote Sens. 2018, 10(8), 1206; https://doi.org/10.3390/rs10081206
Received: 11 June 2018 / Revised: 29 July 2018 / Accepted: 30 July 2018 / Published: 1 August 2018
Cited by 2 | PDF Full-text (5036 KB) | HTML Full-text | XML Full-text
Abstract
Geometric three-dimensional (3D) reconstruction has emerged as a powerful tool for plant phenotyping and plant breeding. Although laser scanning is one of the most intensely used sensing techniques for 3D reconstruction projects, it still has many limitations, such as the high investment cost. [...] Read more.
Geometric three-dimensional (3D) reconstruction has emerged as a powerful tool for plant phenotyping and plant breeding. Although laser scanning is one of the most intensely used sensing techniques for 3D reconstruction projects, it still has many limitations, such as the high investment cost. To overcome such limitations, in the present study, a low-cost, novel, and efficient imaging system consisting of a red-green-blue (RGB) camera and a photonic mixer detector (PMD) was developed, and its usability for plant phenotyping was demonstrated via a 3D reconstruction of a soybean plant that contains color information. To reconstruct soybean canopies, a density-based spatial clustering of applications with noise (DBSCAN) algorithm was used to extract canopy information from the raw 3D point cloud. Principal component analysis (PCA) and iterative closest point (ICP) algorithms were then used to register the multisource images for the 3D reconstruction of a soybean plant from both the side and top views. We then assessed phenotypic traits such as plant height and the greenness index based on the deviations of test samples. The results showed that compared with manual measurements, the side view-based assessments yielded a determination coefficient (R2) of 0.9890 for the estimation of soybean height and a R2 of 0.6059 for the estimation of soybean canopy greenness index; the top view-based assessment yielded a R2 of 0.9936 for the estimation of soybean height and a R2 of 0.8864 for the estimation of soybean canopy greenness. Together, the results indicated that an assembled 3D imaging device applying the algorithms developed in this study could be used as a reliable and robust platform for plant phenotyping, and potentially for automated and high-throughput applications under both natural light and indoor conditions. Full article
Figures

Graphical abstract

Open AccessArticle A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera
Remote Sens. 2018, 10(7), 1138; https://doi.org/10.3390/rs10071138
Received: 30 May 2018 / Revised: 9 July 2018 / Accepted: 9 July 2018 / Published: 18 July 2018
Cited by 2 | PDF Full-text (8308 KB) | HTML Full-text | XML Full-text
Abstract
Timely and accurate estimates of crop parameters are crucial for agriculture management. Unmanned aerial vehicles (UAVs) carrying sophisticated cameras are very pertinent for this work because they can obtain remote-sensing images with higher temporal, spatial, and ground resolution than satellites. In this study, [...] Read more.
Timely and accurate estimates of crop parameters are crucial for agriculture management. Unmanned aerial vehicles (UAVs) carrying sophisticated cameras are very pertinent for this work because they can obtain remote-sensing images with higher temporal, spatial, and ground resolution than satellites. In this study, we evaluated (i) the performance of crop parameters estimates using a near-surface spectroscopy (350~2500 nm, 3 nm at 700 nm, 8.5 nm at 1400 nm, 6.5 nm at 2100 nm), a UAV-mounted snapshot hyperspectral sensor (450~950 nm, 8 nm at 532 nm) and a high-definition digital camera (Visible, R, G, B); (ii) the crop surface models (CSMs), RGB-based vegetation indices (VIs), hyperspectral-based VIs, and methods combined therefrom to make multi-temporal estimates of crop parameters and to map the parameters. The estimated leaf area index (LAI) and above-ground biomass (AGB) are obtained by using linear and exponential equations, random forest (RF) regression, and partial least squares regression (PLSR) to combine the UAV based spectral VIs and crop heights (from the CSMs). The results show that: (i) spectral VIs correlate strongly with LAI and AGB over single growing stages when crop height correlates positively with AGB over multiple growth stages; (ii) the correlation between the VIs multiplying crop height and AGB is greater than that between a single VI and crop height; (iii) the AGB estimate from the UAV-mounted snapshot hyperspectral sensor and high-definition digital camera is similar to the results from the ground spectrometer when using the combined methods (i.e., using VIs multiplying crop height, RF and PLSR to combine VIs and crop heights); and (iv) the spectral performance of the sensors is crucial in LAI estimates (the wheat LAI cannot be accurately estimated over multiple growing stages when using only crop height). The LAI estimates ranked from best to worst are ground spectrometer, UAV snapshot hyperspectral sensor, and UAV high-definition digital camera. Full article
Figures

Graphical abstract

Open AccessArticle Quantitative Estimation of Wheat Phenotyping Traits Using Ground and Aerial Imagery
Remote Sens. 2018, 10(6), 950; https://doi.org/10.3390/rs10060950
Received: 16 April 2018 / Revised: 28 May 2018 / Accepted: 12 June 2018 / Published: 14 June 2018
Cited by 1 | PDF Full-text (4301 KB) | HTML Full-text | XML Full-text
Abstract
This study evaluates an aerial and ground imaging platform for assessment of canopy development in a wheat field. The dependence of two canopy traits, height and vigour, on fertilizer treatment was observed in a field trial comprised of ten varieties of spring wheat. [...] Read more.
This study evaluates an aerial and ground imaging platform for assessment of canopy development in a wheat field. The dependence of two canopy traits, height and vigour, on fertilizer treatment was observed in a field trial comprised of ten varieties of spring wheat. A custom-built mobile ground platform (MGP) and an unmanned aerial vehicle (UAV) were deployed at the experimental site for standard red, green and blue (RGB) image collection on five occasions. Meanwhile, reference field measurements of canopy height and vigour were manually recorded during the growing season. Canopy level estimates of height and vigour for each variety and treatment were computed by image analysis. The agreement between estimates from each platform and reference measurements was statistically analysed. Estimates of canopy height derived from MGP imagery were more accurate (RMSE = 3.95 cm, R2 = 0.94) than estimates derived from UAV imagery (RMSE = 6.64 cm, R2 = 0.85). In contrast, vigour was better estimated using the UAV imagery (RMSE = 0.057, R2 = 0.57), compared to MGP imagery (RMSE = 0.063, R2 = 0.42), albeit with a significant fixed and proportional bias. The ability of the platforms to capture differential development of traits as a function of fertilizer treatment was also investigated. Both imaging methodologies observed a higher median canopy height of treated plots compared with untreated plots throughout the season, and a greater median vigour of treated plots compared with untreated plots exhibited in the early growth stages. While the UAV imaging provides a high-throughput method for canopy-level trait determination, the MGP imaging captures subtle canopy structures, potentially useful for fine-grained analyses of plants. Full article
Figures

Graphical abstract

Open AccessArticle Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects
Remote Sens. 2018, 10(6), 854; https://doi.org/10.3390/rs10060854
Received: 15 April 2018 / Revised: 24 May 2018 / Accepted: 25 May 2018 / Published: 1 June 2018
Cited by 3 | PDF Full-text (15356 KB) | HTML Full-text | XML Full-text
Abstract
Unmanned aerial vehicles (UAV) provide an unprecedented capacity to monitor the development and dynamics of tree growth and structure through time. It is generally thought that the pruning of tree crops encourages new growth, has a positive effect on fruiting, makes fruit-picking easier, [...] Read more.
Unmanned aerial vehicles (UAV) provide an unprecedented capacity to monitor the development and dynamics of tree growth and structure through time. It is generally thought that the pruning of tree crops encourages new growth, has a positive effect on fruiting, makes fruit-picking easier, and may increase yield, as it increases light interception and tree crown surface area. To establish the response of pruning in an orchard of lychee trees, an assessment of changes in tree structure, i.e., tree crown perimeter, width, height, area and Plant Projective Cover (PPC), was undertaken using multi-spectral UAV imagery collected before and after a pruning event. While tree crown perimeter, width and area could be derived directly from the delineated tree crowns, height was estimated from a produced canopy height model and PPC was most accurately predicted based on the NIR band. Pre- and post-pruning results showed significant differences in all measured tree structural parameters, including an average decrease in tree crown perimeter of 1.94 m, tree crown width of 0.57 m, tree crown height of 0.62 m, tree crown area of 3.5 m2, and PPC of 14.8%. In order to provide guidance on data collection protocols for orchard management, the impact of flying height variations was also examined, offering some insight into the influence of scale and the scalability of this UAV-based approach for larger orchards. The different flying heights (i.e., 30, 50 and 70 m) produced similar measurements of tree crown width and PPC, while tree crown perimeter, area and height measurements decreased with increasing flying height. Overall, these results illustrate that routine collection of multi-spectral UAV imagery can provide a means of assessing pruning effects on changes in tree structure in commercial orchards, and highlight the importance of collecting imagery with consistent flight configurations, as varying flying heights may cause changes to tree structural measurements. Full article
Figures

Graphical abstract

Open AccessArticle Time-Series Multispectral Indices from Unmanned Aerial Vehicle Imagery Reveal Senescence Rate in Bread Wheat
Remote Sens. 2018, 10(6), 809; https://doi.org/10.3390/rs10060809
Received: 11 April 2018 / Revised: 19 May 2018 / Accepted: 21 May 2018 / Published: 23 May 2018
Cited by 3 | PDF Full-text (3382 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Detection of senescence’s dynamics in crop breeding is time consuming and needs considerable details regarding its rate of progression and intensity. Normalized difference red-edge index (NDREI) along with four other spectral vegetative indices (SVIs) derived from unmanned aerial vehicle (UAV) based spatial imagery, [...] Read more.
Detection of senescence’s dynamics in crop breeding is time consuming and needs considerable details regarding its rate of progression and intensity. Normalized difference red-edge index (NDREI) along with four other spectral vegetative indices (SVIs) derived from unmanned aerial vehicle (UAV) based spatial imagery, were evaluated for rapid and accurate prediction of senescence. For this, 32 selected winter wheat genotypes were planted under full and limited irrigation treatments. Significant variations for all five SVIs: green normalize difference vegetation index (GNDVI), simple ratio (SR), green chlorophyll index (GCI), red-edge chlorophyll index (RECI), and normalized difference red-edge index (NDREI) among genotypes and between treatments, were observed from heading to late grain filling stages. The SVIs showed strong relationship (R2 = 0.69 to 0.78) with handheld measurements of chlorophyll and leaf area index (LAI), while negatively correlated (R2 = 0.75 to 0.77) with canopy temperature (CT) across the treatments. NDREI as a new SVI showed higher correlations with ground data under both treatments, similarly as exhibited by other four SVIs. There were medium to strong correlations (r = 0.23–0.63) among SVIs, thousand grain weight (TGW) and grain yield (GY) under both treatments. Senescence rate was calculated by decreasing values of SVIs from their peak values at heading stage, while variance for senescence rate among genotypes and between treatments could be explained by SVIs variations. Under limited irrigation, 10% to 15% higher senescence rate was detected as compared with full irrigation. Principle component analysis corroborated the negative association of high senescence rate with TGW and GY. Some genotypes, such as Beijing 0045, Nongda 5181, and Zhongmai 175, were selected with low senescence rate, stable TGW and GY in both full and limited irrigation treatments, nearly in accordance with the actual performance of these cultivars in field. Thus, SVIs derived from UAV appeared as a promising tool for rapid and precise estimation of senescence rate at maturation stages. Full article
Figures

Graphical abstract

Open AccessArticle Estimation of Vegetable Crop Parameter by Multi-temporal UAV-Borne Images
Remote Sens. 2018, 10(5), 805; https://doi.org/10.3390/rs10050805
Received: 18 April 2018 / Revised: 17 May 2018 / Accepted: 17 May 2018 / Published: 22 May 2018
Cited by 4 | PDF Full-text (5928 KB) | HTML Full-text | XML Full-text
Abstract
3D point cloud analysis of imagery collected by unmanned aerial vehicles (UAV) has been shown to be a valuable tool for estimation of crop phenotypic traits, such as plant height, in several species. Spatial information about these phenotypic traits can be used to [...] Read more.
3D point cloud analysis of imagery collected by unmanned aerial vehicles (UAV) has been shown to be a valuable tool for estimation of crop phenotypic traits, such as plant height, in several species. Spatial information about these phenotypic traits can be used to derive information about other important crop characteristics, like fresh biomass yield, which could not be derived directly from the point clouds. Previous approaches have often only considered single date measurements using a single point cloud derived metric for the respective trait. Furthermore, most of the studies focused on plant species with a homogenous canopy surface. The aim of this study was to assess the applicability of UAV imagery for capturing crop height information of three vegetables (crops eggplant, tomato, and cabbage) with a complex vegetation canopy surface during a complete crop growth cycle to infer biomass. Additionally, the effect of crop development stage on the relationship between estimated crop height and field measured crop height was examined. Our study was conducted in an experimental layout at the University of Agricultural Science in Bengaluru, India. For all the crops, the crop height and the biomass was measured at five dates during one crop growth cycle between February and May 2017 (average crop height was 42.5, 35.5, and 16.0 cm for eggplant, tomato, and cabbage). Using a structure from motion approach, a 3D point cloud was created for each crop and sampling date. In total, 14 crop height metrics were extracted from the point clouds. Machine learning methods were used to create prediction models for vegetable crop height. The study demonstrates that the monitoring of crop height using an UAV during an entire growing period results in detailed and precise estimates of crop height and biomass for all three crops (R2 ranging from 0.87 to 0.97, bias ranging from −0.66 to 0.45 cm). The effect of crop development stage on the predicted crop height was found to be substantial (e.g., median deviation increased from 1% to 20% for eggplant) influencing the strength and consistency of the relationship between point cloud metrics and crop height estimates and, thus, should be further investigated. Altogether the results of the study demonstrate that point cloud generated from UAV-based RGB imagery can be used to effectively measure vegetable crop biomass in larger areas (relative error = 17.6%, 19.7%, and 15.2% for eggplant, tomato, and cabbage, respectively) with a similar accuracy as biomass prediction models based on measured crop height (relative error = 21.6, 18.8, and 15.2 for eggplant, tomato, and cabbage). Full article
Figures

Graphical abstract

Open AccessArticle Aerial and Ground Based Sensing of Tolerance to Beet Cyst Nematode in Sugar Beet
Remote Sens. 2018, 10(5), 787; https://doi.org/10.3390/rs10050787
Received: 1 April 2018 / Revised: 18 May 2018 / Accepted: 18 May 2018 / Published: 19 May 2018
Cited by 2 | PDF Full-text (12732 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
The rapid development of image-based phenotyping methods based on ground-operating devices or unmanned aerial vehicles (UAV) has increased our ability to evaluate traits of interest for crop breeding in the field. A field site infested with beet cyst nematode (BCN) and planted with [...] Read more.
The rapid development of image-based phenotyping methods based on ground-operating devices or unmanned aerial vehicles (UAV) has increased our ability to evaluate traits of interest for crop breeding in the field. A field site infested with beet cyst nematode (BCN) and planted with four nematode susceptible cultivars and five tolerant cultivars was investigated at different times during the growing season. We compared the ability of spectral, hyperspectral, canopy height- and temperature information derived from handheld and UAV-borne sensors to discriminate susceptible and tolerant cultivars and to predict the final sugar beet yield. Spectral indices (SIs) related to chlorophyll, nitrogen or water allowed differentiating nematode susceptible and tolerant cultivars (cultivar type) from the same genetic background (breeder). Discrimination between the cultivar types was easier at advanced stages when the nematode pressure was stronger and the plants and canopies further developed. The canopy height (CH) allowed differentiating cultivar type as well but was much more efficient from the UAV compared to manual field assessment. Canopy temperatures also allowed ranking cultivars according to their nematode tolerance level. Combinations of SIs in multivariate analysis and decision trees improved differentiation of cultivar type and classification of genetic background. Thereby, SIs and canopy temperature proved to be suitable proxies for sugar yield prediction. The spectral information derived from handheld and the UAV-borne sensor did not match perfectly, but both analysis procedures allowed for discrimination between susceptible and tolerant cultivars. This was possible due to successful detection of traits related to BCN tolerance like chlorophyll, nitrogen and water content, which were reduced in cultivars with a low tolerance to BCN. The high correlation between SIs and final sugar beet yield makes the UAV hyperspectral imaging approach very suitable to improve farming practice via maps of yield potential or diseases. Moreover, the study shows the high potential of multi- sensor and parameter combinations for plant phenotyping purposes, in particular for data from UAV-borne sensors that allow for standardized and automated high-throughput data extraction procedures. Full article
Figures

Graphical abstract

Open AccessArticle Phenotyping Conservation Agriculture Management Effects on Ground and Aerial Remote Sensing Assessments of Maize Hybrids Performance in Zimbabwe
Remote Sens. 2018, 10(2), 349; https://doi.org/10.3390/rs10020349
Received: 27 December 2017 / Revised: 8 February 2018 / Accepted: 14 February 2018 / Published: 24 February 2018
Cited by 1 | PDF Full-text (6496 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
In the coming decades, Sub-Saharan Africa (SSA) faces challenges to sustainably increase food production while keeping pace with continued population growth. Conservation agriculture (CA) has been proposed to enhance soil health and productivity to respond to this situation. Maize is the main staple [...] Read more.
In the coming decades, Sub-Saharan Africa (SSA) faces challenges to sustainably increase food production while keeping pace with continued population growth. Conservation agriculture (CA) has been proposed to enhance soil health and productivity to respond to this situation. Maize is the main staple food in SSA. To increase maize yields, the selection of suitable genotypes and management practices for CA conditions has been explored using remote sensing tools. They may play a fundamental role towards overcoming the traditional limitations of data collection and processing in large scale phenotyping studies. We present the result of a study in which Red-Green-Blue (RGB) and multispectral indexes were evaluated for assessing maize performance under conventional ploughing (CP) and CA practices. Eight hybrids under different planting densities and tillage practices were tested. The measurements were conducted on seedlings at ground level (0.8 m) and from an unmanned aerial vehicle (UAV) platform (30 m), causing a platform proximity effect on the images resolution that did not have any negative impact on the performance of the indexes. Most of the calculated indexes (Green Area (GA) and Normalized Difference Vegetation Index (NDVI)) were significantly affected by tillage conditions increasing their values from CP to CA. Indexes derived from the RGB-images related to canopy greenness performed better at assessing yield differences, potentially due to the greater resolution of the RGB compared with the multispectral data, although this performance was more precise for CP than CA. The correlations of the multispectral indexes with yield were improved by applying a soil-mask derived from a NDVI threshold with the aim of corresponding pixels with vegetation. The results of this study highlight the applicability of remote sensing approaches based on RGB images to the assessment of crop performance and hybrid choice. Full article
Figures

Graphical abstract

Open AccessArticle High-Throughput Phenotyping of Canopy Cover and Senescence in Maize Field Trials Using Aerial Digital Canopy Imaging
Remote Sens. 2018, 10(2), 330; https://doi.org/10.3390/rs10020330
Received: 22 November 2017 / Revised: 8 February 2018 / Accepted: 8 February 2018 / Published: 23 February 2018
Cited by 4 | PDF Full-text (5074 KB) | HTML Full-text | XML Full-text
Abstract
In the crop breeding process, the use of data collection methods that allow reliable assessment of crop adaptation traits, faster and cheaper than those currently in use, can significantly improve resource use efficiency by reducing selection cost and can contribute to increased genetic [...] Read more.
In the crop breeding process, the use of data collection methods that allow reliable assessment of crop adaptation traits, faster and cheaper than those currently in use, can significantly improve resource use efficiency by reducing selection cost and can contribute to increased genetic gain through improved selection efficiency. Current methods to estimate crop growth (ground canopy cover) and leaf senescence are essentially manual and/or by visual scoring, and are therefore often subjective, time consuming, and expensive. Aerial sensing technologies offer radically new perspectives for assessing these traits at low cost, faster, and in a more objective manner. We report the use of an unmanned aerial vehicle (UAV) equipped with an RGB camera for crop cover and canopy senescence assessment in maize field trials. Aerial-imaging-derived data showed a moderately high heritability for both traits with a significant genetic correlation with grain yield. In addition, in some cases, the correlation between the visual assessment (prone to subjectivity) of crop senescence and the senescence index, calculated from aerial imaging data, was significant. We concluded that the UAV-based aerial sensing platforms have great potential for monitoring the dynamics of crop canopy characteristics like crop vigor through ground canopy cover and canopy senescence in breeding trial plots. This is anticipated to assist in improving selection efficiency through higher accuracy and precision, as well as reduced time and cost of data collection. Full article
Figures

Graphical abstract

Open AccessArticle Estimating Barley Biomass with Crop Surface Models from Oblique RGB Imagery
Remote Sens. 2018, 10(2), 268; https://doi.org/10.3390/rs10020268
Received: 20 November 2017 / Revised: 15 January 2018 / Accepted: 7 February 2018 / Published: 9 February 2018
Cited by 3 | PDF Full-text (5858 KB) | HTML Full-text | XML Full-text
Abstract
Non-destructive monitoring of crop development is of key interest for agronomy and crop breeding. Crop Surface Models (CSMs) representing the absolute height of the plant canopy are a tool for this. In this study, fresh and dry barley biomass per plot are estimated [...] Read more.
Non-destructive monitoring of crop development is of key interest for agronomy and crop breeding. Crop Surface Models (CSMs) representing the absolute height of the plant canopy are a tool for this. In this study, fresh and dry barley biomass per plot are estimated from CSM-derived plot-wise plant heights. The CSMs are generated in a semi-automated manner using Structure-from-Motion (SfM)/Multi-View-Stereo (MVS) software from oblique stereo RGB images. The images were acquired automatedly from consumer grade smart cameras mounted at an elevated position on a lifting hoist. Fresh and dry biomass were measured destructively at four dates each in 2014 and 2015. We used exponential and simple linear regression based on different calibration/validation splits. Coefficients of determination R 2 between 0.55 and 0.79 and root mean square errors (RMSE) between 97 and 234 g/m2 are reached for the validation of predicted vs. observed dry biomass, while Willmott’s refined index of model performance d r ranges between 0.59 and 0.77. For fresh biomass, R 2 values between 0.34 and 0.61 are reached, with root mean square errors (RMSEs) between 312 and 785 g/m2 and d r between 0.39 and 0.66. We therefore established the possibility of using this novel low-cost system to estimate barley dry biomass over time. Full article
Figures

Graphical abstract

Open AccessArticle Recognition of Wheat Spike from Field Based Phenotype Platform Using Multi-Sensor Fusion and Improved Maximum Entropy Segmentation Algorithms
Remote Sens. 2018, 10(2), 246; https://doi.org/10.3390/rs10020246
Received: 20 November 2017 / Revised: 31 January 2018 / Accepted: 2 February 2018 / Published: 6 February 2018
Cited by 2 | PDF Full-text (9483 KB) | HTML Full-text | XML Full-text
Abstract
To obtain an accurate count of wheat spikes, which is crucial for estimating yield, this paper proposes a new algorithm that uses computer vision to achieve this goal from an image. First, a home-built semi-autonomous multi-sensor field-based phenotype platform (FPP) is used to [...] Read more.
To obtain an accurate count of wheat spikes, which is crucial for estimating yield, this paper proposes a new algorithm that uses computer vision to achieve this goal from an image. First, a home-built semi-autonomous multi-sensor field-based phenotype platform (FPP) is used to obtain orthographic images of wheat plots at the filling stage. The data acquisition system of the FPP provides high-definition RGB images and multispectral images of the corresponding quadrats. Then, the high-definition panchromatic images are obtained by fusion of three channels of RGB. The Gram–Schmidt fusion algorithm is then used to fuse these multispectral and panchromatic images, thereby improving the color identification degree of the targets. Next, the maximum entropy segmentation method is used to do the coarse-segmentation. The threshold of this method is determined by a firefly algorithm based on chaos theory (FACT), and then a morphological filter is used to de-noise the coarse-segmentation results. Finally, morphological reconstruction theory is applied to segment the adhesive part of the de-noised image and realize the fine-segmentation of the image. The computer-generated counting results for the wheat plots, using independent regional statistical function in Matlab R2017b software, are then compared with field measurements which indicate that the proposed method provides a more accurate count of wheat spikes when compared with other traditional fusion and segmentation methods mentioned in this paper. Full article
Figures

Graphical abstract

Open AccessArticle A Comparison of Regression Techniques for Estimation of Above-Ground Winter Wheat Biomass Using Near-Surface Spectroscopy
Remote Sens. 2018, 10(1), 66; https://doi.org/10.3390/rs10010066
Received: 16 November 2017 / Revised: 25 December 2017 / Accepted: 2 January 2018 / Published: 5 January 2018
Cited by 8 | PDF Full-text (7046 KB) | HTML Full-text | XML Full-text
Abstract
Above-ground biomass (AGB) provides a vital link between solar energy consumption and yield, so its correct estimation is crucial to accurately monitor crop growth and predict yield. In this work, we estimate AGB by using 54 vegetation indexes (e.g., Normalized Difference Vegetation Index, [...] Read more.
Above-ground biomass (AGB) provides a vital link between solar energy consumption and yield, so its correct estimation is crucial to accurately monitor crop growth and predict yield. In this work, we estimate AGB by using 54 vegetation indexes (e.g., Normalized Difference Vegetation Index, Soil-Adjusted Vegetation Index) and eight statistical regression techniques: artificial neural network (ANN), multivariable linear regression (MLR), decision-tree regression (DT), boosted binary regression tree (BBRT), partial least squares regression (PLSR), random forest regression (RF), support vector machine regression (SVM), and principal component regression (PCR), which are used to analyze hyperspectral data acquired by using a field spectrophotometer. The vegetation indexes (VIs) determined from the spectra were first used to train regression techniques for modeling and validation to select the best VI input, and then summed with white Gaussian noise to study how remote sensing errors affect the regression techniques. Next, the VIs were divided into groups of different sizes by using various sampling methods for modeling and validation to test the stability of the techniques. Finally, the AGB was estimated by using a leave-one-out cross validation with these powerful techniques. The results of the study demonstrate that, of the eight techniques investigated, PLSR and MLR perform best in terms of stability and are most suitable when high-accuracy and stable estimates are required from relatively few samples. In addition, RF is extremely robust against noise and is best suited to deal with repeated observations involving remote-sensing data (i.e., data affected by atmosphere, clouds, observation times, and/or sensor noise). Finally, the leave-one-out cross-validation method indicates that PLSR provides the highest accuracy (R2 = 0.89, RMSE = 1.20 t/ha, MAE = 0.90 t/ha, NRMSE = 0.07, CV (RMSE) = 0.18); thus, PLSR is best suited for works requiring high-accuracy estimation models. The results indicate that all these techniques provide impressive accuracy. The comparison and analysis provided herein thus reveals the advantages and disadvantages of the ANN, MLR, DT, BBRT, PLSR, RF, SVM, and PCR techniques and can help researchers to build efficient AGB-estimation models. Full article
Figures

Graphical abstract

Open AccessArticle High Throughput Phenotyping of Blueberry Bush Morphological Traits Using Unmanned Aerial Systems
Remote Sens. 2017, 9(12), 1250; https://doi.org/10.3390/rs9121250
Received: 6 October 2017 / Revised: 20 November 2017 / Accepted: 23 November 2017 / Published: 2 December 2017
Cited by 2 | PDF Full-text (4420 KB) | HTML Full-text | XML Full-text
Abstract
Phenotyping morphological traits of blueberry bushes in the field is important for selecting genotypes that are easily harvested by mechanical harvesters. Morphological data can also be used to assess the effects of crop treatments such as plant growth regulators, fertilizers, and environmental conditions. [...] Read more.
Phenotyping morphological traits of blueberry bushes in the field is important for selecting genotypes that are easily harvested by mechanical harvesters. Morphological data can also be used to assess the effects of crop treatments such as plant growth regulators, fertilizers, and environmental conditions. This paper investigates the feasibility and accuracy of an inexpensive unmanned aerial system in determining the morphological characteristics of blueberry bushes. Color images collected by a quadcopter are processed into three-dimensional point clouds via structure from motion algorithms. Bush height, extents, canopy area, and volume, in addition to crown diameter and width, are derived and referenced to ground truth. In an experimental farm, twenty-five bushes were imaged by a quadcopter. Height and width dimensions achieved a mean absolute error of 9.85 cm before and 5.82 cm after systematic under-estimation correction. Strong correlation was found between manual and image derived bush volumes and their traditional growth indices. Hedgerows of three Southern Highbush varieties were imaged at a commercial farm to extract five morphological features (base angle, blockiness, crown percent height, crown ratio, and vegetation ratio) associated with cultivation and machine harvestability. The bushes were found to be partially separable by multivariate analysis. The methodology developed from this study is not only valuable for plant breeders to screen genotypes with bush morphological traits that are suitable for machine harvest, but can also aid producers in crop management such as pruning and plot layout organization. Full article
Figures

Figure 1

Open AccessArticle Evaluation of Seed Emergence Uniformity of Mechanically Sown Wheat with UAV RGB Imagery
Remote Sens. 2017, 9(12), 1241; https://doi.org/10.3390/rs9121241
Received: 29 September 2017 / Revised: 27 November 2017 / Accepted: 28 November 2017 / Published: 1 December 2017
Cited by 3 | PDF Full-text (14036 KB) | HTML Full-text | XML Full-text
Abstract
The uniformity of wheat seed emergence is an important characteristic used to evaluate cultivars, cultivation mode and field management. Currently, researchers typically investigated the uniformity of seed emergence by manual measurement, a time-consuming and laborious process. This study employed field RGB images from [...] Read more.
The uniformity of wheat seed emergence is an important characteristic used to evaluate cultivars, cultivation mode and field management. Currently, researchers typically investigated the uniformity of seed emergence by manual measurement, a time-consuming and laborious process. This study employed field RGB images from unmanned aerial vehicles (UAVs) to obtain information related to the uniformity of wheat seed emergence and missing seedlings. The calculation of the length of areas with missing seedlings in both drill and broadcast sowing can be achieved by using an area localization algorithm, which facilitated the comprehensive evaluation of uniformity of seed emergence. Through a comparison between UAV images and the results of manual surveys used to gather data on the uniformity of seed emergence, the root-mean-square error (RMSE) was 0.44 for broadcast sowing and 0.64 for drill sowing. The RMSEs of the numbers of missing seedling regions for broadcast and drill sowing were 1.39 and 3.99, respectively. The RMSEs of the lengths of the missing seedling regions were 12.39 cm for drill sowing and 0.20 cm2 for broadcast sowing. The UAV image-based method provided a new and greatly improved method for efficiently measuring the uniformity of wheat seed emergence. The proposed method could provide a guideline for the intelligent evaluation of the uniformity of wheat seed emergence. Full article
Figures

Graphical abstract

Other

Jump to: Research

Open AccessTechnical Note Measuring Canopy Structure and Condition Using Multi-Spectral UAS Imagery in a Horticultural Environment
Remote Sens. 2019, 11(3), 269; https://doi.org/10.3390/rs11030269
Received: 20 December 2018 / Revised: 19 January 2019 / Accepted: 28 January 2019 / Published: 30 January 2019
PDF Full-text (8034 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Tree condition, pruning and orchard management practices within intensive horticultural tree crop systems can be determined via measurements of tree structure. Multi-spectral imagery acquired from an unmanned aerial system (UAS) has been demonstrated as an accurate and efficient platform for measuring various tree [...] Read more.
Tree condition, pruning and orchard management practices within intensive horticultural tree crop systems can be determined via measurements of tree structure. Multi-spectral imagery acquired from an unmanned aerial system (UAS) has been demonstrated as an accurate and efficient platform for measuring various tree structural attributes, but research in complex horticultural environments has been limited. This research established a methodology for accurately estimating tree crown height, extent, plant projective cover (PPC) and condition of avocado tree crops, from a UAS platform. Individual tree crowns were delineated using object-based image analysis. In comparison to field measured canopy heights, an image-derived canopy height model provided a coefficient of determination (R2) of 0.65 and relative root mean squared error of 6%. Tree crown length perpendicular to the hedgerow was accurately mapped. PPC was measured using spectral and textural image information and produced an R2 value of 0.62 against field data. A random forest classifier was applied to assign tree condition into four categories in accordance with industry standards, producing out-of-bag accuracies >96%. Our results demonstrate the potential of UAS-based mapping for the provision of information to support the horticulture industry and facilitate orchard-based assessment and management. Full article
Figures

Graphical abstract

Open AccessTechnical Note Prediction of Chlorophyll Content in Different Light Areas of Apple Tree Canopies based on the Color Characteristics of 3D Reconstruction
Remote Sens. 2018, 10(3), 429; https://doi.org/10.3390/rs10030429
Received: 12 January 2018 / Revised: 16 February 2018 / Accepted: 8 March 2018 / Published: 10 March 2018
Cited by 1 | PDF Full-text (6248 KB) | HTML Full-text | XML Full-text
Abstract
Improving the speed and accuracy of chlorophyll (Ch1) content prediction in different light areas of apple trees is a central priority for understanding the growth response to light intensity and in turn increasing the primary production of apples. In vitro assessment by wet [...] Read more.
Improving the speed and accuracy of chlorophyll (Ch1) content prediction in different light areas of apple trees is a central priority for understanding the growth response to light intensity and in turn increasing the primary production of apples. In vitro assessment by wet chemical extraction is the standard method for leaf chlorophyll determination. This measurement is expensive, laborious, and time-consuming. Over the years, alternative methods—both rapid and nondestructive—were explored, and many vegetation indices (VIs) were developed to retrieve Ch1 content at the canopy level from meter- to decameter-scale reflectance observations, which have lower accuracy due to the possible confounding influence of the canopy structure. Thus, the spatially continuous distribution of Ch1 content in different light areas within an apple tree canopy remains unresolved. Therefore, the objective of this study is to develop methods for Ch1 content estimation in areas of different light intensity by using 3D models with color characteristics acquired by a 3D laser scanner with centimeter spatial resolution. Firstly, to research relative light intensity (RLI), canopies were scanned with a FARO Focus3D 120 laser scanner on a calm day without strong light intensity and then divided into 180 cube units for each canopy according to actual division methods in three-dimensional spaces based on distance information. Meanwhile, four different types of RLI were defined as 0–30%, 30–60%, 60–85%, and 85–100%, respectively, according to the actual division method for tree canopies. Secondly, Ch1 content in the 180 cubic units of each apple tree was measured by a leaf chlorophyll meter (soil and plant analyzer development, SPAD). Then, color characteristics were extracted from each cubic area of the 3D model and calculated by two color variables, which could be regarded as effective indicators of Ch1 content in field crop areas. Finally, to address the complexity and fuzziness of relationships between the color characteristics and Ch1 content of apple tree canopies (which could not be expressed by an accurate mathematical model), a three-layer artificial neural network (ANN) was constructed as a predictive model to find Ch1 content in different light areas in apple tree canopies. The results indicated that the mean highest and mean lowest value of Ch1 content distributed in 60–85% and 0–30% of RLI areas, respectively, and that there was no significant difference between adjacent RLI areas. Additionally, color characteristics changed regularly as the RLI rose within canopies. Moreover, the prediction of Ch1 content was strongly correlated with those of actual measurements (R = 0.9755) by the SPAD leaf chlorophyll meter. In summary, the color characteristics in 3D apple tree canopies combined with ANN technology could be used as a potential rapid technique for predicting Ch1 content in different areas of light in apple tree canopies. Full article
Figures

Graphical abstract

Open AccessTechnical Note Estimation of Wheat LAI at Middle to High Levels Using Unmanned Aerial Vehicle Narrowband Multispectral Imagery
Remote Sens. 2017, 9(12), 1304; https://doi.org/10.3390/rs9121304
Received: 29 September 2017 / Revised: 30 November 2017 / Accepted: 7 December 2017 / Published: 12 December 2017
Cited by 7 | PDF Full-text (4023 KB) | HTML Full-text | XML Full-text
Abstract
Leaf area index (LAI) is a significant biophysical variable in the models of hydrology, climatology and crop growth. Rapid monitoring of LAI is critical in modern precision agriculture. Remote sensing (RS) on satellite, aerial and unmanned aerial vehicles (UAVs) has become a popular [...] Read more.
Leaf area index (LAI) is a significant biophysical variable in the models of hydrology, climatology and crop growth. Rapid monitoring of LAI is critical in modern precision agriculture. Remote sensing (RS) on satellite, aerial and unmanned aerial vehicles (UAVs) has become a popular technique in monitoring crop LAI. Among them, UAVs are highly attractive to researchers and agriculturists. However, some of the UAVs vegetation index (VI)—derived LAI models have relatively low accuracy because of the limited number of multispectral bands, especially as they tend to saturate at the middle to high LAI levels, which are the LAI levels of high-yielding wheat crops in China. This study aims to effectively estimate wheat LAI with UAVs narrowband multispectral image (400–800 nm spectral regions, 10 cm resolution) under varying growth conditions during five critical growth stages, and to provide the potential technical support for optimizing the nitrogen fertilization. Results demonstrated that the newly developed LAI model with modified triangular vegetation index (MTVI2) has better accuracy with higher coefficient of determination (Rc2 = 0.79, Rv2 = 0.80) and lower relative root mean squared error (RRMSE = 24%), and higher sensitivity under various LAI values (from 2 to 7), which will broaden the applied range of the new LAI model. Furthermore, this LAI model displayed stable performance under different sub-categories of growth stages, varieties, and eco-sites. In conclusion, this study could provide effective technical support to precisely monitor the crop growth with UAVs in various crop yield levels, which should prove helpful in family farm for the modern agriculture. Full article
Figures

Graphical abstract

Remote Sens. EISSN 2072-4292 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top