Next Article in Journal
Suppression of Wind Ripples and Microwave Backscattering Due to Turbulence Generated by Breaking Surface Waves
Previous Article in Journal
Photogrammetric 3D Model via Smartphone GNSS Sensor: Workflow, Error Estimate, and Best Practices
Previous Article in Special Issue
Fusion of Spectral and Structural Information from Aerial Images for Improved Biomass Estimation
Article

High-Throughput Phenotyping of Soybean Maturity Using Time Series UAV Imagery and Convolutional Neural Networks

1
Department of Crop Sciences, University of Illinois at Urbana Champaign, Urbana, IL 61801, USA
2
Estación Experimental INIA La Estanzuela, Instituto Nacional de Investigación Agropecuaria (INIA), Ruta 50 km 11, Colonia 70000, Uruguay
3
GDM Seeds Inc., Gibson City, IL 60936, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(21), 3617; https://doi.org/10.3390/rs12213617
Received: 18 September 2020 / Revised: 28 October 2020 / Accepted: 29 October 2020 / Published: 4 November 2020
Soybean maturity is a trait of critical importance for the development of new soybean cultivars, nevertheless, its characterization based on visual ratings has many challenges. Unmanned aerial vehicles (UAVs) imagery-based high-throughput phenotyping methodologies have been proposed as an alternative to the traditional visual ratings of pod senescence. However, the lack of scalable and accurate methods to extract the desired information from the images remains a significant bottleneck in breeding programs. The objective of this study was to develop an image-based high-throughput phenotyping system for evaluating soybean maturity in breeding programs. Images were acquired twice a week, starting when the earlier lines began maturation until the latest ones were mature. Two complementary convolutional neural networks (CNN) were developed to predict the maturity date. The first using a single date and the second using the five best image dates identified by the first model. The proposed CNN architecture was validated using more than 15,000 ground truth observations from five trials, including data from three growing seasons and two countries. The trained model showed good generalization capability with a root mean squared error lower than two days in four out of five trials. Four methods of estimating prediction uncertainty showed potential at identifying different sources of errors in the maturity date predictions. The architecture developed solves limitations of previous research and can be used at scale in commercial breeding programs.
View Full-Text
Keywords: machine learning; physiological maturity; computer vision; plant breeding; soybean phenology; Glycine max (L.) Merr machine learning; physiological maturity; computer vision; plant breeding; soybean phenology; Glycine max (L.) Merr
Show Figures

Graphical abstract

MDPI and ACS Style

Trevisan, R.; Pérez, O.; Schmitz, N.; Diers, B.; Martin, N. High-Throughput Phenotyping of Soybean Maturity Using Time Series UAV Imagery and Convolutional Neural Networks. Remote Sens. 2020, 12, 3617. https://doi.org/10.3390/rs12213617

AMA Style

Trevisan R, Pérez O, Schmitz N, Diers B, Martin N. High-Throughput Phenotyping of Soybean Maturity Using Time Series UAV Imagery and Convolutional Neural Networks. Remote Sensing. 2020; 12(21):3617. https://doi.org/10.3390/rs12213617

Chicago/Turabian Style

Trevisan, Rodrigo, Osvaldo Pérez, Nathan Schmitz, Brian Diers, and Nicolas Martin. 2020. "High-Throughput Phenotyping of Soybean Maturity Using Time Series UAV Imagery and Convolutional Neural Networks" Remote Sensing 12, no. 21: 3617. https://doi.org/10.3390/rs12213617

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop