Next Article in Journal
Lab-on-a-Chip Platforms for Detection of Cardiovascular Disease and Cancer Biomarkers
Next Article in Special Issue
A Low-Cost Approach to Automatically Obtain Accurate 3D Models of Woody Crops
Previous Article in Journal
Extraction and Analysis of Respiratory Motion Using Wearable Inertial Sensor System during Trunk Motion
Previous Article in Special Issue
In Vivo Non-Destructive Monitoring of Capsicum Annuum Seed Growth with Diverse NaCl Concentrations Using Optical Detection Technique
Article

Estimation of the Botanical Composition of Clover-Grass Leys from RGB Images Using Data Simulation and Fully Convolutional Neural Networks

1
Department of Engineering, Aarhus University, Finlandsgade 22, 8200 Aarhus N, Denmark
2
Department of Agroecology, Aarhus University, Forsøgsvej 1, 4200 Slagelse, Denmark
3
Agro Intelligence ApS, Agro Food Park 13, 8200 Aarhus N, Denmark
4
Department of Agroecology, Aarhus University, Blichers Allé 20, 8830 Tjele, Denmark
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(12), 2930; https://doi.org/10.3390/s17122930
Received: 31 October 2017 / Revised: 29 November 2017 / Accepted: 12 December 2017 / Published: 17 December 2017
(This article belongs to the Special Issue Sensors in Agriculture)
Optimal fertilization of clover-grass fields relies on knowledge of the clover and grass fractions. This study shows how knowledge can be obtained by analyzing images collected in fields automatically. A fully convolutional neural network was trained to create a pixel-wise classification of clover, grass, and weeds in red, green, and blue (RGB) images of clover-grass mixtures. The estimated clover fractions of the dry matter from the images were found to be highly correlated with the real clover fractions of the dry matter, making this a cheap and non-destructive way of monitoring clover-grass fields. The network was trained solely on simulated top-down images of clover-grass fields. This enables the network to distinguish clover, grass, and weed pixels in real images. The use of simulated images for training reduces the manual labor to a few hours, as compared to more than 3000 h when all the real images are annotated for training. The network was tested on images with varied clover/grass ratios and achieved an overall pixel classification accuracy of 83.4%, while estimating the dry matter clover fraction with a standard deviation of 7.8%. View Full-Text
Keywords: deep learning; clover-grass; precision agriculture; dry matter composition; proximity sensing deep learning; clover-grass; precision agriculture; dry matter composition; proximity sensing
Show Figures

Figure 1

MDPI and ACS Style

Skovsen, S.; Dyrmann, M.; Mortensen, A.K.; Steen, K.A.; Green, O.; Eriksen, J.; Gislum, R.; Jørgensen, R.N.; Karstoft, H. Estimation of the Botanical Composition of Clover-Grass Leys from RGB Images Using Data Simulation and Fully Convolutional Neural Networks. Sensors 2017, 17, 2930. https://doi.org/10.3390/s17122930

AMA Style

Skovsen S, Dyrmann M, Mortensen AK, Steen KA, Green O, Eriksen J, Gislum R, Jørgensen RN, Karstoft H. Estimation of the Botanical Composition of Clover-Grass Leys from RGB Images Using Data Simulation and Fully Convolutional Neural Networks. Sensors. 2017; 17(12):2930. https://doi.org/10.3390/s17122930

Chicago/Turabian Style

Skovsen, Søren, Mads Dyrmann, Anders K. Mortensen, Kim A. Steen, Ole Green, Jørgen Eriksen, René Gislum, Rasmus N. Jørgensen, and Henrik Karstoft. 2017. "Estimation of the Botanical Composition of Clover-Grass Leys from RGB Images Using Data Simulation and Fully Convolutional Neural Networks" Sensors 17, no. 12: 2930. https://doi.org/10.3390/s17122930

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop