Next Article in Journal
Lab-on-a-Chip Platforms for Detection of Cardiovascular Disease and Cancer Biomarkers
Next Article in Special Issue
A Low-Cost Approach to Automatically Obtain Accurate 3D Models of Woody Crops
Previous Article in Journal
Extraction and Analysis of Respiratory Motion Using Wearable Inertial Sensor System during Trunk Motion
Previous Article in Special Issue
In Vivo Non-Destructive Monitoring of Capsicum Annuum Seed Growth with Diverse NaCl Concentrations Using Optical Detection Technique
Article Menu
Issue 12 (December) cover image

Export Article

Open AccessArticle
Sensors 2017, 17(12), 2930; https://doi.org/10.3390/s17122930

Estimation of the Botanical Composition of Clover-Grass Leys from RGB Images Using Data Simulation and Fully Convolutional Neural Networks

1
Department of Engineering, Aarhus University, Finlandsgade 22, 8200 Aarhus N, Denmark
2
Department of Agroecology, Aarhus University, Forsøgsvej 1, 4200 Slagelse, Denmark
3
Agro Intelligence ApS, Agro Food Park 13, 8200 Aarhus N, Denmark
4
Department of Agroecology, Aarhus University, Blichers Allé 20, 8830 Tjele, Denmark
*
Author to whom correspondence should be addressed.
Received: 31 October 2017 / Revised: 29 November 2017 / Accepted: 12 December 2017 / Published: 17 December 2017
(This article belongs to the Special Issue Sensors in Agriculture)
Full-Text   |   PDF [19384 KB, uploaded 19 December 2017]   |  

Abstract

Optimal fertilization of clover-grass fields relies on knowledge of the clover and grass fractions. This study shows how knowledge can be obtained by analyzing images collected in fields automatically. A fully convolutional neural network was trained to create a pixel-wise classification of clover, grass, and weeds in red, green, and blue (RGB) images of clover-grass mixtures. The estimated clover fractions of the dry matter from the images were found to be highly correlated with the real clover fractions of the dry matter, making this a cheap and non-destructive way of monitoring clover-grass fields. The network was trained solely on simulated top-down images of clover-grass fields. This enables the network to distinguish clover, grass, and weed pixels in real images. The use of simulated images for training reduces the manual labor to a few hours, as compared to more than 3000 h when all the real images are annotated for training. The network was tested on images with varied clover/grass ratios and achieved an overall pixel classification accuracy of 83.4%, while estimating the dry matter clover fraction with a standard deviation of 7.8%. View Full-Text
Keywords: deep learning; clover-grass; precision agriculture; dry matter composition; proximity sensing deep learning; clover-grass; precision agriculture; dry matter composition; proximity sensing
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Supplementary material

SciFeed

Share & Cite This Article

MDPI and ACS Style

Skovsen, S.; Dyrmann, M.; Mortensen, A.K.; Steen, K.A.; Green, O.; Eriksen, J.; Gislum, R.; Jørgensen, R.N.; Karstoft, H. Estimation of the Botanical Composition of Clover-Grass Leys from RGB Images Using Data Simulation and Fully Convolutional Neural Networks. Sensors 2017, 17, 2930.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top