Next Article in Journal
Hyperspectral Image Classification Based on Parameter-Optimized 3D-CNNs Combined with Transfer Learning and Virtual Samples
Next Article in Special Issue
3D Façade Labeling over Complex Scenarios: A Case Study Using Convolutional Neural Network and Structure-From-Motion
Previous Article in Journal
Assessment of Forest above Ground Biomass Estimation Using Multi-Temporal C-band Sentinel-1 and Polarimetric L-band PALSAR-2 Data
Previous Article in Special Issue
Aircraft Type Recognition in Remote Sensing Images Based on Feature Learning with Conditional Generative Adversarial Networks
Article Menu
Issue 9 (September) cover image

Export Article

Open AccessArticle
Remote Sens. 2018, 10(9), 1423; https://doi.org/10.3390/rs10091423

WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming

1
Autonomous Systems Lab., Department of Mechanical and Process Engineering, ETHZ, Zurich 8092, Switzerland
2
Vision for Robotics Lab., Department of Mechanical and Process Engineering, ETHZ, Zurich 8092, Switzerland
3
Institute of Geodesy and Geoinformation, University of Bonn, Bonn 53115, Germany
4
Crop Science, Department of Environmental Systems Science, ETHZ, Zurich 8092, Switzerland
Current address: Leonhardstrasse 21, Building LEE, J, 8092 Zurich, Switzerland.
*
Author to whom correspondence should be addressed.
Received: 27 July 2018 / Revised: 20 August 2018 / Accepted: 26 August 2018 / Published: 7 September 2018
(This article belongs to the Special Issue Deep Learning for Remote Sensing)
Full-Text   |   PDF [11799 KB, uploaded 7 September 2018]   |  

Abstract

The ability to automatically monitor agricultural fields is an important capability in precision farming, enabling steps towards more sustainable agriculture. Precise, high-resolution monitoring is a key prerequisite for targeted intervention and the selective application of agro-chemicals. The main goal of this paper is developing a novel crop/weed segmentation and mapping framework that processes multispectral images obtained from an unmanned aerial vehicle (UAV) using a deep neural network (DNN). Most studies on crop/weed semantic segmentation only consider single images for processing and classification. Images taken by UAVs often cover only a few hundred square meters with either color only or color and near-infrared (NIR) channels. Although a map can be generated by processing single segmented images incrementally, this requires additional complex information fusion techniques which struggle to handle high fidelity maps due to their computational costs and problems in ensuring global consistency. Moreover, computing a single large and accurate vegetation map (e.g., crop/weed) using a DNN is non-trivial due to difficulties arising from: (1) limited ground sample distances (GSDs) in high-altitude datasets, (2) sacrificed resolution resulting from downsampling high-fidelity images, and (3) multispectral image alignment. To address these issues, we adopt a stand sliding window approach that operates on only small portions of multispectral orthomosaic maps (tiles), which are channel-wise aligned and calibrated radiometrically across the entire map. We define the tile size to be the same as that of the DNN input to avoid resolution loss. Compared to our baseline model (i.e., SegNet with 3 channel RGB (red, green, and blue) inputs) yielding an area under the curve (AUC) of [background=0.607, crop=0.681, weed=0.576], our proposed model with 9 input channels achieves [0.839, 0.863, 0.782]. Additionally, we provide an extensive analysis of 20 trained models, both qualitatively and quantitatively, in order to evaluate the effects of varying input channels and tunable network hyperparameters. Furthermore, we release a large sugar beet/weed aerial dataset with expertly guided annotations for further research in the fields of remote sensing, precision agriculture, and agricultural robotics. View Full-Text
Keywords: precision farming; weed management; multispectral imaging; semantic segmentation; deep neural network; unmanned aerial vehicle; remote sensing precision farming; weed management; multispectral imaging; semantic segmentation; deep neural network; unmanned aerial vehicle; remote sensing
Figures

Graphical abstract

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Supplementary materials

  • Supplementary File 1:

    ZIP-Document (ZIP, 2369 KB)

  • Externally hosted supplementary file 1
    Link: https://goo.gl/ZsgeCV
    Description: Remote Sensing 2018 ETH ASL, Weed Map Dataset
SciFeed

Share & Cite This Article

MDPI and ACS Style

Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming. Remote Sens. 2018, 10, 1423.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Remote Sens. EISSN 2072-4292 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top