The development and improvement of methods to map agricultural land cover are currently major challenges, especially for radar images. This is due to the speckle noise nature of radar, leading to a less intensive use of radar rather than optical images. The European Space Agency Sentinel-1 constellation, which recently became operational, is a satellite system providing global coverage of Synthetic Aperture Radar (SAR) with a 6-days revisit period at a high spatial resolution of about 20 m. These data are valuable, as they provide spatial information on agricultural crops. The aim of this paper is to provide a better understanding of the capabilities of Sentinel-1 radar images for agricultural land cover mapping through the use of deep learning techniques. The analysis is carried out on multitemporal Sentinel-1 data over an area in Camargue, France. The data set was processed in order to produce an intensity radar data stack from May 2017 to September 2017. We improved this radar time series dataset by exploiting temporal filtering to reduce noise, while retaining as much as possible the fine structures present in the images. We revealed that even with classical machine learning approaches (K
nearest neighbors, random forest, and support vector machines), good performance classification could be achieved with F-measure/Accuracy greater than 86% and Kappa coefficient better than 0.82. We found that the results of the two deep recurrent neural network (RNN)-based classifiers clearly outperformed the classical approaches. Finally, our analyses of the Camargue area results show that the same performance was obtained with two different RNN-based classifiers on the Rice
class, which is the most dominant crop of this region, with a F-measure metric of 96%. These results thus highlight that in the near future these RNN-based techniques will play an important role in the analysis of remote sensing time series.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited