Next Article in Journal
Hyperspectral Classification Based on Texture Feature Enhancement and Deep Belief Networks
Next Article in Special Issue
Object-Based Features for House Detection from RGB High-Resolution Images
Previous Article in Journal
Landsat Super-Resolution Enhancement Using Convolution Neural Networks and Sentinel-2 for Training
Previous Article in Special Issue
Sparse Subspace Clustering-Based Feature Extraction for PolSAR Imagery Classification
Open AccessArticle

Classifying Wheat Hyperspectral Pixels of Healthy Heads and Fusarium Head Blight Disease Using a Deep Neural Network in the Wild Field

1
College of Information and Computer Science, Anhui Agricultural University, Hefei 230036, China
2
College of Agronomy, Anhui Agriculture University, Hefei 230036, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2018, 10(3), 395; https://doi.org/10.3390/rs10030395
Received: 21 January 2018 / Revised: 27 February 2018 / Accepted: 1 March 2018 / Published: 4 March 2018
Classification of healthy and diseased wheat heads in a rapid and non-destructive manner for the early diagnosis of Fusarium head blight disease research is difficult. Our work applies a deep neural network classification algorithm to the pixels of hyperspectral image to accurately discern the disease area. The spectra of hyperspectral image pixels in a manually selected region of interest are preprocessed via mean removal to eliminate interference, due to the time interval and the environment. The generalization of the classification model is considered, and two improvements are made to the model framework. First, the pixel spectra data are reshaped into a two-dimensional data structure for the input layer of a Convolutional Neural Network (CNN). After training two types of CNNs, the assessment shows that a two-dimensional CNN model is more efficient than a one-dimensional CNN. Second, a hybrid neural network with a convolutional layer and bidirectional recurrent layer is reconstructed to improve the generalization of the model. When considering the characteristics of the dataset and models, the confusion matrices that are based on the testing dataset indicate that the classification model is effective for background and disease classification of hyperspectral image pixels. The results of the model show that the two-dimensional convolutional bidirectional gated recurrent unit neural network (2D-CNN-BidGRU) has an F1 score and accuracy of 0.75 and 0.743, respectively, for the total testing dataset. A comparison of all the models shows that the hybrid neural network of 2D-CNN-BidGRU is the best at preventing over-fitting and optimize the generalization. Our results illustrate that the hybrid structure deep neural network is an excellent classification algorithm for healthy and Fusarium head blight diseased classification in the field of hyperspectral imagery. View Full-Text
Keywords: Fusarium head blight disease; hyperspectral image; classification; deep convolution recurrent neural network (DCRNN); bidirectional recurrent neural network Fusarium head blight disease; hyperspectral image; classification; deep convolution recurrent neural network (DCRNN); bidirectional recurrent neural network
Show Figures

Graphical abstract

MDPI and ACS Style

Jin, X.; Jie, L.; Wang, S.; Qi, H.J.; Li, S.W. Classifying Wheat Hyperspectral Pixels of Healthy Heads and Fusarium Head Blight Disease Using a Deep Neural Network in the Wild Field. Remote Sens. 2018, 10, 395.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map

1
Back to TopTop