Reshaping Hyperspectral Data into a Two-Dimensional Image for a CNN Model to Classify Plant Species from Reflectance
Abstract
:1. Introduction
2. Materials and Methods
2.1. Dataset Description
2.1.1. Data Source
2.1.2. Plant Species Selection
2.1.3. Image Data Preparation for CNN Models
2.2. CNN Model Architectures
2.3. Model Comparison
2.3.1. Comparison of CNN Models with DCN
2.3.2. Comparison of CNN Models with Conventional Models
2.4. Model Evaluation
2.5. Process Flowchart
3. Results
3.1. Comparison of CNN Models
3.2. Comparison of CNN with DCN and Conventional Models
3.3. Identification Results
4. Discussion and Future Work
4.1. Reshaping Leaf Spectroscopy Data to Feed CNN Models
4.2. Superiority of CNN Model on Species Classification Using Leaf Spectroscopy Data
4.3. Advantages and Uncertainty of Approach; Future Studies
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Wäldchen, J.; Mäder, P. Plant Species Identification Using Computer Vision Techniques: A Systematic Literature Review; Springer: Dordrecht, The Netherlands, 2018; Volume 25, ISBN 0123456789. [Google Scholar]
- Hassoon, I.M.; Kassir, S.A.; Altaie, S.M. A review of plant species identification techniques. Int. J. Sci. Res. 2018, 7, 325–328. [Google Scholar]
- Cope, J.S.; Corney, D.; Clark, J.Y.; Remagnino, P.; Wilkin, P. Plant species identification using digital morphometrics: A review. Expert Syst. Appl. 2012, 39, 7562–7573. [Google Scholar] [CrossRef]
- Wäldchen, J.; Rzanny, M.; Seeland, M.; Mäder, P. Automated plant species identification—Trends and future directions. PLoS Comput. Biol. 2018, 14, e1005993. [Google Scholar] [CrossRef]
- Lee, S.H.; Chan, C.S.; Wilkin, P.; Remagnino, P. Deep-Plant: Plant identification with convolutional neural networks. In Proceedings of the 2015 IEEE International Conference on Image Processing (ICIP), Quebec City, QC, Canada, 27 September–1 October 2015; pp. 452–456. [Google Scholar]
- Lee, S.H.; Chan, C.S.; Mayo, S.J.; Remagnino, P. How deep learning extracts and learns leaf features for plant classification. Pattern Recognit. 2017, 71, 1–13. [Google Scholar] [CrossRef]
- Zhang, C.; Zhou, P.; Li, C.; Liu, L. A convolutional neural network for leaves recognition using data augmentation. In Proceedings of the 2015 IEEE International Conference on Computer and Information Technology, Ubiquitous Computing and Communications, Dependable, Autonomic and Secure Computing, Pervasive Intelligence and Computing, Liverpool, UK, 26–28 October 2015; pp. 2143–2150. [Google Scholar]
- Castro-Esau, K.L.; Sánchez-Azofeifa, G.A.; Caelli, T. Discrimination of lianas and trees with leaf-level hyperspectral data. Remote Sens. Environ. 2004, 90, 353–372. [Google Scholar] [CrossRef]
- Chan, J.C.-W.; Paelinckx, D. Evaluation of random forest and adaboost tree-based ensemble classification and spectral band selection for ecotope mapping using airborne hyperspectral imagery. Remote Sens. Environ. 2008, 112, 2999–3011. [Google Scholar] [CrossRef]
- Clark, M.L.; Roberts, D.A.; Clark, D.B. Hyperspectral discrimination of tropical rain forest tree species at leaf to crown scales. Remote Sens. Environ. 2005, 96, 375–398. [Google Scholar] [CrossRef]
- Ferreira, M.P.; Zortea, M.; Zanotta, D.C.; Shimabukuro, Y.E.; de Souza Filho, C.R. Mapping tree species in tropical seasonal semi-deciduous forests with hyperspectral and multispectral data. Remote Sens. Environ. 2016, 179, 66–78. [Google Scholar] [CrossRef]
- Kishore, B.S.P.C.; Kumar, A.; Saikia, P.; Lele, N.V.; Pandey, A.C.; Srivastava, P.; Bhattacharya, B.K.; Khan, M.L. Major forests and plant species discrimination in Mudumalai forest region using airborne hyperspectral sensing. J. Asia-Pac. Biodivers. 2020, 13, 637–651. [Google Scholar] [CrossRef]
- Prospere, K.; McLaren, K.; Wilson, B. Plant species discrimination in a tropical wetland using in situ hyperspectral data. Remote Sens. 2014, 6, 8494–8523. [Google Scholar] [CrossRef]
- Long, Y.; Rivard, B.; Sanchez-Azofeifa, A.; Greiner, R.; Harrison, D.; Jia, S. Identification of spectral features in the longwave infrared (LWIR) spectra of leaves for the discrimination of tropical dry forest tree species. Int. J. Appl. Earth Obs. Geoinf. 2021, 97, 102286. [Google Scholar] [CrossRef]
- Gong, P.; Ruilianp, P.; Bin, Y. Conifer species recognition: An exploratory analysis of in situ hyperspectral data. Remote Sens. Environ. 1997, 62, 189–200. [Google Scholar] [CrossRef]
- Roth, K.L.; Roberts, D.A.; Dennison, P.E.; Peterson, S.H.; Alonzo, M. The impact of spatial resolution on the classification of plant species and functional types within imaging spectrometer data. Remote Sens. Environ. 2015, 171, 45–57. [Google Scholar] [CrossRef]
- Mäyrä, J.; Keski-Saari, S.; Kivinen, S.; Tanhuanpää, T.; Hurskainen, P.; Kullberg, P.; Poikolainen, L.; Viinikka, A.; Tuominen, S.; Kumpula, T.; et al. Tree species classification from airborne hyperspectral and LiDAR data using 3D convolutional neural networks. Remote Sens. Environ. 2021, 256, 112322. [Google Scholar] [CrossRef]
- Zhang, B.; Zhao, L.; Zhang, X. Three-dimensional convolutional neural network model for tree species classification using airborne hyperspectral images. Remote Sens. Environ. 2020, 247, 111938. [Google Scholar] [CrossRef]
- Hennessy, A.; Clarke, K.; Lewis, M. Hyperspectral classification of plants: A review of waveband selection generalisability. Remote Sens. 2020, 12, 113. [Google Scholar] [CrossRef]
- Khdery, G.A.; Farg, E.; Arafat, S.M. Natural vegetation cover analysis in Wadi Hagul, Egypt using hyperspectral remote sensing approach. Egypt. J. Remote Sens. Space Sci. 2019, 22, 253–262. [Google Scholar] [CrossRef]
- Jernelv, I.L.; Hjelme, D.R.; Matsuura, Y.; Aksnes, A. Convolutional neural networks for classification and regression analysis of one-dimensional spectral data. arXiv 2020, arXiv:2005.07530. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Sust. 2012, 25. Available online: https://proceedings.neurips.cc/paper/2012/hash/c399862d3b9d6b76c8436e924a68c45b-Abstract.html (accessed on 5 February 2021). [CrossRef]
- Fassnacht, F.E.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.T.; Straub, C.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
- Sims, D.A.; Gamon, J.A. Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
- Jacquemoud, S.; Ustin, S.L.; Verdebout, J.; Schmuck, G.; Andreoli, G.; Hosgood, B. Estimating leaf biochemistry using the PROSPECT leaf optical properties model. Remote Sens. Environ. 1996, 56, 194–202. [Google Scholar] [CrossRef]
- Wang, Z.; Townsend, P.A.; Schweiger, A.K.; Couture, J.J.; Singh, A.; Hobbie, S.E.; Cavender-Bares, J. Mapping foliar functional traits and their uncertainties across three years in a grassland experiment. Remote Sens. Environ. 2019, 221, 405–416. [Google Scholar] [CrossRef]
- Meireles, J.E.; Cavender-Bares, J.; Townsend, P.A.; Ustin, S.; Gamon, J.A.; Schweiger, A.K.; Schaepman, M.E.; Asner, G.P.; Martin, R.E.; Singh, A.; et al. Leaf reflectance spectra capture the evolutionary history of seed plants. New Phytol. 2020, 228, 485–493. [Google Scholar] [CrossRef]
- Chen, J.; Pisonero, J.; Chen, S.; Wang, X.; Fan, Q.; Duan, Y. Convolutional neural network as a novel classification approach for laser-induced breakdown spectroscopy applications in lithological recognition. Spectrochim. Acta Part B At. Spectrosc. 2020, 166, 105801. [Google Scholar] [CrossRef]
- Gao, H.; Yang, Y.; Li, C.; Zhou, H.; Qu, X. Joint alternate small convolution and feature reuse for hyperspectral image classification. ISPRS Int. J. Geo-Inf. 2018, 7, 349. [Google Scholar] [CrossRef]
- Lumini, A.; Nanni, L. Convolutional neural networks for ATC classification. Curr. Pharm. Des. 2018, 24, 4007–4012. [Google Scholar] [CrossRef]
- Yang, W.; Nigon, T.; Hao, Z.; Paiao, G.D.; Fernández, F.G.; Mulla, D.; Yang, C. Estimation of corn yield based on hyperspectral imagery and convolutional neural network. Comput. Electron. Agric. 2021, 184, 106092. [Google Scholar] [CrossRef]
- Luo, Y.; Zou, J.; Yao, C.; Zhao, X.; Li, T.; Bai, G. HSI-CNN: A novel convolution neural network for hyperspectral image. In Proceedings of the ICALIP 2018 6th International Conference on Audio, Language and Image Processing, Shanghai, China, 16–17 July 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 464–469. [Google Scholar]
- Han, Z.; Gao, J. Pixel-level aflatoxin detecting based on deep learning and hyperspectral imaging. Comput. Electron. Agric. 2019, 164, 104888. [Google Scholar] [CrossRef]
- Wang, Z. Fresh Leaf Spectra to Estimate LMA over NEON Domains in Eastern United States. Available online: https://ecosis.org/package/fresh-leaf-spectra-to-estimate-lma-over-neon-domains-in-eastern-united-states (accessed on 5 February 2021).
- Kothari, S.; Montgomery, R.; Cavender-Bares, J. FAB Leaf Spectra across a Light Gradient at Cedar Creek LTER. Available online: https://ecosis.org/package/fab-leaf-spectra-across-a-light-gradient-at-cedar-creek-lter (accessed on 5 February 2021).
- Beamlab You Can Probably Use Deep Learning Even If Your Data Isn’t That Big. Available online: https://beamandrew.github.io/deeplearning/2017/06/04/deep_learning_works.html (accessed on 30 June 2021).
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Xu, B.; Wang, N.; Chen, T.; Li, M. Empirical evaluation of rectified activations in convolutional network. arXiv 2015, arXiv:1505.00853. [Google Scholar]
- Kingma, D.P.; Ba, J.L. Adam: A method for stochastic optimization. In Proceedings of the 3rd International Conference on Learning Representations (ICLR 2015), Conference Track Proceedings, San Diego, CA, USA, 7–9 May 2015; pp. 1–15. [Google Scholar]
- Wang, R.; Fu, G.; Fu, B.; Wang, M. Deep & cross network for ad click predictions. In Proceedings of the 2017 AdKDD TargetAd—In Conjunction with the 23rd ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 2017), Halifax, NS, Canada, 13–17 August 2017. [Google Scholar]
- Wang, R.; Shivanna, R.; Cheng, D.; Jain, S.; Lin, D.; Hong, L.; Chi, E. DCN V2: Improved Deep & Cross Network and Practical Lessons for Web-Scale Learning to Rank Systems; Association for Computing Machinery: New York, NY, USA, 2020; ISBN 9781450383127. [Google Scholar]
- Khalid, S. Structured Data Learning with Wide, Deep, and Cross Networks. Available online: https://keras.io/examples/structured_data/wide_deep_cross_networks/#experiment-3-deep-amp-cross-model (accessed on 3 July 2021).
- Abadi, M.; Agarwal, A.; Barham, P.; Brevdo, E.; Chen, Z.; Citro, C.; Corrado, G.S.; Davis, A.; Dean, J.; Devin, M.; et al. TensorFlow: Large-scale machine learning on heterogeneous distributed systems. arXiv 2016, arXiv:1603.04467. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-Learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
- Mantero, P.; Moser, G.; Serpico, S.B. Partially supervised classification of remote sensing images using SVM-based probability density estimation. IEEE Trans. Geosci. Remote Sens. 2005, 43, 559–570. [Google Scholar] [CrossRef]
- Somvanshi, M.; Chavan, P.; Tambade, S.; Shinde, S.V. A review of machine learning techniques using decision tree and support vector machine. In Proceedings of the 2nd International Conference on Computing Communication Control and Automation (ICCUBEA), Pune, India, 12–13 August 2016. [Google Scholar] [CrossRef]
- Yang, C.-C.; Prasher, S.O.; Enright, P.; Madramootoo, C.; Burgess, M.; Goel, P.K.; Callum, I. Application of decision tree technology for image classification using remote sensing data. Agric. Syst. 2003, 76, 1101–1117. [Google Scholar] [CrossRef]
- Natekin, A.; Knoll, A. Gradient boosting machines, a tutorial. Front. Neurorobot. 2013, 7, 21. [Google Scholar] [CrossRef] [PubMed]
- Jin, X.; Jie, L.; Wang, S.; Qi, H.J.; Li, S.W. Classifying wheat hyperspectral pixels of healthy heads and fusarium head blight disease using a deep neural network in the wild field. Remote Sens. 2018, 10, 395. [Google Scholar] [CrossRef]
- Lecun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Fricker, G.A.; Ventura, J.D.; Wolf, J.A.; North, M.P.; Davis, F.W.; Franklin, J. A convolutional neural network classifier identifies tree species in mixed-conifer forest from hyperspectral imagery. Remote Sens. 2019, 11, 2326. [Google Scholar] [CrossRef]
- Kalacska, M.; Bohlman, S.; Sanchez-Azofeifa, G.; Castro-Esau, K.; Caelli, T. Hyperspectral discrimination of tropical dry forest lianas and trees: Comparative data reduction approaches at the leaf and canopy levels. Remote Sens. Environ. 2007, 109, 406–415. [Google Scholar] [CrossRef]
- Bahrami, M.; Mobasheri, M.R. Plant species determination by coding leaf reflectance spectrum and its derivatives. Eur. J. Remote Sens. 2020, 53, 258–273. [Google Scholar] [CrossRef]
Latin Name | Symbol | Code | Group | Training and Validation | Prediction | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
AN | NE | SH | Total | LO | CC | SH | Total | ||||
Acer pseudoplatanus L. | ACPS | 0 | Tree | 181 | 181 | 10 | 10 | ||||
Acer rubrum L. | ACRU | 1 | Tree | 156 | 156 | 18 | 18 | ||||
Acer shirasawanum Koidzumi | ACSH | 2 | Shrub-tree | 100 | 100 | 9 | 9 | ||||
Andropogon gerardii Vitman | ANGE | 3 | Grass | 89 | 89 | 16 | 16 | ||||
Fagus crenata Blume | FACR | 4 | Tree | 214 | 214 | 24 | 24 | ||||
Quercus rubra L. | QURU | 5 | Tree | 105 | 105 | 5 | 18 | 23 | |||
Total | 181 | 350 | 314 | 845 | 52 | 15 | 33 | 100 |
CNN1A | CNN1B | CNN2A | CNN2B | CNN2C | CNN3A | CNN3B | CNN3C | ||
---|---|---|---|---|---|---|---|---|---|
Input | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | |
Rescaling | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | 45 × 45 × 3 | |
Conv1 | Kernel | 3 × 3 | 3 × 3 | 3 × 3 | 3 × 3 | 3 × 3 | 3 × 3 | 3 × 3 | 3 × 3 |
Stride | 1 × 1 | 1 × 1 | 1 × 1 | 1 × 1 | 1 × 1 | 1 × 1 | 1 × 1 | 1 × 1 | |
Output | 45 × 45 × 32 | 45 × 45 × 32 | 45 × 45 × 32 | 45 × 45 × 32 | 45 × 45 × 32 | 45 × 45 × 32 | 45 × 45 × 32 | 45 × 45 × 32 | |
Pooling | Output | - | 22 × 22 × 32 | - | 22 × 22 × 32 | 22 × 22 × 32 | - | 22 × 22 × 32 | 22 × 22 × 32 |
Conv2 | Kernel | - | - | 3 × 3 | 3 × 3 | 3 × 3 | 3 × 3 | 3 × 3 | 3 × 3 |
Stride | - | - | 1 × 1 | 1 × 1 | 1 × 1 | 1 × 1 | 1 × 1 | 1 × 1 | |
Output | - | - | 45 × 45 × 32 | 22 × 22 × 32 | 22 × 22 × 32 | - | 22 × 22 × 32 | 22 × 22 × 32 | |
Pooling | Output | - | - | - | - | 11 × 11 × 32 | - | 11 × 11 × 32 | 11 × 11 × 32 |
Conv3 | Kernel | - | - | - | - | - | 3 × 3 | 3 × 3 | 3 × 3 |
Stride | - | - | - | - | - | 1 × 1 | 1 × 1 | 1 × 1 | |
Output | - | - | - | - | - | 45 × 45 × 64 | 11 × 11 × 32 | 11 × 11 × 32 | |
Pooling | Output | - | - | - | - | - | - | - | 5 × 5 × 64 |
Dropout | Rate (%) | 0.2 | 0.2 | 0.2 | 0.2 | 0.2 | 0.2 | 0.2 | 0.2 |
Flatten | 64,800 | 15,488 | 64,800 | 15,488 | 3872 | 129,600 | 7744 | 1600 | |
Dense | 64,800 × 128 | 15,488 × 128 | 64,800 × 128 | 15,488 × 128 | 3872 × 128 | 129,600 × 128 | 7744 × 128 | 1600 × 128 | |
Output | 1 × 6 | 1 × 6 | 1 × 6 | 1 × 6 | 1 × 6 | 1 × 6 | 1 × 6 | 1 × 6 |
Confusion Matrix | Predicted | ||
---|---|---|---|
Positive | Negative | ||
Actual | Positive | True positive (TP) | False negative (FN) |
Negative | False positive (FP) | True negative (TN) |
Training Phase | Application Phase | |||
---|---|---|---|---|
Model | Validation Accuracy (%) | Prediction Accuracy (%) | Precision (%) | F1-Score |
CNN1A | 98.4 ± 1.5 | 87.6 ± 1.7 | 62.9 ± 5.0 | 0.54 ± 0.04 |
CNN1B | 98.6 ± 0.7 | 88.9 ± 1.7 | 66.6 ± 5.2 | 0.58 ± 0.05 |
CNN2A | 98.0 ± 2.0 | 87.6 ± 2.3 | 62.7 ± 7.0 | 0.54 ± 0.07 |
CNN2B | 98.6 ± 0.9 | 90.5 ± 2.6 | 71.5 ± 7.7 | 0.62 ± 0.07 |
CNN2C | 98.6 ± 0.8 | 91.6 ± 2.7 | 74.9 ± 8.2 | 0.65 ± 0.08 |
CNN3A | 98.5 ± 1.2 | 88.6 ± 3.0 | 65.9 ± 8.9 | 0.57 ± 0.09 |
CNN3B | 98.5 ± 1.1 | 91.2 ± 2.2 | 73.6 ± 6.7 | 0.63 ± 0.08 |
CNN3C | 98.2 ± 1.1 | 90.8 ± 2.9 | 72.5 ± 8.6 | 0.62 ± 0.09 |
Training Phase | Application Phase | |||
---|---|---|---|---|
Model | Validation Accuracy (%) | Prediction Accuracy (%) | Precision (%) | F1-Score |
DCN | 94.0 ± 13.9 | 85.7 ± 2.5 | 57.1 ± 7.5 | 0.48 ± 0.09 |
SVM | 98.6 ± 1.0 | 88.8 ± 1.2 | 66.4 ± 3.7 | 0.64 ± 0.04 |
RF | 86.7 ± 2.6 | 83.6 ± 0.9 | 50.7 ± 2.8 | 0.46 ± 0.03 |
GBDT | 91.6 ± 1.9 | 83.4 ± 1.1 | 50.4 ± 3.2 | 0.44 ± 0.03 |
DT | 83.4 ± 2.6 | 80.9 ± 1.6 | 42.8 ± 4.9 | 0.38 ± 0.05 |
Species | CNN2C | DCN | SVM | RF | GBDT | DT |
---|---|---|---|---|---|---|
ACPS | 0.1 | 0.0 | 77.0 | 59.0 | 26.0 | 23.0 |
ACRU | 88.9 | 72.2 | 72.2 | 48.9 | 54.4 | 26.7 |
ACSH | 74.4 | 87.8 | 44.4 | 7.2 | 0.1 | 13.3 |
ANGE | 68.8 | 29.4 | 81.3 | 93.8 | 93.8 | 87.5 |
FACR | 100.0 | 83.3 | 95.8 | 79.2 | 87.5 | 70.8 |
QURU | 65.2 | 47.8 | 17.4 | 8.7 | 8.3 | 13.9 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yuan, S.; Song, G.; Huang, G.; Wang, Q. Reshaping Hyperspectral Data into a Two-Dimensional Image for a CNN Model to Classify Plant Species from Reflectance. Remote Sens. 2022, 14, 3972. https://doi.org/10.3390/rs14163972
Yuan S, Song G, Huang G, Wang Q. Reshaping Hyperspectral Data into a Two-Dimensional Image for a CNN Model to Classify Plant Species from Reflectance. Remote Sensing. 2022; 14(16):3972. https://doi.org/10.3390/rs14163972
Chicago/Turabian StyleYuan, Shaoxiong, Guangman Song, Guangqing Huang, and Quan Wang. 2022. "Reshaping Hyperspectral Data into a Two-Dimensional Image for a CNN Model to Classify Plant Species from Reflectance" Remote Sensing 14, no. 16: 3972. https://doi.org/10.3390/rs14163972
APA StyleYuan, S., Song, G., Huang, G., & Wang, Q. (2022). Reshaping Hyperspectral Data into a Two-Dimensional Image for a CNN Model to Classify Plant Species from Reflectance. Remote Sensing, 14(16), 3972. https://doi.org/10.3390/rs14163972