Imaging Using Unmanned Aerial Vehicles for Agriculture Land Use Classification
Abstract
:1. Introduction
2. Methods
2.1. Research Sites
2.2. Photographing Tools
2.3. Research Procedure
2.4. Maximum Likelihood Method
2.5. Single Feature Probability
2.6. Accuracy Assessment
- (1)
- Producer’s accuracy: This is the accuracy of classification of ground-truth reference data. It is obtained by dividing the number of correctly classified samples by the total number of samples in the corresponding reference data.
- (2)
- User’s accuracy: This refers to the accuracy of classification of actual land cover based on computed-based interpretation. It is calculated by dividing the number of correctly classified samples in a specific type by the total number of sample types.
- (3)
- Overall accuracy: This is the most straightforward and general parameter and is obtained by dividing the number of samples on the diagonal of the matrix by the total number of samples. This index weights the types of land use, so the results are relatively objective. A higher overall accuracy corresponds to a more accurate interpretation result.
- (4)
- Kappa value: This index compares computer classification with random classification and calculates the percentage by which the errors of the former is lower than that of the latter. In general, the Kappa value ranges between 0 and 1; a high value indicates high similarity between the two classifications and therefore a highly accurate computer-based interpretation.
3. Results
3.1. Analysis of Land Use Interpretation at the Lucao Sampling Site
3.2. Analysis of Land Use Interpretation at the Minxiong Sampling Site
3.3. Analysis of Land Use Interpretation at Xingang Sampling Site
3.4. Analysis of Land Use Interpretation at the Budai Sampling Site
3.5. Analysis of Land Use Interpretation at the Yizhu Sampling Site
4. Discussion
- The interpretation results for the five sampling sites demonstrated that of the images that were captured from August to October, those had high interpretation accuracies. Moreover, interpretation accuracy varied with the environmental features of the sampling sites; even at a single sampling site, interpretation accuracy varied among crop growth stages.
- The classification accuracy of RGB images was unsatisfactory (60–88%). However, after the NIR image band data were added, the classification accuracy increased to over 80%. Adding the DSM elevation data further improved the accuracy to approximately 90%. Therefore, multispectral and elevation data were verified to effectively enhance the accuracy of land use classification. This result showed that RGB image classification is often prone to the “salt and pepper effect”, which can be ameliorated by the inclusion of multispectral images. This is because the RGB image spectrum classification accuracy is low, and such images lack the NIR image band data to improve the separability of land coverage type. These findings are consistent with past studies [11,18,28,29].
- Adding different types of image information exerts distinct effects in land use classification. For example, paddy fields and water bodies exhibit highly similar characteristics, often leading to misclassification. However, the addition of DSM elevation data helps distinguish these two types of land. In addition, buildings with green roofs are likely to be misinterpreted as vegetation cover. Adding NIR image band and DSM elevation data can effectively distinguish between such buildings and vegetation. When lacking DSM elevation data, misclassification often occurs because of the lack of the interpretations of terrain height and plant height [15,16]. When multiple types of images are integrated, such as multispectral images and elevation images, the accuracy of the classification of land cover increases [30].
- Paddy fields and fish farms are easily misclassified because of the presence of water on the land surface. In this study, the addition of NIR image band data did not prevent fish farms from being misclassified as paddy fields. This is attributable to how fish farms contain various aquatic plants and algae that are difficult to distinguish using multispectral images. However, this problem can be addressed through the addition of elevation data [20,31]. In addition, water bodies reflect light, which can lead to errors in image interpretation. Thus, we suggest that researchers avoid capturing images at noon to reduce direct light reflection from water.
- The image classification tool ArcGIS 10.0 was used to identify land uses. This tool performs classification and interpretation by the color of each cell, using the known spectra in delineated training samples to determine other parts of the image. However, since different land cover types can yield similar spectra, misinterpretation is possible. For example, a vegetable field can be misidentified as an open space owing to the bare soil in the field. The interpretation results were relatively accurate only when the crops were flourishing; newly planted seedlings were too small to be effectively classified by computer interpretation [32]. Therefore, to increase the interpretation accuracy, crop samples at various growth stages should be used for computer learning.
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Comber, A.; Fisher, P.; Brunsdon, C.; Khmag, A. Spatial analysis of remote sensing image classification accuracy. Remote Sens. Environ. 2012, 127, 237–246. [Google Scholar] [CrossRef] [Green Version]
- Thorp, K.R.; Tian, L.F. A review on remote sensing of weeds in agriculture. Precis. Agric. 2004, 5, 477–508. [Google Scholar] [CrossRef]
- Wu, C.C. Study on Land Utilizes and Changes by Satellite Image Automation Classification. Master’s Thesis, The National Cheng Kung University, Tainan, Taiwan, 2007. Unpublished. [Google Scholar]
- Lee, R.Y.; Ou, D.Y.; Hsu, C.H. Utilizing unmanned aerial vehicle images to interpret crop types in the hillside area. ISPRS J Photogramm. Remote Sens. 2018, 23, 245–256. [Google Scholar]
- Council of Agriculture, Executive Yuan. Agriculture Statistics Yearbook. 2019; Taipei. Available online: https://agrstat.coa.gov.tw/sdweb/public/book/Book.aspx (accessed on 2 September 2020).
- Liu, S.; Li, L.; Gao, W.; Zhang, Y.; Liu, Y.; Wang, S.; Lu, J. Diagnosis of nitrogen status in winter oilseed rape (Brassica napus L.) using in-situ hyperspectral data and unmanned aerial vehicle (UAV) multispectral images. Comput. Electron. Agric. 2018, 151, 185–195. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
- Ma, L.; Li, M.; Ma, X.; Cheng, L.; Du, P.; Liu, Y. A review of supervised object-based land-cover image classification. ISPRS J. Photogramm. Remote Sens. 2017, 130, 277–293. [Google Scholar] [CrossRef]
- Tian, Z.K.; Fu, Y.Y.; Liu, S.H.; Liu, F. Rapid crops classification based on UAV low-altitude remote sensing. Trans. Chin. Soc. Agric. Eng. 2013, 29, 109–116. [Google Scholar]
- Bryson, M.; Reid, A.; Ramos, F.; Sukkarieh, S. Airborne vision-based mapping and classification of large farmland environments. J. Field Robot 2015, 27, 632–655. [Google Scholar] [CrossRef]
- Yang, M.D.; Huang, K.S.; Kuo, Y.H.; Hui, T.; Lin, L.M. Spatial and spectral hybrid image classification for rice lodging assessment through UAV imagery. Remote Sens. 2017, 9, 583. [Google Scholar] [CrossRef] [Green Version]
- Lin, F.Y.; Chang, S.C.; Feng, Y.Y.; Chen, Y.W. Evaluation for Application of Sensing Technology to Monitor on Agricultural Loss. In Proceedings of the Symposium on Agricultural Engineering and Automation Project Achievements, Taipei, Taiwan, 1 March 2015. [Google Scholar]
- Kuo, Y.H. Application of UAV Images to Cultivated Field Classification. Master′s Thesis, The National Chung Hsing University, Taichung, Taiwan, 2011. Unpublished. [Google Scholar]
- Congalton, R.G. A review of assessing the accuracy of classifications of remotely sensed data. Remote Sens. Env. 1991, 37, 35–46. [Google Scholar] [CrossRef]
- Al-Najjar, H.A.H.; Kalantar, B.; Pradhan, B.; Saeidi, V.; Halin, A.A.; Ueda, N.; Mansor, S. Land cover classification from fused DSM and UAV images using convolutional neural networks. Remote Sens. 2019, 11, 1461. [Google Scholar] [CrossRef] [Green Version]
- Michez, A.; Piégay, H.; Lisein, J.; Claessens, H.; Lejeune, P. Classification of riparian forest species and health condition using multi-temporal and hyperspatial imagery from unmanned aerial system. Environ. Monit. Assess. 2016, 188, 146. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Torres-Sánchez, J.; López-Granados, F.; Peña, J.M. An automatic object-based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops. Comput. Electron. Agric. 2015, 114, 43–52. [Google Scholar] [CrossRef]
- Hunt, E.R., Jr.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; McCarty, G.W. Acquisition of NIR-green–blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef] [Green Version]
- Kalantar, B.; Mansor, S.B.; Sameen, M.I.; Pradhan, B.; Shafri, H.Z.M. Drone-based land-cover mapping using a fuzzy unordered rule induction algorithm integrated into object-based image analysis. Int. J. Remote Sens. 2017, 38, 2535–2556. [Google Scholar] [CrossRef]
- Yang, M.D.; Huang, K.S.; Wan, J.; Tsai, H.P.; Lin, L.M. Timely and quantitative damage assessment of oyster racks using UAV images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 2862–2868. [Google Scholar] [CrossRef]
- Hubert-Moy, L.; Cotonnec, A.; Le Du, L.; Chardin, A.; Pérez, P. A comparison of parametric classification procedures of remotely sensed data applied on different landscape units. Remote Sens. Environ. 2001, 75, 174–187. [Google Scholar] [CrossRef]
- Peña-Barragán, J.M.; López-Granados, F.; García-Torres, L.; Jurado-Expósito, M.; Sánchez de La Orden, M.; García-Ferrer, A. Discriminating cropping systems and agro-environmental measures by remote sensing. Agron. Sustain. Dev. 2008, 28, 355–362. [Google Scholar] [CrossRef] [Green Version]
- Lillesand, T.M.; Kiefer, R.W. Remote Sensing and Image Interpretation, 5th ed.; John Wiley & Sons: New York, NY, USA, 2004. [Google Scholar]
- Lin, G.C. Factors and integrated managements of rice blast disease in Yunlin, Chiayi, and Tainan region. Tainan Agric. News 2014, 87, 22–25. [Google Scholar]
- Jovanovic, D.; Govedarica, M.; Sabo, F.; Bugarinovic, Z.; Novovic, O.; Beker, T.; Lauter, M. Land cover change detection by using Remote Sensing—A Case Study of Zlatibor (Serbia). Geogr. Pannonica. 2015, 19, 162–173. [Google Scholar] [CrossRef] [Green Version]
- McGwire, K.C.; Fisher, P. Spatially Variable Thematic Accuracy: Beyond the Confusion Matrix. In Spatial Uncertainty in Ecology: Implications for Remote Sensing and GIS Applications; Hunsaker, C.T., Goodchild, M.F., Friedl, M.A., Case, T.J., Eds.; Springer: New York, NY, USA, 2001; pp. 308–329. [Google Scholar]
- Congalton, R.G.; Green, K. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices; Lewis Publishers: Boca Raton, FL, USA, 1999. [Google Scholar]
- Liu, T.; Abd-Elrahman, A. Multi-view object-based classification of wetland land covers using unmanned aircraft system images. Remote Sens. Environ. 2018, 216, 122–138. [Google Scholar] [CrossRef]
- Feng, Q.; Liu, J.; Gong, J. UAV Remote sensing for urban vegetation mapping using random forest and texture analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef] [Green Version]
- Baron, J.; Hill, D.J.; Elmiligi, H. Combining image processing and machine learning to identify invasive plants in high-resolution images. Int. J. Remote Sens. 2018, 39, 5099–5118. [Google Scholar] [CrossRef]
- Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
- Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
Results | Type A | Type B | Type N | Total | User′s Accuracy | ||
---|---|---|---|---|---|---|---|
Real Data | |||||||
Type A | X11 | X12 | X1n | ||||
Type B | X21 | X22 | X2n | ||||
Type N | Xn1 | Xn2 | Xnn | ||||
Total | |||||||
Producer’s Accuracy | |||||||
, |
Image Composites | 17 August | 28 September | ||
---|---|---|---|---|
Overall Accuracy | Kappa | Overall Accuracy | Kappa | |
RGB | 74% | 0.546 | 78% | 0.359 |
RGB + NIR | 80% | 0.725 | 83% | 0.514 |
RGB + NIR + DSM | 88% | 0.857 | 90% | 0.881 |
Image Composites | 7 September | 5 October | ||
---|---|---|---|---|
Overall Accuracy | Kappa | Overall Accuracy | Kappa | |
RGB | 76% | 0.448 | 88% | 0.668 |
RGB + NIR | 82% | 0.763 | 89% | 0.714 |
RGB + NIR + DSM | 88% | 0.875 | 95% | 0.903 |
Image Composites | 13 September | 26 October | ||
---|---|---|---|---|
Overall Accuracy | Kappa | Overall Accuracy | Kappa | |
RGB | 74% | 0.555 | 62% | 0.382 |
RGB + NIR | 80% | 0.745 | 81% | 0.715 |
RGB + NIR + DSM | 85% | 0.814 | 87% | 0.863 |
Image Composites | 6 September | 15 October | ||
---|---|---|---|---|
Overall Accuracy | Kappa | Overall Accuracy | Kappa | |
RGB | 86% | 0.783 | 76% | 0.649 |
RGB + NIR | 91% | 0.907 | 82% | 0.798 |
RGB + NIR + DSM | 94% | 0.935 | 88% | 0.863 |
Image Composites | 10 August | 21 September | ||
---|---|---|---|---|
Overall Accuracy | Kappa | Overall Accuracy | Kappa | |
RGB | 60% | 0.387 | 83% | 0.745 |
RGB + NIR | 85% | 0.814 | 94% | 0.900 |
RGB + NIR + DSM | 92% | 0.891 | 96% | 0.948 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, P.-C.; Chiang, Y.-C.; Weng, P.-Y. Imaging Using Unmanned Aerial Vehicles for Agriculture Land Use Classification. Agriculture 2020, 10, 416. https://doi.org/10.3390/agriculture10090416
Chen P-C, Chiang Y-C, Weng P-Y. Imaging Using Unmanned Aerial Vehicles for Agriculture Land Use Classification. Agriculture. 2020; 10(9):416. https://doi.org/10.3390/agriculture10090416
Chicago/Turabian StyleChen, Pei-Chun, Yen-Cheng Chiang, and Pei-Yi Weng. 2020. "Imaging Using Unmanned Aerial Vehicles for Agriculture Land Use Classification" Agriculture 10, no. 9: 416. https://doi.org/10.3390/agriculture10090416
APA StyleChen, P.-C., Chiang, Y.-C., & Weng, P.-Y. (2020). Imaging Using Unmanned Aerial Vehicles for Agriculture Land Use Classification. Agriculture, 10(9), 416. https://doi.org/10.3390/agriculture10090416