A Field Weed Density Evaluation Method Based on UAV Imaging and Modified U-Net
Abstract
:1. Introduction
- To develop a semantic weed segmentation algorithm based on deep learning;
- To design a weed density calculation and mapping method based on segmented UAV images.
2. Materials and Methods
2.1. Marigold Field Image Acquisition and Sample Preparation
2.2. Process of Weed Density Evaluation from UAV Images
2.3. Green Plant Segmentation Method
2.4. Crop Segmentation Network Structure
2.5. Modified U-Net Training
2.6. Crop Segment Network Performance Evaluation
2.7. Weed Density Calculating and Mapping
3. Results
3.1. Green Plant Segmentation Results
3.2. Training Process of the Modified U-Net
3.3. Comparison of Modified U-Net with State-of-the-Art Methods
3.4. Weed Mapping and Accuracy Evaluation Results
4. Discussion
5. Conclusions
- (1)
- The combination of excess green minus excess red index and the minimum error method could be used to segment bare land and green plants. The segmentation accuracy could reach 93.5%.
- (2)
- The proposed modified U-net can effectively segment weeds and crop images. The IoU of segmentation was 93.40%, and the segmentation time of a single image was 40.90 ms;
- (3)
- Weed density in the field can be effectively evaluated by UAV images. The coefficient of determination was 0.94, and the root mean square error () was 0.03.
- (4)
- The results show that weed density could be calculated and mapped by UAV and image segmentation. The results for this method are reasonable and provide effective information for precise weed management and precision weeding.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
- Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agric. 2016, 125, 184–199. [Google Scholar] [CrossRef]
- Oerke, E.-C. Crop losses to pests. J. Agric. Sci. 2006, 144, 31–43. [Google Scholar] [CrossRef]
- Christensen, S.; Søgaard, H.T.; Kudsk, P.; Nørremark, M.; Lund, I.; Nadimi, E.S.; Jørgensen, R. Site-specific weed control technologies. Weed Res. 2009, 49, 233–241. [Google Scholar] [CrossRef]
- López-Granados, F.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.I.; Mesas-Carrascosa, F.J.; Pena, J.M. Early season weed mapping in sunflower using UAV technology: Variability of herbicide treatment maps against weed thresholds. Precis. Agric. 2016, 17, 183–199. [Google Scholar] [CrossRef]
- López-Granados, F. Weed detection for site-specific weed management: Mapping and real-time approaches. Weed Res. 2011, 51, 1–11. [Google Scholar] [CrossRef] [Green Version]
- Gao, J.; Liao, W.; Nuyttens, D.; Lootens, P.; Vangeyte, J.; Pižurica, A.; He, Y.; Pieters, J.G. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 67, 43–53. [Google Scholar] [CrossRef]
- Zhou, H.; Zhang, J.; Ge, L.; Yu, X.; Wang, Y.; Zhang, C. Research on volume prediction of single tree canopy based on three-dimensional (3D) LiDAR and clustering segmentation. Int. J. Remote Sens. 2020, 42, 738–755. [Google Scholar] [CrossRef]
- Castillejo-González, I.L.; Peña-Barragán, J.M.; Jurado-Expósito, M.; Mesas-Carrascosa, F.J.; López-Granados, F. Evaluation of pixel-and object-based approaches for mapping wild oat (Avena sterilis) weed patches in wheat fields using QuickBird imagery for site-specific management. Eur. J. Agron. 2014, 59, 57–66. [Google Scholar] [CrossRef]
- de Castro, A.I.; López-Granados, F.; Jurado-Expósito, M. Broad-scale cruciferous weed patch classification in winter wheat using QuickBird imagery for in-season site-specific control. Precis. Agric. 2013, 14, 392–413. [Google Scholar] [CrossRef] [Green Version]
- Lottes, P.; Hörferlin, M.; Sander, S.; Stachniss, C. Effective vision-based classification for separating sugar beets and weeds for precision farming. J. Field Robot. 2017, 34, 1160–1178. [Google Scholar] [CrossRef]
- Rehman, T.U.; Zaman, Q.U.; Chang, Y.K.; Schumann, A.W.; Corscadden, K.W.; Esau, T.J. Optimising the parameters influencing performance and weed (goldenrod) identification accuracy of colour co-occurrence matrices. Biosyst. Eng. 2018, 170, 85–95. [Google Scholar] [CrossRef] [Green Version]
- Tao, T.; Wu, S.; Li, L.; Li, J.; Bao, S.; Wei, X. Design and experiments of weeding teleoperated robot spectral sensor for winter rape and weed identification. Adv. Mech. Eng. 2018, 10, 1687814018776741. [Google Scholar] [CrossRef] [Green Version]
- Du, M.; Noguchi, N. Monitoring of wheat growth status and mapping of wheat yield’s within-field spatial variations using color images acquired from UAV-camera system. Remote Sens. 2017, 9, 289. [Google Scholar] [CrossRef] [Green Version]
- Nevavuori, P.; Narra, N.; Lipping, T. Crop yield prediction with deep convolutional neural networks. Comput. Electron. Agric. 2019, 163, 104859. [Google Scholar] [CrossRef]
- Xu, W.; Yang, W.; Chen, S.; Wu, C.; Chen, P.; Lan, Y. Establishing a model to predict the single boll weight of cotton in northern Xinjiang by using high resolution UAV remote sensing data. Comput. Electron. Agric. 2020, 179, 105762. [Google Scholar] [CrossRef]
- Zhou, D.; Li, M.; Li, Y.; Qi, J.; Liu, K.; Cong, X.; Tian, X. Detection of ground straw coverage under conservation tillage based on deep learning. Comput. Electron. Agric. 2020, 172, 105369. [Google Scholar] [CrossRef]
- Rasmussen, J.; Nielsen, J.; Garcia-Ruiz, F.; Christensen, S.; Streibig, J.C. Potential uses of small unmanned aircraft systems (UAS) in weed research. Weed Res. 2013, 53, 242–248. [Google Scholar] [CrossRef]
- Costa, L.; Nunes, L.; Ampatzidis, Y. A new visible band index (vNDVI) for estimating NDVI values on RGB images utilizing genetic algorithms. Comput. Electron. Agric. 2020, 172, 105334. [Google Scholar] [CrossRef]
- Liu, Y.; Liu, S.; Li, J.; Guo, X.; Wang, S.; Lu, J. Estimating biomass of winter oilseed rape using vegetation indices and texture metrics derived from UAV multispectral images. Comput. Electron. Agric. 2019, 166, 105026. [Google Scholar] [CrossRef]
- Cao, Y.; Li, G.L.; Luo, Y.K.; Pan, Q.; Zhang, S.Y. Monitoring of sugar beet growth indicators using wide-dynamic-range vegetation index (WDRVI) derived from UAV multispectral images. Comput. Electron. Agric. 2020, 171, 105331. [Google Scholar] [CrossRef]
- Ge, L.; Yang, Z.; Sun, Z.; Zhang, G.; Zhang, M.; Zhang, K.; Zhang, C.; Tan, Y.; Li, W. A method for broccoli seedling recognition in natural environment based on binocular stereo vision and gaussian mixture model. Sensors 2019, 19, 1132. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tamouridou, A.; Alexandridis, T.; Pantazi, X.; Lagopodi, A.; Kashefi, J.; Moshou, D. Evaluation of UAV imagery for mapping Silybum marianum weed patches. Int. J. Remote Sens. 2017, 38, 2246–2259. [Google Scholar] [CrossRef]
- Stroppiana, D.; Villa, P.; Sona, G.; Ronchetti, G.; Candiani, G.; Pepe, M.; Busetto, L.; Migliazzi, M.; Boschetti, M. Early season weed mapping in rice crops using multi-spectral UAV data. Int. J. Remote Sens. 2018, 39, 5432–5452. [Google Scholar] [CrossRef]
- Alexandridis, T.K.; Tamouridou, A.A.; Pantazi, X.E.; Lagopodi, A.L.; Kashefi, J.; Ovakoglou, G.; Polychronos, V.; Moshou, D. Novelty detection classifiers in weed mapping: Silybum marianum detection on UAV multispectral images. Sensors 2017, 17, 2007. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gasparovic, M.; Zrinjski, M.; Barkovi, C.D.; Radocaj, D. An automatic method for weed mapping in oat fields based on UAV imagery. Comput. Electron. Agric. 2020, 173, 105385. [Google Scholar] [CrossRef]
- Perez-Ortiz, M.; Manuel Pena, J.; Antonio Gutierrez, P.; Torres-Sanchez, J.; Hervas-Martinez, C.; Lopez-Granados, F. Selecting patterns and features for between- and within- crop-row weed mapping using UAV-imagery. Expert Syst. Appl. 2016, 47, 85–94. [Google Scholar] [CrossRef] [Green Version]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Wen, S.; Zhang, H.; Zhang, Y. Accurate weed mapping and prescription map generation based on fully convolutional networks using UAV imagery. Sensors 2018, 18, 3299. [Google Scholar] [CrossRef] [Green Version]
- Huang, H.; Lan, Y.; Deng, J.; Yang, A.; Deng, X.; Zhang, L.; Wen, S. A semantic labeling approach for accurate weed mapping of high resolution UAV imagery. Sensors 2018, 18, 2113. [Google Scholar] [CrossRef] [Green Version]
- Kamilaris, A.; Prenafetaboldu, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Chen, C.; Kung, H.; Hwang, F.J. Deep Learning Techniques for Agronomy Applications. Agronomy 2019, 9, 142. [Google Scholar] [CrossRef] [Green Version]
- Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
- Fuentes, A.; Yoon, S.; Kim, S.C.; Park, D.S. A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition. Sensors 2017, 17, 2022. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L.; Gonzalez-Andujar, J.L. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. PLoS ONE 2018, 13, e0196302. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, H.; Gao, N.; Xiao, Y.; Tang, Y. Image feature extraction based on improved FCN for UUV side-scan sonar. Mar. Geophys. Res. 2020, 41, 1–17. [Google Scholar] [CrossRef]
- Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; pp. 234–241. [Google Scholar]
- Chen, L.C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 834–848. [Google Scholar] [CrossRef]
- Wang, P.; Chen, P.; Yuan, Y.; Liu, D.; Huang, Z.; Hou, X.; Cottrell, G. Understanding convolution for semantic segmentation. In Proceedings of the 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Tahoe, NV, USA, 12–15 March 2018; pp. 1451–1460. [Google Scholar]
- Tang, H.; Wang, B.; Chen, X. Deep learning techniques for automatic butterfly segmentation in ecological images. Comput. Electron. Agric. 2020, 178, 105739. [Google Scholar] [CrossRef]
- Zou, K.; Ge, L.; Zhang, C.; Yuan, T.; Li, W. Broccoli Seedling Segmentation Based on Support Vector Machine Combined With Color Texture Features. IEEE Access 2019, 7, 168565–168574. [Google Scholar] [CrossRef]
- Badrinarayanan, V.; Kendall, A.; Cipolla, R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495. [Google Scholar] [CrossRef]
Layer Name | Layer Type | Output Shape | Connected To |
---|---|---|---|
Input_1 | Input | (256, 256, 3) | None |
Convolution_1 | Conv2D | (256, 256, 3) | Input_1 |
Convolution_2 | Conv2D | (256, 256, 64) | Convolution_1 |
Maxpooling_1 | MaxPooling2D | (128, 128, 64) | Convolution_2 |
Convolution_3 | Conv2D | (128, 128, 128) | Maxpooling_1 |
Convolution_4 | Conv2D | (128, 128, 128) | Convolution_3 |
MaxPooling_2 | MaxPooling2D | (64, 64, 128) | Convolution_4 |
Convolution_5 | Dilated convolution | (64, 64, 256) | MaxPooling_2 |
Convolution_6 | Dilated convolution | (64, 64, 256) | Convolution_5 |
Convolution_7 | Dilated convolution | (64, 64, 256) | Convolution_6 |
MaxPooling_3 | MaxPooling2D | (32, 32, 256) | Convolution_6 |
Convolution_8 | Dilated convolution | (32, 32, 256) | MaxPooling_3 |
Convolution_9 | Dilated convolution | (32, 32, 256) | Convolution_7 |
Convolution_10 | Dilated convolution | (32, 32, 256) | Convolution_6 |
UpSampling_1 | UpSampling2D | (64, 64, 256) | Convolution_10 |
Concatenate_1 | Concatenate | (64, 64, 512) | UpSampling_1 & Convolution_7 |
Convolution_11 | Conv2D | (64, 64, 256) | Concatenate_1 |
UpSampling_2 | UpSampling2D | (128, 128, 256) | Convolution_11 |
Concatenate_2 | Concatenate | (128, 128, 384) | UpSampling_2 & Convolution_4 |
Convolution_12 | Conv2D | (128, 128, 256) | Concatenate_2 |
UpSampling_3 | UpSampling2D | (256, 256, 256) | Convolution_12 |
Concatenate_3 | Concatenate | (256, 256, 320) | UpSampling_3 & Convolution_2 |
Convolution_13 | Conv2D | (256, 256, 128) | Concatenate_3 |
Concatenate_4 | Concatenate | (256, 256, 131) | Convolution_13 & Input_1 |
Convolution_14 | Conv2D | (256, 256, 2) | Concatenate_4 |
Activation | Softmax | (256, 256, 2) | Convolution_14 |
Segmentation Methods | IoU (%) | Acc (%) | Pr (%) | Re (%) | ST (ms) |
---|---|---|---|---|---|
Threshold | 71.49 | 85.97 | 64.25 | 92.02 | 0.24 |
Color texture and shape + SVM | 75.02 | 89.80 | 70.96 | 84.31 | 1.74 |
FCN | 68.78 | 90.89 | 54.61 | 77.15 | 40.79 |
SegNet | 84.43 | 96.69 | 74.64 | 72.21 | 41.43 |
U-net | 92.33 | 98.62 | 82.43 | 80.55 | 44.24 |
Proposed algorithm | 93.40 | 98.84 | 84.29 | 80.85 | 40.90 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zou, K.; Chen, X.; Zhang, F.; Zhou, H.; Zhang, C. A Field Weed Density Evaluation Method Based on UAV Imaging and Modified U-Net. Remote Sens. 2021, 13, 310. https://doi.org/10.3390/rs13020310
Zou K, Chen X, Zhang F, Zhou H, Zhang C. A Field Weed Density Evaluation Method Based on UAV Imaging and Modified U-Net. Remote Sensing. 2021; 13(2):310. https://doi.org/10.3390/rs13020310
Chicago/Turabian StyleZou, Kunlin, Xin Chen, Fan Zhang, Hang Zhou, and Chunlong Zhang. 2021. "A Field Weed Density Evaluation Method Based on UAV Imaging and Modified U-Net" Remote Sensing 13, no. 2: 310. https://doi.org/10.3390/rs13020310
APA StyleZou, K., Chen, X., Zhang, F., Zhou, H., & Zhang, C. (2021). A Field Weed Density Evaluation Method Based on UAV Imaging and Modified U-Net. Remote Sensing, 13(2), 310. https://doi.org/10.3390/rs13020310