Application of UAV RGB Images and Improved PSPNet Network to the Identification of Wheat Lodging Areas
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Data Acquisition and Preprocessing
- (1)
- Acquisition of UAV images. The wheat lodging was collected at the wheat grain-filling stage on 4 May 2021 and the mature stage on 24 May 2021. The flight platform is DJI Phantom 4 pro, and the image acquisition equipment is shown in Figure 2. The onboard RGB sensor, including red, green and blue bands, captures images with red, blue and green spectral channels with a resolution of 5472 × 3648 pixels. The flight altitude of UAV is set to 20 m and 40 m. Before the UAV flight, the DJI GS Pro software was used to plan the route, first select the flight area of the farm, set the flight altitude of 20 m and 40 m, the flight speed of 2 m/s and 4 m/s, the heading overlap of 85%, the side overlap of 85% and the shooting mode is set to take photos at equal time intervals. After all parameters are set, the UAV wheat data will be obtained. The day of shooting is sunny and breezy which meets the requirements of UAV flight and remote sensing data acquisition.
- (2)
- Data annotation. The obtained UAV wheat image during the grain-filling period includes two kinds of data at the height of 20 m and 40 m, and the size of the original wheat lodging image collected is 5472 × 3648 and 8049 × 5486, respectively. Due to hardware constraints and adaptation to subsequent deep learning training, the images are uniformly cropped to the same size. The obtained UAV image has high spatial accuracy and can clearly see the lodging information of wheat. Therefore, Expert Visual Interpretation (EVI) is selected to obtain the landmark data of real lodging areas. LabelMe [19] is selected as the annotation software to manually label the data which is an open annotation tool created by MIT Computer Science and Artificial Intelligence Laboratory (MIT CSAIL). It is a JavaScript annotation tool for online image annotation. Compared with traditional image annotation tools, its advantage is that we can use it anywhere. In addition, it can also help label images without installing or copying large data sets in the computer. According to this software, real ground lodging information can be obtained. The lodging areas were marked in LabelMe to generate a JSON file which was then converted to a PNG format image for deep learning network training preparation. As shown in Figure 3, the red areas represent the lodging areas and the black represents the non-lodging areas.
3. Methodology
3.1. Technical Workflow
3.2. NAM Attention Module
3.3. Structure of the NAM-PSPNet
- (1)
- The lightweight neural network MobileNetV2 is used to replace ResNet as the feature extraction backbone network, MobileNetV2 is used as the backbone feature extraction network and the deep separable convolution is used to replace the standard convolution. That is, the deep separable convolution with a step of one is used at the beginning and end of the feature extraction network, and the deep separable convolution with a step of 2 is used for three consecutive times in the middle layer. Reduce the amount of model parameters and calculation and improve the segmentation speed. The residual connection is introduced to extract the features that may be ignored by the original PSPNet.
- (2)
- Construct a pyramid pool structure of multi-dimensional feature fusion. Using the method of U-Net [23] multi-dimensional feature cascade for reference, the multi-scale feature is further extracted.
- (3)
- Add the NAM attention module to recognize the less obvious features, so as to compress the model, reduce the calculation and improve the segmentation accuracy.
4. Training of NAM-PSPNet and Evaluation Metrics
4.1. Image-Label Datasets
4.2. Model Training
4.3. Evaluation Metrics
4.4. Comparison Models
5. Results and Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zhang, H.-J.; Li, T.; Liu, H.-W.; Mai, C.-Y.; Yu, G.-J.; Li, H.-L.; Yu, L.-Q.; Meng, L.-Z.; Jian, D.-W.; Yang, L.; et al. Genetic progress in stem lodging resistance of the dominant wheat cultivars adapted to Yellow-Huai River Valleys winter wheat zone in China since 1964. J. Integr. Agric. 2020, 19, 438–448. [Google Scholar] [CrossRef]
- Shewry, P.R.; Hey, S.J. The contribution of wheat to human diet and health. Food Energy Secur. 2015, 4, 178–202. [Google Scholar] [CrossRef] [PubMed]
- Shah, L.; Yahya, M.; Shah, S.M.A.; Nadeem, M.; Ali, A.; Ali, A.; Wang, J.; Riaz, M.W.; Rehman, S.; Wu, W.; et al. Improving lodging resistance: Using wheat and rice as classical examples. Int. J. Mol. Sci. 2019, 20, 4211. [Google Scholar] [CrossRef] [PubMed]
- Zhao, X.; Yuan, Y.; Song, M.; Ding, Y.; Lin, F.; Liang, D.; Zhang, D. Use of unmanned aerial vehicle imagery and deep learning unet to extract rice lodging. Sensors 2019, 19, 3859. [Google Scholar] [CrossRef] [PubMed]
- Yang, M.D.; Boubin, J.G.; Tsai, H.P.; Tseng, H.H.; Hsu, Y.C.; Stewart, C.C. Adaptive autonomous UAV scouting for rice lodging assessment using edge computing with deep learning EDANet. Comput. Electron. Agric. 2020, 179, 105817. [Google Scholar] [CrossRef]
- Liu, T.; Li, R.; Zhong, X.; Jiang, M.; Jin, X.; Zhou, P.; Liu, S.; Sun, C.; Guo, W. Estimates of rice lodging using indices derived from UAV visible and thermal infrared images. Agric. For. Meteorol. 2018, 252, 144–154. [Google Scholar] [CrossRef]
- Burkart, A.; Aasen, H.; Alonso, L.; Menz, G.; Bareth, G.; Rascher, U. Angular dependency of hyperspectral measurements over wheat characterized by a novel UAV based goniometer. Remote Sens. 2015, 7, 725–746. [Google Scholar] [CrossRef]
- Li, Z.; Chen, Z.; Ren, G.; Li, Z.; Wang, X. Estimation of maize lodging area based on Worldview-2 image. Trans. Chin. Soc. Agric. Eng. 2016, 32, 1–5. [Google Scholar]
- Dai, X.; Chen, S.; Jia, K.; Jiang, H.; Sun, Y.; Li, D.; Zheng, Q.; Huang, J. A decision-tree approach to identifying paddy rice lodging with multiple pieces of polarization information derived from Sentinel-1. Remote Sens. 2022, 15, 240. [Google Scholar] [CrossRef]
- Sun, Y.; Liu, P.; Zhang, Y.; Song, C.; Zhang, D.; Ma, X. Extraction of winter wheat planting area in Weifang based on Sentinel-2A remote sensing image. J. Chin. Agric. Mech. 2022, 43, 98–105. [Google Scholar]
- Tang, Z.; Sun, Y.; Wan, G.; Zhang, K.; Shi, H.; Zhao, Y.; Chen, S.; Zhang, X. Winter wheat lodging area extraction using deep learning with GaoFen-2 satellite imagery. Remote Sens. 2022, 14, 4887. [Google Scholar] [CrossRef]
- Gao, L.; Yang, G.; Yu, H.; Xu, B.; Zhao, X.; Dong, J.; Ma, Y. Winter Wheat Leaf Area Index Retrieval Based on UAV Hyperspectral Remote Sensing. Trans. Chin. Soc. Agric. Eng. 2016, 32, 113–120. [Google Scholar]
- Tang, Z.; Sun, Y.; Wan, G.; Zhang, K.; Shi, H.; Zhao, Y.; Chen, S.; Zhang, X. A quantitative monitoring method for determining Maize lodging in different growth stages. Remote Sens. 2020, 12, 3149. [Google Scholar]
- Zhang, D.; Ding, Y.; Chen, P.; Zhang, X.; Pan, Z.; Liang, D. Automatic extraction of wheat lodging area based on transfer learning method and deeplabv3+ network. Comput. Electron. Agric. 2020, 179, 105845. [Google Scholar] [CrossRef]
- Zheng, E.G.; Tian, Y.F.; Chen, T. Region extraction of corn lodging in UAV images based on deep learning. J. Henan Agric. Sci. 2018, 47, 155–160. [Google Scholar]
- Mardanisamani, S.; Maleki, F.; Kassani, S.H.; Rajapaksa, S.; Duddu, H.; Wang, M.; Shirtliffe, S.; Ryu, S.; Josuttes, A.; Zhang, T.; et al. Crop lodging prediction from UAV-acquired images of wheat and canola using a DCNN augmented with handcrafted texture features. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Long Beach, CA, USA, 16–20 June 2019. [Google Scholar]
- Yang, B.; Zhu, Y.; Zhou, S. Accurate wheat lodging extraction from multi-channel UAV images using a lightweight network model. Sensors 2021, 21, 6826. [Google Scholar] [CrossRef] [PubMed]
- Varela, S.; Pederson, T.L.; Leakey, A.D.B. Implementing spatio-temporal 3D-convolution neural networks and UAV time series imagery to better predict lodging damage in sorghum. Remote Sens. 2022, 14, 733. [Google Scholar] [CrossRef]
- Torralba, A.; Russell, B.C.; Yuen, J. LabelMe: Online image annotation and applications. Proc. IEEE 2010, 98, 1467–1484. [Google Scholar] [CrossRef]
- Woo, S.; Park, J.; Lee, J.Y.; Kweon, I.S. Cbam: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Liu, Y.; Shao, Z.; Teng, Y.; Hoffmann, N. NAM: Normalization-based attention module. arXiv 2021, arXiv:2111.12419. [Google Scholar]
- Zhao, H.; Shi, J.; Qi, X.; Wang, X.; Jia, J. Pyramid scene parsing network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2881–2890. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, 5–9 October 2015; Proceedings, Part III 18; Springer International Publishing: Cham, Switzerland, 2015; pp. 234–241. [Google Scholar]
Parameter | Value |
---|---|
Batch_size | 32 |
Learning_rate | 5 × 10−3 |
Input_shape | 256 × 256 |
Num_classes | 2 |
Epoch | 150 |
Approach | Growth Period | MPA (%) | MioU (%) | Accuracy (%) | Precision (%) | Recall (%) |
---|---|---|---|---|---|---|
U-Net | Grain-filling | 84.19 | 72.91 | 89.95 | 84.46 | 79.52 |
Maturity | 90.79 | 83.81 | 91.60 | 91.58 | 92.77 | |
SegNet | Grain-filling | 81.88 | 71.25 | 85.33 | 82.48 | 76.83 |
Maturity | 83.84 | 72.70 | 89.29 | 83.22 | 77.59 | |
DeepLabv3+ | Grain-filling | 88.87 | 81.81 | 91.66 | 85.36 | 80.55 |
Maturity | 91.44 | 84.07 | 92.01 | 86.43 | 81.40 | |
NAM-PSPNet | Grain-filling | 93.73 | 88.96 | 94.12 | 93.78 | 94.82 |
Maturity | 94.62 | 89.32 | 94.95 | 94.30 | 95.43 |
Flight Altitude | Evaluation Metrics | U-Net | SegNet | DeepLabv3+ | NAM-PSPNet |
---|---|---|---|---|---|
20 m | MPA (%) | 90.79 | 81.88 | 91.44 | 94.62 |
MIoU (%) | 83.81 | 71.25 | 84.07 | 89.32 | |
40 m | MPA (%) | 89.65 | 83.49 | 89.81 | 94.35 |
MIoU (%) | 82.64 | 72.19 | 81.18 | 88.96 |
Flight Altitude | Evaluation Metrics | PSPNet | NAM-PSPNet |
---|---|---|---|
20 m | MPA (%) | 92.15 | 94.62 |
MIoU (%) | 85.33 | 89.32 | |
40 m | MPA (%) | 92.07 | 94.35 |
MIoU (%) | 85.28 | 88.96 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhao, J.; Li, Z.; Lei, Y.; Huang, L. Application of UAV RGB Images and Improved PSPNet Network to the Identification of Wheat Lodging Areas. Agronomy 2023, 13, 1309. https://doi.org/10.3390/agronomy13051309
Zhao J, Li Z, Lei Y, Huang L. Application of UAV RGB Images and Improved PSPNet Network to the Identification of Wheat Lodging Areas. Agronomy. 2023; 13(5):1309. https://doi.org/10.3390/agronomy13051309
Chicago/Turabian StyleZhao, Jinling, Zheng Li, Yu Lei, and Linsheng Huang. 2023. "Application of UAV RGB Images and Improved PSPNet Network to the Identification of Wheat Lodging Areas" Agronomy 13, no. 5: 1309. https://doi.org/10.3390/agronomy13051309