Economic Fruit Forest Classification Based on Improved U-Net Model in UAV Multispectral Imagery
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Data
2.2.1. Data Source
2.2.2. Data Processing
2.3. Methods
2.3.1. Sample Set Construction and Sample Labeling
2.3.2. U-Net Model Improvement
2.3.3. Experimental Environment
3. Results
3.1. Comparative Analysis of Improved U-Net and Classical Network Model
3.2. Comparative Analysis of Improved U-Net Optimization Model
3.3. Analysis of the Influence of Network Parameters on Classification Accuracy
3.4. Classification and Mapping of Economic Fruit Forests in the Study Area
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Wang, Q.; Sun, B.; Xiong, M.; Yang, Y. Effects of different economic fruit forests on soil nutrients in Qingxi small watershed. IOP Conf. Series. Earth Environ. Sci. 2019, 295, 42118. [Google Scholar] [CrossRef]
- Deng, X.; Shu, H.; Hao, Y.; Xu, Q.; Han, M.; Zhang, S.; Duan, C.; Jiang, Q.; Yi, G.; Chen, H. Review on the Centennial Development of Pomology in China. J. Agric. 2018, 8, 24–34. [Google Scholar]
- Liu, F.; Wang, H.; Hu, C. Current situation of main fruit tree industry in China and it’s development countermeasure during the “14th five-year plan” period. China Fruits 2021, 1, 1–5. [Google Scholar] [CrossRef]
- Ma, J.; Li, F.; Zhang, H.; Khan, N. Commercial cash crop production and households’ economic welfare: Evidence from the pulse farmers in rural China. J. Integr. Agric. 2022, 21, 3395–3407. [Google Scholar] [CrossRef]
- Li, J.; Zhang, Z.; Jin, X.; Chen, J.; Zhang, S.; He, Z.; Li, S.; He, Z.; Zhang, H.; Xiao, H. Exploring the socioeconomic and ecological consequences of cash crop cultivation for policy implications. Land Use Policy 2018, 76, 46–57. [Google Scholar] [CrossRef]
- Su, S.; Zhou, X.; Wan, C.; Li, Y.; Kong, W. Land use changes to cash crop plantations: Crop types, multilevel determinants and policy implications. Land Use Policy 2016, 50, 379–389. [Google Scholar] [CrossRef]
- Toosi, A.; Daass Javan, F.; Samadzadegan, F.; Mehravar, S.; Kurban, A.; Azadi, H. Citrus orchard mapping in Juybar, Iran: Analysis of NDVI time series and feature fusion of multi-source satellite imageries. Ecol. Inform. 2022, 70, 1–23. [Google Scholar] [CrossRef]
- Zhang, J.; Xu, S.; Zhao, Y.; Sun, J.; Xu, S.; Zhang, X. Aerial orthoimage generation for UAV remote sensing: Review. Inf. Fusion 2023, 89, 91–120. [Google Scholar] [CrossRef]
- Liao, X.; Zhang, Y.; Su, F.; Yue, H.; Ding, Z.; Liu, J. UAVs surpassing satellites and aircraft in remote sensing over China. Int. J. Remote Sens. 2018, 39, 7138–7153. [Google Scholar] [CrossRef]
- Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
- Córcoles, J.I.; Ortega, J.F.; Hernández, D.; Moreno, M. Estimation of leaf area index in onion (Allium cepa L.) using an unmanned aerial vehicle. Biosyst. Eng. 2013, 115, 31–42. [Google Scholar] [CrossRef]
- Ballesteros, R.; Ortega, J.F.; Hernández, D.; Moreno, M. Applications of georeferenced high-resolution images obtained with unmanned aerial vehicles. Part I: Description of image acquisition and processing. Precis. Agric. 2014, 15, 579–592. [Google Scholar] [CrossRef]
- Lelong, C.C.; Burger, P.; Jubelin, G.; Roux, B.; Labbé, S.; Baret, F. Assessment of Unmanned Aerial Vehicles Imagery for Quantitative Monitoring of Wheat Crop in Small Plots. Sensors 2008, 8, 3557–3585. [Google Scholar] [CrossRef] [PubMed]
- Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of Banana Fusarium Wilt Based on UAV Remote Sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef]
- Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.; et al. Individual Tree Detection and Classification with UAV-Based Photogrammetric Point Clouds and Hyperspectral Imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef]
- Deng, X.; Tong, Z.; Lan, Y.; Huang, Z. Detection and Location of Dead Trees with Pine Wilt Disease Based on Deep Learning and UAV Remote Sensing. AgriEngineering 2020, 2, 294–307. [Google Scholar] [CrossRef]
- Yang, Z.; Yu, X.; Dedman, S.; Rosso, M.; Zhu, J.; Yang, J.; Xia, Y.; Tian, Y.; Zhang, G.; Wang, J. UAV remote sensing applications in marine monitoring: Knowledge visualization and review. Sci. Total Environ. 2022, 838 Pt 1, 155939. [Google Scholar] [CrossRef]
- Jiao, L.; Zhao, J. A Survey on the New Generation of Deep Learning in Image Processing. IEEE Access 2019, 7, 172231–172263. [Google Scholar] [CrossRef]
- Xu, F.; Hu, C.; Li, J.; Plaza, A.; Datcu, M. Special focus on deep learning in remote sensing image processing. Sci. China Inf. Sci. 2020, 63, 140300. [Google Scholar] [CrossRef]
- Li, L.; Han, L.; Ding, M.; Cao, H.; Hu, H. A deep learning semantic template matching framework for remote sensing image registration. ISPRS J. Photogramm. Remote Sens. 2021, 181, 205–217. [Google Scholar] [CrossRef]
- Osco, L.P.; Marcato Junior, J.; Marques Ramos, A.P.; Castro Jorge, L.A.; Narges Fatholahi, S.; Andrade Silva, J.; Matsubara, E.T.; Pistori, H.; Gonçalves, W.N.; Li, J. A review on deep learning in UAV remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102456. [Google Scholar] [CrossRef]
- de Oliveira, R.P.; Barbosa Júnior, M.R.; Pinto, A.A.; Pereira Oliveira, J.L.; Zerbato, C.; Angeli Furlani, C.E. Predicting Sugarcane Biometric Parameters by UAV Multispectral Images and Machine Learning. Agronomy 2022, 12, 1992. [Google Scholar] [CrossRef]
- Parsons, M.; Bratanov, D.; Gaston, K.J.; Gonzalez, F. UAVs, Hyperspectral Remote Sensing, and Machine Learning Revolutionizing Reef Monitoring. Sensors 2018, 18, 2026. [Google Scholar] [CrossRef] [PubMed]
- Sun, Q.; Zhang, R.; Chen, L.; Zhang, L.; Zhang, H.; Zhao, C. Semantic segmentation and path planning for orchards based on UAV images. Comput. Electron. Agric. 2022, 200, 107222. [Google Scholar] [CrossRef]
- Wang, T.; Thomasson, J.A.; Yang, C.; Isakeit, T.; Nichols, R. Automatic Classification of Cotton Root Rot Disease Based on UAV Remote Sensing. Remote Sens. 2020, 12, 1310. [Google Scholar] [CrossRef]
- Li, G.; Han, W.; Huang, S.; Ma, W.; Ma, Q.; Cui, X. Extraction of Sunflower Lodging Information Based on UAV Multi-spectral Remote Sensing and Deep Learning. Remote Sens. 2021, 13, 2721. [Google Scholar] [CrossRef]
- Haq, M.A.; Rahaman, G.; Baral, P.; Ghosh, A. Deep Learning Based Supervised Image Classification Using UAV Images for Forest Areas Classification. J. Indian Soc. Remote Sens. 2021, 49, 601–606. [Google Scholar] [CrossRef]
- Kwak, G.; Park, N. Impact of Texture Information on Crop Classification with Machine Learning and UAV Images. Appl. Sci. 2019, 9, 643. [Google Scholar] [CrossRef]
- Bah, M.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef]
- Reedha, R.; Dericquebourg, E.; Canals, R.; Hafiane, A. Transformer Neural Network for Weed and Crop Classification of High Resolution UAV Images. Remote Sens. 2022, 14, 592. [Google Scholar] [CrossRef]
- Anagnostis, A.; Tagarakis, A.C.; Kateris, D.; Moysiadis, V.; Sørensen, C.G.; Pearson, S.; Bochtis, D. Orchard Mapping with Deep Learning Semantic Segmentation. Sensors 2021, 21, 3813. [Google Scholar] [CrossRef]
- Niu, Z.; Deng, J.; Zhang, X.; Zhang, J.; Pan, S.; Mu, H. Identifying the Branch of Kiwifruit Based on Unmanned Aerial Vehicle (UAV) Images Using Deep Learning Method. Sensors 2021, 21, 4442. [Google Scholar] [CrossRef] [PubMed]
- Hu, G.; Wang, T.; Wan, M.; Bao, W.; Zeng, W. UAV remote sensing monitoring of pine forest diseases based on improved Mask R-CNN. Int. J. Remote Sens. 2022, 43, 1274–1305. [Google Scholar] [CrossRef]
- Egli, S.; Höpke, M. CNN-Based Tree Species Classification Using High Resolution RGB Image Data from Automated UAV Observations. Remote Sens. 2020, 12, 3892. [Google Scholar] [CrossRef]
- Liu, Z.; Su, B.; Lv, F. Intelligent Identification Method of Crop Species Using Improved U-Net Network in UAV Remote Sensing Image. Sci. Program. 2022, 2022, 9717843. [Google Scholar] [CrossRef]
- Zhang, H.; Du, H.; Zhang, C.; Zhang, L. An automated early-season method to map winter wheat using time-series Sentinel-2 data: A case study of Shandong, China. Comput. Electron. Agric. 2021, 182, 105962. [Google Scholar] [CrossRef]
- Wang, L.; Wu, C.; Li, X. Suitability Evaluation of Apple Planting Area in Shandong Province Based on GIS and AHP. J. Shandong Agric. Univ. 2022, 53, 531–537. [Google Scholar]
- Wang, J.; Yang, X.; Wang, Z.; Ge, D.; Kang, J. Monitoring Marine Aquaculture and Implications for Marine Spatial Planning—An Example from Shandong Province, China. Remote Sens. 2022, 14, 732. [Google Scholar] [CrossRef]
- Ou, C.; Yang, J.; Du, Z.; Zhang, T.; Niu, B.; Feng, Q.; Liu, Y.; Zhu, D. Landsat-Derived Annual Maps of Agricultural Greenhouse in Shandong Province, China from 1989 to 2018. Remote Sens. 2021, 23, 4830. [Google Scholar] [CrossRef]
- Chaudhry, M.H.; Ahmad, A.; Gulzar, Q. Impact of UAV Surveying Parameters on Mixed Urban Landuse Surface Modelling. ISPRS Int. J. Geo-Inf. 2020, 9, 656. [Google Scholar] [CrossRef]
- Bie, D.; Li, D.; Xiang, J.; Li, H.; Kan, Z.; Sun, Y. Design, aerodynamic analysis and test flight of a bat-inspired tailless flapping wing unmanned aerial vehicle. Aerosp. Sci. Technol. 2021, 112, 106557. [Google Scholar] [CrossRef]
- Guimarães, N.; Pádua, L.; Marques, P.; Silva, N.; Peres, E.; Sousa, J.J. Forestry Remote Sensing from Unmanned Aerial Vehicles: A Review Focusing on the Data, Processing and Potentialities. Remote Sens. 2020, 12, 1046. [Google Scholar] [CrossRef]
- Hao, Z.; Post, C.J.; Mikhailova, E.A.; Lin, L.; Liu, J.; Yu, K. How Does Sample Labeling and Distribution Affect the Accuracy and Efficiency of a Deep Learning Model for Individual Tree-Crown Detection and Delineation. Remote Sens. 2022, 7, 1561. [Google Scholar] [CrossRef]
- Ma, D.; Tang, P.; Zhao, L. SiftingGAN: Generating and Sifting Labeled Samples to Improve the Remote Sensing Image Scene Classification Baseline In Vitro. IEEE Geosci. Remote Sens. Lett. 2019, 7, 1046–1050. [Google Scholar] [CrossRef]
- Yang, M.; Tseng, H.H.; Hsu, Y.C.; Yang, C.Y.; Lai, M.H.; Wu, D.H. A UAV Open Dataset of Rice Paddies for Deep Learning Practice. Remote Sens. 2021, 13, 1358. [Google Scholar] [CrossRef]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going Deeper with Convolutions. arXiv 2014, arXiv:1409.4842. [Google Scholar]
- Hu, J.; Shen, L.; Albanie, S.; Sun, G.; Wu, E. Squeeze-and-Excitation Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2011–2023. [Google Scholar] [CrossRef] [PubMed]
- Liu, P.; Wei, Y.; Wang, Q.; Chen, Y.; Xie, J. Research on Post-Earthquake Landslide Extraction Algorithm Based on Improved U-Net Model. Remote Sens. 2020, 12, 894. [Google Scholar] [CrossRef]
- Tian, S.; Dong, Y.; Feng, R.; Liang, D.; Wang, L. Mapping mountain glaciers using an improved U-Net model with cSE. Int. J. Digit. Earth 2021, 15, 463–477. [Google Scholar] [CrossRef]
- Fan, X.; Yan, C.; Fan, J.; Wang, N. Improved U-Net Remote Sensing Classification Algorithm Fusing Attention and Multiscale Features. Remote Sens. 2022, 14, 3591. [Google Scholar] [CrossRef]
- Ji, S.; Zhang, Z.; Zhang, C.; Wei, S.; Lu, M. Learning discriminative spatiotemporal features for precise crop classification from multi-temporal satellite images. Int. J. Remote Sens. 2019, 41, 3162–3174. [Google Scholar] [CrossRef]
- Deng, T.; Fu, B.; Liu, M.; He, H.; Fan, D.; Li, L.; Huang, L.; Gao, E. Comparison of multi-class and fusion of multiple single-class SegNet model for mapping karst wetland vegetation using UAV images. Sci. Rep. 2022, 12, 13270. [Google Scholar] [CrossRef] [PubMed]
- Mohammadi, S.; Belgiu, M.; Stein, A. Improvement in crop mapping from satellite image time series by effectively supervising deep neural networks. ISPRS J. Photogramm. Remote Sens. 2023, 198, 272–283. [Google Scholar] [CrossRef]
- Wei, S.; Zhang, H.; Wang, C.; Wang, Y.; Xu, L. Multi-Temporal SAR Data Large-Scale Crop Mapping Based on U-Net Model. Remote Sens. 2019, 11, 68. [Google Scholar] [CrossRef]
- Yang, M.; Tseng, H.; Hsu, Y.; Tsai, H.P. Semantic Segmentation Using Deep Learning with Vegetation Indices for Rice Lodging Identification in Multi-date UAV Visible Images. Remote Sens. 2020, 12, 633. [Google Scholar] [CrossRef]
- Zunair, H.; Ben Hamza, A. Sharp U-Net: Depthwise convolutional network for biomedical image segmentation. Comput. Biol. Med. 2021, 136, 104699. [Google Scholar] [CrossRef]
- Pan, Z.; Xu, J.; Guo, Y.; Hu, Y.; Wang, G. Deep Learning Segmentation and Classification for Urban Village Using a Worldview Satellite Image Based on U-Net. Remote Sens. 2020, 12, 1574. [Google Scholar] [CrossRef]
- Colligan, T.; Ketchum, D.; Brinkerhoff, D.; Maneta, M. A Deep Learning Approach to Mapping Irrigation Using Landsat: IrrMapper U-Net. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–11. [Google Scholar] [CrossRef]
- Giang, T.; Dang, K.; Le, Q.; Nguyen, V.G.; Tong, S.S.; Pham, V.M. U-Net Convolutional Networks for Mining Land Cover Classification Based on High-Resolution UAV Imagery. IEEE Access 2020, 8, 186257–186273. [Google Scholar] [CrossRef]
- Martinez, J.; Rosa, L.; Feitosa, R.; Sanches, I.D.; Happ, P.N. Fully convolutional recurrent networks for multidate crop recognition from multitemporal image sequences. ISPRS J. Photogramm. Remote Sens. 2021, 171, 188–201. [Google Scholar] [CrossRef]
- Rao, B.; Hota, M.; Kumar, U. Crop Classification from UAV-Based Multi-spectral Images Using Deep Learning. Comput. Vis. Image Process. 2021, 1376, 475–486. [Google Scholar] [CrossRef]
- Yan, C.; Fan, X.; Fan, J.; Wang, N. Improved U-Net Remote Sensing Classification Algorithm Based on Multi-Feature Fusion Perception. Remote Sens. 2022, 14, 1118. [Google Scholar] [CrossRef]
Option | Parameters |
---|---|
Resolution ratio | 5 cm/pix |
Flight altitude | 95 m |
Heading overlap rate | 70% |
Lateral overlap rate | 50% |
Flight speed | 5 m/s |
Number of control points | 521 |
Flight distance | 10,669.0 m |
Class of Land | Grape | Peach | Apple | Cheery | Corn | Other Trees | Other Features |
---|---|---|---|---|---|---|---|
RGB value | (255, 0, 255) | (255, 0, 120) | (255, 0, 0) | (255, 150, 0) | (255, 255, 0) | (0, 255, 0) | (0, 0, 0) |
Content | Hardware and Software Parameters |
---|---|
CPU | Intel(R) Xeon(R) Gold 5218 CPU @ 2.30 GHz |
GPU | Nvidia GeForce RTX 2080 Ti 10 G×2 |
Running memory | 128 G |
Operating system | Linux 4.15.0 |
Computing platform | CUDA 10.1 + cuDNN 7.6.5 |
Development language | Python 3.7.10 |
Processing platforms and frameworks | Tensorflow 1.14.0 + Keras 2.3.1 |
Model | Pixel Accuracy/% | Mean Intersection over Union/% | Frequency Weight Intersection over Union/% | Kappa Coefficient |
---|---|---|---|---|
FCN | 81.47 | 55.74 | 70.36 | 0.75 |
SegNet | 81.35 | 55.86 | 70.02 | 0.75 |
U-Net | 84.54 | 61.78 | 74.18 | 0.79 |
ISDU-Net | 87.73 | 70.68 | 78.69 | 0.84 |
Model | IoU (%) | ||||||
---|---|---|---|---|---|---|---|
Cherry | Grape | Corn | Apple | Peach | Other Trees | Other Features | |
FCN | 14.93 | 86.34 | 70.12 | 59.31 | 68.45 | 23.30 | 67.73 |
SegNet | 17.21 | 85.18 | 68.72 | 60.31 | 69.14 | 23.33 | 67.11 |
U-Net | 29.99 | 88.28 | 71.27 | 75.03 | 71.72 | 26.41 | 69.73 |
ISDU-Net | 56.84 | 90.16 | 78.52 | 77.56 | 78.83 | 39.48 | 73.40 |
Model | F1 Score (%) | ||||||
---|---|---|---|---|---|---|---|
Cherry | Grape | Corn | Apple | Peach | Other Trees | Other Features | |
FCN | 25.98 | 92.67 | 82.43 | 74.46 | 81.26 | 37.80 | 80.76 |
SegNet | 29.36 | 92.00 | 81.46 | 75.24 | 81.76 | 37.84 | 80.32 |
U-Net | 46.14 | 93.78 | 83.23 | 85.74 | 83.53 | 41.79 | 82.17 |
ISDU-Net | 72.48 | 94.82 | 87.97 | 87.36 | 88.16 | 56.61 | 84.64 |
FCN Confusion Matrix | Predicted Value | |||||||
---|---|---|---|---|---|---|---|---|
Cherry | Grape | Corn | Apple | Peach | Other Trees | Other Features | ||
True value | cherry | 38,219 | 267 | 913 | 720 | 81,827 | 4699 | 42,638 |
grape | 407 | 3,314,878 | 3307 | 2599 | 15,911 | 8567 | 226,266 | |
corn | 104 | 3439 | 276,079 | 28 | 4024 | 54 | 19,895 | |
apple | 86,931 | 26,528 | 1813 | 662,252 | 48,523 | 64,535 | 161,605 | |
peach | 58,386 | 1523 | 8052 | 78,333 | 1,619,909 | 120,330 | 233,845 | |
other trees | 1905 | 73,132 | 587 | 14,166 | 47,808 | 165,018 | 163,304 | |
other features | 19,293 | 195,321 | 71,182 | 40,802 | 125,900 | 94,931 | 2,999,177 |
SegNet Confusion Matrix | Predicted Value | |||||||
---|---|---|---|---|---|---|---|---|
Cherry | Grape | Corn | Apple | Peach | Other Trees | Other Features | ||
True value | cherry | 59,256 | 545 | 603 | 652 | 78,320 | 1748 | 28,159 |
grape | 2902 | 3,267,104 | 4973 | 6047 | 12,683 | 8289 | 269,937 | |
corn | 214 | 2813 | 280,070 | 492 | 4577 | 381 | 15,076 | |
apple | 89,391 | 30,007 | 3447 | 687,355 | 51,259 | 28,299 | 162,429 | |
peach | 56,256 | 372 | 1969 | 91,978 | 1,664,241 | 76,622 | 228,940 | |
other trees | 9107 | 24,563 | 2107 | 18,820 | 78,901 | 142,616 | 189,806 | |
other features | 21,319 | 168,981 | 101,801 | 42,857 | 132,575 | 91,114 | 2,987,959 |
U-Net Confusion Matrix | Predicted Value | |||||||
---|---|---|---|---|---|---|---|---|
Cherry | Grape | Corn | Apple | Peach | Other Trees | Other Features | ||
True value | cherry | 56,454 | 1210 | 69 | 386 | 79,624 | 2018 | 29,522 |
grape | 122 | 3,412,449 | 1236 | 811 | 16,672 | 14,784 | 190,924 | |
corn | 0 | 2606 | 275,121 | 0 | 4028 | 596 | 21,272 | |
apple | 30,963 | 26,964 | 9211 | 887,943 | 30,259 | 22,641 | 128,466 | |
peach | 43,470 | 167 | 13,050 | 50,699 | 1,703,473 | 75,686 | 166,763 | |
other trees | 458 | 64 | 392 | 5806 | 31,764 | 164,794 | 99,069 | |
other features | 13,551 | 203,362 | 103,461 | 41,259 | 161,487 | 73,342 | 3,028,692 |
ISDU-Net Confusion Matrix | Predicted Value | |||||||
---|---|---|---|---|---|---|---|---|
Cherry | Grape | Corn | Apple | Peach | Other Trees | Other Features | ||
True value | cherry | 152,100 | 432 | 28 | 0 | 32,632 | 1597 | 30,933 |
grape | 30 | 3,479,393 | 343 | 457 | 11,155 | 20,037 | 119,805 | |
corn | 794 | 1364 | 348,203 | 1030 | 3327 | 178 | 33,876 | |
apple | 14,116 | 16,363 | 5299 | 931,750 | 28,536 | 8999 | 91,152 | |
peach | 18,198 | 569 | 3369 | 45,988 | 1,723,641 | 53,381 | 138,054 | |
other trees | 0 | 426 | 40 | 782 | 24,953 | 227,871 | 187,493 | |
other features | 16,738 | 218,791 | 45,609 | 56,823 | 102,671 | 51,385 | 2,984,775 |
Model | Pixel Accuracy/% | Average Intersection over Union/% | Frequency Weight Intersection over Union/% | Kappa Coefficient |
---|---|---|---|---|
U-Net | 84.54 | 61.78 | 74.18 | 0.79 |
INCU-Net | 86.18 | 67.99 | 76.30 | 0.82 |
SEU-Net | 86.06 | 66.18 | 76.29 | 0.81 |
DeepU-Net | 86.20 | 66.59 | 76.16 | 0.82 |
ISDU-Net | 87.73 | 70.68 | 78.69 | 0.84 |
Model | IoU (%) | ||||||
---|---|---|---|---|---|---|---|
Cherry | Grape | Corn | Apple | Peach | Other Trees | Other Features | |
U-Net | 29.99 | 88.28 | 71.27 | 75.03 | 71.72 | 26.41 | 69.73 |
INCU-Net | 54.37 | 87.94 | 74.48 | 75.76 | 75.56 | 36.43 | 71.36 |
SEU-Net | 45.08 | 89.35 | 74.36 | 71.99 | 76.77 | 34.54 | 71.20 |
DeepU-Net | 48.02 | 88.27 | 75.74 | 76.25 | 75.56 | 30.91 | 71.39 |
ISDU-Net | 56.84 | 90.16 | 78.52 | 77.56 | 78.83 | 39.48 | 73.40 |
Model | F1 Score (%) | ||||||
---|---|---|---|---|---|---|---|
Cherry | Grape | Corn | Apple | Peach | Other Trees | Other Features | |
U-Net | 46.14 | 93.78 | 83.23 | 85.74 | 83.53 | 41.79 | 82.17 |
INCU-Net | 70.44 | 93.58 | 85.37 | 86.21 | 86.08 | 53.40 | 83.29 |
SEU-Net | 62.15 | 94.37 | 85.30 | 83.71 | 86.86 | 51.35 | 83.18 |
DeepU-Net | 64.89 | 93.77 | 86.18 | 86.53 | 86.08 | 47.23 | 83.30 |
ISDU-Net | 72.48 | 94.82 | 87.97 | 87.36 | 88.16 | 56.61 | 84.64 |
Initial Learning Rate | Pixel Accuracy/% | Average Intersection over Union/% | Frequency Weight Intersection over Union/% | Kappa Coefficient |
---|---|---|---|---|
0.01 | 80.44 | 61.03 | 69.28 | 0.74 |
0.001 | 85.78 | 69.47 | 77.54 | 0.83 |
0.0001 | 87.73 | 70.68 | 78.69 | 0.84 |
0.00001 | 86.38 | 66.71 | 76.36 | 0.82 |
Batch Size | Pixel Accuracy/% | Average Intersection over Union/% | Frequency Weight Intersection over Union/% | Kappa Coefficient |
---|---|---|---|---|
4 | 85.56 | 65.62 | 75.10 | 0.81 |
6 | 86.18 | 69.44 | 77.54 | 0.83 |
8 | 87.73 | 70.68 | 78.69 | 0.84 |
10 | 85.97 | 68.63 | 76.11 | 0.83 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, C.; Jia, W.; Yang, J.; Zhang, T.; Dai, A.; Zhou, H. Economic Fruit Forest Classification Based on Improved U-Net Model in UAV Multispectral Imagery. Remote Sens. 2023, 15, 2500. https://doi.org/10.3390/rs15102500
Wu C, Jia W, Yang J, Zhang T, Dai A, Zhou H. Economic Fruit Forest Classification Based on Improved U-Net Model in UAV Multispectral Imagery. Remote Sensing. 2023; 15(10):2500. https://doi.org/10.3390/rs15102500
Chicago/Turabian StyleWu, Chunxiao, Wei Jia, Jianyu Yang, Tingting Zhang, Anjin Dai, and Han Zhou. 2023. "Economic Fruit Forest Classification Based on Improved U-Net Model in UAV Multispectral Imagery" Remote Sensing 15, no. 10: 2500. https://doi.org/10.3390/rs15102500
APA StyleWu, C., Jia, W., Yang, J., Zhang, T., Dai, A., & Zhou, H. (2023). Economic Fruit Forest Classification Based on Improved U-Net Model in UAV Multispectral Imagery. Remote Sensing, 15(10), 2500. https://doi.org/10.3390/rs15102500