Research on Segmentation Method of Maize Seedling Plant Instances Based on UAV Multispectral Remote Sensing Images
Abstract
1. Introduction
2. Materials and Methods
2.1. Data Acquisition
2.2. Data Preprocessing
2.3. Dataset Construction
2.4. The SAM Model
2.4.1. Image Encoder
2.4.2. Prompt Encoder
2.4.3. Mask Decoder
2.5. The YOLOv8 Model
2.5.1. Input
2.5.2. Backbone Module
- (A)
- CBS convolutional module
- (B)
- C2f module
- (C)
- SPPF module
2.5.3. Neck Module
2.5.4. Head Module
2.6. Evaluation Indicators
2.7. Experimental Environment
3. Experiments and Analysis of Results
3.1. Comparison and Analysis of Different Instance Segmentation Models
3.2. Analysis of Overcrowded Seedling Plants
4. Discussion
4.1. Effect of Different Parametric Quantities on Segmentation Accuracy
4.2. Generalization Experiment
4.3. Model Performance at Different Spatial Resolutions
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Shu, M.Y.; Shen, M.Y.; Zuo, J.Y.; Yin, P.F.; Wang, M.; Xie, Z.W.; Tang, J.H.; Wang, R.L.; Li, B.G.; Yang, X.; et al. The Application of UAV-Based Hyperspectral Imaging to Estimate Crop Traits in Maize Inbred Lines. Plant Phenomics 2021, 2021, 9890745. [Google Scholar] [CrossRef] [PubMed]
- Shi, M.; Zhang, S.; Lu, H.; Zhao, X.; Wang, X.; Cao, Z. Phenotyping Multiple Maize Ear Traits from a Single Image: Kernels per Ear, Rows per Ear, and Kernels per Row. Comput. Electron. Agric. 2022, 193, 106681. [Google Scholar] [CrossRef]
- Yue, J.; Guo, W.; Yang, G.; Zhou, C.; Feng, H.; Qiao, H. Method for Accurate Multi-Growth-Stage Estimation of Fractional Vegetation Cover Using Unmanned Aerial Vehicle Remote Sensing. Plant Methods 2021, 17, 51. [Google Scholar] [CrossRef] [PubMed]
- Yue, J.; Tian, Q.; Liu, Y.; Fu, Y.; Tian, J.; Zhou, C.; Feng, H.; Yang, G. Mapping Cropland Rice Residue Cover Using a Radiative Transfer Model and Deep Learning. Comput. Electron. Agric. 2023, 215, 108421. [Google Scholar] [CrossRef]
- Zhuang, L.; Wang, C.; Hao, H.; Li, J.; Xu, L.; Liu, S.; Guo, X. Maize Emergence Rate and Leaf Emergence Speed Estimation via Image Detection under Field Rail-Based Phenotyping Platform. Comput. Electron. Agric. 2024, 220, 108838. [Google Scholar] [CrossRef]
- Goggin, F.L.; Lorence, A.; Topp, C.N. Applying High-Throughput Phenotyping to Plant–Insect Interactions: Picturing More Resistant Crops. Curr. Opin. Insect Sci. 2015, 9, 69–76. [Google Scholar] [CrossRef]
- Varshney, R.K.; Thudi, M.; Pandey, M.K.; Tardieu, F.; Ojiewo, C.; Vadez, V.; Whitbread, A.M.; Siddique, K.H.M.; Nguyen, H.T.; Carberry, P.S.; et al. Accelerating Genetic Gains in Legumes for the Development of Prosperous Smallholder Agriculture: Integrating Genomics, Phenotyping, Systems Modelling and Agronomy. J. Exp. Bot. 2018, 69, 3293–3312. [Google Scholar] [CrossRef]
- Zhao, C.; Zhang, Y.; Du, J.; Guo, X.; Wen, W.; Gu, S.; Wang, J.; Fan, J. Crop Phenomics: Current Status and Perspectives. Front. Plant Sci. 2019, 10, 714. [Google Scholar] [CrossRef]
- Tang, L.; Shao, G. Drone Remote Sensing for Forestry Research and Practices. J. For. Res. 2015, 26, 791–797. [Google Scholar] [CrossRef]
- Baresel, J.P.; Rischbeck, P.; Hu, Y.; Kipp, S.; Hu, Y.; Barmeier, G.; Mistele, B.; Schmidhalter, U. Use of a Digital Camera as Alternative Method for Non-Destructive Detection of the Leaf Chlorophyll Content and the Nitrogen Nutrition Status in Wheat. Comput. Electron. Agric. 2017, 140, 25–33. [Google Scholar] [CrossRef]
- Zhou, L.; Gu, X.; Cheng, S.; Yang, G.; Shu, M.; Sun, Q. Analysis of Plant Height Changes of Lodged Maize Using UAV-LiDAR Data. Agriculture 2020, 10, 146. [Google Scholar] [CrossRef]
- Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting Grain Yield in Rice Using Multi-Temporal Vegetation Indices from UAV-Based Multispectral and Digital Imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
- Potgieter, A.B.; George-Jaeggli, B.; Chapman, S.C.; Laws, K.; Suárez Cadavid, L.A.; Wixted, J.; Watson, J.; Eldridge, M.; Jordan, D.R.; Hammer, G.L. Multi-Spectral Imaging from an Unmanned Aerial Vehicle Enables the Assessment of Seasonal Leaf Area Dynamics of Sorghum Breeding Lines. Front. Plant Sci. 2017, 8, 1532. [Google Scholar] [CrossRef]
- Song, P.; Wang, J.; Guo, X.; Yang, W.; Zhao, C. High-Throughput Phenotyping: Breaking through the Bottleneck in Future Crop Breeding. Crop J. 2021, 9, 633–645. [Google Scholar] [CrossRef]
- Leaf Area Index Estimation of Pergola-Trained Vineyards in Arid Regions Using Classical and Deep Learning Methods Based on UAV-Based RGB Images—ScienceDirect. Available online: https://www.sciencedirect.com/science/article/pii/S0168169923001114 (accessed on 1 April 2024).
- Zhang, J.; Cheng, T.; Guo, W.; Xu, X.; Qiao, H.; Xie, Y.; Ma, X. Leaf Area Index Estimation Model for UAV Image Hyperspectral Data Based on Wavelength Variable Selection and Machine Learning Methods. Plant Methods 2021, 17, 49. [Google Scholar] [CrossRef] [PubMed]
- Pei, S.; Zeng, H.; Dai, Y.; Bai, W.; Fan, J. Nitrogen Nutrition Diagnosis for Cotton under Mulched Drip Irrigation Using Unmanned Aerial Vehicle Multispectral Images. J. Integr. Agric. 2023, 22, 2536–2552. [Google Scholar] [CrossRef]
- Hu, J.; Feng, H.; Wang, Q.; Shen, J.; Wang, J.; Liu, Y.; Feng, H.; Yang, H.; Guo, W.; Qiao, H.; et al. Pretrained Deep Learning Networks and Multispectral Imagery Enhance Maize LCC, FVC, and Maturity Estimation. Remote Sens. 2024, 16, 784. [Google Scholar] [CrossRef]
- Han, L.; Yang, G.; Dai, H.; Xu, B.; Yang, H.; Feng, H.; Li, Z.; Yang, X. Modeling Maize Above-Ground Biomass Based on Machine Learning Approaches Using UAV Remote-Sensing Data. Plant Methods 2019, 15, 10. [Google Scholar] [CrossRef] [PubMed]
- Miao, T.; Zhu, C.; Xu, T.; Yang, T.; Li, N.; Zhou, Y.; Deng, H. Automatic Stem-Leaf Segmentation of Maize Shoots Using Three-Dimensional Point Cloud. Comput. Electron. Agric. 2021, 187, 106310. [Google Scholar] [CrossRef]
- Turgut, K.; Dutağacı, H.; Rousseau, D. RoseSegNet: An Attention-Based Deep Learning Architecture for Organ Segmentation of Plants. Biosyst. Eng. 2022, 221, 138–153. [Google Scholar] [CrossRef]
- Lu, Z.; Qi, L.; Zhang, H.; Wan, J.; Zhou, J. Image Segmentation of UAV Fruit Tree Canopy in a Natural Illumination Environment. Agriculture 2022, 12, 1039. [Google Scholar] [CrossRef]
- Ren, M.; Zemel, R.S. End-to-End Instance Segmentation with Recurrent Attention. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 293–301. [Google Scholar] [CrossRef]
- Romera-Paredes, B.; Torr, P.H.S. Recurrent Instance Segmentation. In Proceedings of the Computer Vision—ECCV 2016, Amsterdam, The Netherlands, 11–14 October 2016; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 312–329. [Google Scholar]
- Scharr, H.; Minervini, M.; Fischbach, A.; Tsaftaris, S. Annotated Image Datasets of Rosette Plants. In Proceedings of the European Conference on Computer Vision, Zurich, Switzerland, 6–12 September 2014. [Google Scholar]
- Yin, X.; Liu, X.; Chen, J.; Kramer, D.M. Multi-Leaf Tracking from Fluorescence Plant Videos. In Proceedings of the 2014 IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; pp. 408–412. [Google Scholar]
- Zhang, J.; Wang, X.; Liu, J.Y.; Zhang, D.; Lu, Y.; Zhou, Y.H.; Sun, L.; Hou, S.L.; Fan, X.F.; Shen, S.; et al. Multispectral Drone Imagery and SRGAN for Rapid Phenotypic Mapping of Individual Chinese Cabbage Plants. Plant Phenomics 2022, 2022, 0007. [Google Scholar] [CrossRef] [PubMed]
- Bai, Y.; Shi, L.; Zha, Y.; Liu, S.; Nie, C.; Xu, H.; Yang, H.; Shao, M.; Yu, X.; Cheng, M.; et al. Estimating Leaf Age of Maize Seedlings Using UAV-Based RGB and Multispectral images. Comput. Electron. Agric. 2023, 215, 108349. [Google Scholar] [CrossRef]
- Yang, T.; Zhou, S.; Xu, A.; Ye, J.; Yin, J. An Approach for Plant Leaf Image Segmentation Based on YOLOV8 and the Improved DEEPLABV3+. Plants 2023, 12, 3438. [Google Scholar] [CrossRef] [PubMed]
- Qiang, Z.; Shi, J.; Shi, F. Phenotype Tracking of Leafy Greens Based on Weakly Supervised Instance Segmentation and Data Association. Agronomy 2022, 12, 1567. [Google Scholar] [CrossRef]
- Bearman, A.; Russakovsky, O.; Ferrari, V.; Fei-Fei, L. What’s the Point: Semantic Segmentation with Point Supervision. In Proceedings of the Computer Vision—ECCV 2016, Amsterdam, The Netherlands, 11–14 October 2016; Leibe, B., Matas, J., Sebe, N., Welling, M., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 549–565. [Google Scholar]
- Zhao, Yannan; Deng, Hanbing; Liu, Ting; Zhao, Lulu; Zhao, Kai; Yang, Jing; Zhang, Yufeng A weakly-supervised learning-based method for segmenting maize seedling plant image instances. J. Agric. Eng. 2022, 38, 143–152.
- Bawankule, R.; Gaikwad, V.; Kulkarni, I.; Kulkarni, S.; Jadhav, A.; Ranjan, N. Visual Detection of Waste Using YOLOv8. In Proceedings of the 2023 International Conference on Sustainable Computing and Smart Systems (ICSCSS), Coimbatore, India, 14–16 June 2023; pp. 869–873. [Google Scholar]
- Hu, Y.; Wang, J.; Wang, X.; Sun, Y.; Yu, H.; Zhang, J. Real-Time Evaluation of the Blending Uniformity of Industrially Produced Gravelly Soil Based on Cond-YOLOv8-seg. J. Ind. Inf. Integr. 2024, 39, 100603. [Google Scholar] [CrossRef]
- de Melo Lima, B.P.; de Araújo Barbosa Borges, L.; Hirose, E.; Borges, D.L. A Lightweight and Enhanced Model for Detecting the Neotropical Brown Stink Bug, Euschistus Heros (Hemiptera: Pentatomidae) Based on YOLOv8 for Soybean fields. Ecol. Inform. 2024, 80, 102543. [Google Scholar] [CrossRef]
- Tang, Y.; Qian, Y. High-Speed Railway Track Components Inspection Framework Based on YOLOv8 with High-Performance Model deployment. High-Speed Railw. 2024, 2, 42–50. [Google Scholar] [CrossRef]
- Kirillov, A.; Mintun, E.; Ravi, N.; Mao, H.; Rolland, C.; Gustafson, L.; Xiao, T.; Whitehead, S.; Berg, A.C.; Lo, W.-Y.; et al. Segment Anything. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 2–3 October 2023. [Google Scholar]
- Moulon, P.; Monasse, P.; Marlet, R. Global Fusion of Relative Motions for Robust, Accurate and Scalable Structure from Motion. In Proceedings of the 2013 IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; pp. 3248–3255. [Google Scholar]
- Zhang, C.; Marfatia, P.; Farhan, H.; Di, L.; Lin, L.; Zhao, H.; Li, H.; Islam, M.D.; Yang, Z. Enhancing USDA NASS Cropland Data Layer with Segment Anything Model. In Proceedings of the 2023 11th International Conference on Agro-Geoinformatics (Agro-Geoinformatics), Novi Sad, Serbia, 15–18 July 2023; pp. 1–5. [Google Scholar]
- Pandey, S.; Chen, K.-F.; Dam, E.B. Comprehensive Multimodal Segmentation in Medical Imaging: Combining YOLOv8 with SAM and HQ-SAM Models. In Proceedings of the 2023 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW), Paris, France, 2–6 October 2023; pp. 2584–2590. [Google Scholar]
- Noe, S.M.; Zin, T.T.; Tin, P.; Kobyashi, I. Efficient Segment-Anything Model for Automatic Mask Region Extraction in Livestock Monitoring. In Proceedings of the 2023 IEEE 13th International Conference on Consumer Electronics—Berlin (ICCE-Berlin), Berlin, Germany, 2–5 September 2023; pp. 167–171. [Google Scholar]
- Dikshit, A.; Bartsch, A.; George, A.; Farimani, A.B. RoboChop: Autonomous Framework for Fruit and Vegetable Chopping Leveraging Foundational Models. arXiv 2023, arXiv:2307.13159. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image Is Worth 16x16 Words: Transformers for Image Recognition at Scale. arXiv 2021, arXiv:2010.11929. [Google Scholar]
- He, K.; Chen, X.; Xie, S.; Li, Y.; Dollár, P.; Girshick, R. Masked Autoencoders Are Scalable Vision Learners. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 19–25 June 2021. [Google Scholar]
- Radford, A.; Kim, J.W.; Hallacy, C.; Ramesh, A.; Goh, G.; Agarwal, S.; Sastry, G.; Askell, A.; Mishkin, P.; Clark, J.; et al. Learning Transferable Visual Models From Natural Language Supervision. In Proceedings of the International Conference on Machine Learning, Vienna, Austria, 18–24 July 2021. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention Is All You Need. Adv. Neural Inf. Process. Syst. arXiv arXiv:1706.03762, 2023.
- Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Ramachandran, P.; Zoph, B.; Le, Q.V. Searching for Activation Functions. arXiv 2017, arXiv:1710.05941. [Google Scholar]
- Liu, S.; Qi, L.; Qin, H.; Shi, J.; Jia, J. Path Aggregation Network for Instance Segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018. [Google Scholar]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar]
- Huang, Z.; Huang, L.; Gong, Y.; Huang, C.; Wang, X. Mask Scoring R-CNN. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Kirillov, A.; Wu, Y.; He, K.; Girshick, R. PointRend: Image Segmentation As Rendering. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 9796–9805. [Google Scholar]
- Jocher, G.; Chaurasia, A.; Stoken, A.; Borovec, J.; NanoCode012; Kwon, Y.; Michael, K.; Tao, X.; Fang, J.; Imyhxy; et al. Ultralytics/Yolov5: V7.0—YOLOv5 SOTA Realtime Instance Segmentation. Zenodo 2022. [Google Scholar] [CrossRef]
- Jin, X.; Liu, S.; Frederic, B.; Hemmerlé, M.; Comar, A. Estimates of Plant Density of Wheat Crops at Emergence from Very Low Altitude UAV Imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef]
- Liu, S.; Baret, F.; Andrieu, B.; Burger, P.; Hemmerlé, M. Estimation of Wheat Plant Density at Early Stages Using High Resolution Imagery. Front. Plant Sci. 2017, 8, 739. [Google Scholar] [CrossRef] [PubMed]
- Jensen, J.R. Remote Sensing of the Environment: An Earth Resource Perspective. In Prentice Hall Series in Geographic Information Science, 2nd ed.; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2007; ISBN 978-0-13-188950-7. [Google Scholar]
- Zhang, Z.; Zhu, L. A Review on Unmanned Aerial Vehicle Remote Sensing: Platforms, Sensors, Data Processing Methods, and Applications. Drones 2023, 7, 398. [Google Scholar] [CrossRef]
- Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned Aerial Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and Perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
- BandCombinations.pdf. Available online: https://fsapps.nwcg.gov/gtac/CourseDownloads/Reimbursables/FY20/FHM/Day2/Track3/BandComboCheatSheet.pdf (accessed on 5 June 2024).
- Malla, S.; Tuladhar, A.; Quadri, G.J.; Rosen, P. Multi-Spectral Satellite Image Analysis for Feature Identification and Change Detection VAST Challenge 2017: Honorable Mention for Good Facilitation of Single Image Analysis. In Proceedings of the 2017 IEEE Conference on Visual Analytics Science and Technology (VAST), Phoenix, AZ, USA, 3–6 October 2017; pp. 205–206. [Google Scholar]
- Langhe, S.; Herbei, M.V.; Sala, F. Use of Remote Sensing Images in Crop Monitoring Case Study: Soybean Crop. Res. J. Agric. Sci. 2020, 52, 53–61. [Google Scholar]
- Wigmore, O.; Mark, B.; McKenzie, J.; Baraer, M.; Lautz, L. Sub-Metre Mapping of Surface Soil Moisture in Proglacial Valleys of the Tropical Andes Using a Multispectral Unmanned Aerial Vehicle. Remote Sens. Environ. 2019, 222, 104–118. [Google Scholar] [CrossRef]
- Zhang, Y.; Han, W.; Zhang, H.; Niu, X.; Shao, G. Evaluating Soil Moisture Content under Maize Coverage Using UAV Multimodal Data by Machine Learning Algorithms. J. Hydrol. 2023, 617, 129086. [Google Scholar] [CrossRef]
- Nguyen, T.T.; Ngo, H.H.; Guo, W.; Chang, S.W.; Nguyen, D.D.; Nguyen, C.T.; Zhang, J.; Liang, S.; Bui, X.T.; Hoang, N.B. A Low-Cost Approach for Soil Moisture Prediction Using Multi-Sensor Data and Machine Learning Algorithm. Sci. Total Environ. 2022, 833, 155066. [Google Scholar] [CrossRef] [PubMed]
- Boschetti, M.; Bocchi, S.; Brivio, P.A. Assessment of Pasture Production in the Italian Alps Using Spectrometric and Remote Sensing Information. Agric. Ecosyst. Environ. 2007, 118, 267–272. [Google Scholar] [CrossRef]
- Mihai Valentin, H.; Florin, S. Using GIS Technology in Processing and Analyzing Satellite Images—Case Study Cheile Nerei Beusnița National Park, Romania. J. Hortic. For. Biotechnol. 2014, 18, 113–119. [Google Scholar]
- Temenos, A.; Temenos, N.; Kaselimi, M.; Doulamis, A.; Doulamis, N. Interpretable Deep Learning Framework for Land Use and Land Cover Classification in Remote Sensing Using SHAP. IEEE Geosci. Remote Sens. Lett. 2023, 20, 1–5. [Google Scholar] [CrossRef]
- Dixit, M.; Chaurasia, K.; Kumar Mishra, V. Dilated-ResUnet: A Novel Deep Learning Architecture for Building Extraction from Medium Resolution Multi-Spectral Satellite imagery. Expert Syst. Appl. 2021, 184, 115530. [Google Scholar] [CrossRef]
- Hanna, M.M.; Steyn-Ross, D.A.; Steyn-Ross, M. Estimating Biomass for New Zealand Pasture Using Optical Remote Sensing Techniques. Geocarto Int. 1999, 14, 89–94. [Google Scholar] [CrossRef]
- Primicerio, J.; Gay, P.; Ricauda Aimonino, D.; Comba, L.; Matese, A.; Di Gennaro, S. NDVI-Based Vigour Maps Production Using Automatic Detection of Vine Rows in Ultra-High Resolution Aerial Images. In Precision Agriculture; Wageningen Academic: Wageningen, The Netherlands, 2015; pp. 465–470. ISBN 978-90-8686-267-2. [Google Scholar]
- Yang, C.; Everitt, J.H.; Bradford, J.M.; Escobar, D.E. Escobar Mapping Grain Sorghum Growth and Yield Variations Using Airborne Multispectral Digital Imagery. Trans. ASAE 2000, 43, 1927–1938. [Google Scholar] [CrossRef]
- Tucker, C.J. Red and Photographic Infrared Linear Combinations for Monitoring Vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
- Rouse, J.; Haas, R.H.; Schell, J.A.; Deering, D. Monitoring Vegetation Systems in the Great Plains with ERTS. NASA Spec. Publ. 1973, 351, 309. [Google Scholar]
- Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A Modified Soil Adjusted Vegetation Index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
- Rondeaux, G.; Steven, M.; Baret, F. Optimization of Soil-Adjusted Vegetation Indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
- Huete, A.R. A Soil-Adjusted Vegetation Index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a Green Channel in Remote Sensing of Global Vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
- Escadafal, R. Remote Sensing of Arid Soil Surface Color with Landsat Thematic mapper. Adv. Space Res. 1989, 9, 159–163. [Google Scholar] [CrossRef]
- Daughtry, C.S.T.; Gallo, K.P.; Goward, S.N.; Prince, S.D.; Kustas, W.P. Spectral Estimates of Absorbed Radiation and Phytomass Production in Corn and Soybean Canopies. Remote Sens. Environ. 1992, 39, 141–152. [Google Scholar] [CrossRef]
- Penuelas, J.; Frederic, B.; Filella, I. Semi-Empirical Indices to Assess Carotenoids/Chlorophyll-a Ratio from Leaf Spectral Reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
Method | Average Time to Annotate on Particle (s) |
---|---|
Manual annotation | 28.43 |
SAM-based semi-automatic annotation | 5.5 |
Dataset | Class | Instances | Target Amount | ||
---|---|---|---|---|---|
Small | Medium | Large | |||
Train Set | Maize | 19,310 | 1620 | 931 | 16,759 |
Val Set | Maize | 4754 | 389 | 197 | 4168 |
Test Set | Maize | 29,226 | 2278 | 1360 | 25,588 |
Parameter Name | Parameter Value | Parameter Name | Parameter Value |
---|---|---|---|
Epochs | 200 | Batch Size | 2 |
Momentum | 0.937 | Weight_Decay | 0.0005 |
Model | Band | Box | Seg | Parameters (MB) | ||
---|---|---|---|---|---|---|
mAPval50 | mAPval50-95 | mAPval50 | mAPval50-95 | |||
YOLOv8m | NRG | 0.952 | 0.794 | 0.94 | 0.618 | 27.240 |
NER | 0.951 | 0.793 | 0.94 | 0.615 | ||
RGB | 0.949 | 0.769 | 0.932 | 0.567 | ||
YOLOv5m | NRG | 0.952 | 0.788 | 0.942 | 0.611 | 26.531 |
NER | 0.952 | 0.786 | 0.941 | 0.609 | ||
RGB | 0.947 | 0.755 | 0.933 | 0.56 | ||
PointRend | NRG | 0.918 | 0.665 | 0.897 | 0.524 | 55.755 |
NER | 0.918 | 0.666 | 0.906 | 0.524 | ||
RGB | 0.909 | 0.631 | 0.895 | 0.463 | ||
Mask Scoring R-CNN | NRG | 0.906 | 0.639 | 0.891 | 0.499 | 60.230 |
NER | 0.913 | 0.638 | 0.892 | 0.498 | ||
RGB | 0.904 | 0.6 | 0.88 | 0.435 | ||
Mask R-CNN | NRG | 0.910 | 0.637 | 0.895 | 0.511 | 43.971 |
NER | 0.915 | 0.640 | 0.902 | 0.515 | ||
RGB | 0.903 | 0.596 | 0.884 | 0.454 | ||
Cascade Mask R-CNN | NRG | 0.805 | 0.619 | 0.782 | 0.437 | 77.021 |
NER | 0.777 | 0.605 | 0.764 | 0.429 | ||
RGB | 0.766 | 0.576 | 0.379 | 0.752 |
Model | Band | Box | Seg | Speed (ms) | Parameters (MB) | FLOPs (G) | ||
---|---|---|---|---|---|---|---|---|
mAPval50 | mAPval50–95 | mAPval50 | mAPval50–95 | |||||
YOLOv8n | NRG | 0.942 | 0.75 | 0.928 | 0.576 | 11.7 | 3.263 | 12.1 |
NER | 0.941 | 0.751 | 0.928 | 0.579 | 12.4 | |||
RGB | 0.937 | 0.718 | 0.919 | 0.524 | 14.1 | |||
YOLOv8s | NRG | 0.947 | 0.778 | 0.937 | 0.604 | 20.9 | 11.790 | 42.7 |
NER | 0.949 | 0.777 | 0.935 | 0.601 | 23.4 | |||
RGB | 0.946 | 0.744 | 0.931 | 0.551 | 22.3 | |||
YOLOv8m | NRG | 0.952 | 0.794 | 0.94 | 0.618 | 30.8 | 27.240 | 110.4 |
NER | 0.951 | 0.793 | 0.94 | 0.615 | 30.0 | |||
RGB | 0.949 | 0.769 | 0.932 | 0.567 | 31.1 | |||
YOLOv8l | NRG | 0.954 | 0.807 | 0.941 | 0.623 | 36.2 | 45.937 | 220.8 |
NER | 0.953 | 0.806 | 0.943 | 0.622 | 36.1 | |||
RGB | 0.95 | 0.778 | 0.936 | 0.575 | 38.6 | |||
YOLOv8x | NRG | 0.952 | 0.811 | 0.942 | 0.624 | 59.9 | 71.752 | 344.5 |
NER | 0.953 | 0.812 | 0.943 | 0.626 | 59.6 | |||
RGB | 0.952 | 0.79 | 0.94 | 0.584 | 58.8 |
Model | Box | Seg | Speed (ms) | FLOPs (G) | Parameters (MB) | ||
---|---|---|---|---|---|---|---|
mAPtest50 | mAPtest50-95 | mAPtest50 | mAPtest50-95 | ||||
YOLOv8m | 0.795 | 0.585 | 0.788 | 0.462 | 25.2 | 110.4 | 27.240 |
YOLOv5m | 0.772 | 0.554 | 0.763 | 0.428 | 26.1 | 95.4 | 26.531 |
PointRend | 0.73 | 0.428 | 0.714 | 0.348 | 55.5 | 90.299 | 55.755 |
Mask Scoring R-CNN | 0.725 | 0.411 | 0.698 | 0.327 | 47.6 | 183.296 | 60.230 |
Mask R-CNN | 0.701 | 0.389 | 0.682 | 0.322 | 66.7 | 145.408 | 43.971 |
Cascade Mask R-CNN | 0.602 | 0.387 | 0.582 | 0.285 | 43.5 | 240.64 | 77.021 |
Model | Band | GSD (cm/Pixel) | Box | Seg | ||
---|---|---|---|---|---|---|
mAPval50 | mAPval50–95 | mAPval50 | mAPval50–95 | |||
YOLOv8m | NRG | 0.400 | 0.952 | 0.794 | 0.94 | 0.618 |
0.444 | 0.952 | 0.794 | 0.945 | 0.62 | ||
0.500 | 0.95 | 0.784 | 0.938 | 0.61 | ||
0.533 | 0.948 | 0.783 | 0.939 | 0.605 | ||
0.571 | 0.949 | 0.778 | 0.938 | 0.604 | ||
0.615 | 0.944 | 0.773 | 0.935 | 0.598 | ||
0.800 | 0.94 | 0.75 | 0.931 | 0.573 | ||
1.000 | 0.933 | 0.722 | 0.921 | 0.548 | ||
1.143 | 0.925 | 0.696 | 0.913 | 0.526 | ||
1.333 | 0.914 | 0.668 | 0.897 | 0.494 | ||
1.600 | 0.901 | 0.623 | 0.864 | 0.442 | ||
2.000 | 0.876 | 0.562 | 0.793 | 0.356 | ||
2.667 | 0.823 | 0.464 | 0.616 | 0.232 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Geng, T.; Yu, H.; Yuan, X.; Ma, R.; Li, P. Research on Segmentation Method of Maize Seedling Plant Instances Based on UAV Multispectral Remote Sensing Images. Plants 2024, 13, 1842. https://doi.org/10.3390/plants13131842
Geng T, Yu H, Yuan X, Ma R, Li P. Research on Segmentation Method of Maize Seedling Plant Instances Based on UAV Multispectral Remote Sensing Images. Plants. 2024; 13(13):1842. https://doi.org/10.3390/plants13131842
Chicago/Turabian StyleGeng, Tingting, Haiyang Yu, Xinru Yuan, Ruopu Ma, and Pengao Li. 2024. "Research on Segmentation Method of Maize Seedling Plant Instances Based on UAV Multispectral Remote Sensing Images" Plants 13, no. 13: 1842. https://doi.org/10.3390/plants13131842
APA StyleGeng, T., Yu, H., Yuan, X., Ma, R., & Li, P. (2024). Research on Segmentation Method of Maize Seedling Plant Instances Based on UAV Multispectral Remote Sensing Images. Plants, 13(13), 1842. https://doi.org/10.3390/plants13131842