Modeling the Effect of Vegetation Coverage on Unmanned Aerial Vehicles-Based Object Detection: A Study in the Minefield Environment
Abstract
:1. Introduction
1.1. UAV-Based Object Detection
1.2. The Effect of Vegetation and Occlusion on Object Detection
1.3. Landmines and Cluster Munitions
1.4. Motivation
2. Materials and Methods
2.1. The Recall—Vegetated Occlusion Relationship
2.2. Synthetic Vegetation Growth
2.3. Extracting Vegetation Height and Cover from UAV Imagery
2.4. Vegetation Height and Cover Verification
3. Results
3.1. Effect of Occlusion on Recall
Explained Variance Based on Occlusion Factors
3.2. Empirical Recall with Occlusion
3.3. Vegetation Height and Cover Error
4. Discussion
4.1. Interpretation of Results
4.1.1. Effect of Occlusion on Recall
4.1.2. Empirical Recall with Occlusion
4.1.3. Vegetation Height and Cover Error
4.2. Limitations
4.3. Vegetation Height Model Approaches
4.4. Improving Robustness of Object Detection from Vegetation Occlusion
4.5. Application to Humanitarian Mine Action
4.6. Broader Applications and Implications
5. Conclusions
- Novel algorithm to extract vegetation height and foliar cover from a UAV-derived Digital Surface Model.
- Generating synthetic grass growth over an object to quantify the effect of occlusion with increasing foliar cover on detection rates of small objects in the natural environment.
- Developing an occlusion-based vegetation uncertainty model that combines (1) and (2) to create a “detectability” map over an orthomosaic for a deep learning object detection model.
- Applying the uncertainty model in (3) to a real-world test case for UAV-based landmine detection.
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Deng, J.; Dong, W.; Socher, R.; Li, L.J.; Li, K.; Fei-Fei, L. Imagenet: A large-scale hierarchical image database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; IEEE: New York, NY, USA, 2009; pp. 248–255. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; Volume 25. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016; pp. 779–788. [Google Scholar]
- Girshick, R. Fast r-cnn. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. In Proceedings of the Advances in Neural Information Processing Systems 28 (NIPS 2015), Montreal, QC, Canada, 7–12 December 2015; Volume 28. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. Ssd: Single shot multibox detector. In Proceedings of the Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; Proceedings, Part I 14. Springer International Publishing: Berlin/Heidelberg, Germany, 2016; pp. 21–37. [Google Scholar]
- Daponte, P.; De Vito, L.; Glielmo, L.; Iannelli, L.; Liuzza, D.; Picariello, F.; Silano, G. A review on the use of drones for precision agriculture. In IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2019; Volume 275, p. 012022. [Google Scholar]
- James, M.R.; Carr, B.; D’Arcy, F.; Diefenbach, A.; Dietterich, H.; Fornaciai, A.; Lev, E.; Liu, E.; Pieri, D.; Rodgers, M.; et al. Volcanological applications of unoccupied aircraft systems (UAS): Developments, strategies, and future challenges. Volcanica 2020, 3, 67–114. [Google Scholar] [CrossRef]
- Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
- Amoukteh, A.; Janda, J.; Vincent, J. Drones Go to Work. BCG Global. 2017. Available online: https://www.bcg.com/publications/2017/engineered-products-infrastructure-machinery-components-drones-go-work (accessed on 10 January 2024).
- Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. ‘Structure-from-Motion’photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef]
- Du, D.; Qi, Y.; Yu, H.; Yang, Y.; Duan, K.; Li, G.; Zhang, W.; Huang, Q.; Tian, Q. The unmanned aerial vehicle benchmark: Object detection and tracking. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 370–386. [Google Scholar]
- Mittal, P.; Singh, R.; Sharma, A. Deep learning-based object detection in low-altitude UAV datasets: A survey. Image Vis. Comput. 2020, 104, 104046. [Google Scholar] [CrossRef]
- Cao, Y.; He, Z.; Wang, L.; Wang, W.; Yuan, Y.; Zhang, D.; Zhang, J.; Zhu, P.; Van Gool, L.; Han, J.; et al. VisDrone-DET2021: The vision meets drone object detection challenge results. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 2847–2854. [Google Scholar]
- Pham, M.T.; Courtrai, L.; Friguet, C.; Lefèvre, S.; Baussard, A. YOLO-Fine: One-stage detector of small objects under various backgrounds in remote sensing images. Remote Sens. 2020, 12, 2501. [Google Scholar] [CrossRef]
- Hamylton, S.M.; Morris, R.H.; Carvalho, R.C.; Roder, N.; Barlow, P.; Mills, K.; Wang, L. Evaluating techniques for mapping island vegetation from unmanned aerial vehicle (UAV) images: Pixel classification, visual interpretation and machine learning approaches. Int. J. Appl. Earth Obs. Geoinf. 2020, 89, 102085. [Google Scholar] [CrossRef]
- Zangerl, U.; Haselberger, S.; Kraushaar, S. Classifying Sparse Vegetation in a Proglacial Valley Using UAV Imagery and Random Forest Algorithm. Remote Sens. 2022, 14, 4919. [Google Scholar] [CrossRef]
- Baur, J.; Steinberg, G.; Nikulin, A.; Chiu, K.; de Smet, T.S. Applying deep learning to automate UAV-based detection of scatterable landmines. Remote Sens. 2020, 12, 859. [Google Scholar] [CrossRef]
- Barnawi, A.; Budhiraja, I.; Kumar, K.; Kumar, N.; Alzahrani, B.; Almansour, A.; Noor, A. A comprehensive review on landmine detection using deep learning techniques in 5G environment: Open issues and challenges. Neural Comput. Appl. 2022, 34, 21657–21676. [Google Scholar] [CrossRef]
- Bajić, M., Jr.; Potočnik, B. UAV Thermal Imaging for Unexploded Ordnance Detection by Using Deep Learning. Remote Sens. 2023, 15, 967. [Google Scholar] [CrossRef]
- Harvey, A.; LeBrun, E. Computer Vision Detection of Explosive Ordnance: A High-Performance 9N235/9N210 Cluster Submunition Detector. J. Conv. Weapons Destr. 2023, 27, 9. [Google Scholar]
- Baur, J.; Steinberg, G.; Nikulin, A.; Chiu, K.; de Smet, T. How to implement drones and machine learning to reduce time, costs, and dangers associated with landmine detection. J. Conv. Weapons Destr. 2021, 25, 29. [Google Scholar]
- Coulloudon, B.; Eshelman, K.; Gianola, J.; Habich, N.; Hughes, L.; Johnson, C.; Pellant, M.; Podborny, P.; Rasmussen, A.; Robles, B.; et al. Sampling vegetation attributes. In BLM Technical Reference; Bureau of Land Management, National Business Center: Denver, CO, USA, 1999; Volume 1734. [Google Scholar]
- Dong, Z.; Lv, P.; Zhang, Z.; Qian, G.; Luo, W. Aeolian transport in the field: A comparison of the effects of different surface treatments. J. Geophys. Res. Atmos. 2012, 117. [Google Scholar] [CrossRef]
- Bokhorst, S.; Pedersen, S.H.; Brucker, L.; Anisimov, O.; Bjerke, J.W.; Brown, R.D.; Ehrich, D.; Essery, R.L.H.; Heilig, A.; Ingvander, S.; et al. Changing Arctic snow cover: A review of recent developments and assessment of future needs for observations, modelling, and impacts. Ambio 2016, 45, 516–537. [Google Scholar] [CrossRef] [PubMed]
- Saleh, K.; Szénási, S.; Vámossy, Z. Occlusion handling in generic object detection: A review. In Proceedings of the 2021 IEEE 19th World Symposium on Applied Machine Intelligence and Informatics (SAMI), Herl’any, Slovakia, 21–23 January 2021; IEEE: New York, NY, USA, 2021; pp. 000477–000484. [Google Scholar]
- Dalborgo, V.; Murari, T.B.; Madureira, V.S.; Moraes, J.G.L.; Bezerra, V.M.O.; Santos, F.Q.; Silva, A.; Monteiro, R.L. Traffic Sign Recognition with Deep Learning: Vegetation Occlusion Detection in Brazilian Environments. Sensors 2023, 23, 5919. [Google Scholar] [CrossRef] [PubMed]
- ICBL-CMC. International Campaign to Ban Landmines, Landmine Monitor 2023; ICBL-CMC: Geneva, Switzerland, 2023. [Google Scholar]
- GICHD. Difficult Terrain in Mine Action; International Center for Humanitarian Demining: Geneva, Switzerland, 2023. [Google Scholar]
- Tuohy, M.; Greenspan, E.; Fasullo, S.; Baur, J.; Steinberg, G.; Zheng, L.; Nikulin, A.; Clayton, G.M.; de Smet, T. Inspiring the Next Generation of Humanitarian Mine Action Researchers. J. Conv. Weapons Destr. 2023, 27, 7. [Google Scholar]
- National Mine Action Authority; GICHD. Explosive Ordnance Risk Education Interactive Map. ArcGIS Web Application. 2023. Available online: https://ua.imsma.org/portal/apps/webappviewer/index.html?id=92c5f2e0fa794acf95fefb20eebdecae (accessed on 10 January 2024).
- ICBL-CMC. Cluster Munition Coalition. Cluster Munition Monitor 2023. 2023. Available online: www.the-monitor.org (accessed on 5 December 2023).
- Jean-Pierre, K.; Sullivan, J. Press Briefing by Press Secretary Karine Jean-Pierre and National Security Advisor Jake Sullivan; White House: Washington, DC, USA, 2023. [Google Scholar]
- Mishra, N.B.; Crews, K.A. Mapping vegetation morphology types in a dry savanna ecosystem: Integrating hierarchical object-based image analysis with Random Forest. Int. J. Remote Sens. 2014, 35, 1175–1198. [Google Scholar] [CrossRef]
- Resop, J.P.; Lehmann, L.; Hession, W.C. Quantifying the spatial variability of annual and seasonal changes in riverscape vegetation using drone laser scanning. Drones 2021, 5, 91. [Google Scholar] [CrossRef]
- Cayssials, V.; Rodríguez, C. Functional traits of grasses growing in open and shaded habitats. Evol. Ecol. 2013, 27, 393–407. [Google Scholar] [CrossRef]
- Jocher, G.; Chaurasia, A.; Qiu, J. Ultralytics YOLOv8, Version 8.0.0 Software; GitHub: San Francisco, CA, USA, 2023. Available online: https://github.com/ultralytics/ultralytics(accessed on 10 January 2024).
- Cook, K.L. An evaluation of the effectiveness of low-cost UAVs and structure from motion for geomorphic change detection. Geomorphology 2017, 278, 195–208. [Google Scholar] [CrossRef]
- Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O’Connor, J.; Rosette, J. Structure from motion photogrammetry in forestry: A review. Curr. For. Rep. 2019, 5, 155–168. [Google Scholar] [CrossRef]
- Pix4D. How to Verify That There Is Enough Overlap between the Images—Pix4D Mapper; Pix4D Support: Denver, CO, USA, 2020. [Google Scholar]
- Madec, S.; Baret, F.; De Solan, B.; Thomas, S.; Dutartre, D.; Jezequel, S.; Hemmerlé, M.; Colombeau, G.; Comar, A. High-throughput phenotyping of plant height: Comparing unmanned aerial vehicles and ground LiDAR estimates. Front. Plant Sci. 2017, 8, 2002. [Google Scholar] [CrossRef]
- Fujiwara, R.; Kikawada, T.; Sato, H.; Akiyama, Y. Comparison of Remote Sensing Methods for Plant Heights in Agricultural Fields Using Unmanned Aerial Vehicle-Based Structure from Motion. Front. Plant Sci. 2022, 13, 886804. [Google Scholar] [CrossRef]
- Malambo, L.; Popescu, S.C.; Murray, S.C.; Putman, E.; Pugh, N.A.; Horne, D.W.; Richardson, G.; Sheridan, R.; Rooney, W.L.; Avant, R.; et al. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 31–42. [Google Scholar] [CrossRef]
- Schneider, C.A.; Rasband, W.S.; Eliceiri, K.W. NIH Image to ImageJ: 25 years of image analysis. Nat. Methods 2012, 9, 671–675. [Google Scholar] [CrossRef]
- Lisein, J.; Pierrot-Deseilligny, M.; Bonnet, S.; Lejeune, P. A photogrammetric workflow for the creation of a forest canopy height model from small unmanned aerial system imagery. Forests 2013, 4, 922–944. [Google Scholar] [CrossRef]
- DiGiacomo, A.E.; Bird, C.N.; Pan, V.G.; Dobroski, K.; Atkins-Davis, C.; Johnston, D.W.; Ridge, J.T. Modeling salt marsh vegetation height using unoccupied aircraft systems and structure from motion. Remote Sens. 2020, 12, 2333. [Google Scholar] [CrossRef]
- Kawamura, K.; Asai, H.; Yasuda, T.; Khanthavong, P.; Soisouvanh, P.; Phongchanmixay, S. Field phenotyping of plant height in an upland rice field in Laos using low-cost small unmanned aerial vehicles (UAVs). Plant Prod. Sci. 2020, 23, 452–465. [Google Scholar] [CrossRef]
- Zhong, Z.; Zheng, L.; Kang, G.; Li, S.; Yang, Y. Random erasing data augmentation. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 13001–13008. [Google Scholar]
- Shorten, C.; Khoshgoftaar, T.M. A survey on image data augmentation for deep learning. J. Big Data 2019, 6, 1–48. [Google Scholar] [CrossRef]
- Ke, L.; Tai, Y.W.; Tang, C.K. Deep occlusion-aware instance segmentation with overlapping bilayers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 4019–4028. [Google Scholar]
- Yuan, X.; Kortylewski, A.; Sun, Y.; Yuille, A. Robust instance segmentation through reasoning about multi-object occlusion. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 11141–11150. [Google Scholar]
- Gao, T.; Packer, B.; Koller, D. A segmentation-aware object detection model with occlusion handling. In Proceedings of the CVPR 2011, Colorado Springs, CO, USA, 20–25 June 2011; IEEE: New York, NY, USA, 2021; pp. 1361–1368. [Google Scholar]
- Makki, I.; Younes, R.; Francis, C.; Bianchi, T.; Zucchetti, M. A survey of landmine detection using hyperspectral imaging. ISPRS J. Photogramm. Remote Sens. 2017, 124, 40–53. [Google Scholar] [CrossRef]
- Nikulin, A.; De Smet, T.S.; Baur, J.; Frazer, W.D.; Abramowitz, J.C. Detection and identification of remnant PFM-1 ‘Butterfly Mines’ with a UAV-based thermal-imaging protocol. Remote Sens. 2018, 10, 1672. [Google Scholar] [CrossRef]
- Qiu, Z.; Guo, H.; Hu, J.; Jiang, H.; Luo, C. Joint Fusion and Detection via Deep Learning in UAV-Borne Multispectral Sensing of Scatterable Landmine. Sensors 2023, 23, 5693. [Google Scholar] [CrossRef] [PubMed]
- Silva, J.S.; Guerra, I.F.L.; Bioucas-Dias, J.; Gasche, T. Landmine detection using multispectral images. IEEE Sens. J. 2019, 19, 9341–9351. [Google Scholar] [CrossRef]
- U.S. Department of Agriculture, Foreign Agricultural Service. Ukraine Agricultural Production and Trade—April 2022. 2022. Available online: https://www.fas.usda.gov/sites/default/files/2022-04/Ukraine-Factsheet-April2022.pdf (accessed on 10 January 2024).
- Reutebuch, S.E.; Andersen, H.E.; McGaughey, R.J. Light detection and ranging (LIDAR): An emerging tool for multiple resource inventory. J. For. 2005, 103, 286–292. [Google Scholar] [CrossRef]
- ten Harkel, J.; Bartholomeus, H.; Kooistra, L. Biomass and crop height estimation of different crops using UAV-based LiDAR. Remote Sens. 2019, 12, 17. [Google Scholar] [CrossRef]
- Wang, C.; Menenti, M.; Stoll, M.P.; Feola, A.; Belluco, E.; Marani, M. Separation of ground and low vegetation signatures in LiDAR measurements of salt-marsh environments. IEEE Trans. Geosci. Remote Sens. 2009, 47, 2014–2023. [Google Scholar] [CrossRef]
- Dhami, H.; Yu, K.; Xu, T.; Zhu, Q.; Dhakal, K.; Friel, J.; Li, S.; Tokekar, P. Crop height and plot estimation for phenotyping from unmanned aerial vehicles using 3D LiDAR. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 2643–2649. [Google Scholar]
- Cucchiara, R.; Grana, C.; Piccardi, M.; Prati, A. Detecting moving objects, ghosts, and shadows in video streams. IEEE Trans. Pattern Anal. Mach. Intell. 2003, 25, 1337–1342. [Google Scholar] [CrossRef]
- Lee, J.T.; Lim, K.T.; Chung, Y. Moving shadow detection from background image and deep learning. In Image and Video Technology–PSIVT 2015 Workshops: RV 2015, GPID 2013, VG 2015, EO4AS 2015, MCBMIIA 2015, and VSWS 2015, Auckland, New Zealand, 23–27 November 2015; Revised Selected Papers 7; Springer International Publishing: Berlin/Heidelberg, Germany, 2016; pp. 299–306. [Google Scholar]
- Leone, A.; Distante, C. Shadow detection for moving objects based on texture analysis. Pattern Recognit. 2007, 40, 1222–1233. [Google Scholar] [CrossRef]
- Pal, M.; Palevičius, P.; Landauskas, M.; Orinaitė, U.; Timofejeva, I.; Ragulskis, M. An overview of challenges associated with automatic detection of concrete cracks in the presence of shadows. Appl. Sci. 2021, 11, 11396. [Google Scholar] [CrossRef]
- Zhang, H.; Qu, S.; Li, H.; Luo, J.; Xu, W. A moving shadow elimination method based on fusion of multi-feature. IEEE Access 2020, 8, 63971–63982. [Google Scholar] [CrossRef]
Environment | Hv Mean (cm) | Hv Stdv (cm) | Hv 0.95 (cm) | Hv 0.05 (cm) | Fcover ≥ 4 cm Mean | Fcover ≥ 4 cm Std | Detectability (%) Mean for 4 cm Object | Detectability (%) Std for 4 cm Object | Number of UXO Detected | Total Number of UXO | |
---|---|---|---|---|---|---|---|---|---|---|---|
(A) | Tarp | 0.88 | 0.50 | 1.43 | 0.44 | 0.61 | 2.74 | 99.12 | 0.63 | 71 | 75 |
(B) | Dirt | 0.74 | 0.88 | 2.02 | 0.14 | 1.36 | 3.34 | 99.11 | 0.80 | 95 | 100 |
(C) | Gravel | 1.22 | 1.14 | 2.88 | 0.55 | 1.62 | 3.16 | 99.09 | 0.70 | 60 | 75 |
(D) | Salt flats | 1.09 | 1.36 | 3.75 | 0.14 | 3.81 | 5.39 | 99.02 | 1.01 | 25 | 30 |
(E) | Farm field | 6.24 | 1.54 | 8.73 | 4.21 | 4.24 | 2.53 | 98.39 | 0.72 | 60 | 72 |
(F) | Grass + dirt | 3.33 | 3.94 | 10.88 | 0.39 | 5.59 | 7.72 | 97.79 | 6.48 | 77 | 100 |
(G) | Grass + dirt (2) | 3.41 | 3.89 | 11.22 | 0.41 | 7.83 | 9.89 | 96.72 | 8.92 | 83 | 99 |
(H) | Flower field | 13.25 | 8.87 | 32.21 | 5.01 | 20.04 | 12.89 | 81.02 | 23.36 | 65 | 100 |
(I) | High vegetation | 12.38 | 9.51 | 31.19 | 2.29 | 19.6 | 17.64 | 77.91 | 29.9 | 55 | 100 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Baur, J.; Dewey, K.; Steinberg, G.; Nitsche, F.O. Modeling the Effect of Vegetation Coverage on Unmanned Aerial Vehicles-Based Object Detection: A Study in the Minefield Environment. Remote Sens. 2024, 16, 2046. https://doi.org/10.3390/rs16122046
Baur J, Dewey K, Steinberg G, Nitsche FO. Modeling the Effect of Vegetation Coverage on Unmanned Aerial Vehicles-Based Object Detection: A Study in the Minefield Environment. Remote Sensing. 2024; 16(12):2046. https://doi.org/10.3390/rs16122046
Chicago/Turabian StyleBaur, Jasper, Kyle Dewey, Gabriel Steinberg, and Frank O. Nitsche. 2024. "Modeling the Effect of Vegetation Coverage on Unmanned Aerial Vehicles-Based Object Detection: A Study in the Minefield Environment" Remote Sensing 16, no. 12: 2046. https://doi.org/10.3390/rs16122046
APA StyleBaur, J., Dewey, K., Steinberg, G., & Nitsche, F. O. (2024). Modeling the Effect of Vegetation Coverage on Unmanned Aerial Vehicles-Based Object Detection: A Study in the Minefield Environment. Remote Sensing, 16(12), 2046. https://doi.org/10.3390/rs16122046