Comparison of Different Analytical Strategies for Classifying Invasive Wetland Vegetation in Imagery from Unpiloted Aerial Systems (UAS)
Abstract
:1. Introduction
2. Materials and Methods
2.1. UAS, Study Site, & Data Collection
2.2. Image Processing
2.3. Image Subsetting
2.4. CHM vs. DSM for Vegetation Structure
2.5. Preliminary Classification, Image Masking, & Layer Stacking
2.6. Reference & Data Extraction
2.7. Pixel-Based Classification Approach
2.8. Object-Based Classification Approach
2.9. Accuracy Assessment and Model Performance
3. Results
3.1. Very-High (3 cm) vs. High-Resolution (11 cm) Pixels
3.2. Pixel- vs. Object-Based Classification Approach
3.3. Multispectral Only vs. Multispectral plus Structural Layers
3.4. Predicted Vegetation Classes Sampled to Quadrat Data
4. Discussion
4.1. Very-High (3 cm) vs. High-Resolution (11 cm) Pixels
4.2. Pixel- vs. Object-Based Classification Approach
4.3. Multispectral Only vs. Multispectral plus Structural Layers
4.4. Importance of UAS Flight Parameterization and Spectral Resolution of UAS Sensor
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Hu, S.; Niu, Z.; Chen, Y.; Li, L.; Zhang, H. Global wetlands: Potential distribution, wetland loss, and status. Sci. Total Environ. 2017, 586, 319–327. [Google Scholar] [CrossRef] [PubMed]
- Houlahan, J.E.; Findlay, C.S. Effect of Invasive Plant Species on Temperate Wetland Plant Diversity. Conserv. Biol. 2004, 18, 1132–1138. [Google Scholar] [CrossRef]
- Zhu, B.; Ottaviani, C.C.; Naddafi, R.; Dai, Z.; Du, D. Invasive European frogbit (Hydrocharis morsus-ranae L.) in North America: An updated review 2003–16. J. Plant Ecol. 2018, 11, 17–25. [Google Scholar] [CrossRef]
- Lishawa, S.C.; Carson, B.D.; Brandt, J.S.; Tallant, J.M.; Reo, N.J.; Albert, D.A.; Monks, A.M.; Lautenbach, J.M.; Clark, E. Mechanical Harvesting Effectively Controls Young Typha spp. Invasion Unmanned aerial vehicle data enhances post-treatment monitoring. Front. Plant Sci. 2017, 8, 619. [Google Scholar] [CrossRef] [PubMed]
- Michez, A.; Piégay, H.; Jonathan, L.; Claessens, H.; Lejeune, P. Mapping of riparian invasive species with supervised classification of Unmanned Aerial System (UAS) imagery. Int. J. Appl. Earth Obs. Geoinf. 2016, 44, 88–94. [Google Scholar] [CrossRef]
- Müllerová, J.; Brůna, J.; Bartaloš, T.; Dvořák, P.; Vítková, M.; Pyšek, P. Timing Is Important: Unmanned Aircraft vs. Satellite Imagery in Plant Invasion Monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Abeysinghe, T.; Simic Milas, A.; Arend, K.; Hohman, B.; Reil, P.; Gregory, A.; Vázquez-Ortega, A. Mapping Invasive Phragmites australis in the Old Woman Creek Estuary Using UAV Remote Sensing and Machine Learning Classifiers. Remote Sens. 2019, 11, 1380. [Google Scholar] [CrossRef] [Green Version]
- Martin, F.-M.; Müllerová, J.; Borgniet, L.; Dommanget, F.; Breton, V.; Evette, A. Using Single- and Multi-Date UAV and Satellite Imagery to Accurately Monitor Invasive Knotweed Species. Remote Sens. 2018, 10, 1662. [Google Scholar] [CrossRef] [Green Version]
- Lippitt, C.D.; Zhang, S. The impact of small unmanned airborne platforms on passive optical remote sensing: A conceptual perspective. Int. J. Remote Sens. 2018, 39, 4852–4868. [Google Scholar] [CrossRef]
- Sandino, J.; Gonzalez, F.; Mengersen, K.; Gaston, K.J. UAVs and Machine Learning Revolutionising Invasive Grass and Vegetation Surveys in Remote Arid Lands. Sensors 2018, 18, 605. [Google Scholar] [CrossRef] [Green Version]
- Al-Ali, Z.M.; Abdullah, M.M.; Asadalla, N.B.; Gholoum, M. A comparative study of remote sensing classification methods for monitoring and assessing desert vegetation using a UAV-based multispectral sensor. Environ. Monit. Assess. 2020, 192, 389. [Google Scholar] [CrossRef] [PubMed]
- Huang, C.; Asner, G.P. Applications of Remote Sensing to Alien Invasive Plant Studies. Sensors 2009, 9, 4869–4889. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hruska, R.; Mitchell, J.; Anderson, M.; Glenn, N.F. Radiometric and Geometric Analysis of Hyperspectral Imagery Acquired from an Unmanned Aerial Vehicle. Remote Sens. 2012, 4, 2736–2752. [Google Scholar] [CrossRef] [Green Version]
- Van Iersel, W.; Straatsma, M.; Middelkoop, H.; Addink, E. Multitemporal Classification of River Floodplain Vegetation Using Time Series of UAV Images. Remote Sens. 2018, 10, 1144. [Google Scholar] [CrossRef] [Green Version]
- Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef] [Green Version]
- Ghosh, A.; Joshi, P.K. A comparison of selected classification algorithms for mapping bamboo patches in lower Gangetic plains using very high resolution WorldView 2 imagery. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 298–311. [Google Scholar] [CrossRef]
- Liu, D.; Xia, F. Assessing object-based classification: Advantages and limitations. Remote Sens. Lett. 2010, 1, 187–194. [Google Scholar] [CrossRef]
- Alvarez-Taboada, F.; Paredes, C.; Julián-Pelaz, J. Mapping of the Invasive Species Hakea sericea Using Unmanned Aerial Vehicle (UAV) and WorldView-2 Imagery and an Object-Oriented Approach. Remote Sens. 2017, 9, 913. [Google Scholar] [CrossRef] [Green Version]
- Bolch, E.A.; Hestir, E.L. Using Hyperspectral UAS Imagery to Monitor Invasive Plant Phenology. In Proceedings of the Optical Sensors and Sensing Congress (ES, FTS, HISE, Sensors), Washington, DC, USA, 22–26 June 2019; p. HTu4C.3. [Google Scholar] [CrossRef]
- Dronova, I. Object-Based Image Analysis in Wetland Research: A Review. Remote Sens. 2015, 7, 6380–6413. [Google Scholar] [CrossRef] [Green Version]
- Warner, T. Kernel-Based Texture in Remote Sensing Image Classification. Geogr. Compass 2011, 5, 781–798. [Google Scholar] [CrossRef]
- Prošek, J.; Šímová, P. UAV for mapping shrubland vegetation: Does fusion of spectral and vertical information derived from a single sensor increase the classification accuracy? Int. J. Appl. Earth Obs. Geoinf. 2019, 75, 151–162. [Google Scholar] [CrossRef]
- Storey, E.A. Mapping plant growth forms using structure-from-motion data combined with spectral image derivatives. Remote Sens. Lett. 2020, 11, 426–435. [Google Scholar] [CrossRef] [PubMed]
- Hestir, E.L.; Khanna, S.; Andrew, M.E.; Santos, M.J.; Viers, J.H.; Greenberg, J.A.; Rajapakse, S.S.; Ustin, S.L. Identification of invasive vegetation using hyperspectral remote sensing in the California Delta ecosystem. Remote Sens. Environ. 2008, 112, 4034–4047. [Google Scholar] [CrossRef]
- Rebelo, A.J.; Somers, B.; Esler, K.J.; Meire, P. Can wetland plant functional groups be spectrally discriminated? Remote Sens. Environ. 2018, 210, 25–34. [Google Scholar] [CrossRef]
- Matese, A.; Gennaro, S.F.D.; Berton, A. Assessment of a canopy height model (CHM) in a vineyard using UAV-based multispectral imaging. Int. J. Remote Sens. 2017, 38, 2150–2160. [Google Scholar] [CrossRef]
- Viljanen, N.; Honkavaara, E.; Näsi, R.; Hakala, T.; Niemeläinen, O.; Kaivosoja, J. A Novel Machine Learning Method for Estimating Biomass of Grass Swards Using a Photogrammetric Canopy Height Model, Images and Vegetation Indices Captured by a Drone. Agriculture 2018, 8, 70. [Google Scholar] [CrossRef] [Green Version]
- Martínez-Carricondo, P.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Mesas-Carrascosa, F.-J.; García-Ferrer, A.; Pérez-Porras, F.-J. Assessment of UAV-photogrammetric mapping accuracy based on variation of ground control points. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 1–10. [Google Scholar] [CrossRef]
- Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of Drone Altitude, Image Overlap, and Optical Sensor Resolution on Multi-View Reconstruction of Forest Images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef] [Green Version]
- Flores-de-Santiago, F.; Valderrama-Landeros, L.; Rodríguez-Sobreyra, R.; Flores-Verdugo, F. Assessing the effect of flight altitude and overlap on orthoimage generation for UAV estimates of coastal wetlands. J. Coast. Conserv. 2020, 24, 35. [Google Scholar] [CrossRef]
- Dong, Y.; Yan, H.; Wang, N.; Huang, M.; Hu, Y. Automatic Identification of Shrub-Encroached Grassland in the Mongolian Plateau Based on UAS Remote Sensing. Remote Sens. 2019, 11, 1623. [Google Scholar] [CrossRef] [Green Version]
- Monks, A.M.; Lishawa, S.C.; Wellons, K.C.; Albert, D.A.; Mudrzynski, B.; Wilcox, D.A. European frogbit (Hydrocharis morsus-ranae) invasion facilitated by non-native cattails (Typha) in the Laurentian Great Lakes. J. Great Lakes Res. 2019, 45, 912–920. [Google Scholar] [CrossRef]
- Lishawa, S.C.; Albert, D.A.; Tuchman, N.C. Water level decline promotes Typha X glauca establishment and vegetation change in Great Lakes coastal wetlands. Wetlands 2010, 30, 1085–1096. [Google Scholar] [CrossRef]
- Bansal, S.; Lishawa, S.C.; Newman, S.; Tangen, B.A.; Wilcox, D.; Albert, D.; Anteau, M.J.; Chimney, M.J.; Cressey, R.L.; DeKeyser, E.; et al. Typha (Cattail) invasion in North American wetlands: Biology, regional problems, impacts, ecosystem services, and management. Wetlands 2019, 39, 645–684. [Google Scholar] [CrossRef] [Green Version]
- Singh, K.K.; Frazier, A.E. A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications. Int. J. Remote Sens. 2018, 39, 5078–5098. [Google Scholar] [CrossRef]
- Dandois, J.P.; Ellis, E.C. Remote Sensing of Vegetation Structure Using Computer Vision. Remote Sens. 2010, 2, 1157–1176. [Google Scholar] [CrossRef] [Green Version]
- Dunnington, D.; Harvey, P. Exifr: EXIF Image Data in R. R Package Version 0.3.1. 2019. Available online: https://CRAN.R-project.org/package=exifr (accessed on 1 June 2020).
- Kolarik, N.; Gaughan, A.E.; Stevens, F.R.; Pricope, N.G.; Woodward, K.; Cassidy, L.; Salerno, J.; Hartter, J. A multi-plot assessment of vegetation structure using a micro-unmanned aerial system (UAS) in a semi-arid savanna environment. ISPRS J. Photogramm. Remote Sens. 2020, 164, 84–96. [Google Scholar] [CrossRef] [Green Version]
- LaRue, E.A.; Atkins, J.W.; Dahlin, K.; Fahey, R.; Fei, S.; Gough, C.; Hardiman, B.S. Linking Landsat to terrestrial LiDAR: Vegetation metrics of forest greenness are correlated with canopy structural complexity. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 420–427. [Google Scholar] [CrossRef]
- Hardiman, B.S.; LaRue, E.A.; Atkins, J.W.; Fahey, R.T.; Wagner, F.W.; Gough, C.M. Spatial Variation in Canopy Structure across Forest Landscapes. Forests 2018, 9, 474. [Google Scholar] [CrossRef] [Green Version]
- Congalton, R.G. A review of assessing the accuracy of classifications of remotely sensed data. Remote Sens. Environ. 1991, 37, 35–46. [Google Scholar] [CrossRef]
- Kuhn, M.; Caret: Classification and Regression Training. R Package Version 6.0-86. 2020. Available online: https://CRAN.R-project.org/package=caret (accessed on 1 June 2020).
- Gonçalves, J.; Pôças, I.; Marcos, B.; Mücher, C.A.; Honrado, J.P. SegOptim—A new R package for optimizing object-based image analyses of high-spatial resolution remotely-sensed data. Int. J. Appl. Earth Obs. Geoinf. 2019, 76, 218–230. [Google Scholar] [CrossRef]
- Mohammadi, R.; Oshowski, B.; Monks, A.; Lishawa, S. Constructing a Habitat Suitability Model for Hydrocharis morsus-ranae. 2021; in prepress, journal not yet decided. [Google Scholar]
- Sun, J.; Li, H.; Fujita, H.; Fu, B.; Ai, W. Class-imbalanced dynamic financial distress prediction based on Adaboost-SVM ensemble combined with SMOTE and time weighting. Inf. Fusion 2020, 54, 128–144. [Google Scholar] [CrossRef]
- Miraki, M.; Sohrabi, H.; Fatehi, P.; Kneubuehler, M. Individual tree crown delineation from high-resolution UAV images in broadleaf forest. Ecol. Inform. 2021, 61, 101207. [Google Scholar] [CrossRef]
- Wicaksono, P.; Aryaguna, P.A. Analyses of inter-class spectral separability and classification accuracy of benthic habitat mapping using multispectral image. Remote Sens. Appl. Soc. Environ. 2020, 19, 100335. [Google Scholar] [CrossRef]
- Boon, M.A.; Greenfield, R.; Tesfamichael, S. Wetland Asssement using unmanned aerial vehicle (UAV) photogrammetry. In Proceedings of the ISPRS—International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016; Volume XLI-B1; pp. 781–788. [Google Scholar] [CrossRef]
- Santos, M.J.; Khanna, S.; Hestir, E.L.; Greenberg, J.A.; Ustin, S.L. Measuring landscape-scale spread and persistence of an invaded submerged plant community from airborne remote sensing. Ecol. Appl. 2016, 26, 1733–1744. [Google Scholar] [CrossRef] [PubMed]
Parrot Sequoia/pix4d-Generated Bands | Spectral Band Width (nm)/Structural/Textural Metric | Spatial Resolution (cm) | Vertical Range (m) |
---|---|---|---|
Green | 530–570 nm | 3, 11 | NA |
Red | 640–680 nm | 3, 11 | NA |
Red Edge | 730–740 nm | 3, 11 | NA |
NIR | 770–810 nm | 3, 11 | NA |
NDVI (included in Multispectral category below) | Index between 0–1 | 3, 11 | NA |
Digital Surface Model | Surface Height in meters | 3, 11 | 178.50–192.30 (3 cm dataset) 177.97–194.21 (11 cm dataset) |
Digital Surface Model Rugosity | SD of Surface Height of 8 neighbor pixels | 3, 11 | NA |
Canopy Height Model | Canopy Height in meters | 15, 55 | −1.15–11.90 (15 cm dataset) −2.33–24.30 (55 cm dataset) |
Canopy Height Model Rugosity | SD of Canopy Height of 8 neighbor pixels | 11, 55 | NA |
Classification Approach | Pixel Resolution | Dataset (with Pixel Resolution in cm) |
---|---|---|
Pixel-Based | High-Resolution | Multispectral Only (11 cm) |
Multispectral + CHM + Rugosity (55 cm) | ||
Multispectral + DSM + Rugosity (11 cm) | ||
Pixel-Based | Very-high Resolution | Multispectral Only (3 cm) |
Multispectral + CHM + Rugosity (15 cm) | ||
Multispectral + DSM + Rugosity (3 cm) | ||
Object-Based | High-Resolution | Multispectral Only (11 cm) |
Multispectral + CHM + Rugosity (55 cm) | ||
Multispectral + DSM + Rugosity (11 cm) | ||
Object-Based | Very-high Resolution | Multispectral Only (3 cm) |
Multispectral + CHM + Rugosity (15 cm) | ||
Multispectral + DSM + Rugosity (3 cm) |
Vegetation Class | Field Collected | Collected from Image Interpretation | Total Reference Points for Classification | Dataset |
---|---|---|---|---|
Emergent | 4 | 79 | 83 | 11 cm (55 cm for CHM models) |
Floating | 4 | 96 | 100 | 11 cm (55 cm for CHM models) |
Submergent | 5 | 61 | 66 | 11 cm (55 cm for CHM models) |
Total | 13 | 236 | 249 | |
Emergent | 4 | 79 | 83 | 3 cm (15 cm for CHM models) |
Floating | 4 | 90 | 94 | 3 cm (15 cm for CHM models) |
Submergent | 5 | 49 | 53 | 3 cm (15 cm for CHM models) |
Total | 13 | 216 | 229 |
Floating | Emergent | Submergent | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Classification Approach | Spatial Resolution | Data | OA | PA | UA | PA | UA | PA | UA | Mean AUROC | Mean F1 Score |
Pixel-based | 11 cm | Multispectral + CHM + Rugosity (resampled to 55 cm) | 76.9 (95% CI 0.761–0.769) | 81.8 | 75.7 | 88.9 | 78.7 | 48.1 | 75.7 | 86.0 | 78.6 |
Object-based | 11 cm | Multispectral + CHM + Rugosity (resampled to 55 cm) | 73.4 (95% CI 0.725–0.743) | 83.7 | 70.0 | 84.8 | 80.5 | 34.6 | 65.1 | 82.4 | 76.3 |
Pixel-based | 11 cm | Multispectral + DSM + Rugosity | 81.4 (95% CI 0.807–0.821) | 86.3 | 76.7 | 90.0 | 84.6 | 63.3 | 86.7 | 90.0 | 80.6 |
Object-based | 11 cm | Multispectral + DSM + Rugosity | 77 (95% CI 0.762–0.776) | 81.9 | 72.9 | 86.6 | 84.3 | 54.6 | 73.0 | 86.9 | 77.1 |
Pixel-based | 11 cm | Multispectral Only | 80.8 (95% CI 0.801–0.815) | 84.9 | 76.8 | 89.6 | 85.7 | 63.6 | 81.3 | 87.6 | 80.7 |
Object-based | 11 cm | Multispectral Only | 77.3 (95% CI 0.765–0.78) | 83.2 | 73.1 | 88.9 | 83.5 | 50.1 | 75.2 | 86.8 | 77.8 |
Pixel-based | 3 cm | Multispectral + CHM + Rugosity (resampled to 15 cm) | 58.3 (95% CI 0.574–0.592) | 57.2 | 53.2 | 79.5 | 73.2 | 29.4 | 38.7 | 74.8 | 55.1 |
Object-based | 3 cm | Multispectral + CHM + Rugosity (resampled to 15 cm) | 76.7 (95% CI 0.759–0.774) | 71.8 | 70.4 | 83.1 | 82.9 | 75.0 | 77.8 | 88.8 | 71.1 |
Pixel-based | 3 cm | Multispectral + DSM + Rugosity | 78.9 (95% CI 0.774–0.790) | 74.3 | 73.8 | 86.5 | 82.8 | 72 | 78.4 | 74.8 | 74.1 |
Object-based | 3 cm | Multispectral + DSM + Rugosity | 77 (95% CI 0.763–0.778) | 72.0 | 72.9 | 86.6 | 81.5 | 70.8 | 76.9 | 89.5 | 72.5 |
Pixel-based | 3 cm | Multispectral Only | 72.6 (95% CI 0.717–0.734) | 66.3 | 68.3 | 82.9 | 76.1 | 67.6 | 74.2 | 85.8 | 67.3 |
Object-based | 3 cm | Multispectral Only | 76.6 (95% CI 0.758–0.773) | 70.7 | 74.2 | 86.9 | 77.7 | 70.5 | 78.7 | 87.6 | 72.4 |
Reference | ||||||
---|---|---|---|---|---|---|
Floating | Submergent | Emergent | Total | UA | ||
Predicted | Floating | 4317 | 947 | 366 | 5630 | 76.68 |
Submergent | 276 | 2090 | 45 | 2411 | 86.69 | |
Emergent | 407 | 263 | 3689 | 4359 | 84.63 | |
Total | 5000 | 3300 | 4100 | 12,400 | ||
PA | 86.34 | 86.34 | 86.34 | 81.42 |
Designated Vegetation Zone (Mohammadi et al. in prep) | RF Predicted Vegetation Class (Total Number of Points) * | Species | Relative Abundance (% Cover/Total Cover) | Percent Cover (%) | EFB Present? | Cattail Present? | Non-Native and/or Invasive? (Yes or No) |
---|---|---|---|---|---|---|---|
Emergent (90% agreement) | Emergent (n = 11) | Hydrocharis morsus-ranae | 33.7 | 28.9 | 100% of points | 90.9% of points | Y |
Spirodela polyrhiza | 24.4 | 21.0 | N | ||||
Typha × glauca | 21.7 | 18.6 | Y | ||||
Floating (90% agreement) | Floating (n = 12) | Nymphaea odorata | 14.4 | 14.3 | 91.6% of points | 16.6% of points | N |
Ceratophyllum demersum | 10.4 | 10.3 | N | ||||
Hydrocharis morsus-ranae | 7.8 | 7.8 | Y | ||||
Submergent (33% agreement) | Submergent (n = 3) | Nymphaea odorata | 72.2 | 4.3 | 33.3% of points | 0% of points | N |
Spirodela polyrhiza | 16.7 | 1.0 | N | ||||
Hydrocharis morsus-ranae | 5.6 | 0.03 | Y | ||||
Open Water (47% agreement) | Open Water (n = 22) ** | Ceratophyllum demersum | 22.4 | 1.0 | 31.8% of points | 4.5% of points | N |
Spirodela polyrhiza | 14.3 | 5.0 | N | ||||
Nuphar variagatum | 12.4 | 4.3 | N |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jochems, L.W.; Brandt, J.; Monks, A.; Cattau, M.; Kolarik, N.; Tallant, J.; Lishawa, S. Comparison of Different Analytical Strategies for Classifying Invasive Wetland Vegetation in Imagery from Unpiloted Aerial Systems (UAS). Remote Sens. 2021, 13, 4733. https://doi.org/10.3390/rs13234733
Jochems LW, Brandt J, Monks A, Cattau M, Kolarik N, Tallant J, Lishawa S. Comparison of Different Analytical Strategies for Classifying Invasive Wetland Vegetation in Imagery from Unpiloted Aerial Systems (UAS). Remote Sensing. 2021; 13(23):4733. https://doi.org/10.3390/rs13234733
Chicago/Turabian StyleJochems, Louis Will, Jodi Brandt, Andrew Monks, Megan Cattau, Nicholas Kolarik, Jason Tallant, and Shane Lishawa. 2021. "Comparison of Different Analytical Strategies for Classifying Invasive Wetland Vegetation in Imagery from Unpiloted Aerial Systems (UAS)" Remote Sensing 13, no. 23: 4733. https://doi.org/10.3390/rs13234733
APA StyleJochems, L. W., Brandt, J., Monks, A., Cattau, M., Kolarik, N., Tallant, J., & Lishawa, S. (2021). Comparison of Different Analytical Strategies for Classifying Invasive Wetland Vegetation in Imagery from Unpiloted Aerial Systems (UAS). Remote Sensing, 13(23), 4733. https://doi.org/10.3390/rs13234733