A Comparative Study of Deep Semantic Segmentation and UAV-Based Multispectral Imaging for Enhanced Roadside Vegetation Composition Assessment
Abstract
1. Introduction
- Curate an SS dataset of proximal RGB images capturing diverse roadside vegetation species;
- Train and evaluate the performance of four DL-based SS models (UNet++, PSPNet, DeepLabV3+, and MANet) on a custom roadside vegetation dataset to assess their performance for vegetation composition assessment;
- Compare the performance of SS models and VI-based methods in estimating vegetation density, focusing on prediction accuracy and generalizability.
2. Materials and Methods
2.1. Data Collection and Sources
2.1.1. Image Annotations
2.1.2. Image Pre-Processing
2.2. Deep Learning Model Training
2.3. Vegetation Detection Using Vegetation Index-Based Methods
3. Results
3.1. Deep Learning Model Performance
3.2. Vegetation Detection Using Vegetation Index-Based Methods
3.2.1. Vegetation Detection at Prattville Construction Site
3.2.2. Determining Optimal Thresholds for Vegetation Detection Using Vegetation Indices Methods
3.3. Performance Assessment: Comparison of Vegetation Detection Using Vegetation Indices and Deep Learning Approaches
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
VIs | Vegetation Indices |
DL | Deep Learning |
ML | Machine Learning |
UAV | Unmanned Aerial Vehicles |
NIR | Near-Infrared |
SS | Semantic Segmentation |
AGL | Above Ground Level |
GSD | Ground Sampling Distance |
IoU | Intersection over Union |
mIoU | Mean Intersection over Union |
NDVI | Normalized Difference Vegetation Index |
NDRE | Normalized Difference Red Edge |
SAVI | Soil Adjusted Vegetation Index |
MAE | Mean Absolute Error |
RMSE | Root Mean Squared Error |
MAPE | Mean Absolute Percentage Error |
B–A | Bland–Altman |
R2 | Coefficient of Determination |
LoA | Limit of Agreement |
MiT | Mix Vision Transformer |
References
- Ayhan, B.; Kwan, C.; Budavari, B.; Kwan, L.; Lu, Y.; Perez, D.; Li, J.; Skarlatos, D.; Vlachos, M. Vegetation detection using deep learning and conventional methods. Remote Sens. 2020, 12, 2502. [Google Scholar] [CrossRef]
- Lynch, T.M.H.; Barth, S.; Dix, P.J.; Grogan, D.; Grant, J.; Grant, O.M. Ground cover assessment of perennial ryegrass using digital imaging. Agron. J. 2015, 107, 2347–2352. [Google Scholar] [CrossRef]
- Rentch, J.S.; Fortney, R.H.; Stephenson, S.L.; Adams, H.S.; Grafton, W.N.; Anderson, J.T. Vegetation–site relationships of roadside plant communities in West Virginia, USA. J. Appl. Ecol. 2005, 42, 129–138. [Google Scholar] [CrossRef]
- Jakobsson, S.; Bernes, C.; Bullock, J.M.; Verheyen, K.; Lindborg, R. How does roadside vegetation management affect the diversity of vascular plants and invertebrates? A systematic review. Environ. Evid. 2018, 7, 17. [Google Scholar] [CrossRef]
- Sandino, J.; Gonzalez, F.; Mengersen, K.; Gaston, K.J. UAVs and Machine Learning Revolutionizing Invasive Grass and Vegetation Surveys in Remote Arid Lands. Sensors 2018, 18, 605. [Google Scholar] [CrossRef] [PubMed]
- Weidlich, E.W.A.; Flórido, F.G.; Sorrini, T.B.; Brancalion, P.H.S. Controlling invasive plant species in ecological restoration: A global review. J. Appl. Ecol. 2020, 57, 1806–1817. [Google Scholar] [CrossRef]
- Milton, S.J.; Dean, W.R.J.; Sielecki, L.E.; van der Ree, R. The function and management of roadside vegetation. Handb. Road Ecol. 2015, 46, 373–381. [Google Scholar] [CrossRef]
- Kartal, S. Comparison of semantic segmentation algorithms for the estimation of botanical composition of clover-grass pastures from RGB images. Ecol. Inform. 2021, 66, 101467. [Google Scholar] [CrossRef]
- Pettorelli, N.; Bühne, H.S.T.; Tulloch, A.; Dubois, G.; Macinnis-Ng, C.; Queirós, A.M.; Keith, D.A.; Wegmann, M.; Schrodt, F.; Stellmes, M.; et al. Satellite remote sensing of ecosystem functions: Opportunities, challenges and way forward. Remote Sens. Ecol. Conserv. 2017, 4, 71–93. [Google Scholar] [CrossRef]
- Perez, M.A.; Zech, W.C.; Donald, W.N. Using unmanned aerial vehicles to conduct site inspections of erosion and sediment control practices and track project progression. Transp. Res. Rec. J. Transp. Res. Board 2015, 2528, 38–48. [Google Scholar] [CrossRef]
- Xie, Y.; Sha, Z.; Yu, M. Remote sensing imagery in vegetation mapping: A review. J. Plant Ecol. 2008, 1, 9–23. [Google Scholar] [CrossRef]
- Getzin, S.; Wiegand, K.; Schöning, I. Assessing biodiversity in forests using very high-resolution images and unmanned aerial vehicles. Methods Ecol. Evol. 2011, 3, 397–404. [Google Scholar] [CrossRef]
- Zhu, Z.; Qiu, S.; Ye, S. Remote sensing of land change: A multifaceted perspective. Remote Sens. Environ. 2022, 282, 113266. [Google Scholar] [CrossRef]
- Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef]
- Al-Ali, Z.M.; Abdullah, M.; Asadalla, N.B.; Gholoum, M. A comparative study of remote sensing classification methods for monitoring and assessing desert vegetation using a UAV-based multispectral sensor. Environ. Monit. Assess. 2020, 192, 389. [Google Scholar] [CrossRef]
- Lee, G.; Hwang, J.; Cho, S. A novel index to detect vegetation in urban areas using UAV-based multispectral images. Appl. Sci. 2021, 11, 3472. [Google Scholar] [CrossRef]
- Wang, T.; Chandra, A.; Jung, J.; Chang, A. UAV remote sensing based estimation of green cover during turfgrass establishment. Comput. Electron. Agric. 2022, 194, 106721. [Google Scholar] [CrossRef]
- Kim, J.; Kang, S.; Seo, B.; Narantsetseg, A.; Han, Y. Estimating fractional green vegetation cover of Mongolian grasslands using digital camera images and MODIS satellite vegetation indices. GIScience Remote Sens. 2019, 57, 49–59. [Google Scholar] [CrossRef]
- Zhu, X.X.; Tuia, D.; Mou, L.; Xia, G.-S.; Zhang, L.; Xu, F.; Fraundorfer, F. Deep learning in remote sensing: A comprehensive review and list of resources. IEEE Geosci. Remote Sens. Mag. 2017, 5, 8–36. [Google Scholar] [CrossRef]
- Hu, T.; Cao, M.; Zhao, X.; Liu, X.; Liu, Z.; Liu, L.; Huang, Z.; Tao, S.; Tang, Z.; Guo, Y.; et al. High-resolution mapping of grassland canopy cover in China through the integration of extensive drone imagery and satellite data. ISPRS J. Photogramm. Remote Sens. 2024, 218, 69–83. [Google Scholar] [CrossRef]
- Dong, S.; Wang, P.; Abbas, K. A survey on deep learning and its applications. Comput. Sci. Rev. 2021, 40, 100379. [Google Scholar] [CrossRef]
- Mortensen, A.K.; Karstoft, H.; Søegaard, K.; Gislum, R.; Jørgensen, R.N. Preliminary results of clover and grass coverage and total dry matter estimation in clover-grass crops using image analysis. J. Imaging 2017, 3, 59. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Jin, X.; Bagavathiannan, M.; McCullough, P.E.; Chen, Y.; Yu, J. A deep learning-based method for classification, detection, and localization of weeds in turfgrass. Pest Manag. Sci. 2022, 78, 4809–4821. [Google Scholar] [CrossRef]
- Luo, Z.; Yang, W.; Yuan, Y.; Gou, R.; Li, X. Semantic segmentation of agricultural images: A survey. Inf. Process. Agric. 2024, 11, 172–186. [Google Scholar] [CrossRef]
- Behera, T.K.; Bakshi, S.; Sa, P.K. Vegetation extraction from UAV-based aerial images through deep learning. Comput. Electron. Agric. 2022, 198, 107094. [Google Scholar] [CrossRef]
- Furukawa, F.; Laneng, L.A.; Ando, H.; Yoshimura, N.; Kaneko, M.; Morimoto, J. Comparison of RGB and multispectral unmanned aerial vehicle for monitoring vegetation coverage changes on a landslide area. Drones 2021, 5, 97. [Google Scholar] [CrossRef]
- Pérez-Luque, A.J.; Ramos-Font, M.E.; Barbieri, M.J.T.; Pérez, C.T.; Renta, G.C.; Cruz, A.B.R. Vegetation cover estimation in semi-arid shrublands after prescribed burning: Field-ground and drone image comparison. Drones 2022, 6, 370. [Google Scholar] [CrossRef]
- Li, X.; Xu, F.; Liu, F.; Tong, Y.; Lyu, X.; Zhou, J. Semantic segmentation of remote sensing images by interactive representation refinement and geometric prior-guided inference. IEEE Trans. Geosci. Remote Sens. 2023, 62, 1–18. [Google Scholar] [CrossRef]
- Li, X.; Xie, L.; Wang, C.; Miao, J.; Shen, H.; Zhang, L. Boundary-enhanced dual-stream network for semantic segmentation of high-resolution remote sensing images. GIScience Remote Sens. 2024, 61, 2356355. [Google Scholar] [CrossRef]
- Niu, B.; Feng, Q.; Chen, B.; Ou, C.; Liu, Y.; Yang, J. HSI-TransUNet: A transformer based semantic segmentation model for crop mapping from UAV hyperspectral imagery. Comput. Electron. Agric. 2022, 201, 107297. [Google Scholar] [CrossRef]
- Zhao, S.; Chen, H.; Zhang, X.; Xiao, P.; Bai, L.; Ouyang, W. Rs-mamba for large remote sensing image dense prediction. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5633314. [Google Scholar] [CrossRef]
- Li, X.; Xu, F.; Liu, F.; Lyu, X.; Tong, Y.; Xu, Z.; Zhou, J. A synergistical attention model for semantic segmentation of remote sensing images. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5400916. [Google Scholar] [CrossRef]
- Wang, L.; Li, R.; Zhang, C.; Fang, S.; Duan, C.; Meng, X.; Atkinson, P.M. UNetFormer: A UNet-like transformer for efficient semantic segmentation of remote sensing urban scene imagery. Remote Sens. 2022, 190, 196–214. [Google Scholar] [CrossRef]
- Bolyn, C.; Lejeune, P.; Michez, A.; Latte, N. Mapping tree species proportions from satellite imagery using spectral–spatial deep learning. Remote Sens. Environ. 2022, 280, 113205. [Google Scholar] [CrossRef]
- Cui, B.; Fei, D.; Shao, G.; Lu, Y.; Chu, J. Extracting raft aquaculture areas from remote sensing images via an improved U-net with a PSE structure. Remote Sens. 2019, 11, 2053. [Google Scholar] [CrossRef]
- Cao, Q.; Li, M.; Yang, G.; Tao, Q.; Luo, Y.; Wang, R.; Chen, P. Urban vegetation classification for unmanned aerial vehicle remote sensing combining feature engineering and improved DeepLabV3+. Forests 2024, 15, 382. [Google Scholar] [CrossRef]
- Liu, H.; Sun, B.; Gao, Z.; Chen, Z.; Zhu, Z. High resolution remote sensing recognition of elm sparse forest via deep-learning-based semantic segmentation. Ecol. Indic. 2024, 166, 112428. [Google Scholar] [CrossRef]
- Strothmann, W.; Ruckelshausen, A.; Hertzberg, J.; Scholz, C.; Langsenkamp, F. Plant classification with in-field-labeling for crop/weed discrimination using spectral features and 3d surface features from a multi-wavelength laser line profile system. Comput. Electron. Agric. 2017, 134, 79–93. [Google Scholar] [CrossRef]
- Singh, P. Semantic Segmentation Based Deep Learning Approaches for Weed Detection. Master’s Thesis, University of Nebraska-Lincoln, Lincoln, NE, USA, 16 December 2022. Available online: https://digitalcommons.unl.edu/biosysengdiss/137/ (accessed on 16 April 2025).
- Raja, R.; Nguyen, T.T.; Slaughter, D.C.; Fennimore, S.-A. Real-time weed-crop classification and localisation technique for robotic weed control in lettuce. Biosyst. Eng. 2020, 192, 257–274. [Google Scholar] [CrossRef]
- Deng, T.; Fu, B.; Liu, M.; He, H.; Fan, D.; Li, L.; Huang, L.; Gao, E. Comparison of multi-class and fusion of multiple single-class SegNet model for mapping karst wetland vegetation using UAV images. Sci. Rep. 2022, 12, 13270. [Google Scholar] [CrossRef] [PubMed]
- Brooks, J. COCO Annotator. 2019. Available online: https://github.com/jsbroks/coco-annotator (accessed on 16 April 2025).
- Bradski, G. The opencv library. Dr’ Dobb’s J. Softw. Tools Prof. Program. 2000, 25, 120–123. [Google Scholar]
- Zhou, Z.; Siddiquee, M.M.R.; Tajbakhsh, N.; Liang, J. UNet++: Redesigning Skip Connections to Exploit Multiscale Features in Image Segmentation. IEEE Trans. Med. Imaging 2019, 39, 1856–1867. [Google Scholar] [CrossRef]
- Zhao, H.; Shi, J.; Qi, X.; Wang, X.; Jia, J. Pyramid scene parsing network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2881–2890. [Google Scholar] [CrossRef]
- Li, R.; Zheng, S.; Zhang, C.; Duan, C.; Su, J.; Wang, L.; Atkinson, P.M. Multiattention network for semantic segmentation of fine-resolution remote sensing images. IEEE Trans. Geosci. Remote Sens. 2021, 60, 5607713. [Google Scholar] [CrossRef]
- Chen, L.C.; Zhu, Y.; Papandreou, G.; Schroff, F.; Adam, H. Encoder-decoder with atrous separable convolution for semantic image segmentation. In Proceedings of the E10, Florence, Italy, 26–30 August 2018. [Google Scholar] [CrossRef]
- Lin, T.Y.; Goyal, P.; Girshick, R.; He, K.; Dollár, P. Focal loss for dense object detection. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2980–2988. [Google Scholar] [CrossRef]
- Giavarina, D. Understanding bland altman analysis. Biochem. Medica 2015, 25, 141–151. [Google Scholar] [CrossRef]
- Wang, L.; Zhou, Y.; Hu, Q.; Tang, Z.; Ge, Y.; Smith, A.; Awada, T.; Shi, Y. Early detection of encroaching woody juniperus virginiana and its classification in multi-species forest using UAS imagery and semantic segmentation algorithms. Remote Sens. 2021, 13, 1975. [Google Scholar] [CrossRef]
- Naeem, S.; Ali, A.; Chesneau, C.; Tahir, M.H.; Jamal, F.; Sherwani, R.A.K.; Hassan, M.U. The classification of medicinal plant leaves based on multispectral and texture feature using machine learning approach. Agronomy 2021, 11, 263. [Google Scholar] [CrossRef]
- Wang, B.; Yao, Y. Mountain vegetation classification method based on multi-channel semantic segmentation model. Remote Sens. 2024, 16, 256. [Google Scholar] [CrossRef]
- Chhetri, M.; Fontanier, C. Use of canopeo for estimating green coverage of bermudagrass during postdormancy regrowth. HortTechnology 2021, 31, 817–819. [Google Scholar] [CrossRef]
- Vayssade, J.-A.; Paoli, J.-N.; Gée, C.; Jones, G. DeepIndices: Remote sensing indices based on approximation of functions through deep-learning, application to uncalibrated vegetation images. Remote Sens. 2021, 13, 2261. [Google Scholar] [CrossRef]
Location | Date | Vegetation Species | Image Acquisition: Images Collected |
---|---|---|---|
Lee County: Auburn | 10 March 2023 | Annual ryegrass | Skydio: 10 |
16 March 2023 | Annual ryegrass and Lespedeza | Skydio: 39 | |
23 March 2023 | Annual ryegrass, Bermuda, Lespedeza | Canon: 25 | |
Lee County: Opelika | 08 July 2022 | Bermuda, Crabgrass, Johnsongrass, Lespedeza | Canon: 38. Skydio: 40 |
15 July 2022 | Bermuda, Browntop millet, Crabgrass, Johnsongrass, Lespedeza | Canon: 40. Skydio: 40 | |
22 July 2022 | Bahia, Bermuda, Browntop millet, Crabgrass, Johnsongrass, Lespedeza | Canon: 40. Skydio: 39 | |
29 July 2022 | Bahia, Bermuda, Browntop millet, Lespedeza | Canon: 15 | |
03 August 2022 | Browntop millet, Crabgrass, Johnsongrass, Lespedeza | Canon: 40, Skydio: 33 | |
11 August 2022 | Browntop millet, Crabgrass, Johnsongrass, Lespedeza | Canon: 40 | |
18 August 2022 | Browntop millet, Crabgrass, Johnsongrass, Lespedeza | Skydio: 42 | |
02 September 2022 | Crabgrass, Lespedeza | Skydio: 30 | |
26 May 2023 | Bermuda | Canon: 10 | |
02 June 2023 | Bermuda | Canon: 20 | |
18 July 2023 | Bahia, Lespedeza | Canon: 20 | |
Linden | 28 July 2023 | Bahia, Browntop millet, Crabgrass Johnsongrass, Lespedeza | Canon: 16 |
Montgomery | 09 June 2023 | Annual ryegrass, Johnsongrass, Lespedeza | Canon: 20 |
Prattville | 27 July 2023 | Bermuda, Crabgrass, Johnsongrass, Lespedeza | Canon: 20 |
Tuscaloosa | 28 July 2023 | Crabgrass, Lespedeza | Canon: 18 |
Hyperparameters | Values |
---|---|
Optimizer | Adam |
Loss function | Focal loss (ɤ = 3.0) |
Learning rate scheduler | CosineAnnealingLR |
Epochs | 50 |
Augmentations | Random Crop, Horizontal/Vertical flip, Shift scale rotate |
Batch size | 32 |
Image size | 256 × 256 |
Evaluation metrics | Loss, Precision, Recall, IoU |
Weight initialization | Fine-tuning pretrained weights on ImageNet |
Architecture backbones | mit_b5, mit_b4, ResNet101, ResNet50, Xception |
Dropout rate | 0.2 |
Learning rate | 2 × 10−4 |
DL Model | Encoder | Loss | Precision | Recall | mIoU | Trainable Parameters |
---|---|---|---|---|---|---|
UNet++ | ResNet50 | 0.002 | 0.608 | 0.661 | 0.688 | 48.98 M |
PSPNet | mit_b5 | 0.001 | 0.664 | 0.712 | 0.751 | 81.63 M |
DeepLabV3+ | ResNet101 | 0.001 | 0.623 | 0.700 | 0.779 | 45.67 M |
MAnet | mit_b4 | 0.001 | 0.681 | 0.737 | 0.791 | 54.84 M |
DL Model | Backbone | Annual Ryegrass | Bahia | Bermuda | Crabgrass | Brown Top Millet | Lespedeza | Johnsongrass | Fescue |
---|---|---|---|---|---|---|---|---|---|
UNet++ | Resnet50 | 0.71 | 0.78 | 0.74 | 0.83 | 0.70 | 0.80 | 0.71 | 0.78 |
PSPnet | Mit_b5 | 0.93 | 0.87 | 0.96 | 0.75 | 0.90 | 0.85 | 0.83 | 0.87 |
DeepLabV3+ | Resnet 101 | 0.94 | 0.94 | 0.96 | 0.75 | 0.84 | 0.87 | 0.87 | 0.93 |
MAnet | Mit_b4 | 0.97 | 0.96 | 0.95 | 0.75 | 0.90 | 0.84 | 0.86 | 0.91 |
Frame ID | Actual Vegetation Cover | Predicted Vegetation Cover: Deep Learning | Predicted Vegetation Cover: NDVI—0.5 | Predicted Vegetation Cover: NDRE—0.25 | Predicted Vegetation Cover: SAVI—0.40 |
---|---|---|---|---|---|
1 | 0.357 | 0.345 (Testing) | 0.667 | 0.713 | 0.646 |
2 | 0.242 | 0.239 (Training) | 0.480 | 0.480 | 0.423 |
3 | 0.009 | 0.008 (Training) | 0.008 | 0.104 | 0.008 |
4 | 0.257 | 0.264 (Training) | 0.545 | 0.500 | 0.519 |
5 | 0.229 | 0.220 (Testing) | 0.423 | 0.531 | 0.379 |
6 | 0.495 | 0.500 (Training) | 0.930 | 0.885 | 0.886 |
7 | 0.332 | 0.331 (Training) | 0.891 | 0.798 | 0.813 |
8 | 0.398 | 0.391 (Training) | 0.952 | 0.871 | 0.949 |
9 | 0.357 | 0.363 (Training) | 0.768 | 0.685 | 0.711 |
10 | 0.136 | 0.134 (Testing) | 0.286 | 0.472 | 0.196 |
11 | 0.186 | 0.192 (Training) | 0.478 | 0.450 | 0.282 |
12 | 0.291 | 0.290 (Training) | 0.767 | 0.710 | 0.694 |
13 | 0.354 | 0.355 (Training) | 0.766 | 0.743 | 0.715 |
14 | 0.338 | 0.335 (Training) | 0.753 | 0.711 | 0.695 |
15 | 0.261 | 0.278 (Testing) | 0.533 | 0.582 | 0.417 |
16 | 0.413 | 0.392 (Training) | 0.870 | 0.878 | 0.870 |
17 | 0.251 | 0.235 (Training) | 0.698 | 0.616 | 0.654 |
18 | 0.368 | 0.354 (Training) | 0.928 | 0.880 | 0.925 |
19 | 0.304 | 0.307 (Training) | 0.595 | 0.639 | 0.505 |
20 | 0.218 | 0.217 (Testing) | 0.286 | 0.296 | 0.269 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Singh, P.; Perez, M.A.; Donald, W.N.; Bao, Y. A Comparative Study of Deep Semantic Segmentation and UAV-Based Multispectral Imaging for Enhanced Roadside Vegetation Composition Assessment. Remote Sens. 2025, 17, 1991. https://doi.org/10.3390/rs17121991
Singh P, Perez MA, Donald WN, Bao Y. A Comparative Study of Deep Semantic Segmentation and UAV-Based Multispectral Imaging for Enhanced Roadside Vegetation Composition Assessment. Remote Sensing. 2025; 17(12):1991. https://doi.org/10.3390/rs17121991
Chicago/Turabian StyleSingh, Puranjit, Michael A. Perez, Wesley N. Donald, and Yin Bao. 2025. "A Comparative Study of Deep Semantic Segmentation and UAV-Based Multispectral Imaging for Enhanced Roadside Vegetation Composition Assessment" Remote Sensing 17, no. 12: 1991. https://doi.org/10.3390/rs17121991
APA StyleSingh, P., Perez, M. A., Donald, W. N., & Bao, Y. (2025). A Comparative Study of Deep Semantic Segmentation and UAV-Based Multispectral Imaging for Enhanced Roadside Vegetation Composition Assessment. Remote Sensing, 17(12), 1991. https://doi.org/10.3390/rs17121991