Multi-Camera Machine Learning for Salt Marsh Species Classification and Mapping
Abstract
1. Introduction
Multi-Sensor Fusion Approaches in Vegetation Classification
2. Materials and Methods
2.1. Study Area
2.1.1. Survey Timing and Conditions
2.1.2. Scale of Analysis
2.1.3. Drone Operations and Safety Regulations
2.2. Equipment
2.3. Data Collection
2.4. Data Preprocessing for HSI, MSI, and RGB Datasets
2.5. Data Preprocessing for the Multi-Camera Dataset
2.6. Machine Learning Classification
2.6.1. Data Preparation and Feature Extraction
2.6.2. Model Selection and Training
2.6.3. Validation and Accuracy Assessment
3. Results
3.1. Data Overview
3.2. Classification Results
3.2.1. RGB Classification Results
3.2.2. Multispectral Classification Results
3.2.3. Hyperspectral Classification Results
3.2.4. Multi-Camera Classification Results
3.2.5. Model Comparison
3.3. Species Distribution
4. Discussion
4.1. Evaluation of Classification Accuracies
Comparison with Other UAV-Based Vegetation Classification Studies
4.2. Methodological Limitations and Future Directions
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
UAV | Unmanned Aerial Vehicle |
RGB | Red, Green, Blue |
MSI | Multispectral Imaging |
HSI | Hyperspectral Imaging |
LiDAR | Light Detection and Ranging |
RF | Random Forest |
SAM | Spectral Angle Mapper |
SVM | Support Vector Machine |
PCA | Principal Component Analysis |
MNF | Minimum Noise Fraction |
OA | Overall Accuracy |
PA | Producer’s Accuracy |
UA | User’s Accuracy |
GT | Ground Truth |
GSD | Ground Sample Distance |
FOV | Field of View |
Appendix A. Confusion Matrix Tables for RGB Classifications
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 13.09% | 14.13% | 3.57% | 11.00% |
Class 1 | 84.42% | 9.28% | 5.30% | 34.64% |
Class 2 | 2.48% | 74.51% | 80.48% | 50.77% |
Class 3 | 0.01% | 2.08% | 10.65% | 3.60% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 0.00% | 0.00% | 0.00% | 0.00% |
Class 1 | 80.83% | 77.32% | 78.29% | 78.82% |
Class 2 | 15.50% | 11.88% | 8.99% | 12.22% |
Class 3 | 3.68% | 10.80% | 12.73% | 8.95% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 13.08% | 13.83% | 3.40% | 10.76% |
Class 1 | 84.62% | 17.70% | 6.97% | 37.82% |
Class 2 | 2.30% | 68.47% | 89.63% | 51.42% |
Class 3 | 0.00% | 0.00% | 0.00% | 0.00% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Appendix B. Confusion Matrix Tables for Multispectral Classifications
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 32.20% | 1.43% | 0.01% | 11.41% |
Class 1 | 67.83% | 0.01% | 0.23% | 22.89% |
Class 2 | 0.00% | 98.56% | 2.57% | 39.90% |
Class 3 | 0.00% | 0.00% | 97.18% | 25.79% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 0.00% | 0.00% | 0.00% | 0.00% |
Class 1 | 97.17% | 17.90% | 2.06% | 40.56% |
Class 2 | 1.34% | 79.81% | 1.17% | 32.18% |
Class 3 | 1.49% | 2.30% | 96.77% | 27.26% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 32.20% | 1.43% | 0.01% | 11.41% |
Class 1 | 67.80% | 0.00% | 0.09% | 22.85% |
Class 2 | 0.00% | 98.57% | 0.33% | 39.31% |
Class 3 | 0.00% | 0.00% | 99.57% | 26.43% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Appendix C. Confusion Matrix Tables for Hyperspectral Classifications
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 1.10% | 17.08% | 0.0% | 7.50% |
Class 1 | 96.60% | 0.09% | 8.22% | 36.28% |
Class 2 | 2.10% | 82.83% | 2.28% | 35.71% |
Class 3 | 0.21% | 0.0% | 89.49% | 20.51% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 0.00% | 0.00% | 0.00% | 0.00% |
Class 1 | 95.29% | 12.49% | 0.00% | 37.31% |
Class 2 | 4.57% | 83.67% | 0.74% | 34.95% |
Class 3 | 0.13% | 3.84% | 99.26% | 27.74% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 1.10% | 17.08% | 0.00% | 7.50% |
Class 1 | 98.65% | 0.14% | 0.23% | 35.22% |
Class 2 | 0.25% | 82.78% | 3.75% | 35.39% |
Class 3 | 0.00% | 0.00% | 96.02% | 21.89% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Appendix D. Confusion Matrix Tables for Multi-Camera Classifications
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 32.29% | 2.75% | 0.05% | 11.99% |
Class 1 | 67.64% | 0.02% | 0.07% | 24.94% |
Class 2 | 0.04% | 97.23% | 1.12% | 38.95% |
Class 3 | 0.02% | 0.00% | 90.76% | 24.12% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 0.00% | 0.00% | 0.00% | 0.00% |
Class 1 | 99.13% | 2.89% | 0.00% | 35.27% |
Class 2 | 0.63% | 95.90% | 0.34% | 39.08% |
Class 3 | 0.24% | 1.22% | 99.65% | 25.65% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
Class | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent Vegetation | Total |
---|---|---|---|---|
Unclassified | 32.29% | 2.75% | 0.05% | 11.99% |
Class 1 | 67.71% | 0.01% | 0.54% | 22.97% |
Class 2 | 0.00% | 97.24% | 0.54% | 38.78% |
Class 3 | 0.00% | 0.00% | 98.87% | 26.26% |
Total | 100.00% | 100.00% | 100.00% | 100.00% |
References
- Silliman, B.R. Salt marshes. Curr. Biol. 2014, 24, R348–R350. [Google Scholar] [CrossRef] [PubMed]
- Mcowen, C.J.; Weatherdon, L.V.; Van Bochove, J.W.; Sullivan, E.; Blyth, S.; Zockler, C.; Stanwell-Smith, D.; Kingston, N.; Martin, C.S.; Spalding, M.; et al. A global map of saltmarshes. Biodivers. Data J. 2017, 5, 11764. [Google Scholar] [CrossRef]
- Huang, R.; He, J.; Wang, N.; Christakos, G.; Gu, J.; Song, L.; Luo, J.; Agusti, S.; Duarte, C.M.; Wu, J. Carbon sequestration potential of transplanted mangroves and exotic saltmarsh plants in the sediments of subtropical wetlands. Sci. Total Environ. 2023, 904, 166185. [Google Scholar] [CrossRef] [PubMed]
- Coverdale, T.C.; Brisson, C.P.; Young, E.W.; Yin, S.F.; Donnelly, J.P.; Bertness, M.D. Indirect human impacts reverse centuries of carbon sequestration and salt marsh accretion. PLoS ONE 2014, 9, e93296. [Google Scholar] [CrossRef]
- Burden, A.; Garbutt, R.; Evans, C.; Jones, D.; Cooper, D. Carbon sequestration and biogeochemical cycling in a saltmarsh subject to coastal managed realignment. Estuar. Coast. Shelf Sci. 2013, 120, 12–20. [Google Scholar] [CrossRef]
- Zhu, X.; Meng, L.; Zhang, Y.; Weng, Q.; Morris, J. Tidal and meteorological influences on the growth of invasive Spartina alterniflora: Evidence from UAV remote sensing. Remote Sens. 2019, 11, 1208. [Google Scholar] [CrossRef]
- Nardin, W.; Taddia, Y.; Quitadamo, M.; Vona, I.; Corbau, C.; Franchi, G.; Staver, L.W.; Pellegrinelli, A. Seasonality and characterization mapping of restored tidal marsh by NDVI imageries coupling UAVs and multispectral camera. Remote Sens. 2021, 13, 4207. [Google Scholar] [CrossRef]
- Shepard, C.C.; Crain, C.M.; Beck, M.W. The protective role of coastal marshes: A systematic review and meta-analysis. PLoS ONE 2011, 6, e27374. [Google Scholar] [CrossRef]
- Townend, I.; Fletcher, C.; Knappen, M.; Rossington, K. A review of salt marsh dynamics. Water Environ. J. 2011, 25, 477–488. [Google Scholar] [CrossRef]
- Pendleton, L.; Donato, D.C.; Murray, B.C.; Crooks, S.; Jenkins, W.A.; Sifleet, S.; Craft, C.; Fourqurean, J.W.; Kauffman, J.B.; Marbà, N.; et al. Estimating global “blue carbon” emissions from conversion and degradation of vegetated coastal ecosystems. PLoS ONE 2012, 7, e43542. [Google Scholar] [CrossRef]
- Curcio, A.C.; Barbero, L.; Peralta, G. Enhancing salt marshes monitoring: Estimating biomass with drone-derived habitat-specific models. Remote Sens. Appl. Soc. Environ. 2024, 35, 101216. [Google Scholar] [CrossRef]
- Drake, K.; Halifax, H.; Adamowicz, S.C.; Craft, C. Carbon sequestration in tidal salt marshes of the Northeast United States. Environ. Manag. 2015, 56, 998–1008. [Google Scholar] [CrossRef]
- Lopes, C.L.; Mendes, R.; Caçador, I.; Dias, J.M. Assessing salt marsh extent and condition changes with 35 years of Landsat imagery: Tagus Estuary case study. Remote Sens. Environ. 2020, 247, 111939. [Google Scholar] [CrossRef]
- Blount, T.R.; Carrasco, A.R.; Cristina, S.; Silvestri, S. Exploring open-source multispectral satellite remote sensing as a tool to map long-term evolution of salt marsh shorelines. Estuar. Coast. Shelf Sci. 2022, 266, 107664. [Google Scholar] [CrossRef]
- Hu, Y.; Tian, B.; Yuan, L.; Li, X.; Huang, Y.; Shi, R.; Jiang, X.; Wang, L.; Sun, C. Mapping coastal salt marshes in China using time series of Sentinel-1 SAR. ISPRS J. Photogramm. Remote Sens. 2021, 173, 122–134. [Google Scholar] [CrossRef]
- Campbell, A.; Wang, Y. High spatial resolution remote sensing for salt marsh mapping and change analysis at Fire Island National Seashore. Remote Sens. 2019, 11, 1107. [Google Scholar] [CrossRef]
- Brunier, G.; Oiry, S.; Lachaussée, N.; Barillé, L.; Le Fouest, V.; Méléder, V. A machine-learning approach to intertidal mudflat mapping combining multispectral reflectance and geomorphology from UAV-based monitoring. Remote Sens. 2022, 14, 5857. [Google Scholar] [CrossRef]
- Bartlett, B.; Santos, M.; Dorian, T.; Moreno, M.; Trslic, P.; Dooly, G. Real-Time UAV Surveys with the Modular Detection and Targeting System: Balancing Wide-Area Coverage and High-Resolution Precision in Wildlife Monitoring. Remote Sens. 2025, 17, 879. [Google Scholar] [CrossRef]
- Curcio, A.C.; Peralta, G.; Aranda, M.; Barbero, L. Evaluating the performance of high spatial resolution UAV-photogrammetry and UAV-LiDAR for salt marshes: The Cádiz Bay study case. Remote Sens. 2022, 14, 3582. [Google Scholar] [CrossRef]
- Doughty, C.L.; Cavanaugh, K.C. Mapping coastal wetland biomass from high resolution unmanned aerial vehicle (UAV) imagery. Remote Sens. 2019, 11, 540. [Google Scholar] [CrossRef]
- Curcio, A.C.; Barbero, L.; Peralta, G. UAV-hyperspectral imaging to estimate species distribution in salt marshes: A case study in the Cadiz Bay (SW Spain). Remote Sens. 2023, 15, 1419. [Google Scholar] [CrossRef]
- Routhier, M.; Moore, G.; Rock, B. Assessing Spectral Band, Elevation, and Collection Date Combinations for Classifying Salt Marsh Vegetation with Unoccupied Aerial Vehicle (UAV)-Acquired Imagery. Remote Sens. 2023, 15, 5076. [Google Scholar] [CrossRef]
- Ge, X.; Wang, J.; Ding, J.; Cao, X.; Zhang, Z.; Liu, J.; Li, X. Combining UAV-based hyperspectral imagery and machine learning algorithms for soil moisture content monitoring. PeerJ 2019, 7, e6926. [Google Scholar] [CrossRef]
- Hamylton, S.M.; Morris, R.H.; Carvalho, R.C.; Roder, N.; Barlow, P.; Mills, K.; Wang, L. Evaluating techniques for mapping island vegetation from unmanned aerial vehicle (UAV) images: Pixel classification, visual interpretation and machine learning approaches. Int. J. Appl. Earth Obs. Geoinf. 2020, 89, 102085. [Google Scholar] [CrossRef]
- Ishida, T.; Kurihara, J.; Viray, F.A.; Namuco, S.B.; Paringit, E.C.; Perez, G.J.; Takahashi, Y.; Marciano, J.J., Jr. A novel approach for vegetation classification using UAV-based hyperspectral imaging. Comput. Electron. Agric. 2018, 144, 80–85. [Google Scholar] [CrossRef]
- Qian, S.E.; Chen, G. Enhancing spatial resolution of hyperspectral imagery using sensor’s intrinsic keystone distortion. IEEE Trans. Geosci. Remote Sens. 2012, 50, 5033–5048. [Google Scholar] [CrossRef]
- Zhang, Z.; Huang, L.; Wang, Q.; Jiang, L.; Qi, Y.; Wang, S.; Shen, T.; Tang, B.H.; Gu, Y. UAV Hyperspectral Remote Sensing Image Classification: A Systematic Review. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024. [Google Scholar] [CrossRef]
- Parsons, M.; Bratanov, D.; Gaston, K.J.; Gonzalez, F. UAVs, hyperspectral remote sensing, and machine learning revolutionizing reef monitoring. Sensors 2018, 18, 2026. [Google Scholar] [CrossRef]
- Proença, B.; Frappart, F.; Lubac, B.; Marieu, V.; Ygorra, B.; Bombrun, L.; Michalet, R.; Sottolichio, A. Potential of high-resolution Pléiades imagery to monitor salt marsh evolution after Spartina invasion. Remote Sens. 2019, 11, 968. [Google Scholar] [CrossRef]
- Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef]
- Farris, A.S.; Defne, Z.; Ganju, N.K. Identifying salt marsh shorelines from remotely sensed elevation data and imagery. Remote Sens. 2019, 11, 1795. [Google Scholar] [CrossRef]
- Wu, X.; Tan, K.; Liu, S.; Wang, F.; Tao, P.; Wang, Y.; Cheng, X. Drone multiline light detection and ranging data filtering in coastal salt marshes using extreme gradient boosting model. Drones 2024, 8, 13. [Google Scholar] [CrossRef]
- Hong, Q.; Ge, Z.; Wang, X.; Li, Y.; Xia, X.; Chen, Y. Measuring canopy morphology of saltmarsh plant patches using UAV-based LiDAR data. Front. Mar. Sci. 2024, 11, 1378687. [Google Scholar] [CrossRef]
- Pinton, D.; Canestrelli, A.; Wilkinson, B.; Ifju, P.; Ortega, A. A new algorithm for estimating ground elevation and vegetation characteristics in coastal salt marshes from high-resolution UAV-based LiDAR point clouds. Earth Surf. Process. Landforms 2020, 45, 3687–3701. [Google Scholar] [CrossRef]
- Diruit, W.; Le Bris, A.; Bajjouk, T.; Richier, S.; Helias, M.; Burel, T.; Lennon, M.; Guyot, A.; Ar Gall, E. Seaweed habitats on the shore: Characterization through hyperspectral UAV imagery and field sampling. Remote Sens. 2022, 14, 3124. [Google Scholar] [CrossRef]
- National Parks; Wildlife Service (NPWS). Derrymore Island. Available online: https://www.npws.ie/nature-reserves/kerry/derrymore-island (accessed on 3 March 2025).
- National Parks; Wildlife Service Derrymore Island SAC—Site Synopsis. Available online: https://www.npws.ie/sites/default/files/protected-sites/synopsis/SY002070.pdf (accessed on 3 June 2025).
- Akhmetkaliyeva, S.; Fairchild, E.; Cott, G. Carbon provenance study of Irish saltmarshes using bacteriohopanepolyol biomarkers. In Proceedings of the EGU General Assembly Conference Abstracts, Vienna, Austria, 14–19 April 2024. [Google Scholar] [CrossRef]
- Seawright, J. Red Fescue—Festuca Rubra. Available online: https://www.irishwildflowers.ie/pages-grasses/g-25.html (accessed on 28 May 2025).
- Seawright, J. Sea Rush—Juncus Maritimus. Available online: https://www.irishwildflowers.ie/pages-rushes/r-15.html (accessed on 28 May 2025).
- Irish Aviation Authority. Drones: Safety & Regulations. Available online: https://www.iaa.ie/general-aviation/drones/faqs/safety-regulations (accessed on 3 March 2025).
- DJI. Support of Matrice 300 RTK. Available online: https://www.dji.com/ie/support/product/matrice-300 (accessed on 3 March 2025).
- RESONON. Resonon Pika L Hyperspectral Camera. Available online: https://resonon.com/Pika-L (accessed on 3 March 2025).
- Agrowing. ALPHA 7Rxx Sextuple. Available online: https://agrowing.com/products/alpha-7riv-sextuple/ (accessed on 3 March 2025).
- DJI. Zenmuse P1. Available online: https://enterprise.dji.com/zenmuse-p1 (accessed on 3 March 2025).
- Jakob, S.; Zimmermann, R.; Gloaguen, R. The need for accurate geometric and radiometric corrections of drone-borne hyperspectral data for mineral exploration: Mephysto—A toolbox for pre-processing drone-borne hyperspectral data. Remote Sens. 2017, 9, 88. [Google Scholar] [CrossRef]
- Proctor, C.; He, Y. Workflow for building a hyperspectral UAV: Challenges and opportunities. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 415–419. [Google Scholar] [CrossRef]
- Emlid. Reach RS2: Multi-Band RTK GNSS Receiver with Centimeter Precision. Available online: https://emlid.com/reachrs2plus/ (accessed on 3 March 2025).
- DJI. D-RTK 2. Available online: https://www.dji.com/ie/d-rtk-2 (accessed on 3 March 2025).
- Resonon. Spectronon: Hyperspectral Software, version 3.5.6; Resonon Inc.: Bozeman, MT, USA, 2025. Available online: https://resonon.com/software (accessed on 3 March 2025).
- Rodarmel, C.; Shan, J. Principal component analysis for hyperspectral image classification. Surv. Land Inf. Sci. 2002, 62, 115–123. [Google Scholar]
- Wolfe, J.D.; Black, S.R. Hyperspectral Analytics in ENVI; L3Harris: Melbourne, FL, USA, 2018. [Google Scholar]
- Kang, X.; Xiang, X.; Li, S.; Benediktsson, J.A. PCA-based edge-preserving features for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 7140–7151. [Google Scholar] [CrossRef]
- Alaibakhsh, M.; Emelyanova, I.; Barron, O.; Sims, N.; Khiadani, M.; Mohyeddin, A. Delineation of riparian vegetation from Landsat multi-temporal imagery using PCA. Hydrol. Process. 2017, 31, 800–810. [Google Scholar] [CrossRef]
- Hobbs, R.; Grace, J. A study of pattern and process in coastal vegetation using principal components analysis. Vegetatio 1981, 44, 137–153. [Google Scholar] [CrossRef]
- Mcieem, P.M.P.C.; FitzPatrick, Ú.; Lynn, D. The Irish Vegetation Classification–An Overview of Concepts, Structure and Tools. Pract. Bull. Chart. Inst. Ecol. Environ. Manag. 2018, 102, 14–19. [Google Scholar]
- Mihu-Pintilie, A.; Nicu, I.C. GIS-based landform classification of eneolithic archaeological sites in the plateau-plain transition zone (NE Romania): Habitation practices vs. flood hazard perception. Remote Sens. 2019, 11, 915. [Google Scholar] [CrossRef]
- James, D.; Collin, A.; Bouet, A.; Perette, M.; Dimeglio, T.; Hervouet, G.; Durozier, T.; Duthion, G.; Lebas, J.F. Multi-Temporal Drone Mapping of Coastal Ecosystems in Restoration: Seagrass, Salt Marsh, and Dune. J. Coast. Res. 2025, 113, 524–528. [Google Scholar] [CrossRef]
- Douay, F.; Verpoorter, C.; Duong, G.; Spilmont, N.; Gevaert, F. New hyperspectral procedure to discriminate intertidal macroalgae. Remote Sens. 2022, 14, 346. [Google Scholar] [CrossRef]
- Paoletti, M.E.; Haut, J.M.; Plaza, J.; Plaza, A. Deep learning classifiers for hyperspectral imaging: A review. ISPRS J. Photogramm. Remote Sens. 2019, 158, 279–317. [Google Scholar] [CrossRef]
- Li, S.; Song, W.; Fang, L.; Chen, Y.; Ghamisi, P.; Benediktsson, J.A. Deep learning for hyperspectral image classification: An overview. IEEE Trans. Geosci. Remote Sens. 2019, 57, 6690–6709. [Google Scholar] [CrossRef]
- Chen, Y.; Lin, Z.; Zhao, X.; Wang, G.; Gu, Y. Deep learning-based classification of hyperspectral data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2094–2107. [Google Scholar] [CrossRef]
- Audebert, N.; Le Saux, B.; Lefèvre, S. Deep learning for classification of hyperspectral data: A comparative review. IEEE Geosci. Remote Sens. Mag. 2019, 7, 159–173. [Google Scholar] [CrossRef]
Camera | Spectral Range (nm) | Number of Bands | Field of View (°) |
---|---|---|---|
Pika L (Hyperspectral) | 400–1000 | 281 | 17.6 |
Agrowing 7 IV (Multispectral) | 405–850 | 14 | 35.0 |
DJI Zenmuse P1 (RGB) | ∼400–700 | 3 | 63.5 |
Camera | Flight Altitude (m) | Speed (m/s) | GSD (cm/pixel) | Survey Time (min) |
---|---|---|---|---|
Pika L (Hyperspectral) | 30 | 1.2 | 1.03 | 23 |
Agrowing 7 IV (Multispectral) | 30 | 1.5 | 0.45 | 19 |
DJI Zenmuse P1 (RGB) | 25 | 6.8 | 0.31 | 16 |
GT/P | Class 1 Festuca rubra | Class 2 Juncus maritimus | Class 3 Senescent vegetation | Total |
---|---|---|---|---|
Unclassified | UP1 | UP2 | UP3 | Total |
Class 1 | TP1 | FP12 | FP13 | GT1 |
Class 2 | FP21 | TP2 | FP23 | GT2 |
Class 3 | FP31 | FP32 | TP3 | GT3 |
Total | P1 | P2 | P3 | Total Pixels |
Camera | Training Pixels | Ground Truth Pixels | Full Dataset Pixels |
---|---|---|---|
RGB (Zenmuse P1) | 100,336 | 161,378 | 327,426,624 |
Multispectral (Agrowing 7R IV) | 50,680 | 81,332 | 164,436,442 |
Hyperspectral (Pika L) | 30,308 | 48,413 | 98,227,987 |
Multi-camera | 100,336 | 161,378 | 327,426,624 |
Sensor | Accuracy Metric | Random Forest | Spectral Angle Mapper | Support Vector Machine |
---|---|---|---|---|
RGB | Overall Accuracy | 0.61 | 0.36 | 0.56 |
Producer’s Accuracy | 0.84, 0.74, 0.10 | 0.80, 0.11, 0.12 | 0.84, 0.68, 0.00 | |
User’s Accuracy | 0.85, 0.56, 0.77 | 0.35, 0.33, 0.44 | 0.76, 0.51, 0.00 | |
Kappa Coefficient | 0.42 | 0.02 | 0.34 | |
Multispectral | Overall Accuracy | 0.88 | 0.90 | 0.88 |
Producer’s Accuracy | 0.67, 0.98, 0.97 | 0.97, 0.79, 0.97 | 0.67, 0.98, 0.99 | |
User’s Accuracy | 0.99, 0.98, 0.99 | 0.81, 0.97, 0.94 | 0.99, 0.99, 1.00 | |
Kappa Coefficient | 0.83 | 0.85 | 0.83 | |
Hyperspectral | Overall Accuracy | 0.89 | 0.92 | 0.91 |
Producer’s Accuracy | 0.96, 0.82, 0.89 | 0.95, 0.83, 0.99 | 0.98, 0.82, 0.96 | |
User’s Accuracy | 0.94, 0.96, 0.99 | 0.86, 0.95, 0.94 | 0.99, 0.97, 1.00 | |
Kappa Coefficient | 0.79 | 0.87 | 0.87 | |
Multi-Camera | Overall Accuracy | 0.86 | 0.97 | 0.88 |
Producer’s Accuracy | 0.67, 0.97, 0.90 | 0.99, 0.95, 0.99 | 0.67, 0.97, 0.98 | |
User’s Accuracy | 0.91, 0.99, 0.99 | 0.96, 0.99, 0.97 | 0.99, 0.99, 1.00 | |
Kappa Coefficient | 0.79 | 0.97 | 0.82 |
Study | Sensor Fusion | Ecosystem Type | Fusion Level | Reported Accuracy |
---|---|---|---|---|
Curcio et al. [11] | LiDAR + MSI | Salt marsh | Feature-level | up to 0.99 |
Diruit et al. [35] | HSI | Macroalgae | Pixel-level | SAM: 87.9% |
Brunier et al. [17] | MSI | Intertidal mudflats | Feature-level | RF: 93.1% |
Curcio et al. [19] | LiDAR + MSI | Salt marsh | Feature-level | up to 0.94 |
Ishida et al. [25] | HSI | Vegetation | Pixel-level | SVM: 94.5% |
Campbell and Wang [16] | MSI + LiDAR | Salt marsh | Object-based analysis | OA: 92.8% |
Wu et al. [32] | LiDAR | Salt marsh | Feature-level | AUC: 0.91 |
This study | RGB + MSI + HSI | Salt marsh | Pixel-level | SAM: 97% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Moreno, M.; Dalai, S.; Cott, G.; Bartlett, B.; Santos, M.; Dorian, T.; Riordan, J.; McGonigle, C.; Sacchetti, F.; Dooly, G. Multi-Camera Machine Learning for Salt Marsh Species Classification and Mapping. Remote Sens. 2025, 17, 1964. https://doi.org/10.3390/rs17121964
Moreno M, Dalai S, Cott G, Bartlett B, Santos M, Dorian T, Riordan J, McGonigle C, Sacchetti F, Dooly G. Multi-Camera Machine Learning for Salt Marsh Species Classification and Mapping. Remote Sensing. 2025; 17(12):1964. https://doi.org/10.3390/rs17121964
Chicago/Turabian StyleMoreno, Marco, Sagar Dalai, Grace Cott, Ben Bartlett, Matheus Santos, Tom Dorian, James Riordan, Chris McGonigle, Fabio Sacchetti, and Gerard Dooly. 2025. "Multi-Camera Machine Learning for Salt Marsh Species Classification and Mapping" Remote Sensing 17, no. 12: 1964. https://doi.org/10.3390/rs17121964
APA StyleMoreno, M., Dalai, S., Cott, G., Bartlett, B., Santos, M., Dorian, T., Riordan, J., McGonigle, C., Sacchetti, F., & Dooly, G. (2025). Multi-Camera Machine Learning for Salt Marsh Species Classification and Mapping. Remote Sensing, 17(12), 1964. https://doi.org/10.3390/rs17121964