Next Article in Journal
Predicting Abiotic Soil Characteristics Using Sentinel-2 at Nature-Management-Relevant Spatial Scales and Extents
Next Article in Special Issue
Evaluating the Multidimensional Stability of Regional Ecosystems Using the LandTrendr Algorithm
Previous Article in Journal
Wavelength Cut-Off Error of Spectral Density from MTF3 of SWIM Instrument Onboard CFOSAT: An Investigation from Buoy Data
Previous Article in Special Issue
Analysis of Changes in Forest Vegetation Peak Growth Metrics and Driving Factors in a Typical Climatic Transition Zone: A Case Study of the Funiu Mountain, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fine-Scale Mangrove Species Classification Based on UAV Multispectral and Hyperspectral Remote Sensing Using Machine Learning

1
Key Laboratory of Environment Change and Resources Use in Beibu Gulf, Nanning Normal University, Nanning 530001, China
2
Guangxi Beihai Wetland Ecosystem National Observation and Research Station, Beihai 536001, China
3
Department of Forestry and Natural Resources, TP Cooper Building, University of Kentucky, Lexington, KY 40546, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(16), 3093; https://doi.org/10.3390/rs16163093
Submission received: 16 July 2024 / Revised: 19 August 2024 / Accepted: 19 August 2024 / Published: 22 August 2024

Abstract

:
Mangrove ecosystems play an irreplaceable role in coastal environments by providing essential ecosystem services. Diverse mangrove species have different functions due to their morphological and physiological characteristics. A precise spatial distribution map of mangrove species is therefore crucial for biodiversity maintenance and environmental conservation of coastal ecosystems. Traditional satellite data are limited in fine-scale mangrove species classification due to low spatial resolution and less spectral information. This study employed unmanned aerial vehicle (UAV) technology to acquire high-resolution multispectral and hyperspectral mangrove forest imagery in Guangxi, China. We leveraged advanced algorithms, including RFE-RF for feature selection and machine learning models (Adaptive Boosting (AdaBoost), eXtreme Gradient Boosting (XGBoost), Random Forest (RF), and Light Gradient Boosting Machine (LightGBM)), to achieve mangrove species mapping with high classification accuracy. The study assessed the classification performance of these four machine learning models for two types of image data (UAV multispectral and hyperspectral imagery), respectively. The results demonstrated that hyperspectral imagery had superiority over multispectral data by offering enhanced noise reduction and classification performance. Hyperspectral imagery produced mangrove species classification with overall accuracy (OA) higher than 91% across the four machine learning models. LightGBM achieved the highest OA of 97.15% and kappa coefficient (Kappa) of 0.97 based on hyperspectral imagery. Dimensionality reduction and feature extraction techniques were effectively applied to the UAV data, with vegetation indices proving to be particularly valuable for species classification. The present research underscored the effectiveness of UAV hyperspectral images using machine learning models for fine-scale mangrove species classification. This approach has the potential to significantly improve ecological management and conservation strategies, providing a robust framework for monitoring and safeguarding these essential coastal habitats.

1. Introduction

Mangrove ecosystems are vital salt-tolerant communities, thriving in tropical and subtropical intertidal regions along coastlines and estuaries [1,2]. They provide plentiful critical ecosystem services such as shoreline protection, biodiversity maintenance, and noteworthy carbon sequestration [3,4]. However, these ecosystems are severely threatened, with a loss of 35% of the area worldwide during the last two decades of the 20th century [5,6]. Mangrove forests have also experienced a rapid decline in China [7,8]. Considering the reduction and degradation of mangrove forests worldwide, there is an urgent need for effective conservation and restoration of mangroves. Accurate mapping of mangrove forests can assist such efforts [2,9]. However, since diverse mangrove species have different functions due to their morphological and physiological characteristics [10,11], there is a growing need for a precise spatial distribution map of mangrove forests at the species level.
Remote sensing offers a powerful tool for mangrove species mapping [12,13,14]. The initial efforts relied on medium-resolution multispectral satellite imagery, such as Landsat and SPOT [15,16,17]. However, they encountered challenges in distinguishing species with small patches, irregular spatial distribution, or similar spectral signatures, due to the limitations of low spatial and spectral resolution [9,18]. The enhanced spatial detail provided by high-resolution satellites is counterbalanced by the limited spectral resolution, which continues to impede nuanced species classification [19,20]. The utilization of hyperspectral imagery satellites, which provide abundant spectral information, hold great potential for precise species classification [21,22,23], albeit with limited spatial resolutions. Data fusion of space-borne hyperspectral imagery with high-resolution data from other platforms is a viable approach to achieve superior classification accuracy [24,25]. However, there are still practical challenges with the data fusion approach that relies on satellite imagery. Mangroves predominantly thrive in cloudy and rainy areas where tidal conditions vary over time. Given that satellite transit times are typically fixed, obtaining low-tide images in specific areas becomes challenging. Consequently, the classification of mangrove communities is constrained by the availability of high-quality optical remote sensing image data.
UAV remote sensing technology emerges as a promising alternative due to its flexibility, affordability, and data quality [26,27]. UAV imagery delivers high spatial resolution, enabling fine-scale classification of mangrove species at the landscape scale [28,29]. This bridges the gap between large-scale satellite monitoring and field surveys, providing a crucial link for advancing species classification and mapping research [30,31,32]. Most UAV remote sensing applications for mangrove monitoring have utilized multispectral data, achieving promising overall classification accuracies between 85% and 90% [14,33,34]. Conversely, studies employing UAV hyperspectral data for mangrove species classification are scarce, and the reported accuracies often range between 80% and 90%, similar to multispectral data [35,36]. One reason for suboptimal classification performance with the UAV hyperspectral data is that the high-dimensional hyperspectral data are often susceptible to the “Hughes phenomenon”, where classification accuracy suffers due to the inclusion of redundant features [37,38]. Dimensionality reduction is often deemed necessary to overcome the “Hughes phenomenon” and fully untap the potential of UAV-based hyperspectral data.
Various remote sensing-based classification methods have been developed, each with its advantages and limitations [1,21,30,39]. Traditional methods like visual interpretation, pixel-based, and object-oriented classification have drawbacks such as subjectivity, inefficiency, or neglecting spatial information [40,41]. Machine learning algorithms have emerged as powerful tools due to their ability to handle complex data and extract nonlinear relationships [42,43]. Ensemble learning methods like Random Forest (RF) and Gradient Boosting have shown promise by combining multiple classifiers for higher accuracy and robustness [5,24,44]. Deep learning, another branch of machine learning, holds great potential for mangrove species classification but has been limited by data availability and computational complexity [1,13]. Although machine learning algorithms such as RF and Light Gradient Boosting Machine (LightGBM) have demonstrated satisfactory classification accuracy with UAV multispectral data, their generalizability to UAV hyperspectral remote sensing has received scant attention in the literature [44]. Further research is necessary to address existing challenges and fully harness the potential of machine learning for effective mangrove ecosystem management and conservation.
This study investigates the potential of UAV-based hyperspectral and multispectral imagery for mangrove species classification. Yingluo Bay and Pearl Bay were chosen as our study areas due to their contrasting community structures in Guangxi, China. This variation allows us to evaluate the robustness and generalizability of the model for mangrove species identification across diverse ecological settings. The specific objectives of this study encompass (1) comparing the performance of four machine learning algorithms (Adaptive Boosting (AdaBoost), eXtreme Gradient Boosting (XGBoost), RF, and LightGBM) in classifying mangrove species; (2) identifying the key features of multi- and hyper-spectral data derived from UAV imagery for mangrove species classification; and (3) providing recommendations on the suitability of UAV imagery data and classification methods for mapping mangroves accurately and efficiently. The findings will provide valuable insights into the effectiveness of different data sources and classification methods for accurate mangrove species identification. The results will also inform the development of robust and transferable UAV remote sensing protocols for mangrove species classification in diverse geographical and environmental settings.

2. Materials and Methods

2.1. Study Area

Yingluo Bay is a part of Shankou National Mangrove Ecological Nature Reserve (21°28′~21°37′N, 109°37′~109°47′E) in Guangxi (Figure 1A). The bay has a tropical marine monsoon climate with an average annual temperature of 22.9 °C and rainfall of 1573 mm (http://data.cma.cn/data/ (accessed on 1 June 2020)). Both irregular diurnal and semidiurnal tides coexist, with an average tidal range of 2.5 m and a maximum of 6.3 m [45]. This area exhibits denser vegetation with a wider variety of species, including Bruguiera gymnorrhiza, Rhizophora stylosa, Kandelia candel, Aegiceras corniculatum, Avicennia marina, Excoecaria agallocha, Hibiscus tiliaceus, and Spartina alterniflora [46]. Designated as a core conservation area, Yingluo Bay boasts well-preserved and representative mangrove vegetation, making it ideal for studying and classifying mangrove species.
Pearl Bay is within the Guangxi Beilun Estuary National Nature Reserve, which is situated at the mouth of the Beilun River in Guangxi, Southwest China (21°28′~21°37′N, 109°37′~109°47′E) (Figure 1B). Pearl Bay is approximately 3.5 km wide at its mouth and extends 46 km along the coastline. The bay experiences a subtropical marine monsoon climate with an average temperature of 22.5 °C and annual rainfall of 2220 mm (http://data.cma.cn/data/ (accessed on 1 July 2020)). Regular diurnal tides dominate the area, with an average tidal range of 2.2 m and a maximum of 5.1 m [47]. Mangroves cover 939.97 hectares, with dominant species including Bruguiera gymnorrhiza, Kandelia candel, Aegiceras corniculatum, and Excoecaria agallocha [47]. Pearl Bay is also a core conservation area, where there are preserved and representative mangrove vegetation areas to implement research.

2.2. Method

2.2.1. Overview

Our mangrove species identification process follows a five-step workflow (Figure 2): (1) Data acquisition: This stage involves UAV image data acquisition and sample plots inventory. (2) Data preparation: we assemble UAV hyperspectral and multispectral data, pre-process it, generate vegetation index datasets, and incorporate field survey data. (3) Feature extraction and selection: The most informative features were identified from vegetation indices, texture features, and original spectral data employing the Recursive Feature Elimination (RFE) method. (4) Classifier optimization: We evaluate the performance of different combinations of data sources and machine learning methods in mangrove species classification. (5) Modeling assessment: The performance of the chosen models is assessed to evaluate their robustness and generalizability across different regions.

2.2.2. Data Acquisition

We employed a DJI M300 RTK (M300, DJI, Shenzhen, China) quadcopter equipped with a ULTRIS X20 Plus sensor for hyperspectral image capture. A DJI Phantom 4 Multi RTK (P4M RTK, DJI, Shenzhen, China) drone with an integrated multispectral imaging system (visible, red, green, blue, red edge, near-infrared) was used to acquire multispectral data. High-resolution visible light data were obtained using a DJI Phantom 4 RTK (P4 RTK, DJI, Shenzhen, China) drone equipped with a 1-inch, 20-megapixel CMOS sensor. This provides a Ground Sampling Distance (GSD) of 2.74 cm at a flight altitude of 100 m, ideal for acquiring reference data.
Hyperspectral, multispectral, and visible light data were collected between 2022 and 2023 under clear skies, ample sunlight, and low tidal levels (below 3 m) to ensure exposed mangrove areas. Data acquisition during summer was restricted to morning and afternoon to avoid high solar noon conditions that caused water surface glare. Flight missions maintained an 80% forward overlap and 70% side overlap for comprehensive data capture. The flight path adopted a strip pattern extending from inland to coastal areas to ensure complete coverage of mangrove distribution.
Field surveys were performed at Yingluo Bay and Pearl Bay on 22–24 June 2023, and 25–27 November 2023, respectively. A Qianxun SR6 Plus RTK (SP6 Plus, Qianxun Spatial Intelligence Inc., Shanghai, China) instrument was used to collect ground truth data based on the China Geodetic Coordinate System 2000 coordinate system, including geographic location, species names, images, and other relevant vegetation information for sample selection. However, marshy terrain and dense canopy cover within the study area hampered RTK signal reception, impacting accuracy. To address this challenge, tree species surveys primarily followed tidal creek directions for improving accessibility and locational accuracy.

2.2.3. Data Preparation

The acquired hyperspectral images were first fused using Cubert-Pilot 3.0 software (Cubert, Ulmu, Germany), then processed using Agisoft PhotoScan Professional 1.7.3 software (Agisoft LLC, St. Petersburg, Russia) for photo alignment, generating dense point clouds, creating meshes, texturing, and producing orthoimages. Subsequently, geometric correction and image cropping were conducted in ENVI 5.5 software, maintaining a Root Mean Square Error (RMSE) below 0.5 pixels. Visible and multispectral images were stitched and processed into RGB images and Digital Orthophoto Maps (DOMs) for each band using DJI Terra 4.2.3 software (DJI, Shenzhen, China). These five single-band DOMs were composited into a multispectral image using ArcGIS 10.7 software (ESRI, Redlands, CA, USA). All imagery was resampled to 0.06 m using the nearest neighbor method to maintain consistent spatial resolution. Detailed flight parameters are provided in Table 1.
Given the slow successional cycles of mature mangrove trees, the spatial distribution patterns were assumed to be stable within a year. A total of 434 and 692 samples were collected in the Yingluo Bay and Pearl Bay areas, respectively (Figure 1A,B). In areas with limited ground access, a DJI P4 RTK drone captured high-resolution RGB images at 50 m flight altitude (parameters in Table 1). These drone images, combined with the collected ground survey data, were used to establish interpretation signs (Table 2) for identifying 11 land cover types. These categories included mangrove species Rhizophora stylosa (RS), Bruguiera gymnorrhiza (BG), Avicennia marina (AM), Aegiceras corniculatum (AC), Kandelia candel (KC), Excoecaria agallocha (EA), and Hibiscus tiliaceus (HT), other vegetation Spartina alterniflora (SA), and non-vegetation water bodies (WB), mudflats (MF), and roads (RD). The sample library was enhanced through the incorporation of visual interpretation of visible images for feature selection and model optimization.

2.2.4. Feature Extraction and Selection

Mangrove species exhibit unique spectral characteristics, with variations in their biochemical and physiological properties reflected in their spectral signatures. Preprocessing of the acquired drone-based hyperspectral data revealed significant noise in the first 20 bands (ranging from 400 to 432 nm). Consequently, these bands were removed, resulting in a refined dataset of 144 bands spanning 433 to 1000 nm. Conversely, multispectral data exhibited less noise and retained the original red, green, blue, red edge, and near-infrared bands. Sole reliance on spectral reflectance for mangrove species classification has limitations. To improve species identification accuracy, additional features are crucial. This study incorporated two key feature types: vegetation indices (VIs) and texture features.
Vegetation indices are mathematical transformations of spectral data used to highlight vegetation features and distinguish between plant types [48,49,50,51]. In this study, 49 candidate VIs were selected for analysis. Texture features capture spatial variations in the arrangement of pixels within an image, providing valuable information for species classification. This study utilized high-resolution UAV imagery and the Gray-Level Co-occurrence Matrix (GLCM) method to extract texture information. GLCM statistically describes the spatial distribution of grayscale values between pixels at specified distances and directions [50,52,53].
To reduce redundancy in spectral information, principal component analysis (PCA) was employed [54]. Research suggests using the first principal component image from PCA for GLCM calculations yields better texture feature descriptions compared to single-band approaches. Therefore, eight texture operators (Entropy, Correlation, Dissimilarity, Homogeneity, Contrast, Mean, Angular Second Moment, and Variance) were computed on the first principal component image derived from the UAV multispectral and hyperspectral data. Calculations were performed in eight directions (0°, 45°, 90°, 135°, 180°, 225°, 270°), with a step size of 1 and window sizes of 3 × 3, 5 × 5, and 7 × 7. Ultimately, a total of 192 texture features (1 PCA band × 3 window types × 8 texture operators × 8 directions) were computed for both multispectral and hyperspectral datasets.
When using a large number of features for modeling, the Hughes phenomenon can occur, reducing model efficiency and classification accuracy. To address this issue, the Recursive Feature Elimination (RFE) algorithm was employed. RFE iteratively trains a model and eliminates the least important features, ultimately selecting an optimal feature subset. The Random Forest model was chosen within the RFE algorithm due to its robustness to overfitting, reduced sensitivity to parameters, and fast training speed [33].

2.2.5. Selection of Machine Learning Classification Algorithms

This study evaluated the performance of four machine learning algorithms (AdaBoost, XGBoost, RF, and LightGBM) for mangrove species classification. AdaBoost, an ensemble method, enhances weak classifiers by focusing on misclassified instances, proving particularly effective for task with subtle class distinctions like mangrove species identification [55]. XGBoost, a gradient boosting algorithm with regularization, excels at handling complex, high-dimensional remote sensing datasets [56,57]. RF, a collection of decision trees, is robust for large datasets and offers valuable feature importance estimates [58]. LightGBM, optimized for speed and efficiency, employs a leaf-wise tree growth strategy, often outperforming level-wise algorithms like XGBoost [59,60].
These four machine learning algorithms exhibit certain distinctions [55,61,62,63]. AdaBoost and XGBoost are both boosting techniques, with XGBoost incorporating regularization for improved performance [62]. RF, based on bagging methodology, offers stability but may be less powerful in capturing complex relationships compared to boosting methods. XGBoost and LightGBM, both gradient boosting algorithms, differ in tree growth strategies, with LightGBM potentially achieving higher accuracy but at the risk of overfitting if not carefully tuned [63]. RF excels in feature importance assessment, while XGBoost and LightGBM prioritize error minimization. Both XGBoost and LightGBM employ regularization to prevent overfitting. Ultimately, these algorithms were selected due to their complementary strengths in addressing the challenges of mangrove species classification, including handling high-dimensional data, achieving robust performance, and providing interpretable results.
To optimize model performance, a comprehensive hyperparameter tuning process was conducted using grid search with cross-validation from the scikit-learn library. Hyperparameter tuning is critical in machine learning, aiming to identify the best combination of settings to enhance model performance, generalization ability, and training efficiency. Grid search is a popular method for this task due to its simplicity, ease of use, and ability to leverage parallel processing. The grid search automatically selects the hyperparameter combination that produces the best performance, defining the final model’s configuration.
Open-source Python libraries were used for model implementation: scikit-learn for AdaBoost (https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.AdaBoostClassifier.html (accessed on 1 June 2023)), XGBoost (https://readthedocs.org/projects/xgboost/ (accessed on 1 June 2023)), RF (https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html (accessed on 1 June 2023)), and LightGBM (https://lightgbm.readthedocs.io/en/latest/Python-Intro.html (accessed on 1 June 2023)).

2.2.6. Classification Accuracy Evaluation

This study employed the confusion matrix to evaluate the classification accuracy of mangrove species. The confusion matrix compares and analyzes the agreement between actual ground truth categories and the model’s classified results. This comparison allows for the calculation of several key metrics to assess classification performance: Producer’s Accuracy (PA), User’s Accuracy (UA), Kappa Coefficient (Kappa), and Overall Accuracy (OA), aiming to assess mapping precision and classification effectiveness comprehensively. These metrics provide a comprehensive evaluation of mapping precision and classification effectiveness for mangrove species identification.

3. Results

3.1. Feature Selection Based on REF-RF

An initial set of 237 features was extracted for multispectral imagery in Yingluo Bay. Correlation analysis reduced this to 45 features, and RFE-RF further refined the selection to 17 optimal features. These included nine vegetation indices, six textural features, and two spectral features (Table 3, Appendix A Table A1). The cross-validation score was 0.81 for multispectral imagery. An initial set of 385 features was extracted for hyperspectral imagery in Yingluo Bay. Correlation analysis reduced this to 49 features, and RFE-RF further refined the selection to 25 optimal features. These included 15 vegetation indices, 9 spectral features, and 1 textural feature (Table 3, Appendix A Table A1). The cross-validation score was 0.94 for hyperspectral imagery.
Similar to Yingluo Bay, correlation analysis and REF-RF were also applied to the Pearl Bay image. In Pearl Bay, the optimal number of features selected for multi- and hyper-spectral were 12 and 16, respectively (Table 3, Appendix A Table A1). The cross-validation scores were 0.75 and 0.91, with the hyperspectral imagery demonstrating a higher score attributed to its enhanced spectral information.

3.2. Mangroves Species Classification and Accuracy Evaluation

This study evaluated the performance of four machine learning algorithms (AdaBoost, XGBoost, RF, and LightGBM) for mangrove species classification using multispectral and hyperspectral data sources. In Yingluo Bay, across both data sources, a consistent ranking of algorithm performance emerged. The ranking from highest to lowest accuracy was as follows: LightGBM > RF > XGBoost > AdaBoost (Table 4). LightGBM achieved the highest classification accuracy (OA: 80.96%, 97.15%, Kappa: 0.78, 0.97) in multi- and hyper-spectral images, respectively. Conversely, AdaBoost exhibited the lowest accuracy (OA: 63.05%, 82.96%, Kappa: 0.56, 0.79) in both scenarios. Although LightGBM consistently outperformed other algorithms, the margin of improvement varied based on the data source. Using multispectral data, the difference between LightGBM, RF, and XGBoost was minimal, with all achieving OA values around 80% and Kappa around 0.77. In contrast, LightGBM displayed a significant advantage in hyperspectral data. Compared to RF and XGBoost, LightGBM yielded improvements of 1.42% and 2.89% in overall accuracy, respectively. This highlights the ability of LightGBM to leverage the richer spectral information in hyperspectral data for more accurate classification.
This study also evaluated the performance of each classification algorithm by calculating the PA and UA for each mangrove species class based on the overall confusion matrix (Figure 3). LightGBM achieved the highest or near-highest PA and UA values across most classes, particularly for species with limited distribution areas like HT, EA, and SA. These species posed a challenge in mapping due to their smaller sample sizes. In contrast, the AdaBoost algorithm yielded significantly lower PA and UA for all classes. The AdaBoost algorithm showed significant discrepancies between the two data sources. It achieved zero accuracy for BG and HT when used with multispectral images while demonstrating poor performance for AC and AM with hyperspectral images.
To assess the generalizability of the LightGBM model, we applied it to separate UAV datasets acquired in Pearl Bay. LightGBM based on multispectral data failed to identify EA entirely, highlighting its limitations for certain species (Figure 4). Hyperspectral data displayed significant variation in PA and UA across different species. This suggests that while hyperspectral data generally outperform multispectral data, their effectiveness can vary depending on the specific species. Unlike Yingluo Bay, Pearl Bay exhibited high PA and UA accuracy for HT (>80%), indicating successful identification.

3.3. Mapping the Distribution of Mangrove Species

After performing feature selection and model building, four algorithms were applied to the UAV-acquired multispectral and hyperspectral images of Yingluo Bay (Figure 5). The classification results revealed the advantages of hyperspectral data in effectively mitigating the “salt-and-pepper” noise, a common issue encountered in multispectral data that often leads to misclassification (Figure 5). The utilization of hyperspectral data further facilitated the more precise delineation of vegetation boundaries, thereby enhancing the visual quality of classification maps (Figure 6).
Although visual differences between algorithms using hyperspectral data were subtle (Figure 5), LightGBM outperformed others, with a classification map closely matching the actual spatial distribution of species (Figure 6). The geographical distribution range of BG scattered in a large area of RS is visible, particularly. However, in the coastal area, it was observed that a small portion of AC had been erroneously classified as BG, and some MF had been mistakenly categorized as SA. Non-vegetated features such as RD and MF can still be accurately distinguished. The AdaBoost algorithm exhibits the poorest classification performance, with HT being indistinguishable and only a small portion of EA correctly identified, while most of SA is erroneously classified as the bare beach (Figure 5). These factors primarily contribute to the low accuracy (OA: 82.96%, Kappa: 0.79) observed in the AdaBoost model (Table 4).
The classification results provided insights into the spatial distribution of mangrove species in Yingluo Bay (Figure 5 and Figure 6). HT and EA were often found intermixed near inland roads. RS exhibited a more widespread distribution across the area. BG was primarily scattered along inner beaches. AM was concentrated in outer beaches, while AC was found interspersed with AM in low-tide zones and outer beaches.
The classification maps derived from UAV multispectral and hyperspectral imagery using the LightGBM model in Pearl Bay highlight the advantages of hyperspectral data (Figure 7 and Figure 8). Hyperspectral imagery significantly mitigates the “salt-and-pepper” effect observed in multispectral data. The presence of this noise can result in misclassifications, such as confusion between AC and AM (Figure 7). Hyperspectral data facilitate clearer delineation of species boundaries compared to multispectral data (Figure 7 and Figure 8). This was evident in the improved distinction between bare beaches and vegetation. Notably, LightGBM using multispectral data failed to detect EA entirely. Hyperspectral data, however, offered improved species identification (Figure 8).
The classification maps provided valuable insights into the spatial distribution of mangrove species in Pearl Bay. EA and HT were distributed along the landward edge in strip-like patterns. BG exhibits sparse distribution in the central region. AM was primarily concentrated in the central area and along the coast. KC and AC were intermixed throughout the study area.

4. Discussion

4.1. Feature Selection

The RFE-RF and correlation analysis techniques were employed to investigate the impact of feature selection on mangrove species classification using UAV imagery in Yingluo Bay and Pearl Bay. Feature selection significantly reduced the initial feature sets, confirming stable model performance even with less than 20 features. Across both multispectral and hyperspectral data, vegetation indices consistently ranked as the most influential features for distinguishing mangrove species. This highlights their sensitivity to species identification. Vegetation indices are mathematical combinations of reflectance values from two or more spectral bands. Compared to single spectral bands, vegetation indices exhibit a closer relationship with key plant characteristics, such as growth stage, biomass, and leaf properties, which could enable a more accurate representation of the health and structure of mangrove species [64]. Textural features played a less prominent role in hyperspectral data. Richer spectral information in hyperspectral data allows individual spectral bands to provide valuable discriminatory power, reducing the reliance on textural features. Conversely, multispectral data, with lower spectral resolution, require texture analysis to extract internal canopy details (shadows, leaf attributes, and branch thickness). Textural features thus contributed more to classification than original spectral features for this data type.
The selected features also differed between the two study areas due to variations in mangrove species composition. Similar features can vary in importance depending on the specific species present. Vegetation indices reflect various attributes like structure (biomass, cover, and LAI), biochemistry (water and pigments), and physiology (fluorescence). Different indices target different attributes. Similar to other studies, this research found that widely used traditional indices like NDVI and DVI were rarely selected features [65]. This may be because these indices are not sensitive enough to capture the subtle spectral differences between mangrove species. Regardless of the chosen features, most selected spectral features concentrated in the red edge and near-infrared regions, suggesting these ranges hold key information for species differentiation [66].

4.2. Advantage of Hyperspectral Data in Mangrove Species Identification

This study investigated the influence of data sources (multispectral vs. hyperspectral) and machine learning algorithms on mangrove species classification accuracy. UAV-derived imagery from both sensors was subjected to variable selection and modeling, yielding distinct classification results. Multispectral data exhibited a significant amount of salt-and-pepper noise. In contrast, hyperspectral imagery demonstrated superior performance. Its richer spectral information provided greater clarity of vegetation boundaries, effectively mitigating salt-and-pepper noise and enhancing overall visual quality. This highlights the substantial advantage of hyperspectral data for accurate species identification. For instance, AdaBoost consistently underperformed, particularly in distinguishing species like BG and AC that are sparsely distributed. Notably, hyperspectral data enabled AdaBoost to achieve a high accuracy of 0.93 and 0.67 based on the confusion matrix (Appendix B Figure A1), whereas multispectral data faced challenges for these sparsely distributed species.
Hyperspectral data allowed for more precise species differentiation, which contains abundant spectral information. The LightGBM model achieved high classification accuracies for key species such as BG (0.98) and RS (0.98), as evidenced by the confusion matrix (Appendix B Figure A1). However, species with spectral similarities and mixed distributions, such as EA and HT, yielded less optimal results. Overall, hyperspectral data exhibited a lower probability of misclassification compared to multispectral data, solidifying its superiority for mangrove species classification.

4.3. Model Robustness Evaluation

This study investigated the influence of the study area on classification accuracy for four machine learning algorithms. The four classification algorithms consistently demonstrated high levels of OA and Kappa rankings across both data sources. The order of accuracy rankings, from highest to lowest, was as follows: LightGBM > RF > XGBoost > AdaBoost. The LightGBM algorithm demonstrates the highest classification accuracy across various data sources, with an optimal OA of 97.15% and Kappa of 0.97. On the other hand, the AdaBoost algorithm exhibits the lowest accuracy, with an OA of merely 63.05% and Kappa as low as 0.56, indicating its significant deviation from the performance levels achieved by the other three algorithms. Although the LightGBM algorithm exhibits numerous advantages in classifying mangrove species, there is no statistically significant difference when applied to multispectral data.
Beyond its superior accuracy, LightGBM addresses limitations present in RF and XGBoost. These algorithms can be computationally expensive, requiring high memory consumption and lengthy parameter optimization processes. LightGBM effectively overcomes these drawbacks, offering faster training efficiency, lower memory usage, and the ability to handle large datasets. This combination of speed, efficiency, and superior accuracy makes LightGBM a compelling choice for analyzing high dimensional hyperspectral data.

4.4. Limitations and Implications for Mangrove Species Mapping

This study confirms the feasibility of UAV multispectral and hyperspectral data for mangrove species classification using machine learning approaches. However, several challenges are remaining. Firstly, weather variations, particularly in large study areas, can significantly impact image quality due to changes in light intensity and solar elevation during data acquisition. Although UAVs offer flexible operation times, rapid weather fluctuations can degrade data quality compared to the fixed-rate collection process. This limits UAV remote sensing to scenarios where weather conditions are more unpredictable.
Secondly, dense mangrove vegetation presents challenges for image mosaicking and orthorectification. Establishing precise tie points between flight strips is difficult, hindering accurate image stitching. Additionally, storing and processing high-resolution hyperspectral data, characterized by large data volumes, is computationally demanding, limiting its application in large-scale studies.
Furthermore, mangrove classification is inherently more complex than plantation forests or urban trees. Dense stands with significant species intermixing make canopy boundary delineation difficult. This results in the phenomena of “same object, different spectra” and “different objects, same spectra” within imagery. Current classifications typically encompass fewer than five species, and the number of classified species often inversely affects accuracy. Laser scanning technology offers promise by providing detailed three-dimensional structural information that can complement optical imagery and potentially enhance classification accuracy.
Studies indicate that spatial resolution also influences classification effectiveness, where higher resolution does not necessarily lead to better classification results [22,24,32]. This study focused on a single spatial resolution. Future research should investigate the impact of using different resolutions on feature selection and classification performance. Spatial resolution may not always correlate directly with classification accuracy, highlighting the importance of selecting an optimal resolution for each study.
The present study showcases the efficacy of machine learning ensemble algorithms in handling hyperspectral data, while deep learning holds promise for future advancements. However, challenges such as long training times, significant computational requirements, limited interpretability, and sensitivity to data quality hinder its widespread adoption. Future research can explore the differential effectiveness of deep learning and machine learning ensemble approaches in mangrove species classification.

5. Conclusions

This study investigated the feasibility and limitations of UAV multispectral and hyperspectral imagery, coupled with machine learning algorithms, for fine-scale mangrove species identification. The key findings are summarized below:
  • Dimensionality reduction: UAV remote sensing data can contain redundant features, impacting classification accuracy. To address this, we employed correlation analysis and utilized an RF algorithm with recursive feature elimination for feature selection. This approach effectively reduced overfitting and identified vegetation indices as the most informative features for inter-species classification.
  • Data source and algorithm performance: Hyperspectral data consistently yielded superior classification results compared to multispectral data. Among the evaluated algorithms, the LightGBM algorithm achieved the highest classification accuracy. RF and XGBoost also demonstrated promising performance for hyperspectral mangrove species classification.
  • Spatial variability and classification performance: Comparative analysis between study areas revealed higher overall classification accuracy in Yingluo Bay compared to Pearl Bay, despite using the same data source. This difference likely stems from variations in species characteristics between the two locations. This finding underscores the importance of considering spatial variability in species distribution when optimizing classification strategies.

Author Contributions

Conceptualization, Y.Y. and J.Y.; methodology, Y.Y. and Z.M.; validation, Y.Y., W.C. and J.Y.; investigation, Y.Y., Z.M., J.Z., J.W. and H.S.; resources, Y.Y. and J.Y.; writing—original draft preparation, Y.Y.; writing—review and editing, J.Y. and W.C.; visualization, Y.Y., Z.M. and J.Z.; supervision, J.Y.; project administration, Y.Y.; funding acquisition, Y.Y. and J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China, grant number 41961037, the Guangxi Science and Technology Project, grant number AD20297066, the Key Laboratory of Environment Change and Resources Use in Beibu Gulf (Nanning Normal University), Ministry of Education, grant number GTEU-KLOP-X1718, and the BaGui Scholars Program of Guangxi Zhuang Autonomous Region.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Acknowledgments

We thank Shankou National Mangrove Ecological Nature Reserve, and Guangxi Beilun Estuary National Nature Reserve, for helping field survey. We also express our gratitude to everyone who helped us to successfully complete this research.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

The calculation formula for remote sensing vegetation indices was employed in this study.
Table A1. The equations of vegetation indices were used in this study.
Table A1. The equations of vegetation indices were used in this study.
Spectral IndicesFormulaReferences
Anthocyanin Reflectance Index1 (ARI1) A R I 1 = 1 ρ 550 1 ρ 700 [67]
Anthocyanin Reflectance Index1 (ARI2) A R I 2 = ρ 800 × 1 ρ 550 1 ρ 700 [68]
Carotenoid Reflectance Index1 (CRI2) C R I 2 = 1 ρ 510 1 ρ 700 [69]
Enhanced Vegetation Index (EVI) E V I = 2.5 × N I R R e d N I R + 6 × R e d 7.5 × B l u e + 1 [70]
Global Environmental Monitoring Index (GEMI) G E M I = e t a 1 0.25 × e t a R e d 0.125 1 R e d   [71]
Green Atmospherically Resistant index (GARI) G A R I = N I R G r e e n γ B l u e R e d   N I R + G r e e n γ B l u e R e d   ,   γ = 1.7 [72]
Green Leaf Index (GLI) G L I = G r e e n R e d + G r e e n B l u e 2 × G r e e n + R e d + B l u e [73]
Leaf Area Index (LAI) L A I = 3.618 × E V I 0.118 [74]
Modified Chlorophyll
Absorption Ratio Index (MCARI)
M C A R I = ρ 700 ρ 670 0.2 × ρ 700 ρ 550 × ρ 700 ρ 670 [75]
Modified Red Edge Normalized Difference Vegetation Index (MRENDVI) M R E N D V I = ρ 750 ρ 705 ρ 750 + ρ 705 2 × ρ 445 [76]
Modified Red Edge Simple Ration (MRESR) M R E S R = ρ 750 ρ 445 ρ 750 + ρ 445 [77]
Normalized Difference Mud Index (NDMI) N D M I = ρ 795 ρ 990 ρ 795 + ρ 990 [78]
Photochemical Reflectance Index (PRI) P R I = ρ 531 ρ 570 ρ 531 + ρ 570 [79,80]
Plant Senescence Reflenctance Index (PSRI) P S R I = ρ 680 ρ 500 ρ 750 [81]
Red Edge Position Index (REPI)The wavelength of the maximum derivative of reflectance in the vegetation red edge region of the spectrum in microns from 690 to 740 nm.[82]
Red Green Ratio Index (RGRI) R G R I = i = 600 699 R i j = 500 599 R j [83]
Simple Ratio Index (SRI) S R I = N I R R e d [84]
Structure insensitive Pigment Index (SIPI) S I P I = ρ 800 ρ 445 ρ 800 ρ 680 [85]
Sum Green Index (SGI) ρ 500 ~ ρ 600 [86]
Transformed Chlorophyll Absorption Reflectance Index (TCARI) T C A R I = 3 ρ 700 ρ 670 0.2 ρ 700 ρ 550 ρ 700 ρ 670 [87]
Transformed Difference Vegetation Index (TDVI) T D V I = 0.5 + N I R R e d N I R + R e d   [88]
Triangular Greenness Index (TGI) T G I = R e d B l u e R e d G r e e n R e d G r e e n R e d B l u e 2 [89]
Visible Atmospherically Resistant Index (VARI) V A R I = G r e e n R e d G r e e n + R e d B l u e [90]
Vogelmann Red Edge Index1 (VREI) V R E I 1 = ρ 740 ρ 720 [91]
Vogelmann Red Edge Index2 (VREI) V R E I 2 = ρ 734 ρ 747 ρ 715 + ρ 726 [91]
Water Band Index (WBI) W B I = ρ 970 ρ 900 [92]
The selected bands of Red, Green, Blue, and NIR should be within their respective wavelength ranges. For example, the wavelength of the selected red band must fall between 600–700 nm. In case multiple bands fall within this range simultaneously, choose the band with the closest central wavelength. Here, the ρ n u m represent the reflectivity of each corresponding band where num denotes its central wavelength. The selection principle for bands remains consistent as mentioned above.

Appendix B

The confusion matrix was employed to evaluate the classification accuracy of mangrove species.
Figure A1. Normalized confusion matrices of mangrove species classification using four learning models (AdaBoost, XGboost, RF, and LightGBM) based on UAV multi- and hyper-spectral images in Yingluo Bay.
Figure A1. Normalized confusion matrices of mangrove species classification using four learning models (AdaBoost, XGboost, RF, and LightGBM) based on UAV multi- and hyper-spectral images in Yingluo Bay.
Remotesensing 16 03093 g0a1

References

  1. Lassalle, G.; Ferreira, M.P.; La Rosa, L.E.C.; de Souza Filho, C.R. Deep learning-based individual tree crown delineation in mangrove forests using very-high-resolution satellite imagery. ISPRS J. Photogramm. Remote Sens. 2022, 189, 220–235. [Google Scholar] [CrossRef]
  2. Alongi, D.M. Impacts of Climate Change on Blue Carbon Stocks and Fluxes in Mangrove Forests. Forests 2022, 13, 149. [Google Scholar] [CrossRef]
  3. Zhao, C.; Jia, M.; Wang, Z.; Mao, D.; Wang, Y. Identifying mangroves through knowledge extracted from trained random forest models: An interpretable mangrove mapping approach (IMMA). ISPRS J. Photogramm. Remote Sens. 2023, 201, 209–225. [Google Scholar] [CrossRef]
  4. Jia, M.; Wang, Z.; Mao, D.; Ren, C.; Song, K.; Zhao, C.; Wang, C.; Xiao, X.; Wang, Y. Mapping global distribution of mangrove forests at 10-m resolution. Sci. Bull. 2023, 68, 1306–1316. [Google Scholar] [CrossRef]
  5. Cao, J.; Liu, K.; Zhuo, L.; Liu, L.; Zhu, Y.; Peng, L. Combining UAV-based hyperspectral and LiDAR data for mangrove species classification using the rotation forest algorithm. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102414. [Google Scholar] [CrossRef]
  6. Wang, L.; Jia, M.; Yin, D.; Tian, J. A review of remote sensing for mangrove forests: 1956–2018. Remote Sens. Environ. 2019, 231, 111223. [Google Scholar] [CrossRef]
  7. Wang, Z.; Liu, K.; Cao, J.; Peng, L.; Wen, X. Annual Change Analysis of Mangrove Forests in China during 1986–2021 Based on Google Earth Engine. Forests 2022, 13, 1489. [Google Scholar] [CrossRef]
  8. Zhang, R.; Jia, M.; Wang, Z.; Zhou, Y.; Mao, D.; Ren, C.; Zhao, C.; Liu, X. Tracking annual dynamics of mangrove forests in mangrove National Nature Reserves of China based on time series Sentinel-2 imagery during 2016–2020. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102918. [Google Scholar] [CrossRef]
  9. Zhao, C.; Jia, M.; Zhang, R.; Wang, Z.; Ren, C.; Mao, D.; Wang, Y. Mangrove species mapping in coastal China using synthesized Sentinel-2 high-separability images. Remote Sens. Environ. 2024, 307, 114151. [Google Scholar] [CrossRef]
  10. Su, J.; Friess, D.A.; Gasparatos, A. A meta-analysis of the ecological and economic outcomes of mangrove restoration. Nat. Commun. 2021, 12, 5050. [Google Scholar] [CrossRef]
  11. Bai, J.; Meng, Y.; Gou, R.; Dai, Z.; Zhu, X.; Lin, G. The linkages between stomatal physiological traits and rapid expansion of exotic mangrove species (Laguncularia racemosa) in new territories. Front. Mar. Sci. 2023, 10, 1136443. [Google Scholar] [CrossRef]
  12. Liu, T.; Zhou, B.J.; Jiang, H.; Yao, L. Mapping the number of mangrove trees in the Guangdong-Hong Kong-Macao Greater Bay Area. Mar. Pollut. Bull. 2023, 196, 115658. [Google Scholar] [CrossRef] [PubMed]
  13. Lassalle, G.; Ferreira, M.P.; Cué La Rosa, L.E.; Del’Papa Moreira Scafutto, R.; de Souza Filho, C.R. Advances in multi- and hyperspectral remote sensing of mangrove species: A synthesis and study case on airborne and multisource spaceborne imagery. ISPRS J. Photogramm. Remote Sens. 2023, 195, 298–312. [Google Scholar] [CrossRef]
  14. Fu, B.; He, X.; Liang, Y.; Deng, T.; Li, H.; He, H.; Jia, M.; Fan, D.; Wang, F. Examination of the performance of ASEL and MPViT algorithms for classifying mangrove species of multiple natural reserves of Beibu Gulf, south China. Ecol. Indic. 2023, 154, 110870. [Google Scholar] [CrossRef]
  15. Valderrama-Landeros, L.; Flores-de-Santiago, F.; Kovacs, J.M.; Flores-Verdugo, F. An assessment of commonly employed satellite-based remote sensors for mapping mangrove species in Mexico using an NDVI-based classification scheme. Environ. Monit. Assess. 2017, 190, 23. [Google Scholar] [CrossRef] [PubMed]
  16. Wang, D.; Wan, B.; Qiu, P.; Su, Y.; Guo, Q.; Wang, R.; Sun, F.; Wu, X. Evaluating the Performance of Sentinel-2, Landsat 8 and Pléiades-1 in Mapping Mangrove Extent and Species. Remote Sens. 2018, 10, 1468. [Google Scholar] [CrossRef]
  17. Bullock, E.L.; Fagherazzi, S.; Nardin, W.; Vo-Luong, P.; Nguyen, P.; Woodcock, C.E. Temporal patterns in species zonation in a mangrove forest in the Mekong Delta, Vietnam, using a time series of Landsat imagery. Cont. Shelf Res. 2017, 147, 144–154. [Google Scholar] [CrossRef]
  18. Zulfa, A.W.; Norizah, K.; Hamdan, O.; Faridah-Hanum, I.; Rhyma, P.P.; Fitrianto, A. Spectral signature analysis to determine mangrove species delineation structured by anthropogenic effects. Ecol. Indic. 2021, 130, 108148. [Google Scholar] [CrossRef]
  19. Peng, L.; Liu, K.; Cao, J.; Zhu, Y.; Li, F.; Liu, L. Combining GF-2 and RapidEye satellite data for mapping mangrove species using ensemble machine-learning methods. Int. J. Remote Sens. 2019, 41, 813–838. [Google Scholar] [CrossRef]
  20. Sun, Y.; Ye, M.; Jian, Z.; Ai, B.; Zhao, J.; Chen, Q. Species Classification and Carbon Stock Assessment of Mangroves in Qi’ao Island with Worldview-3 Imagery. Forests 2023, 14, 2356. [Google Scholar] [CrossRef]
  21. Wan, L.; Lin, Y.; Zhang, H.; Wang, F.; Liu, M.; Lin, H. GF-5 Hyperspectral Data for Species Mapping of Mangrove in Mai Po, Hong Kong. Remote Sens. 2020, 12, 656. [Google Scholar] [CrossRef]
  22. Prakash Hati, J.; Samanta, S.; Rani Chaube, N.; Misra, A.; Giri, S.; Pramanick, N.; Gupta, K.; Datta Majumdar, S.; Chanda, A.; Mukhopadhyay, A.; et al. Mangrove classification using airborne hyperspectral AVIRIS-NG and comparing with other spaceborne hyperspectral and multispectral data. Egypt. J. Remote Sens. Space Sci. 2021, 24, 273–281. [Google Scholar] [CrossRef]
  23. Osei Darko, P.; Kalacska, M.; Arroyo-Mora, J.P.; Fagan, M.E. Spectral Complexity of Hyperspectral Images: A New Approach for Mangrove Classification. Remote Sens. 2021, 13, 2604. [Google Scholar] [CrossRef]
  24. Jiang, Y.; Zhang, L.; Yan, M.; Qi, J.; Fu, T.; Fan, S.; Chen, B. High-Resolution Mangrove Forests Classification with Machine Learning Using Worldview and UAV Hyperspectral Data. Remote Sens. 2021, 13, 1529. [Google Scholar] [CrossRef]
  25. Jia, M.; Zhang, Y.; Wang, Z.; Song, K.; Ren, C. Mapping the distribution of mangrove species in the Core Zone of Mai Po Marshes Nature Reserve, Hong Kong, using hyperspectral data and high-resolution data. Int. J. Appl. Earth Obs. Geoinf. 2014, 33, 226–231. [Google Scholar] [CrossRef]
  26. Li, Z.; Zan, Q.; Yang, Q.; Zhu, D.; Chen, Y.; Yu, S. Remote Estimation of Mangrove Aboveground Carbon Stock at the Species Level Using a Low-Cost Unmanned Aerial Vehicle System. Remote Sens. 2019, 11, 1018. [Google Scholar] [CrossRef]
  27. Zimudzi, E.; Sanders, I.; Rollings, N.; Omlin, C.W. Remote sensing of mangroves using unmanned aerial vehicles: Current state and future directions. J. Spat. Sci. 2019, 66, 195–212. [Google Scholar] [CrossRef]
  28. Cao, J.; Leng, W.; Liu, K.; Liu, L.; He, Z.; Zhu, Y. Object-Based Mangrove Species Classification Using Unmanned Aerial Vehicle Hyperspectral Images and Digital Surface Models. Remote Sens. 2018, 10, 89. [Google Scholar] [CrossRef]
  29. Medellin, A.; Bhamri, A.; Langari, R.; Gopalswamy, S. Real-Time Semantic Segmentation using Hyperspectral Images for Mapping Unstructured and Unknown Environments. arXiv 2023, arXiv:2303.15623. [Google Scholar]
  30. Pham, T.; Yokoya, N.; Bui, D.; Yoshino, K.; Friess, D. Remote Sensing Approaches for Monitoring Mangrove Species, Structure, and Biomass: Opportunities and Challenges. Remote Sens. 2019, 11, 230. [Google Scholar] [CrossRef]
  31. Chen, R.; Zhang, R.; Zhao, C.; Wang, Z.; Jia, M. High-Resolution Mapping of Mangrove Species Height in Fujian Zhangjiangkou National Mangrove Nature Reserve Combined GF-2, GF-3, and UAV-LiDAR. Remote Sens. 2023, 15, 5645. [Google Scholar] [CrossRef]
  32. Deng, L.; Chen, B.; Yan, M.; Fu, B.; Yang, Z.; Zhang, B.; Zhang, L. Estimation of Species-Scale Canopy Chlorophyll Content in Mangroves from UAV and GF-6 Data. Forests 2023, 14, 1417. [Google Scholar] [CrossRef]
  33. Fu, B.; He, X.; Yao, H.; Liang, Y.; Deng, T.; He, H.; Fan, D.; Lan, G.; He, W. Comparison of RFE-DL and stacking ensemble learning algorithms for classifying mangrove species on UAV multispectral images. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102890. [Google Scholar] [CrossRef]
  34. Wen, X.; Jia, M.; Li, X.; Wang, Z.; Zhong, C.; Feng, E. Identification of mangrove canopy species based on visible unmanned aerial vehicle images. J. For. Environ. 2020, 40, 486–496. [Google Scholar] [CrossRef]
  35. Zaiming, Z.; Benqing, C.; Ran, X.; Wei, F. Identification of the mangrove species using UAV hyperspectral images: A case study of Zhangjiangkou mangrove national nature reserve. Haiyang Xuebao 2021, 43, 137–145. [Google Scholar] [CrossRef]
  36. Lina, Y.; Guifeng, Z.; Zheng, W.; Mianqing, W.; Jinke, L.; Liujing, W. Mangrove forest species classification based on the UAV hyperspectral images. Bull. Surv. Mapp. 2022, 26, 26–31. [Google Scholar] [CrossRef]
  37. Liu, Y.; Li, X.; Hua, Z.; Xia, C.; Zhao, L. A Band Selection Method with Masked Convolutional Autoencoder for Hyperspectral Image. IEEE Geosci. Remote Sens. Lett. 2022, 19, 6010005. [Google Scholar] [CrossRef]
  38. Chen, H.; Miao, F.; Chen, Y.; Xiong, Y.; Chen, T. A Hyperspectral Image Classification Method Using Multifeature Vectors and Optimized KELM. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 2781–2795. [Google Scholar] [CrossRef]
  39. Behera, M.D.; Barnwal, S.; Paramanik, S.; Das, P.; Bhattyacharya, B.K.; Jagadish, B.; Roy, P.S.; Ghosh, S.M.; Behera, S.K. Species-Level Classification and Mapping of a Mangrove Forest Using Random Forest—Utilisation of AVIRIS-NG and Sentinel Data. Remote Sens. 2021, 13, 2027. [Google Scholar] [CrossRef]
  40. Heenkenda, M.; Joyce, K.; Maier, S.; Bartolo, R. Mangrove Species Identification: Comparing WorldView-2 with Aerial Photographs. Remote Sens. 2014, 6, 6064–6088. [Google Scholar] [CrossRef]
  41. Kamal, M.; Phinn, S. Hyperspectral Data for Mangrove Species Mapping: A Comparison of Pixel-Based and Object-Based Approach. Remote Sens. 2011, 3, 2222–2242. [Google Scholar] [CrossRef]
  42. Pu, R.; Landry, S. A comparative analysis of high spatial resolution IKONOS and WorldView-2 imagery for mapping urban tree species. Remote Sens. Environ. 2012, 124, 516–533. [Google Scholar] [CrossRef]
  43. Wang, D.; Wan, B.; Qiu, P.; Su, Y.; Guo, Q.; Wu, X. Artificial Mangrove Species Mapping Using Pléiades-1: An Evaluation of Pixel-Based and Object-Based Classifications with Selected Machine Learning Algorithms. Remote Sens. 2018, 10, 294. [Google Scholar] [CrossRef]
  44. Fu, B.; Zuo, P.; Liu, M.; Lan, G.; He, H.; Lao, Z.; Zhang, Y.; Fan, D.; Gao, E. Classifying vegetation communities karst wetland synergistic use of image fusion and object-based machine learning algorithm with Jilin-1 and UAV multispectral images. Ecol. Indic. 2022, 140, 108989. [Google Scholar] [CrossRef]
  45. Lin, J.; Liu, X.; Lan, W.; Huang, Z. Conservation effectiveness of Hepu Dugong dugon National Nature Reserve of Guangxi Zhuang Autonomous Region. Wetl. Sci. 2020, 18, 461–467. [Google Scholar] [CrossRef]
  46. Shichu, L. Studies on the mangrove communities in Yingluo Bay of Guangxi [China]. Acta Phytoecol. Sin. 1996, 20, 310–320. [Google Scholar]
  47. Ning, Q.Y.; Lai, T.H.; Cao, Q.X.; Mo, Z.N.; Li, Y.H.; He, B.Y. Structures and dynamics of mangrove populations in Zhenzhu Bay, Guangxi. J. Appl. Oceanogr. 2022, 41, 42–52. [Google Scholar] [CrossRef]
  48. Catalão, J.; Navarro, A.; Calvão, J. Mapping Cork Oak Mortality Using Multitemporal High-Resolution Satellite Imagery. Remote Sens. 2022, 14, 2750. [Google Scholar] [CrossRef]
  49. Ogungbuyi, M.G.; Mohammed, C.; Ara, I.; Fischer, A.M.; Harrison, M.T. Advancing Skyborne Technologies and High-Resolution Satellites for Pasture Monitoring and Improved Management: A Review. Remote Sens. 2023, 15, 4866. [Google Scholar] [CrossRef]
  50. Silva, A.G.P.; Galvão, L.S.; Ferreira Júnior, L.G.; Teles, N.M.; Mesquita, V.V.; Haddad, I. Discrimination of Degraded Pastures in the Brazilian Cerrado Using the PlanetScope SuperDove Satellite Constellation. Remote Sens. 2024, 16, 2256. [Google Scholar] [CrossRef]
  51. Souza, A.A.d.; Galvão, L.S.; Korting, T.S.; Almeida, C.A. On a Data-Driven Approach for Detecting Disturbance in the Brazilian Savannas Using Time Series of Vegetation Indices. Remote Sens. 2021, 13, 4959. [Google Scholar] [CrossRef]
  52. Huang, X.; Zhang, L. A comparative study of spatial approaches for urban mapping using hyperspectral ROSIS images over Pavia City, northern Italy. Int. J. Remote Sens. 2009, 30, 3205–3221. [Google Scholar] [CrossRef]
  53. Crabbe, R.A.; Lamb, D.W.; Edwards, C. Discriminating between C3, C4, and Mixed C3/C4 Pasture Grasses of a Grazed Landscape Using Multi-Temporal Sentinel-1a Data. Remote Sens. 2019, 11, 253. [Google Scholar] [CrossRef]
  54. Luo, G.; Chen, G.; Tian, L.; Qin, K.; Qian, S.-E. Minimum noise fraction versus principal component analysis as a preprocessing step for hyperspectral imagery denoising. Can. J. Remote Sens. 2016, 42, 106–116. [Google Scholar] [CrossRef]
  55. Saini, R. Integrating vegetation indices and spectral features for vegetation mapping from multispectral satellite imagery using AdaBoost and random forest machine learning classifiers. Geomat. Environ. Eng. 2023, 17, 57–74. [Google Scholar] [CrossRef]
  56. Pham, T.D.; Le, N.N.; Ha, N.T.; Nguyen, L.V.; Xia, J.; Yokoya, N.; To, T.T.; Trinh, H.X.; Kieu, L.Q.; Takeuchi, W. Estimating mangrove above-ground biomass using extreme gradient boosting decision trees algorithm with fused sentinel-2 and ALOS-2 PALSAR-2 data in can Gio biosphere reserve, Vietnam. Remote Sens. 2020, 12, 777. [Google Scholar] [CrossRef]
  57. Huber, F.; Yushchenko, A.; Stratmann, B.; Steinhage, V. Extreme Gradient Boosting for yield estimation compared with Deep Learning approaches. Comput. Electron. Agric. 2022, 202, 107346. [Google Scholar] [CrossRef]
  58. Guo, Q.; Zhang, J.; Guo, S.; Ye, Z.; Deng, H.; Hou, X.; Zhang, H. Urban tree classification based on object-oriented approach and random forest algorithm using unmanned aerial vehicle (UAV) multispectral imagery. Remote Sens. 2022, 14, 3885. [Google Scholar] [CrossRef]
  59. Candido, C.; Blanco, A.; Medina, J.; Gubatanga, E.; Santos, A.; Ana, R.S.; Reyes, R. Improving the consistency of multi-temporal land cover mapping of Laguna lake watershed using light gradient boosting machine (LightGBM) approach, change detection analysis, and Markov chain. Remote Sens. Appl. Soc. Environ. 2021, 23, 100565. [Google Scholar] [CrossRef]
  60. Sang, M.; Xiao, H.; Jin, Z.; He, J.; Wang, N.; Wang, W. Improved Mapping of Regional Forest Heights by Combining Denoise and LightGBM Method. Remote Sens. 2023, 15, 5436. [Google Scholar] [CrossRef]
  61. Kavzoglu, T.; Teke, A. Predictive Performances of ensemble machine learning algorithms in landslide susceptibility mapping using random forest, extreme gradient boosting (XGBoost) and natural gradient boosting (NGBoost). Arab. J. Sci. Eng. 2022, 47, 7367–7385. [Google Scholar] [CrossRef]
  62. Natras, R.; Soja, B.; Schmidt, M. Ensemble machine learning of random forest, AdaBoost and XGBoost for vertical total electron content forecasting. Remote Sens. 2022, 14, 3547. [Google Scholar] [CrossRef]
  63. Shen, Z.; Miao, J.; Wang, J.; Zhao, D.; Tang, A.; Zhen, J. Evaluating Feature Selection Methods and Machine Learning Algorithms for Mapping Mangrove Forests Using Optical and Synthetic Aperture Radar Data. Remote Sens. 2023, 15, 5621. [Google Scholar] [CrossRef]
  64. Zeng, Y.; Hao, D.; Huete, A.; Dechant, B.; Berry, J.; Chen, J.M.; Joiner, J.; Frankenberg, C.; Bond-Lamberty, B.; Ryu, Y.; et al. Optical vegetation indices for monitoring terrestrial ecosystems globally. Nat. Rev. Earth Environ. 2022, 3, 477–493. [Google Scholar] [CrossRef]
  65. Jiang, Y.F. Classification of Mangrove Species Using High-Resolution Multi-Sourse Remote Sensing Images. Master’s Thesis, Shandong Agricultural University, Tai’an, China, 2021. [Google Scholar]
  66. Xu, Y.; Zhen, J.; Jiang, X.; Wang, J. Mangrove species classification with UAV-based remote sensing data and XGBoost. Natl. Remote Sens. Bull 2021, 25, 737–752. [Google Scholar] [CrossRef]
  67. Gitelson, A.A.; Merzlyak, M.N.; Chivkunova, O.B. Optical properties and nondestructive estimation of anthocyanin content in plant leaves. Photochem. Photobiol. 2001, 74, 38–45. [Google Scholar] [CrossRef]
  68. Gitelson, A.A.; Zur, Y.; Chivkunova, O.B.; Merzlyak, M.N. Assessing carotenoid content in plant leaves with reflectance spectroscopy. Photochem. Photobiol. 2002, 75, 272–281. [Google Scholar] [CrossRef] [PubMed]
  69. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  70. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  71. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  72. Sripada, R.P. Determining In-Season Nitrogen Requirements for Corn Using Aerial Color-Infrared Photography; North Carolina State University: Raleigh, NC, USA, 2005. [Google Scholar]
  73. Sripada, R.P.; Heiniger, R.W.; White, J.G.; Meijer, A.D. Aerial color infrared photography for determining early in-season nitrogen requirements in corn. Agron. J. 2006, 98, 968–977. [Google Scholar] [CrossRef]
  74. Boegh, E.; Soegaard, H.; Broge, N.; Hasager, C.; Jensen, N.; Schelde, K.; Thomsen, A. Airborne multispectral data for quantifying leaf area index, nitrogen concentration, and photosynthetic efficiency in agriculture. Remote Sens. Environ. 2002, 81, 179–193. [Google Scholar] [CrossRef]
  75. Daughtry, C.S.; Walthall, C.; Kim, M.; De Colstoun, E.B.; McMurtrey Iii, J. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  76. Datt, B. A new reflectance index for remote sensing of chlorophyll content in higher plants: Tests using Eucalyptus leaves. J. Plant Physiol. 1999, 154, 30–36. [Google Scholar] [CrossRef]
  77. Sims, D.A.; Gamon, J.A. Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
  78. Bernstein, L.S.; Jin, X.; Gregor, B.; Adler-Golden, S.M. Quick atmospheric correction code: Algorithm description and recent upgrades. Opt. Eng. 2012, 51, 111719. [Google Scholar] [CrossRef]
  79. Peñuelas, J.; Filella, I.; Gamon, J.A. Assessment of photosynthetic radiation-use efficiency with spectral reflectance. New Phytol. 1995, 131, 291–296. [Google Scholar] [CrossRef]
  80. Gamon, J.; Serrano, L.; Surfus, J. The photochemical reflectance index: An optical indicator of photosynthetic radiation use efficiency across species, functional types, and nutrient levels. Oecologia 1997, 112, 492–501. [Google Scholar] [CrossRef] [PubMed]
  81. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive optical detection of pigment changes during leaf senescence and fruit ripening. Physiol. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
  82. Curran, P.J.; Dungan, J.L.; Gholz, H.L. Exploring the relationship between reflectance red edge and chlorophyll content in slash pine. Tree Physiol. 1990, 7, 33–48. [Google Scholar] [CrossRef]
  83. Gamon, J.; Surfus, J. Assessing leaf pigment content and activity with a reflectometer. New Phytol. 1999, 143, 105–117. [Google Scholar] [CrossRef]
  84. Birth, G.S.; McVey, G.R. Measuring the color of growing turf with a reflectance spectrophotometer 1. Agron. J. 1968, 60, 640–643. [Google Scholar] [CrossRef]
  85. Penuelas, J.; Baret, F.; Filella, I. Semi-empirical indices to assess carotenoids/chlorophyll a ratio from leaf spectral reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
  86. Lobell, D.B.; Asner, G.P. Hyperion studies of crop stress in Mexico. In Proceedings of the 12th JPL Airborne Earth Science Workshop, Pasadena, CA, USA, 4–8 March 1996. [Google Scholar]
  87. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  88. Bannari, A.; Asalhi, H.; Teillet, P.M. Transformed difference vegetation index (TDVI) for vegetation cover mapping. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Toronto, ON, Canada, 24–28 June 2002; pp. 3053–3055. [Google Scholar]
  89. Hunt Jr, E.R.; Daughtry, C.; Eitel, J.U.; Long, D.S. Remote sensing leaf chlorophyll content using a visible band index. Agron. J. 2011, 103, 1090–1099. [Google Scholar] [CrossRef]
  90. Gitelson, A.; Stark, R.; Grits, U.; Rundquist, D.; Kaufman, Y.; Derry, D. Vegetation and soil lines in visible spectral space: A concept and technique for remote estimation of vegetation fraction. Int. J. Remote Sens. 2002, 23, 2537–2562. [Google Scholar] [CrossRef]
  91. Vogelmann, J.; Rock, B.; Moss, D. Red edge spectral measurements from sugar maple leaves. Int. J. Remote Sens. 1993, 14, 1563–1575. [Google Scholar] [CrossRef]
  92. Peñuelas, J.; Filella, I.; Biel, C.; Serrano, L.; Save, R. The reflectance at the 950–970 nm region as an indicator of plant water status. Int. J. Remote Sens. 1993, 14, 1887–1905. [Google Scholar] [CrossRef]
Figure 1. Study area and UAV-based visible image ((A): Yingluo Bay, (B): Pearl Bay).
Figure 1. Study area and UAV-based visible image ((A): Yingluo Bay, (B): Pearl Bay).
Remotesensing 16 03093 g001
Figure 2. Workflow diagram illustrating the methodology of this study.
Figure 2. Workflow diagram illustrating the methodology of this study.
Remotesensing 16 03093 g002
Figure 3. Mangrove species classification comparison of user’s and producer’s accuracies obtained by four learning models based on multi- and hyper-spectral images in Yingluo Bay.
Figure 3. Mangrove species classification comparison of user’s and producer’s accuracies obtained by four learning models based on multi- and hyper-spectral images in Yingluo Bay.
Remotesensing 16 03093 g003
Figure 4. Mangrove species classification comparison of user’s and producer’s accuracies obtained by LightGBM learning model based on the multi- and hyper-spectral image in Pearl Bay.
Figure 4. Mangrove species classification comparison of user’s and producer’s accuracies obtained by LightGBM learning model based on the multi- and hyper-spectral image in Pearl Bay.
Remotesensing 16 03093 g004
Figure 5. The mangrove species classification maps using four learning models (LightGBM, RF, XGBoost, and AdaBoost) based on UAV multispectral image (ad) and hyperspectral image (eh), respectively, in Yingluo Bay.
Figure 5. The mangrove species classification maps using four learning models (LightGBM, RF, XGBoost, and AdaBoost) based on UAV multispectral image (ad) and hyperspectral image (eh), respectively, in Yingluo Bay.
Remotesensing 16 03093 g005
Figure 6. The UAV visual image covering Yingluo Bay and three subsets (AC) of the UAV multispectral and hyperspectral image classification results based on the LightGBM learning model.
Figure 6. The UAV visual image covering Yingluo Bay and three subsets (AC) of the UAV multispectral and hyperspectral image classification results based on the LightGBM learning model.
Remotesensing 16 03093 g006
Figure 7. The mangrove species classification maps using the LightGBM learning model based on UAV multispectral image (a) and hyperspectral image (b) in Pearl Bay.
Figure 7. The mangrove species classification maps using the LightGBM learning model based on UAV multispectral image (a) and hyperspectral image (b) in Pearl Bay.
Remotesensing 16 03093 g007
Figure 8. The UAV visual image covering Pearl Bay and three subsets (AC) of the UAV multispectral and hyperspectral image classification results using LightGBM learning model.
Figure 8. The UAV visual image covering Pearl Bay and three subsets (AC) of the UAV multispectral and hyperspectral image classification results using LightGBM learning model.
Remotesensing 16 03093 g008
Table 1. UAV equipment parameters and image data acquisition information.
Table 1. UAV equipment parameters and image data acquisition information.
ParametersVisible ImageMultispectral ImageHyperspectral Image
UAV modelDJI P4 RTKDJI P4M RTKDJI M300 RTK
SensorRGB cameraMultispectral cameraULTRIS X20 PLUS
Acquisition time7 November 2022/24 April 202320 June 2023/24 April 20239 November 2022/25 July 2023
Flight altitude (m)5080/100120/150
FOV/Maximum filed angle (°)8462.735
Band range (nm)400~700450~840350~1000
Spectral resolution (nm)4
Spatial resolution (m)0.020.03/0.050.03/0.06
Number of bands35164
Table 2. Field survey photos of mangrove plant species and other cover types in the study area and establishment of interpretation signs from UAV visible image.
Table 2. Field survey photos of mangrove plant species and other cover types in the study area and establishment of interpretation signs from UAV visible image.
Species and Cover TypeFile Survey PhotoUAV ImageInterpretation Signs
RS 1Remotesensing 16 03093 i001Remotesensing 16 03093 i002Dark green in color, with a regular texture resembling dense points, occurring in continuous patches.
BG 2Remotesensing 16 03093 i003Remotesensing 16 03093 i004Light green in color, irregular in texture, with blurred boundaries, clustered in patches, and dispersed distribution.
AM 3Remotesensing 16 03093 i005Remotesensing 16 03093 i006Composed of light and tender green colors, with a rough texture, predominantly distributed around river channels.
AC 4Remotesensing 16 03093 i007Remotesensing 16 03093 i008Yellow-green in color, featuring a relatively smooth texture with frequent small gaps, primarily distributed around river channels.
KC 5Remotesensing 16 03093 i009Remotesensing 16 03093 i010Green in color, appearing clustered with a uniform hue.
EA 6Remotesensing 16 03093 i011Remotesensing 16 03093 i012Bright green in color, characterized by rough texture, mostly growing along the coastline in scattered distribution.
HT 7Remotesensing 16 03093 i013Remotesensing 16 03093 i014Yellow-green in color, displaying a mixed hue, with visibly rough texture, mostly growing along the coastline in a patchy distribution.
SA 8Remotesensing 16 03093 i015Remotesensing 16 03093 i016Light green in color, uniform in hue, with smooth texture, clustered in patches, predominantly located near the seaside.
WB 9Remotesensing 16 03093 i017Remotesensing 16 03093 i018Mainly composed of gray and light green colors, with a smooth and delicate texture.
MF 10Remotesensing 16 03093 i019Remotesensing 16 03093 i020Predominantly gray-brown, with a smooth texture and uniform tone.
RD 11Remotesensing 16 03093 i021Remotesensing 16 03093 i022Composed of bright white and light gray colors, displaying a striped distribution.
1 Rhizophora stylosa, 2 Bruguiera gymnorrhiza, 3 Avicennia marina, 4 Aegiceras cor-niculatum, 5 Kandelia candel, 6 Excoecaria agallocha, 7 Hibiscus tiliaceus, 8 Spartina alterniflora, 9 water bodies, 10 mudflats, and 11 roads.
Table 3. The optimal feature set is derived from multi- and hyper-spectral images in two study sites, respectively.
Table 3. The optimal feature set is derived from multi- and hyper-spectral images in two study sites, respectively.
Study SitesData SourcesDimensionFeature TypesOptimal Features
Yingluo BayMulti17Vegetation IndicesARI1, GEMI, LAI, MCARI, NDMI, SRI, SGI, TDVI, TGI
Texture features 1m_5_0_Con, m_7_180_Con, m_7_270_Mea, m_7_270_Con, m_7_45_Con, m_7_45_ASM
Spectral featuresRed, RedEdge
Hyper25Vegetation IndicesCRI2, SRI, TGI, MRENDVI, VARI, MRESR, SIPI, NDMI, RGRI, PSRI, ARI1, ARI2, REPI, PRI, WBI
Texture features 2h_3_90_Mea
Spectral features 3h_band52, h_band89, h_band92, h_band98, h_band147, h_band150, h_band153, h_band155, h_band157
Pearl BayMulti12Vegetation IndicesARI1, GARI, MCARI, NDMI, RGRI, TGI
Texture features 4m_5_45_Mea, m_7_0_Con, m_7_180_Con, m_7_225_Con, m_7_270_Con
Spectral featuresBlue
Hyper16Vegetation IndicesARI1, ARI2, CRI2, GLI, MRESR, NDMI, SRI, SIPI, TCARI, VREI1, VREI2
Texture features 5h_5_180_Var
Spectral features 6h_band52, h_band91, h_band101, h_band162
1,4 The abbreviation “m” represents multispectral data. The numbers 3, 5, and 7 represent the dimensions of the sliding window, the angles 0, 45, 90, 135, 180, 225, and 270 represent the directional movement of the sliding window. Contrast (Con), Mean (Mea), Angular Second Moment (ASM), and Variance (Var). 2,5 The abbreviation “h” represents hyperspectral data, and the interpretations of additional symbols align with the multispectral data. 3,6 The “band + numbers” indicates the spectral reflectance values for different bands in hyperspectral data. The abbreviations of vegetation index can be found in Table A1 of Appendix A.
Table 4. The mangrove species classification accuracy obtained from four machine learning models based on UAV multi- and hyper-spectral images in Yingluo Bay.
Table 4. The mangrove species classification accuracy obtained from four machine learning models based on UAV multi- and hyper-spectral images in Yingluo Bay.
ModelsMultispectral ImageHyperspectral Image
OA (%)KappaOA (%)Kappa
AdaBoost63.050.5682.960.79
XGBoost80.370.7794.260.93
RF80.500.7795.730.95
LightGBM80.960.7897.150.97
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, Y.; Meng, Z.; Zu, J.; Cai, W.; Wang, J.; Su, H.; Yang, J. Fine-Scale Mangrove Species Classification Based on UAV Multispectral and Hyperspectral Remote Sensing Using Machine Learning. Remote Sens. 2024, 16, 3093. https://doi.org/10.3390/rs16163093

AMA Style

Yang Y, Meng Z, Zu J, Cai W, Wang J, Su H, Yang J. Fine-Scale Mangrove Species Classification Based on UAV Multispectral and Hyperspectral Remote Sensing Using Machine Learning. Remote Sensing. 2024; 16(16):3093. https://doi.org/10.3390/rs16163093

Chicago/Turabian Style

Yang, Yuanzheng, Zhouju Meng, Jiaxing Zu, Wenhua Cai, Jiali Wang, Hongxin Su, and Jian Yang. 2024. "Fine-Scale Mangrove Species Classification Based on UAV Multispectral and Hyperspectral Remote Sensing Using Machine Learning" Remote Sensing 16, no. 16: 3093. https://doi.org/10.3390/rs16163093

APA Style

Yang, Y., Meng, Z., Zu, J., Cai, W., Wang, J., Su, H., & Yang, J. (2024). Fine-Scale Mangrove Species Classification Based on UAV Multispectral and Hyperspectral Remote Sensing Using Machine Learning. Remote Sensing, 16(16), 3093. https://doi.org/10.3390/rs16163093

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop