Next Article in Journal
Multi-Perspective Information Fusion Network for Remote Sensing Segmentation
Previous Article in Journal
Modelling the Remote Sensing Reflectance for the Sea Surface Layer Using Empirical Inherent Optical Properties
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Spectral Characterization of Nine Urban Tree Species in Southern Wisconsin

1
Department of Geography, Geology, and Environmental Science, University of Wisconsin–Whitewater, Whitewater, WI 53190, USA
2
Wisconsin Emergency Management, Madison, WI 53704, USA
3
Fehr Graham Engineering & Environmental, Rockford, IL 61107, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2026, 18(1), 99; https://doi.org/10.3390/rs18010099 (registering DOI)
Submission received: 18 November 2025 / Revised: 14 December 2025 / Accepted: 19 December 2025 / Published: 27 December 2025
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)

Highlights

What are the main findings?
  • Incorporating first derivatives of hyperspectral data and vegetation indices in a random forest model achieved the highest predictive performance (80.4%).
  • The red-edge and shortwave infrared 1 (SWIR 1) regions provided the most influential variables for the random forest classification.
What are the implications of the main findings?
  • While hyperspectral reflectance features aid species discrimination, they do not capture all variability, and adding non-spectral variables may improve accuracy.
  • Targeted reductions—such as excluding spectral reflectance when first derivatives and vegetation indices already capture key variation—may improve accuracy and reduce computational cost.

Abstract

Urban trees provide essential environmental, health, social, and economic benefits. Consequently, researchers and stakeholders devote considerable effort to characterizing, mapping, and monitoring urban tree species. Traditional identification methods that rely on field surveys are labor-intensive and time-consuming. This study evaluated the potential of field hyperspectral spectroscopy to classify nine common urban tree species at the leaf level. Seven random forest classifiers, each using different combinations of spectral features, were compared for classification accuracy. The model that incorporated both first derivatives of spectral reflectance and vegetation indices achieved the highest overall accuracy (80.4%), whereas the model combining spectral reflectance and vegetation indices had the lowest predictive performance (70.1%). The most influential predictors were spectral bands and first derivatives in the red-edge and SWIR 1 regions; and the vegetation indices Red-edge Vegetation Stress Index (RVSI), Plant Senescence Reflectance Index (PSRI), and Blue Ratio (BR). These results support the use of hyperspectral remote sensing for identifying and classifying urban tree species.

1. Introduction

As urbanization accelerates, with an estimated 68% of the global population projected to live in cities by 2050, environmental degradation is expected to intensify. Green infrastructure, such as trees, can mitigate many of the adverse effects of urbanization [1]. Urban trees provide a wide range of ecosystem, health, social, and economic benefits. Among their ecosystem services, they improve air quality by removing pollutants such as ozone, sulfur dioxide, nitrogen dioxide, smog, and particulate matter [2]. They also capture and store carbon dioxide, helping to mitigate greenhouse gas concentrations [3]. Trees filter, absorb, and evapotranspire stormwater, reducing surface water flows, lowering flood risk, easing the burden on sewer systems, and improving water quality [4]. Additionally, urban trees provide habitats for wildlife and help mitigate noise pollution. By reducing solar radiation and modifying local microclimates, they alleviate the urban heat island effect, which in turn decreases annual energy consumption [5].
Urban trees also provide numerous health and social benefits. By reducing air pollutants, they are associated with lower incidences of childhood asthma and a reduced prevalence of lung cancer [6,7]. Although trees release pollen that can trigger allergies in sensitive individuals and emit volatile organic compounds, these drawbacks are outweighed by their broader environmental and public health benefits. The presence of urban trees has been linked to fewer heat-related ambulance calls during extreme heat events [8]. Exposure to street trees has also been shown to enhance attention and perceived restorativeness [9]. Greater tree canopy cover is associated with lower levels of depression, anxiety, and stress [10], while even modest increases in canopy within 50 m of a residence can reduce the likelihood of small-for-gestational-age births [11]. Economically, urban trees have a positive influence on property and land values, reduce public expenditure on air pollution mitigation and stormwater infrastructure, and contribute significant annual savings in heating and cooling costs [12].
Assessing and monitoring urban tree resources is essential for sustaining community well-being. Traditional tree surveys are often time-consuming and labor-intensive. Remote sensing technologies offer an efficient alternative, enabling large-scale data collection in a short period of time. In recent years, various instruments and classification techniques have been employed to classify urban tree species with varying degrees of success. For multispectral data, for example, Pu & Landry [13] evaluated different imagery sources and classification algorithms to discriminate among seven urban tree species. They reported that using WorldView-2 imagery increased classification accuracy by approximately 16–18% compared to IKONOS, likely due to its superior spectral and spatial resolution. Accuracy also varied according to the classification algorithm: WorldView-2 imagery with linear discriminant analysis (LDA) achieved 56%, whereas classification and regression trees (CART) achieved 54%. In contrast, Immitzer et al. [14] reported that coupling WorldView-2 data with a random forest classifier improved species discrimination accuracy to 82% for ten urban tree species.
Hyperspectral data are characterized by numerous narrow spectral bands (about 5–10 nm wide) spanning the ultraviolet, visible, and infrared regions of the electromagnetic spectrum. Hyperspectral sensors can be mounted on various platforms, including satellites, aircraft, unmanned aerial vehicles (UAVs), and field spectroradiometers. Most tree classification studies have utilized hyperspectral sensors mounted on above-ground platforms (e.g., satellites, airborne platforms, or UAVs), allowing for the observation of the entire tree crown from above. However, relatively few studies have examined hyperspectral signatures at the leaf level.
Generally, hyperspectral data yield higher classification accuracy. Hyperspectral satellites such as PRISMA (PRecursore IperSpettrale della Missione Applicativa) and EO-1 (Earth Observing-1) Hyperion have been successfully employed to discriminate among forest types. Compared with the Sentinel-2 Multispectral Instrument (MSI), PRISMA demonstrated greater effectiveness in distinguishing coniferous from broadleaf forests [15]. Similarly, EO-1 Hyperion exhibited superior capability to discriminate between these forest types compared with Landsat [16].
Airborne hyperspectral data, which combine numerous spectral bands with high spatial resolution, have proven highly effective for classifying urban tree species. The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and its successor, AVIRIS-Next Generation (AVIRIS-NG), are two prominent examples of airborne hyperspectral instruments. Hati et al. [17] reported that AVIRIS-NG outperformed the EO-1 Hyperion sensor in classifying mangrove species, likely due to its higher spatial resolution. Similarly, Paramanik et al. [18] achieved an 88% accuracy in discriminating among three mangrove species using AVIRIS-NG data. Alonzo et al. [19] used AVIRIS data to classify 15 common urban trees in California with an accuracy of 86%.
Hyperspectral UAV-borne data have also demonstrated strong potential for tree classification. Compared with airborne hyperspectral sensors, UAV-borne sensors offer lower operational costs and customizable spectral configurations [20]. La Rosa et al. [21] mapped tree species in dense tropical forests using a CMOSIS CMV400 sensor camera, achieving an accuracy of approximately 88%. Similarly, Abbas et al. [22] classified 19 tree species in Hong Kong using a hyperspectral camera coupled with a deep neural network algorithm, attaining overall accuracies ranging from 85% to 96% depending on the season.
The fusion of high spatial resolution hyperspectral imagery with LiDAR data has been widely explored, as the inclusion of LiDAR information typically enhances classification accuracy. For example, when hyperspectral data were used in a canonical discriminant analysis to classify 29 tree species, the overall accuracy reached 79%; however, integrating LiDAR data improved the overall accuracy by an additional 4% [23]. Similarly, when visible and near-infrared hyperspectral data were applied to classify seven tree species, the overall accuracy was 57% but increased by 19% with the inclusion of LiDAR [24]. Likewise, when hyperspectral-derived spectral indices were used in a random forest classifier to discriminate 15 urban tree species, the overall accuracy improved from 51% to 70% upon the addition of LiDAR data [25].
A variety of algorithms have been employed to classify tree species from hyperspectral data. Broadly, these approaches can be grouped into traditional machine learning methods and deep learning techniques [26]. Commonly used machine learning algorithms include canonical discriminant analysis (CDA), support vector machine (SVM), K-nearest neighbor (KNN), and random forest (RF). Zhong et al. [27] reported that RF and SVM are the two most widely used classifiers among traditional machine learning approaches for tree species discrimination, with SVM being the more computationally intensive. Deep learning, often considered a subset of machine learning, offers the advantage of automatically learning complex spatial and spectral patterns from hyperspectral data. Overall, deep learning algorithms tend to yield higher classification accuracies.
CDA is effective in separating highly overlapping classes, as is the case with tree species [23]. It operates by maximizing the between-group variance to achieve class separation. One of its advantages is the ability to handle situations where the number of independent variables exceeds the number of observations, as is often the case with hyperspectral data. SVM is a supervised learning algorithm that performs robustly with high-dimensional data, such as hyperspectral imagery, and typically yields high classification accuracies [28]. The KNN algorithm assigns an object to a class based on the classes of its k nearest neighbors. The choice of k is critical, as values that are too small or too large can lead to misclassification [26]. In the RF algorithm, multiple decision trees are generated, and class assignment is determined through a majority voting system. RF offers several advantages, including its ability to rank variable importance and its strong performance with high-dimensional datasets [29].
Several comparative studies have evaluated the effectiveness of different classification methods, yielding mixed results. Xie et al. [29] compared six classification algorithms, including RF, KNN, and SVM, and found that no single algorithm consistently outperformed the others across all tree species. Instead, specific algorithms were better suited for particular groups of species. Similarly, Nevalainen et al. [30] compared five classification algorithms, including RF and KNN, and reported that RF achieved the highest overall accuracy (95%). In another study, Sabat-Tomala et al. [31] compared SVM and RF algorithms for discriminating invasive species and found that both methods achieved similarly high accuracies. However, SVM performed better for some species (e.g., blackberry and wood small-reed), whereas RF was more effective for others (e.g., goldenrod and background vegetation).
Despite extensive research on multispectral and hyperspectral classification of tree species, most studies have focused on canopy-level data from airborne or satellite platforms and relatively few have characterized urban species at the leaf level, where many biochemical and structural signals originate. As a result, the spectral behavior of common urban trees in regions such as southern Wisconsin remains poorly documented. This study addresses that gap by developing a localized leaf level spectral library for nine widely planted urban species using full-range field spectroscopy. In addition, we systematically evaluated three major hyperspectral feature groups—spectral reflectance (SR), first derivatives (FD), and vegetation indices (VI)—both individually and in combination. Our results show that a reduced feature set (FD–VI) can outperform the full feature set (SR–FD–VI), suggesting that more features do not necessarily improve classification accuracy and highlighting the value of targeted feature selection. The analysis also revealed that variables in the red-edge and SWIR 1 regions were consistently the most important for species discrimination, underscoring the role of physiological and structural traits that influence reflectance in these portions of the spectrum. Collectively, these contributions provide baseline spectral information that can support future modeling, species mapping, and validation efforts in regional and urban contexts and offer guidance on how specific spectral feature types can be prioritized to improve model performance.
In this paper, the term ‘urban trees’ encompasses both individual trees and forest stands, whether publicly or privately owned, located along streets, in yards, or within protected areas of cities and towns. It includes all woody perennial plants with a single trunk and a distinct crown. The aim of this study is to identify specific spectral features that can aid in discriminating nine common urban tree species at the leaf level in southern Wisconsin, United States. The specific objectives are: (1) to classify the tree species using the random forest algorithm, (2) to identify the most important features for species discrimination, and (3) to validate the accuracy of the resulting models.

2. Materials and Methods

2.1. Study Area

Seven cities located in the eastern ridges and lowlands of the state of Wisconsin were selected for this study (Figure 1). The cities of Fitchburg, Verona, McFarland, and Stoughton are part of the Madison Metropolitan Statistical Area in Dane County. The city of Whitewater spans Walworth and Jefferson counties, while the cities of Waukesha and Oconomowoc are situated in Waukesha County.
Among these cities, Waukesha has the highest population density, with 1095 people per square kilometer, whereas Fitchburg has the lowest, at 324 people per square kilometer [32]. July is the hottest month in this region, with an average high temperature of 27.4 °C and a record maximum of 43 °C. June is typically the wettest month, receiving an average precipitation of 124 mm [33]. The landscape is a mix of urban, suburban, and peri-urban areas, featuring residential neighborhoods, commercial districts, parks, and riparian corridors. Urban tree cover varies across cities, ranging from densely planted street trees in older neighborhoods to larger forested stands in parks and protected areas.

2.2. Field Data Collection and Preprocessing

Field data were collected during the peak of the growing season, from June to August. Nine of the most common deciduous tree species in southern Wisconsin were identified using the Wisconsin Community Tree Map, a comprehensive application that consolidates urban tree inventories from over 170 organizations. A total of 661 trees, distributed across 33 sampling sites in the seven cities, were surveyed (Table 1). All sample trees were in public areas, including parks and sidewalks. Tree species were confirmed in the field using the PlantNet app and a taxonomic guide, and only mature, healthy specimens were included. Field measurements consisted of diameter at breast height (DBH) and spectral signatures from five leaves per tree.
Spectral data were collected on-site with the HR-1024 spectroradiometer coupled with the Leaf Clip and Reflectance Probe (LC-RP Pro), both from Spectra Vista Corporation (Poughkeepsie, NY, USA). The HR-1024 is a field-portable instrument that measures spectral reflectance from 342 to 2516 nm, producing 974 spectral bands with widths ranging from 1 to 4 nm. The LC-RP Pro provides a controlled source or artificial illumination, allowing spectral measurements under any lighting condition.
Using the SVC HR-1024 PC Data Acquisition Software (version 1.14.0), the five leaf spectra collected per tree were averaged to account for natural variability among leaves of the same tree. The averaged spectra were resampled to a uniform 2 nm interval over the range of 400 to 2400 nm to standardize the bandwidth. Resampled bands between 950 and 1000 nm were removed to minimize noise, resulting in a final spectral signature for each tree composed of 977 bands, each 2 nm wide, covering 400 to 950 nm and 1000 to 2400 nm. For simplicity, these processed spectral signatures are hereafter referred to as spectral reflectance bands.

2.3. Spectral Features Extraction and Selection

A total of 1988 spectral features were extracted as predictor variables for the random forest model. These included the 977 spectral reflectance bands obtained for each surveyed tree, along with two additional feature groups. The first group consisted of 976 features corresponding to the first derivatives of the spectral reflectance values, calculated using the finite backward divided difference (BDD) equation (Equation (1)):
ρ i = ρ i   ρ i 1 λ i λ i 1
where λ i is the i wavelength, ρ i is the reflectance at the i wavelength, and ρ i is the first derivative of ρ i . First derivatives enhance spectral feature discrimination by emphasizing subtle changes in reflectance across wavelengths. The second group comprised 35 spectral indices derived from combinations of bands in the visible and near-infrared regions to ensure comprehensive coverage of physiologically meaningful spectral features (Table 2). The selection of these vegetation indices was informed by prior hyperspectral and plant-physiology literature and they capture key vegetation traits, including chlorophyll and carotenoid content, photosynthetic activity, canopy structure, stress or senescence related spectral responses, and leaf pigmentation.

2.4. Random Forest and Accuracy Assessment

Random forest (RF) is a machine learning algorithm for classification and regression, initially developed by Breiman [58]. The method constructs an ensemble of decision trees by combining bootstrap aggregation (bagging) with random feature selection. Bagging involves sampling with replacement from the training dataset, and each sample is used to build a decision tree in the forest. During the construction of each tree, a random subset of features, defined by a model parameter, is considered at each node, and the feature that minimizes Gini impurity is chosen to perform the split. Out-of-bag (OOB) samples are used to internally estimate model accuracy and generalization error. Finally, each tree in the ensemble casts one vote, and the class label receiving the majority of votes is assigned as the final prediction.
The RF algorithm offers many advantages. It ranks variables according to their relative importance, enabling users to identify predictor variables that contribute little to model performance. The random feature selection strategy reduces correlation among trees and improves computational efficiency compared with other ensemble methods. RF is also robust to noise and outliers, and its predictive accuracy is often comparable to, or higher than, that of algorithms such as AdaBoost. Furthermore, it performs well with high-dimensional datasets and can effectively manage multicollinearity, making it particularly suitable for applications involving hyperspectral data.
Seven iterations of the RF model were performed using different combinations of features (Table 3). For each iteration, the dataset was randomly divided into training (70%) and validation (30%) subsets using stratified sampling by tree species. Two parameters were tuned in the RF models: ntree and mtry. The ntree parameter defines the number of trees grown in the forest and was initially set to 1000, higher than the default value of 500, to enhance model stability. The mtry parameter defines the number of features randomly selected as candidates at each split, with a default value of sqrt(p), where p represents the total number of features.
The ntree parameter was optimized by examining the OOB error across all the trees grown during model construction. The minimum OOB error was identified, and all trees with an OOB error within 2% of this minimum were considered part of the stability plateau. The last tree within this range, plus a buffer of five trees, was selected as the optimized ntree value to ensure stability and computational efficiency.
The mtry parameter was then tuned using the tuneRF function in the randomForest package in R. Starting from the default value of sqrt(p), nearby mtry values were iteratively evaluated using a step factor of 1.2. At each step, a forest was fitted using the optimized ntree, and the OOB error was computed. The search continued until improvements in OOB error fell below 1% and the mtry value yielding the lowest OOB error was selected. This two-stage approach provided data-driven hyperparameter settings tailored to each model configuration. The optimized settings slightly improved the predictive performance of the models compared with the default configurations.
Variable importance was evaluated using the mean decrease in accuracy (MDA), which is calculated by permuting each predictor in the OOB samples and measuring the resulting reduction in classification accuracy. For each variable, the OOB accuracy obtained with the original data is first calculated. The values of that variable are then randomly permuted while all other predictors remain unchanged, and the OOB accuracy is recalculated. The difference between the original and permuted accuracies represents the MDA score for that variable. Variables associated with larger decreases in accuracy are interpreted as more important for model prediction [58]. The MDA plot displays these scores, ranked from the most to the least influential predictors.
The analysis was performed in RStudio (v 2022.07.1) using the ggplot2, cowplot, caret, and randomForest libraries. The predictive performance of each model was evaluated on the validation dataset using a confusion matrix, from which the user’s, producer’s, and overall accuracies were computed.

3. Results

3.1. Spectral Reflectance

The spectral signatures of the nine tree species followed the typical pattern for vegetation (Figure 2). There was low reflectance in the visible portion of the spectrum (400–700 nm), a sharp increase in reflectance in the red-edge region, high reflectance in the near-infrared (NIR) portion of the spectrum, and then a decrease in reflectance in the shortwave infrared (SWIR) region (1300–2400 nm). There are two dips in the SWIR at around 1400 and 1900 nm, which correspond to the water absorption bands.
American linden, littleleaf linden, and Callery pear had the lowest peak in the green portion of the spectrum, whereas red maple and common hackberry had the most pronounced. In the 700–900 nm region, Japanese lilac reflected about 50% and 60% of the radiation, while American linden, littleleaf linden, Norway maple, and common hackberry reflected between 40% and 50%. Callery pear had a deeper water absorption trough around 1400 nm compared to the other species. Kentucky coffeetree had a wider range of values in the NIR compared to the narrower range observed in the other species. This may indicate a higher within-species variability.

3.2. Vegetation Indices

Vegetation indices that evaluate the difference between the near-infrared and the visible portion of the spectrum, such as the normalized difference vegetation index (NDVI), tend to indicate photosynthetic activity and vegetation health. Although many indices fall in this category, depending on the spectral bands used, they can assist in discriminating among species. The distribution of vegetation indices in the boxplot shows occasional outliers, which are interpreted as natural biological variability among individual trees, such as differences in pigment concentration, leaf structure, or transient physiological stress, rather than measurement error (Figure 3). This interpretation is supported by the fact that each reflectance spectrum represents the average of five leaves per tree, which minimizes leaf level differences, and that the measurements were collected under controlled illumination.
The Blue Ratio (BR) index computes and multiplies the ratio between green, red, red-edge, and near-infrared to blue reflectance. Kentucky coffeetree, American linden, littleleaf linden, and red maple obtained particularly low values compared to the rest of the species, though there were some outliers (Figure 3a). Japanese lilac had the highest 2-band Enhanced Vegetation Index (EVI2) values, while common hackberry had the lowest (Figure 3b). EVI2 overcomes the limitations presented in NDVI, in that it is less prone to saturation and corrects for atmospheric effects.
Common hackberry, American linden, littleleaf linden, and Norway maple had the lowest Double Difference Vegetation Index (DD) values compared to the other species (Figure 3c). A similar pattern was observed in the Range graph (Figure 3f), where the same species held the lowest values. DD compares the near-infrared to the visible region. High values indicate very strong near-infrared reflectance and relatively low red reflectance, which are associated with high photosynthetic activity and a dense canopy. Kentucky coffeetree, Japanese lilac, and Callery pear had the lowest values in the Red-edge Vegetation Stress Index (RVSI) (Figure 3d). RVSI identifies plant stress trends by investigating subtle changes in the red-edge region. Negative RVSI values indicate healthy or unstressed vegetation.
The Plant Senescence Reflectance Index (PSRI) is a proxy of fruit ripening and leaf senescence. Across all species, PSRI values were centered around zero, with Kentucky coffeetree exhibiting the lowest median values (Figure 3e). This pattern indicates low levels of senescence, which is consistent with the timing of sample collection during the active growing season.

3.3. Random Forest Accuracy

After tuning the seven random forest models, the overall accuracy oscillated between 70.1% and 80.4% (Figure 4). The models involving the first derivative (i.e., FD, SR-FD-VI, SR-FD, and FD-VI) all had an overall accuracy above 77.5%. This suggests that incorporating first derivative features enhances classification performance. On the other hand, spectral reflectance alone or combined with the vegetation indices yielded some of the lowest accuracies, at 70.6% and 70.1%, respectively.
The first model, which used only spectral reflectance as predictors, correctly labeled most instances of Japanese lilac and Callery pear, with producer’s accuracies of 95.8% and 81%, respectively (Table 4). The classified labels for Kentucky coffeetree and Japanese lilac were also very reliable, with user’s accuracies of 85.7% and 82.1%, respectively. Although nearly half of the red maple reference samples were misclassified (PA 52.4%), a very high percentage of samples labeled as red maple were correct (UA 91.7%). In this model, five out of the nine species—such as common hackberry, American linden, littleleaf linden, Norway maple, and red maple—had either a producer’s or user’s accuracy below 65%, indicating less satisfactory classification performance for these classes.
The second model, which used only the first derivative of spectral reflectance as predictor variables, correctly labeled all actual instances of red maple (PA = 100%) and most instances of Japanese lilac (95.8%), Norway maple (81.5%), and Callery pear (90.5%) (Table 5). The predicted labels for Kentucky coffeetree, Japanese lilac, red maple, and sugar maple were also highly reliable, with user’s accuracies of 100%, 82.1%, 95.5%, and 85%, respectively. Eight out of the nine species had both a producer’s and user’s accuracy above 65%, with littleleaf linden being the only exception (PA = 52.4%). These results suggest that using the first derivative as input features leads to more accurate classification than using spectral reflectance alone.
The third model used vegetation indices only as predictors to classify the tree species (Table 6). Four species—Kentucky coffeetree, common hackberry, Japanese lilac, and Norway maple—were mainly correctly labeled, with producer’s accuracies above 80%. American linden had the highest omission and commission errors, with a producer’s accuracy of 60.0% and a user’s accuracy of 65.2%. This indicates that many actual red maple samples were misclassified as other species, yet most samples labeled as red maple were correctly identified. Overall, six out of the nine species had both a producer’s and user’s accuracy above 65%.
The combined SR-FD-VI model incorporated all features—spectral reflectance, first derivative, and vegetation indices—as predictors (Table 7). Although this model achieved a similar overall accuracy (78.35%) to the FD-only model, the class-level distribution of the producer’s and user’s accuracies differed slightly. In both models, four species—red maple, Japanese lilac, Norway maple, and Callery pear—had a producer’s accuracy above 80%. Regarding the user’s accuracy, Kentucky coffeetree, Japanese lilac, red maple, and sugar maple exceeded 80% in the FD-only model, while sugar maple fell below this threshold and Callery pear exceeded it in the SR-FD-VI model. These results suggest that incorporating all feature types may improve the classification performance for particular species, even when the overall accuracy remains stable.
In the SR-FD model, five species—Kentucky coffeetree, Japanese lilac, Norway maple, red maple, and Callery pear—maintained both a producer’s and user’s accuracy above 80% (Table 8), similar to the SR-FD-VI model (Table 4). Although the overall accuracy in the SR-FD model (77.8%) was slightly lower than in the SR-FD-VI model (79.4%), the difference was minimal. This suggests that removing vegetation indices as predictor variables had only a modest impact on classification performance.
The SR-VI model achieved the lowest overall accuracy among the models, with generally lower producer’s and user’s accuracies across species (Table 9). Only three species—Japanese lilac, Norway maple, and Callery pear—had a producer’s accuracy above 80%, indicating relatively strong classification performance for these classes. Red maple exhibited a notable imbalance, with a low producer’s accuracy of 47.6% and a high user’s accuracy of 83.3%. This pattern suggests that although many actual red maple samples were misclassified (high omission error), most of the samples labeled as red maple were correctly identified (low commission error).
Similar to the FD-only and SR-FD-VI models, the FD-VI model showed that littleleaf linden had the lowest producer’s accuracy (52.4%) among all nine species (Table 10). As in the FD-only model, all samples labeled as Kentucky coffeetree were correctly classified (UA = 100%) and all red maple reference samples were correctly identified (PA = 100%). While many producer’s and user’s accuracies in both the FD-only and the FD-VI models remained the same, those that changed generally improved in the FD-VI model, consistent with its slightly higher overall accuracy. This suggests that incorporating vegetation indices enhanced the model’s predictive performance.

3.4. Variable of Importance

Each panel in Figure 5 illustrates the top 15 predictor variables ranked by their MDA values, which were obtained by permuting each predictor within the OOB samples and calculating the resulting reduction in classification accuracy. The x-axis represents the magnitude of this decrease, with larger MDA values indicating a greater contribution of that variable to model performance. The y-axis lists the most influential predictors for each model.
The SR-only model’s MDA plot revealed that spectral bands within the red-edge region (724–736 nm) were the most influential predictors (Figure 5a). Bands in the blue (438–460 nm) and green (528–538 nm) portions of the spectrum also contributed to the model’s performance, likely reflecting sensitivity to pigment absorption. Spectral band X2,002 was the only SWIR band that played a notable role in improving the model’s predictive capability.
The FD-only model’s MDA plot indicated that fourteen of the fifteen most important variables were located in the red-edge (704–718 nm) and shortwave infrared 1 (1396–1652 nm) regions of the spectrum (Figure 5b). The highest-ranked predictors were concentrated around 708–712 nm and 1646–1650 nm, corresponding to wavelengths sensitive to chlorophyll content and water absorption in vegetation, respectively. Only one variable (FD566) fell within the green region and exhibited a comparatively lower contribution to the model.
The VI-only model’s MDA plot shows that four vegetation indices were the most important predictors: RVSI, PSRI, BR, and PRI (Figure 5c). The RVSI uses red-edge bands and near infrared reflectance to capture variations in vegetation chlorophyll and stress. The PSRI integrates red-edge, green, and near-infrared bands, effectively capturing signals related to vegetation senescence. BR and PRI, both of which involve the blue region of the spectrum, are sensitive to pigment absorption and photosynthetic activity. The remaining vegetation indices in the MDA plot exhibited moderate contributions to model performance.
The SR-FD-VI model’s MDA plot shows that first derivative predictors were the most influential variables, comprising 13 of the 15 most important predictors (Figure 5d). Among them, ten corresponded to wavelengths within the SWIR 1 region and three fell within the red-edge region. One vegetation index (RVSI) was ranked highly, indicating a strong contribution to model performance. In contrast, the PSRI index exhibited lower importance values. Predictors FD1646, FD1648, FD1650, and FD1652 were among the top contributors and were consistently important across multiple models (i.e., FD-only, FD-VI, and SR-FD-VI), while the vegetation indices RVSI and PSRI were shared predictors between the VI-only, FD-VI, and SR-FD-VI models.
The SR-FD model’s MDA plot showed that 14 of the 15 most important variables were first derivative predictors (Figure 5e). Nine FD predictors were within the SWIR 1 region (FD1640, FD1644, FD1646, FD1648, FD1650, FD1652, FD1730, FD1732, FD1734), and five FD predictors were within the red-edge region (FD708, FD710, FD712, FD714, FD718). The four variables with the highest MDA values—FD1646, FD1648, FD1650, and FD1652—were also among the top contributors in the FD-only and SR-FD-VI models.
In the SR-VI model’s MDA plot, nine vegetation indices and six spectral reflectance bands were among the top 15 most important variables (Figure 5f). The three most influential variables were the vegetation indices RVSI, BR, and ACI2, each of which exhibited high MDA values. Following these, the MDA values decreased more gradually among the remaining variables. Interestingly, ACI2 was not among the top 15 variables in the VI-only model, and neither BR nor ACI2 was among the top variables in the SR-FD-VI model.
In the FD-VI model’s MDA plot, variables were ranked so that MDA values gradually increased toward the most important predictors (Figure 5g). Eleven first derivatives and four vegetation indices comprised the top 15 most influential variables. The top two predictors, RVSI and FD1650, were also the top two contributors in the SR-FD-VI model. Among the vegetation indices, RVSI consistently ranked as the top predictor in all models that included VIs.

4. Discussion

4.1. On the Spectral Curves

The spectral curves obtained for the nine species in this study followed the typical reflectance pattern of vegetation. Although several authors [25,59,60,61] examined different tree species, their reported spectral curves exhibited similar characteristics. Reflectance was low (<20%) in the visible region (400–680 nm), with a small peak in the green portion, primarily driven by photosynthetic pigments such as chlorophyll, which strongly absorb blue and red light [62]. The slightly higher reflectance in the green region gives healthy vegetation its characteristic green color. A sharp increase in reflectance between the red and near-infrared (NIR) regions, known as the red-edge (680—750 nm), represents the transition from strong absorption to intense scattering. The NIR region (750–1300 nm) exhibited a broad plateau with high reflectance values (approximately 40–60%), primarily due to scattering within the mesophyll structure. Two distinct reflectance peaks, one near 1700 nm and another around 2200 nm, were observed in the shortwave infrared SWIR 1 and SWIR 2 regions. Reflectance in the SWIR region was primarily influenced by leaf water content as well as the presence of biochemical constituents such as lignin, nitrogen, and cellulose [63].

4.2. On the Features of Random Forest Models

Hyperspectral datasets are characterized by a high degree of correlation among adjacent spectral bands, and several authors have applied dimensionality reduction techniques to mitigate this redundancy. For example, Liu et al. [25] used Pearson coefficients to remove highly correlated variables, while Maschler et al. [60], Mozgeris et al. [61], and Brabant et al. [64] applied Principal Component Analysis (PCA) to derive new input features. Other studies have used the minimum noise fraction (MNF) transformation [28,64]. However, the effectiveness of these techniques has shown mixed results. Sothe et al. [28] reported that replacing VNIR bands with MNF components resulted in slightly worse accuracy, whereas Brabant et al. [64] found improved predictive performance. Similarly, the efficacy of PCA has varied across studies. Maschler et al. [60] observed higher overall accuracy when PCA components and VNIR bands were combined compared to using VNIR bands alone, while Brabant et al. [64] reported mixed outcomes, and Mozgeris et al. [61] found that neither principal components nor correlation-based feature selection, when used by themselves, improved classification results. Additional comparisons of full versus reduced feature sets showed similarly inconsistent trends; for instance, Clark and Roberts [59] found no statistically significant difference in accuracy between the two approaches.
In this study, an initial run of the random forest model using only PCA-derived components yielded very low predictive performance. Given these mixed findings in the literature and our own preliminary results, we retained all reflectance variables, acknowledging that the random forest algorithm is generally robust to multicollinearity and can handle correlated predictors without substantial risk of overfitting.
Models incorporating first derivative features achieved the highest overall accuracy in this study. This outcome likely reflects the ability of spectral derivatives to exploit band contiguity, emphasizing subtle variations in absorption features related to plant chemistry and structure, information that is less apparent in raw reflectance spectra [59]. Interestingly, the best performing model was FD-VI rather than the all-features model SR-FD-VI, indicating that adding raw spectral reflectance (SR) features slightly reduced classification performance. Furthermore, when comparing the FD with SR-FD, VI with SR-VI, and FD-VI with SR-FD-VI models, the inclusion of spectral reflectance (SR) features resulted in a slight reduction in overall accuracy. This consistent pattern suggests that raw reflectance bands may introduce redundancy, thereby marginally degrading classification performance. These results highlight that while dimensionality reduction is not strictly necessary for model stability in random forest algorithm, not all feature groups contribute equally to predictive performance. Targeted reductions—such as excluding SR when FD and VI already capture the most relevant spectral variation—may yield small accuracy gains. Thus, our results indicate that model performance depends more on the relative informativeness of feature types than on the total number of variables.

4.3. On the Accuracy of Leaf Level Classification

Leaf level hyperspectral measurements isolate the biochemical and structural properties of individual leaves and therefore avoid many of the confounding factors present in canopy level remote sensing, such as background reflectance, shadowing, and atmospheric variability. For this reason, interpretation of classification performance is most appropriately made relative to studies that have evaluated tree species discrimination at the leaf scale. Clark and Roberts [59], for example, reported classification accuracies ranging from 73% to 90% using leaf spectra collected from seven tropical rainforest trees. The overall accuracy of the SR and SR-VI models in this study (70.6% and 70.1%, respectively) fell just below the lower end of the range reported by Clark and Roberts, whereas the derivative-based models (FD, SR-FD, FD-VI, and SR-FD-VI), as well as the VI model, achieved accuracies within the mid to upper portion of that spectrum. Such slight differences in accuracy across leaf level studies are common, as performance is strongly influenced by species composition, spectral preprocessing steps, and the classification algorithms employed. These comparisons therefore indicate that the framework applied here performs competitively with existing leaf level hyperspectral classification approaches and underscore the value of derivative-based features for enhancing species separability.

4.4. On Species-Specific Classification Performance

Across the seven random forest models, clear differences in species-specific performance was observed. Japanese lilac, red maple, Callery pear, and Kentucky coffeetree consistently achieved the highest accuracies, indicating strong spectral separability at the leaf level. These species likely possess distinctive combinations of pigment content, leaf structure, or biochemical traits that generate unique signatures in the visible, red-edge, and SWIR regions, allowing for reliable discrimination across all feature sets [59,63]. Similar findings have been reported in leaf level studies where species-specific chemical and anatomical differences produced strong separability across hyperspectral wavelengths [59].
A contrasting pattern was observed when comparing species belonging to the same genus. Littleleaf linden and American linden, both members of the genus Tilia, were consistently difficult to distinguish, reflecting their similar leaf morphology, venation patterns, and internal structure, which likely produced highly overlapping spectral signatures. In contrast, the three Acer species, Norway maple, red maple, and sugar maple, were readily separable across most model configurations. This suggests that, despite their shared genus, these maple species possess sufficiently distinct biochemical or structural properties at the leaf level to generate discriminable spectral profiles. Together, these results indicate that taxonomic relatedness does not necessarily predict spectral similarity, and that genus-level separability is strongly influenced by the degree of physiological divergence among the constituent species.

4.5. On the Variables of Importance

In this study, features within the red-edge and SWIR regions contributed most to model performance. The red-edge region (680–750 nm) is susceptible to variations in chlorophyll content and leaf internal structure and shifts in the red-edge position are closely related to chlorophyll concentration [65]. These subtle changes, captured by hyperspectral sensors, enhance the RF classifier’s ability to discriminate among tree species. Furthermore, while Clark and Roberts [59] reported that both SWIR 1 and SWIR 2 features were important for distinguishing tree species at pixel and crown levels, the present study found that SWIR 1 features were most influential at the leaf level. The SWIR 1 region is primarily associated with absorption features of lignin, a structural polymer in plant cell walls. Among the vegetation indices, the RVSI, PSRI, and BR were the most influential in the VI model (Figure 5). These three indices were also among the most important variables reported by Maschler et al. [60] and Liu [25] and are primarily derived from the visible and red-edge portions of the spectrum, capturing pigment-related variations associated with photosynthetic activity, stress, and senescence.

4.6. Generality, Limitations, and Recommendations for Future Research

Although this study was based on leaf level spectra from nine urban species in southern Wisconsin, several aspects of the results are expected to extend beyond the immediate study context. The spectral regions identified as the most important for species discrimination, particularly the red-edge and SWIR 1 regions, aligned with well-established relationships between reflectance, pigments, leaf structure, and water content [59,65,66]. Because these physiological and structural traits vary consistently across species, the patterns observed here are likely to hold in other settings. Likewise, the improved performance of first derivative transformations and vegetation indices reflects general principles of spectral enhancement that have been demonstrated across various vegetation studies [60].
At the same time, several limitations constrain the direct transferability of the results. Leaf level spectra collected under controlled conditions differ from canopy-level reflectance in ways that affect classification performance, including background reflectance, shadowing, atmospheric effects, and structural heterogeneity. In addition, species composition, local environmental conditions, and sensor characteristics may influence the magnitude of classification accuracy. For these reasons, while the patterns identified, such as the diagnostic value of specific spectral regions and the consistent utility of FD and VI features, are broadly applicable, the absolute accuracies reported here should not be generalized without caution.
Although hyperspectral data provide valuable spectral information for tree classification, they primarily capture subtle variations in reflectance and absorption related to vegetation properties. Integrating additional datasets, such as canopy structure information derived from LiDAR, may provide additional insights into tree form and architecture, further enhancing model performance. For instance, Liu et al. [25] reported an 18.9% increase in overall accuracy when combining hyperspectral VNIR and LiDAR features compared to hyperspectral data alone. Similarly, Shi et al. [67] observed a 7.4% improvement when integrating VNIR-SWIR and LiDAR, and Hartling et al. [68] reported a 5.2% increase in overall accuracy when combining VNIR and LiDAR compared to VNIR alone.
Based on these findings, future research on urban tree classification should evaluate whether the spectral feature-importance patterns identified at the leaf level persist when measured at the canopy scale, under varying illumination and background conditions. Expanding the dataset to include additional species, multiple urban forest types, and diverse geographic regions will help determine the robustness of the spectral relationships highlighted here. Finally, future work should prioritize integrating hyperspectral data with complementary structural information from LiDAR, as spectral data alone primarily represent surface reflectance characteristics. Such extensions will support the development of operational remote sensing approaches for urban tree species identification.

5. Conclusions

This study identified key spectral features spanning the visible to shortwave infrared range that effectively discriminate nine common urban tree species using the random forest classifier at the leaf level. Among the seven model configurations tested, the combination of first derivatives and vegetation indices achieved the highest overall accuracy. The most important predictors were first derivatives features within the red-edge and SWIR 1 regions, followed by spectral reflectance in the red-edge region, and the vegetation indices RVSI, PSRI, and BR. These findings highlight the importance of derivative-based spectral features and selected vegetation indices in enhancing separability in hyperspectral analyses.

Author Contributions

Conceptualization, R.R.D.; data curation, R.R.D., A.K., and M.S.; investigation, A.K., M.S.; data analysis, R.R.D., A.K., and M.S.; writing and review, R.R.D. All authors have read and agreed to the published version of the manuscript.

Funding

This material is based upon work supported by the National Aeronautics and Space Administration (NASA) under Award No. RIP21_2.1, issued through the Wisconsin Space Grant Consortium. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of NASA.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

We thank the University of Wisconsin-Whitewater and its Office of Research & Sponsored Programs (ORSP) for their support in managing the grant.

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ACI1Anthocyanin content index I
ACI2Anthocyanin content index II
ALAmerican linden
ARVIAtmospherically resistant vegetation index
AVIRISAirborne visible/infrared imaging spectrometer
AVIRIS-NGAVIRIS-next generation
BDDBackward divided difference
BRBlue ratio
CARTClassification and regression trees
CDACanonical discriminant analysis
CHCommon hackberry
CI1Chlorophyll index I
CI2Chlorophyll index II
CPCallery pear
DBHDiameter at breast height
DDDouble difference vegetation index
EOEarth observing
EVIEnhanced vegetation index
EVI22-band enhanced vegetation index
FDFirst derivative
GARIGreen atmospherically resistant vegetation index
GNDVIGreen normalized difference vegetation index
GRGreen ratio
GRRGreen-red difference index
IPVIInfrared percentage vegetation index
JLJapanese lilac
KCKentucky coffeetree
KNNK-nearest neighbor
LC-RP ProLeaf clip and reflectance probe
LDALinear discriminant analysis
LLLittleleaf linden
MDAMean decrease in accuracy
MEANAverage reflectance between 690 nm and 740 nm
MEDIMedian reflectance between 690 nm and 740 nm
mND705Modified normalized difference index
MNFMinimum noise fraction
MSIMultispectral instrument
mSR705Modified simple ratio
NDRENormalized difference red-edge index
NDVINormalized difference vegetation index
NIRNear-infrared
NIR-RInfrared–Red Difference index
NMNorway maple
OAOverall accuracy
OOBOut-of-bag
PAProducer’s accuracy
PCAPrincipal component analysis
PRIPhotochemical reflectance index
PRISMAPRecursore IperSpettrale della Missione Applicativa
PSIPlant stress index
PSRIPlant senescence reflectance index
PSSR1Pigment-specific simple ratio I
PSSR2Pigment-specific simple ratio II
R1Ratio vegetation stress index I
R2Ratio vegetation stress index II
R3Ratio vegetation stress index III
RENVIRed-edge normalized difference vegetation index
RFRandom forest
RMRed maple
RRRed ratio
RVIRatio vegetation index
RVSIRed-edge vegetation stress index
SMSugar maple
SRSpectral reflectance
SVMSupport vector machine
SWIRShortwave infrared
UAUser’s accuracy
UAVsUnmanned aerial vehicles
VARIVisible atmospherically resistant index
VIVegetation indices

References

  1. United Nations, Department of Economic and Social Affairs, Population Division. World Urbanization Prospects: The 2018 Revision; (ST/ESA/SER.A/420); United Nations: New York, NY, USA, 2019; Available online: https://population.un.org/wup/assets/WUP2018-Report.pdf (accessed on 18 June 2025).
  2. Rasoolzadeh, R.; Mobarghaee Dinan, N.; Esmaeilzadeh, H.; Rashidi, Y.; Sadeghi, S.M.M. Assessment of air pollution removal by urban trees based on the i-Tree Eco Model: The case of Tehran, Iran. Integr. Environ. Assess. Manag. 2024, 20, 2142–2152. [Google Scholar] [CrossRef]
  3. Király, É.; Illés, G.; Borovics, A. Green infrastructure for climate change mitigation: Assessment of carbon sequestration and storage in the urban forests of Budapest, Hungary. Urban Sci. 2025, 9, 137. [Google Scholar] [CrossRef]
  4. Selbig, W.; Loheid, S.P.; Schuster, W.; Scharenbroch, B.; Coville, R.; Kruegler, J.; Avery, W.; Haefner, R.; Nowak, D. Quantifying the stormwater runoff volume reduction benefits of urban street tree canopy. Sci. Total Environ. 2022, 806. [Google Scholar] [CrossRef]
  5. Yin, Y.; Li, S.; Xing, X.; Zhou, X.; Kang, Y.; Hu, Q.; Li, Y. Cooling benefits of urban tree canopy: A systematic review. Sustainability 2024, 16, 4955. [Google Scholar] [CrossRef]
  6. Lovasi, G.S.; Quinn, J.W.; Neckerman, K.M.; Perzanowski, M.S.; Rundle, A. Children living in areas with more street trees have lower prevalence of asthma. J. Epidemiol. Community Health 2008, 62, 647–649. [Google Scholar] [CrossRef] [PubMed]
  7. Wang, L.; Zhao, X.; Xu, W.; Tang, J.; Jiang, X. Correlation analysis of lung cancer and urban spatial factor: Based on survey in Shanghai. J. Thorac. Dis. 2016, 8, 2626–2637. [Google Scholar] [CrossRef] [PubMed]
  8. Graham, D.A.; Vanos, J.K.; Kenny, N.A.; Brown, R.D. The relationship between neighbourhood tree canopy cover and heat-related ambulance calls during extreme heat events in Toronto, Canada. Urban For. Urban Green. 2016, 20, 180–186. [Google Scholar] [CrossRef]
  9. Lin, Y.H.; Tsai, C.C.; Sullivan, W.C.; Chang, P.J.; Chang, C.Y. Does awareness affect the restorative function and perception of street trees? Front. Psychol. 2014, 5, 906. [Google Scholar] [CrossRef]
  10. Beyer, K.M.M.; Kaltenbach, A.; Szabo, A.; Bogar, S.; Nieto, F.J.; Malecki, K.M. Exposure to neighborhood green space and mental health: Evidence from the survey of the health of Wisconsin. Int. J. Environ. Res. Public Health 2014, 11, 3453–3472. [Google Scholar] [CrossRef]
  11. Donovan, G.H.; Michael, Y.L.; Butry, D.T.; Sullivan, A.D.; Chase, J.M. Urban trees and the risk of poor birth outcomes. Health Place 2011, 17, 390–393. [Google Scholar] [CrossRef]
  12. Carver, A.D.; Unger, D.R.; Parks, C.L. Modeling energy savings from urban shade trees: An assessment of the CITYgreen® energy conservation module. Environ. Manag. 2004, 34, 650–655. [Google Scholar] [CrossRef]
  13. Pu, R.; Landry, S. A comparative analysis of high spatial resolution IKONOS and WorldView-2 imagery for mapping urban tree species. Remote Sens. Environ. 2012, 124, 516–533. [Google Scholar] [CrossRef]
  14. Immitzer, M.; Atzberger, C.; Koukal, T. Tree species classification with random forest using very high spatial resolution 8-band WorldView-2 satellite data. Remote Sens. 2012, 4, 2661–2693. [Google Scholar] [CrossRef]
  15. Vangi, E.; D’Amico, G.; Francini, S.; Giannetti, F.; Lasserre, B.; Marchetti, M.; Chirici, G. The new hyperspectral satellite PRISMA: Imagery for Forest Types Discrimination. Sensors 2021, 21, 1182. [Google Scholar] [CrossRef]
  16. Puletti, N.; Camarretta, N.; Corona, P. Evaluating EO1-Hyperion capability for mapping conifer and broadleaved forests. Eur. J. Remote Sens. 2016, 49, 157–169. [Google Scholar] [CrossRef]
  17. Hati, J.P.; Samanta, S.; Chaube, N.R.; Misra, A.; Giri, S.; Pramanick, N.; Gupta, K.; Majumdar, S.D.; Chanda, A.; Mukhopadhyay, A.; et al. Mangrove classification using airborne hyperspectral AVIRIS-NG and comparing with other spaceborne hyperspectral and multispectral data. Egypt. J. Remote Sens. Space Sci. 2021, 24, 273–281. [Google Scholar] [CrossRef]
  18. Paramanik, S.; Deep, N.R.; Behera, M.D.; Bhattacharya, B.K.; Dash, J. Species-level classification of mangrove forest using AVIRIS-NG hyperspectral imagery. Remote Sens. Lett. 2023, 14, 522–533. [Google Scholar] [CrossRef]
  19. Alonzo, M.; Roth, K.; Roberts, D. Identifying Santa Barbara’s urban tree species from AVIRIS imagery using canonical discriminant analysis. Remote Sens. Lett. 2013, 4, 513–521. [Google Scholar] [CrossRef]
  20. Zhang, Z.; Huang, L.; Wang, Q.; Jiang, L.; Qi, Y.; Wang, S. UAV hyperspectral remote sensing image classification: A systematic review. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2025, 18, 3099–3124. [Google Scholar] [CrossRef]
  21. La Rosa, L.E.C.; Sothe, C.; Feitosa, R.Q.; De Almeida, C.M.; Schimalski, M.B.; Borges Oliveira, D.A. Multi-task fully convolutional network for tree species mapping in dense forests using small training hyperspectral data. ISPRS J. Photogramm. Remote Sens. 2021, 179, 35–49. [Google Scholar] [CrossRef]
  22. Abbas, S.; Peng, Q.; Wong, M.S.; Li, Z.; Wang, J.; Ng, K.T.K.; Kwok, C.Y.T.; Hui, K.K.W. Characterizing and classifying urban tree species using bi-monthly terrestrial hyperspectral images in Hong Kong. ISPRS J. Photogramm. Remote Sens. 2021, 177, 204–216. [Google Scholar] [CrossRef]
  23. Alonzo, M.; Bookhagen, B.; Roberts, D.A. Urban tree species mapping using hyperspectral and lidar data fusion. Remote Sens. Environ. 2014, 14, 70–83. [Google Scholar] [CrossRef]
  24. Voss, M.; Sugumaran, R. Seasonal effect of tree species classification in an urban environment using hyperspectral data, LiDAR, and an object-oriented approach. Sensors 2008, 8, 3020–3036. [Google Scholar] [CrossRef]
  25. Liu, L.; Coops, N.C.; Aven, N.W.; Pang, Y. Mapping urban tree species using integrated airborne hyperspectral and LiDAR remote sensing data. Remote Sens. Environ. 2017, 200, 170–182. [Google Scholar] [CrossRef]
  26. Yel, S.G.; Tunc Gormus, E. Exploiting hyperspectral and multispectral images in the detection of tree species: A review. Front. Remote Sens. 2023, 4, 1136289. [Google Scholar] [CrossRef]
  27. Zhong, L.; Dai, Z.; Fang, P.; Cao, Y.; Wang, L. A Review: Tree species classification based on remote sensing data and classic deep learning-based methods. Forests 2024, 15, 852. [Google Scholar] [CrossRef]
  28. Sothe, C.; Dalponte, M.; Almeida, C.M.d.; Schimalski, M.B.; Lima, C.L.; Liesenberg, V.; Miyoshi, G.T.; Tommaselli, A.M.G. Tree species classification in a highly diverse subtropical forest integrating UAV-based photogrammetric point cloud and hyperspectral data. Remote Sens. 2019, 11, 1338. [Google Scholar] [CrossRef]
  29. Xie, Z.; Chen, Y.; Lu, D.; Li, G.; Chen, E. Classification of land cover, forest, and tree species classes with ZiYuan-3 multispectral and stereo data. Remote Sens. 2019, 11, 164. [Google Scholar] [CrossRef]
  30. Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.N.; et al. Individual tree detection and classification with UAV-based photogrammetric point clouds and hyperspectral imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef]
  31. Sabat-Tomala, A.; Raczko, E.; Zagajewski, B. Comparison of support vector machine and random forest algorithms for invasive and expansive species classification using airborne hyperspectral data. Remote Sens. 2020, 12, 516. [Google Scholar] [CrossRef]
  32. United States Census Bureau. QuickFacts. Available online: https://www.census.gov/quickfacts/fact/table/whitewatercitywisconsin,waukeshacitywisconsin,mcfarlandvillagewisconsin,veronacitywisconsin,fitchburgcitywisconsin,WI/PST045221 (accessed on 18 June 2025).
  33. National Oceanic and Atmospheric Administration (NOAA). Summary of Monthly Normals (1991–2020). Station: Waukesha WWTP, WI. Available online: https://www.ncei.noaa.gov/access/services/data/v1?dataset=normals-monthly-1991-2020&startDate=0001-01-01&endDate=9996-12-31&stations=USC00478937&format=pdf (accessed on 18 June 2025).
  34. Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2018, 112, 3833–3845. [Google Scholar] [CrossRef]
  35. Gamon, J.A.; Surfus, J.S. Assessing leaf pigment content and activity with a reflectometer. New Phytol. 1999, 143, 105–117. [Google Scholar] [CrossRef]
  36. Kaufman, Y.J.; Tanre, D. Atmospherically resistant vegetation index (ARVI) for EOS-MODIS. IEEE Trans. Geosci. Remote Sens. 1992, 30, 261–270. [Google Scholar] [CrossRef]
  37. Merton, R.N. Multi-Temporal Analysis of Community-Scale Vegetation Stress with Imaging Spectroscopy. Ph.D. Thesis, University of Auckland, Auckland, New Zealand, 1999. [Google Scholar]
  38. Waser, L.T.; Küchler, M.; Jütte, K.; Stampfer, T. Evaluating the potential of WorldView-2 data to classify tree species and different levels of Ash mortality. Remote Sens. 2014, 6, 4515–4545. [Google Scholar] [CrossRef]
  39. Datt, B. Visible/near-infrared reflectance and chlorophyll content in Eucalyptus leaves. Int. J. Remote Sens. 1999, 20, 2741–2759. [Google Scholar] [CrossRef]
  40. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  41. Jackson, R.D.; Slater, P.N.; Pinter, P.J. Discrimination of growth and water stress in wheat by various vegetation indices through clear and turbid atmospheres. Remote Sens. Environ. 1983, 13, 187–208. [Google Scholar] [CrossRef]
  42. Liu, H.Q.; Huete, A. A feedback-based modification of the NDVI to minimize canopy background and atmospheric noise. IEEE Trans. Geosci. Remote Sens. 1995, 33, 457–465. [Google Scholar] [CrossRef]
  43. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  44. Gitelson, A.A.; Merzlyak, M.N. Signature analysis of leaf reflectance spectra: Algorithm development for remote sensing of chlorophyll. J. Plant Physiol. 1996, 148, 494–500. [Google Scholar] [CrossRef]
  45. Courel, M.F.; Chamard, P.C.; Guenegou, M.; Lerhun, J.; Levasseur, J.; Togola, M. Utilisation des bandes spectrales du vert et du rouge pour une meilleure évaluation des formations végétales actives. In Télédétection et Cartographie; AUPELF-UREF: Sherbrooke, QC, Canada, 1991; pp. 203–209. [Google Scholar]
  46. Crippen, R. Calculating the vegetation index faster. Remote Sens. Environ. 1990, 34, 71–73. [Google Scholar] [CrossRef]
  47. Sims, D.A.; Gamon, J.A. Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
  48. Barnes, E.M.; Clarke, T.R.; Richards, S.E.; Colaizzi, P.D.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T.; et al. Coincident detection of crop water stress, nitrogen status, and canopy density using ground-based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000; Available online: https://www.tucson.ars.ag.gov/unit/publications/PDFfiles/1356.pdf (accessed on 18 June 2025).
  49. Rouse, J.; Haas, R.; Schell, J.; Deering, D. Monitoring Vegetation Systems in the Great Plains with ERTS. In Proceedings of the Goddard Space Flight Center 3d ERTS-1 Symposium, Washington, DC, USA, 10–14 December 1973; NASA: Greenbelt, MD, USA, 1974; Volume 1, pp. 309–317. Available online: https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19740022614.pdf (accessed on 18 June 2025).
  50. Gamon, J.A.; Serrano, L.; Surfus, J.S. The photochemical reflectance index: An optical indicator of photosynthetic radiation use efficiency across species, functional types, and nutrient levels. Oecologia 1997, 112, 492–501. [Google Scholar] [CrossRef]
  51. Gamon, J.A.; Peñuelas, J.; Field, C.B. A narrow-waveband spectral index that tracks diurnal changes in photosynthetic efficiency. Remote Sens. Environ. 1992, 41, 35–44. [Google Scholar] [CrossRef]
  52. Blackburn, G.A. Quantifying chlorophylls and carotenoids at leaf and canopy scales: An evaluation of some hyperspectral approaches. Remote Sens. Environ. 1998, 66, 273–285. [Google Scholar] [CrossRef]
  53. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive optical detection of pigment changes during leaf senescence and fruit ripening. Physiol. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
  54. Carter, G.A.; Miller, R.L. Early detection of plant stress by digital imaging within narrow stress-sensitive wavebands. Remote Sens. Environ. 1994, 50, 295–302. [Google Scholar] [CrossRef]
  55. Jordan, C.F. Derivation of leaf area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  56. Gitelson, A.; Merzlyak, M.N. Spectral reflectance changes associated with autumn senescence of Aesculus hippocastanum L. and Acer platanoides L. leaves. Spectral features and relation to chlorophyll estimation. J. Plant Physiol. 1994, 143, 286–292. [Google Scholar] [CrossRef]
  57. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
  58. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  59. Clark, M.L.; Roberts, D.A. Species-level differences in hyperspectral metrics among tropical rainforest trees as determined by a tree-based classifier. Remote Sens. 2012, 4, 1820–1855. [Google Scholar] [CrossRef]
  60. Maschler, J.; Atzberger, C.; Immitzer, M. Individual tree crown segmentation and classification of 13 tree species using airborne hyperspectral data. Remote Sens. 2018, 10, 1218. [Google Scholar] [CrossRef]
  61. Mozgeris, G.; Juodkienė, V.; Jonikavičius, D.; Straigytė, L.; Gadal, S.; Ouerghemmi, W. Ultra-light aircraft-based hyperspectral and colour-infrared imaging to identify deciduous tree species in an urban environment. Remote Sens. 2018, 10, 1668. [Google Scholar] [CrossRef]
  62. Cao, J.; Leng, W.; Liu, K.; Liu, L.; He, Z.; Zhu, Y. Object-based mangrove species classification using unmanned aerial vehicle hyperspectral images and digital surface models. Remote Sens. 2018, 10, 89. [Google Scholar] [CrossRef]
  63. Jacquemoud, S.; Ustin, S.L. Leaf Optical Properties: A Handbook of Practical and Theoretical Information; Cambridge University Press: Cambridge, UK, 2019. [Google Scholar]
  64. Brabant, C.; Alvarez-Vanhard, E.; Laribi, A.; Morin, G.; Thanh Nguyen, K.; Thomas, A.; Houet, T. Comparison of Hyperspectral Techniques for Urban Tree Diversity Classification. Remote Sens. 2019, 11, 1269. [Google Scholar] [CrossRef]
  65. Mutanga, O.; Skidmore, A.K. Red-edge shift and biochemical content in grass canopies. ISPRS J. Photogramm. Remote Sens. 2007, 62, 34–42. [Google Scholar] [CrossRef]
  66. Dalponte, M.; Ørka, H.O.; Gobakken, T.; Gianelle, D.; Næsset, E. Tree Species Classification in Boreal Forests with Hyperspectral Data. IEEE Trans. Geosci. Remote Sens. 2013, 51, 2632–2645. [Google Scholar] [CrossRef]
  67. Shi, Y.; Skidmore, A.K.; Wang, T.; Holzwarth, S.; Heiden, U.; Pinnel, N.; Zhu, X.; Heurich, M. Tree species classification using plant functional traits from LiDAR and hyperspectral data. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 207–219. [Google Scholar] [CrossRef]
  68. Hartling, S.; Sagan, V.; Maimaitijiang, M. Urban tree species classification using UAV-based multi-sensor data fusion and machine learning. GIScience Remote Sens. 2021, 58, 1250–1275. [Google Scholar] [CrossRef]
Figure 1. Study site. (a) Cities surveyed (red dots), (b) Example of a sampling location: Starin Park in the city of Whitewater.
Figure 1. Study site. (a) Cities surveyed (red dots), (b) Example of a sampling location: Starin Park in the city of Whitewater.
Remotesensing 18 00099 g001
Figure 2. Spectral reflectance curves of nine tree species: (a) Kentucky coffeetree, (b) Common hackberry, (c) Japanese lilac, (d) American linden, (e) Littleleaf linden, (f) Norway maple, (g) Red maple, (h) Sugar maple, and (i) Callery pear. The gray area represents the range of values, while the red line represents average reflectance.
Figure 2. Spectral reflectance curves of nine tree species: (a) Kentucky coffeetree, (b) Common hackberry, (c) Japanese lilac, (d) American linden, (e) Littleleaf linden, (f) Norway maple, (g) Red maple, (h) Sugar maple, and (i) Callery pear. The gray area represents the range of values, while the red line represents average reflectance.
Remotesensing 18 00099 g002
Figure 3. Boxplot of selected vegetation indices across nine tree species: (a) BR, (b) EVI2, (c) DD, (d) RVSI, (e) PSRI, and (f) range. All indices are unitless as they are derived from ratios or differences of reflectance values. Dots represent occasional outliers. For abbreviations, refer to Table 1 and Table 2.
Figure 3. Boxplot of selected vegetation indices across nine tree species: (a) BR, (b) EVI2, (c) DD, (d) RVSI, (e) PSRI, and (f) range. All indices are unitless as they are derived from ratios or differences of reflectance values. Dots represent occasional outliers. For abbreviations, refer to Table 1 and Table 2.
Remotesensing 18 00099 g003aRemotesensing 18 00099 g003b
Figure 4. Overall accuracy of the seven random forest models. Abbreviations are as follows: SR = spectral reflectance, FD = first derivative, VI = vegetation index, and OA = overall accuracy.
Figure 4. Overall accuracy of the seven random forest models. Abbreviations are as follows: SR = spectral reflectance, FD = first derivative, VI = vegetation index, and OA = overall accuracy.
Remotesensing 18 00099 g004
Figure 5. Top 15 variables of importance according to the MDA scores for all seven models: (a) SR, (b) FD, (c) VI, (d) SR-FD-VI, (e) SR-FD, (f) SR-VI, and (g) FD-VI.
Figure 5. Top 15 variables of importance according to the MDA scores for all seven models: (a) SR, (b) FD, (c) VI, (d) SR-FD-VI, (e) SR-FD, (f) SR-VI, and (g) FD-VI.
Remotesensing 18 00099 g005
Table 1. Common deciduous tree species in southern Wisconsin.
Table 1. Common deciduous tree species in southern Wisconsin.
Common NameAbbreviationScientific NameTrees Sampled
Kentucky CoffeetreeKCGymnocladus dioica57
Common HackberryCHCeltis occidentalis55
Japanese LilacJLSyringa reticulata82
American LindenALTilia americana85
Littleleaf LindenLLTilia Cordata73
Norway MapleNMAcer platanoides91
Red MapleRMAcer rubrum70
Sugar MapleSMAcer saccharum75
Callery PearCPPyrus calleryana73
Table 2. Spectral indices derived from the hyperspectral data.
Table 2. Spectral indices derived from the hyperspectral data.
Spectral IndexFormulaReference
2-Band Enhanced Vegetation Index E V I 2 = 2.5 × ρ 868 ρ   648 ρ 868 + 2.4 × ρ   648   + 1 [34]
Anthocyanin Content Index I A C I 1 = i = 600 i = 700 ρ i i = 500 i = 600 ρ i [35]
Anthocyanin Content Index II A C I 2 = ρ 650 ρ 550 [35]
Atmospherically Resistant Vegetation Index A R V I = ρ 868 ( 2 × ρ   664   ρ   466 ) ρ 868 + ( 2 × ρ   664   ρ   466 ) [36]
Average Reflectance Between 690 nm and 740 nm M E A N = i = 690 i = 740 ρ i N [37]
Blue Ratio B R = ρ 664 ρ 482 × ρ 546 ρ 482 × ρ 722 ρ 482 × ρ 832 ρ 482 [38]
Chlorophyll Index I C I 1 = ρ 850 ρ   710 ρ 850 + ρ   680 [39]
Chlorophyll Index II C I 2 = ρ 750 ρ 700 [39]
Infrared–Red Difference Index N I R R = ρ   772   ρ   664 [40]
Double Difference Vegetation Index D D = ( 2 × ρ   948   ρ   750 ) ( ρ   648   ρ   546 ) [41]
Enhanced Vegetation Index E V I = 2.5 × ρ 868 ρ   648 ρ 868 + 6 × ρ   648   7.5 × ρ   466 + 1 [42]
Green Atmospherically Resistant Vegetation Index G A R I = ρ 750 ( ρ   546 (   ρ 466 ρ   670 ) ) ρ 750 + ( ρ   546 (   ρ 466 ρ   670 ) ) [43]
Green Normalized Difference Vegetation Index G N D V I = ρ 750 ρ   546 ρ 750 + ρ   546 [44]
Green Ratio G R = ρ 546 ρ 664 [38]
Green-Red Difference Index G R R = ρ 562 ρ   664 ρ 562 + ρ   664 [45]
Infrared Percentage Vegetation Index I P V I = ρ 802 ρ 802 + ρ   678 [46]
Median Reflectance between 690 nm and 740 nm M E D I = m e d i a n i = 690 i = 740 ρ i [37]
Modified Normalized Difference Index m N D 705 = ρ 750 ρ   705 ρ 750 + ρ   705 2 × ρ   445   [47]
Modified Simple Ratio m S R 705 = ρ 750 ρ   445 ρ 705 ρ   445 [47]
Normalized Difference Red-Edge Index N D R E = ρ 788 ρ   722 ρ 788 + ρ   722 [48]
Normalized Difference Vegetation Index N D V I = ρ 802 ρ   678 ρ 802 + ρ   678 [49]
Photochemical Reflectance Index P R I = ρ 532 ρ   568 ρ 532 + ρ   568 [50,51]
Pigment Specific Simple Ratio I P S S R 1 = ρ 800 ρ 680 [52]
Pigment Specific Simple Ratio II P S S R 2 = ρ 800 ρ 635 [52]
Plant Senescence Reflectance Index P S R I = ρ 678 ρ   500 ρ 750 [53]
Plant Stress Index P S I = ρ 695 ρ 760 [54]
Ratio Vegetation Index R V I = ρ 802 ρ 678 [55]
Ratio Vegetation Stress Index I R 1 = ρ 694 ρ 760 [54]
Ratio Vegetation Stress Index II R 2 = ρ 600 ρ 760 [54]
Ratio Vegetation Stress Index III R 3 = ρ 710 ρ 760 [54]
Red-Edge Normalized Difference Vegetation Index R E N D V I = ρ 750 ρ   708 ρ 750 + ρ   708 [56]
Red-edge Vegetation Stress Index R V S I = ρ 714 + ρ   752 2 ρ 733 [37]
Red Ratio R R = ρ 832 ρ 664 × ρ 546 ρ 664 × ρ 832 ρ 722   [38]
Reflectance Range between 690 nm and 740 nm R a n g e = m a x ( ρ   690 ρ   740 ) m i n ( ρ   690 ρ   740 ) [37]
Visible Atmospherically Resistant Index V A R I = ρ 562 ρ   664 ρ 562 + ρ   664   ρ   488 [57]
ρ i is the reflectance at the i wavelength, and N is the number of bands.
Table 3. Summary of the predictors used in the seven random forest models.
Table 3. Summary of the predictors used in the seven random forest models.
Model FeaturesModel
Designation
Number of
Features
Spectral reflectanceSR977
First derivativeFD976
Vegetation indicesVI35
Spectral reflectance, first derivative, vegetation indicesSR-FD-VI1988
Spectral reflectance, first derivativeSR-FD1953
Spectral reflectance, vegetation indicesSR-VI1012
First derivative, vegetation indicesFD-VI1011
Table 4. Confusion matrix for the spectral reflectance (SR) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
Table 4. Confusion matrix for the spectral reflectance (SR) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
KCCHALJLLLNMRMSMCPTotalUA%
KC12000001101485.7
CH01020101001471.4
AL11150530112755.6
JL20023001022882.1
LL00401320001968.4
NM05400210403461.8
RM00000011101291.7
SM20000141512365.2
CP00012030172373.9
Total171625242127212221194
PA%70.662.560.095.861.977.852.468.281.0
Table 5. Confusion matrix for the first derivative (FD) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
Table 5. Confusion matrix for the first derivative (FD) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
KCCHALJLLLNMRMSMCPTotalUA%
KC120000000012100
CH01110200001478.6
AL02180410112766.7
JL40023000012882.1
LL10101120001573.3
NM02401220303268.8
RM00000021102295.5
SM01000201702085.0
CP00113000192479.2
Total171625242127212221194
PA%70.668.872.095.852.481.510077.390.5
Table 6. Confusion matrix for the vegetation indices (VI) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
Table 6. Confusion matrix for the vegetation indices (VI) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
KCCHALJLLLNMRMSMCPTotalUA%
KC16001000001794.1
CH01310011001681.3
AL01150500022365.2
JL00022002022684.6
LL00601300001968.4
NM02300250313473.5
RM00010013201681.3
SM10002131702470.8
CP00001020161984.2
Total171625242127212221194
PA%94.181.360.091.761.992.661.977.376.2
Table 7. Confusion matrix for the spectral reflectance, first derivative, and vegetation indices (SR-FD-VI) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
Table 7. Confusion matrix for the spectral reflectance, first derivative, and vegetation indices (SR-FD-VI) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
KCCHALJLLLNMRMSMCPTotalUA%
KC13001000101586.7
CH01010101001376.9
AL02170410112665.4
JL20022000012588.0
LL00301310001776.5
NM04401230303565.7
RM00000019102095.0
SM20000211602176.2
CP00012000192286.4
Total171625242127212221194
PA%76.562.568.091.761.985.290.572.790.5
Table 8. Confusion matrix for the spectral reflectance and first derivative (SR-FD) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
Table 8. Confusion matrix for the spectral reflectance and first derivative (SR-FD) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
KCCHALJLLLNMRMSMCPTotalUA%
KC14001000101687.5
CH01010101001376.9
AL02150410112462.5
JL20022000012588.0
LL10401310001968.4
NM04501230303663.9
RM00000019102095.0
SM00000211601984.2
CP00012000192286.4
Total171625242127212221194
PA%82.462.560.091.761.985.290.572.790.5
Table 9. Confusion matrix for the spectral reflectance and vegetation indices (SR-VI) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
Table 9. Confusion matrix for the spectral reflectance and vegetation indices (SR-VI) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
KCCHALJLLLNMRMSMCPTotalUA%
KC11001000101384.6
CH01110101001478.6
AL11150520122755.6
JL20022002022878.6
LL00501320002065.0
NM04300220403366.7
RM10000010101283.3
SM20100151502462.5
CP00012030172373.9
Total171625242127212221194
PA%64.768.860.091.761.981.547.668.281.0
Table 10. Confusion matrix for the first derivative and vegetation indices (FD-VI) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
Table 10. Confusion matrix for the first derivative and vegetation indices (FD-VI) model. For tree species abbreviations, refer to Table 1. Other abbreviations are as follows: UA = user’s accuracy; PA = producer’s accuracy. Correctly classified observations are highlighted in gray. Columns represent reference data, and rows represent classified data.
KCCHALJLLLNMRMSMCPTotalUA%
KC1200000000121.00
CH01110100001384.6
AL02180510112864.3
JL40023000012882.1
LL10101100001384.6
NM03401240303568.6
RM00000021102295.5
SM00000201701989.5
CP00113000192479.2
Total171625242127212221194
PA%70.668.872.095.852.488.91.0077.390.5
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Duchesne, R.R.; Krebs, A.; Seuser, M. Spectral Characterization of Nine Urban Tree Species in Southern Wisconsin. Remote Sens. 2026, 18, 99. https://doi.org/10.3390/rs18010099

AMA Style

Duchesne RR, Krebs A, Seuser M. Spectral Characterization of Nine Urban Tree Species in Southern Wisconsin. Remote Sensing. 2026; 18(1):99. https://doi.org/10.3390/rs18010099

Chicago/Turabian Style

Duchesne, Rocio R., Alex Krebs, and Madelyn Seuser. 2026. "Spectral Characterization of Nine Urban Tree Species in Southern Wisconsin" Remote Sensing 18, no. 1: 99. https://doi.org/10.3390/rs18010099

APA Style

Duchesne, R. R., Krebs, A., & Seuser, M. (2026). Spectral Characterization of Nine Urban Tree Species in Southern Wisconsin. Remote Sensing, 18(1), 99. https://doi.org/10.3390/rs18010099

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop