Estimation of Boreal Forest Attributes from Very High Resolution Pléiades Data
AbstractIn this study, the potential of using very high resolution Pléiades imagery to estimate a number of common forest attributes for 10-m plots in boreal forest was examined, when a high-resolution terrain model was available. The explanatory variables were derived from three processing alternatives. Height metrics were extracted from image matching of the images acquired from different incidence angles. Spectral derivatives were derived by performing principal component analysis of the spectral bands and lastly, second order textural metrics were extracted from a gray-level co-occurrence matrix, computed with an 11 × 11 pixels moving window. The analysis took place at two Swedish test sites, Krycklan and Remningstorp, containing boreal and hemi-boreal forest. The lowest RMSE was estimated with 1.4 m (7.7%) for Lorey’s mean height, 1.7 m (10%) for airborne laser scanning height percentile 90, 5.1 m2·ha−1 (22%) for basal area, 66 m3·ha−1 (27%) for stem volume, and 26 tons·ha−1 (26%) for above-ground biomass, respectively. It was found that the image-matched height metrics were most important in all models, and that the spectral and textural metrics contained similar information. Nevertheless, the best estimations were obtained when all three explanatory sources were used. To conclude, image-matched height metrics should be prioritised over spectral metrics when estimation of forest attributes is concerned. View Full-Text
Share & Cite This Article
Persson, H.J. Estimation of Boreal Forest Attributes from Very High Resolution Pléiades Data. Remote Sens. 2016, 8, 736.
Persson HJ. Estimation of Boreal Forest Attributes from Very High Resolution Pléiades Data. Remote Sensing. 2016; 8(9):736.Chicago/Turabian Style
Persson, Henrik J. 2016. "Estimation of Boreal Forest Attributes from Very High Resolution Pléiades Data." Remote Sens. 8, no. 9: 736.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.