Canopy height is a fundamental biophysical and structural parameter, crucial for biodiversity monitoring, forest inventory and management, and a number of ecological and environmental studies and applications. It is a determinant for linking the classification of land cover to habitat categories towards building one-to-one relationships. Light detection and ranging (LiDAR) or 3D Stereoscopy are the commonly used and most accurate remote sensing approaches to measure canopy height. However, both require significant time and budget resources. This study proposes a cost-effective methodology for canopy height approximation using texture analysis on a single 2D image. An object-oriented approach is followed using land cover (LC) map as segmentation vector layer to delineate landscape objects. Global texture feature descriptors are calculated for each land cover object and used as variables in a number of classifiers, including single and ensemble trees, and support vector machines. The aim of the analysis is the discrimination among classes in a wide range of height values used for habitat mapping (from less than 5 cm to 40 m). For that task, different spatial resolutions are tested, representing a range from airborne to spaceborne quality ones, as well as their combinations, forming a multiresolution training set. Multiple dataset alternatives are formed based on the missing data handling, outlier removal, and data normalization techniques. The approach was applied using orthomosaics from DMC II airborne images, and evaluated against a reference LiDAR-derived canopy height model (CHM). Results reached overall object-based accuracies of 67% with the percentage of total area correctly classified exceeding 88%. Sentinel-2 simulation and multiresolution analysis (MRA) experiments achieved even higher accuracies of up to 85% and 91%, respectively, at reduced computational cost, showing potential in terms of transferability of the framework to large spatial scales.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited