Multispectral LiDAR Data for Land Cover Classification of Urban Areas
AbstractAirborne Light Detection And Ranging (LiDAR) systems usually operate at a monochromatic wavelength measuring the range and the strength of the reflected energy (intensity) from objects. Recently, multispectral LiDAR sensors, which acquire data at different wavelengths, have emerged. This allows for recording of a diversity of spectral reflectance from objects. In this context, we aim to investigate the use of multispectral LiDAR data in land cover classification using two different techniques. The first is image-based classification, where intensity and height images are created from LiDAR points and then a maximum likelihood classifier is applied. The second is point-based classification, where ground filtering and Normalized Difference Vegetation Indices (NDVIs) computation are conducted. A dataset of an urban area located in Oshawa, Ontario, Canada, is classified into four classes: buildings, trees, roads and grass. An overall accuracy of up to 89.9% and 92.7% is achieved from image classification and 3D point classification, respectively. A radiometric correction model is also applied to the intensity data in order to remove the attenuation due to the system distortion and terrain height variation. The classification process is then repeated, and the results demonstrate that there are no significant improvements achieved in the overall accuracy. View Full-Text
Share & Cite This Article
Morsy, S.; Shaker, A.; El-Rabbany, A. Multispectral LiDAR Data for Land Cover Classification of Urban Areas. Sensors 2017, 17, 958.
Morsy S, Shaker A, El-Rabbany A. Multispectral LiDAR Data for Land Cover Classification of Urban Areas. Sensors. 2017; 17(5):958.Chicago/Turabian Style
Morsy, Salem; Shaker, Ahmed; El-Rabbany, Ahmed. 2017. "Multispectral LiDAR Data for Land Cover Classification of Urban Areas." Sensors 17, no. 5: 958.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.