Automatic Segmentation of Raw LIDAR Data for Extraction of Building Roofs
AbstractAutomatic extraction of building roofs from remote sensing data is important for many applications, including 3D city modeling. This paper proposes a new method for automatic segmentation of raw LIDAR (light detection and ranging) data. Using the ground height from a DEM (digital elevation model), the raw LIDAR points are separated into two groups. The first group contains the ground points that form a “building mask”. The second group contains non-ground points that are clustered using the building mask. A cluster of points usually represents an individual building or tree. During segmentation, the planar roof segments are extracted from each cluster of points and refined using rules, such as the coplanarity of points and their locality. Planes on trees are removed using information, such as area and point height difference. Experimental results on nine areas of six different data sets show that the proposed method can successfully remove vegetation and, so, offers a high success rate for building detection (about 90% correctness and completeness) and roof plane extraction (about 80% correctness and completeness), when LIDAR point density is as low as four points/m2. Thus, the proposed method can be exploited in various applications.
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Awrangjeb, M.; Fraser, C.S. Automatic Segmentation of Raw LIDAR Data for Extraction of Building Roofs. Remote Sens. 2014, 6, 3716-3751.
Awrangjeb M, Fraser CS. Automatic Segmentation of Raw LIDAR Data for Extraction of Building Roofs. Remote Sensing. 2014; 6(5):3716-3751.Chicago/Turabian Style
Awrangjeb, Mohammad; Fraser, Clive S. 2014. "Automatic Segmentation of Raw LIDAR Data for Extraction of Building Roofs." Remote Sens. 6, no. 5: 3716-3751.