# Automatic Filtering of Lidar Building Point Cloud in Case of Trees Associated to Building Roof

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Related Work

## 3. Filtering of Building Point Cloud Which Does Not Contain High Trees

${h}_{i:1ton}=histogram\left({Z}_{pointcloud}\right)\phantom{\rule{0ex}{0ex}}{h}_{max}=\mathrm{max}\left({h}_{i}\right);PSTF=0\phantom{\rule{0ex}{0ex}}\mathit{F}\mathit{o}\mathit{r}i=1\mathit{T}\mathit{o}3\phantom{\rule{0ex}{0ex}}\hspace{1em}\mathit{I}\mathit{f}{h}_{i}-{h}_{i+1}0.5\times {h}_{max}\mathit{O}\mathit{r}{h}_{i}-{h}_{i+2}0.6\times {h}_{max}\mathit{T}\mathit{h}\mathit{e}\mathit{n}PSTF=i\mathit{E}\mathit{n}\mathit{d}\mathit{I}\mathit{f}\phantom{\rule{0ex}{0ex}}\mathit{N}\mathit{e}\mathit{x}\mathit{t}i\phantom{\rule{0ex}{0ex}}\mathit{I}\mathit{f}PSTF=0\mathit{T}\mathit{h}\mathit{e}\mathit{n}PSTF=4\mathit{E}\mathit{n}\mathit{d}\mathit{I}\mathit{f}\phantom{\rule{0ex}{0ex}}{h}_{mrb}=\mathrm{max}\left({h}_{i}\right);i=\mathit{F}\mathit{r}\mathit{o}\mathit{m}PSTF\mathit{T}\mathit{o}length\left({h}_{i}\right)\phantom{\rule{0ex}{0ex}}{h}_{rb}=zeros\left(n\right);S=0;{S}_{1}=0\text{}({S}_{1}\text{}\mathrm{is}\text{}\mathrm{the}\text{}\mathrm{first}\text{}\mathrm{roof}\text{}\mathrm{bar},\text{}S\text{}\mathrm{is}\text{}\mathrm{the}\text{}\mathrm{last}\text{}\mathrm{roof}\text{}\mathrm{bar})\phantom{\rule{0ex}{0ex}}\mathit{F}\mathit{o}\mathit{r}i=PSTF\mathit{T}\mathit{o}n\phantom{\rule{0ex}{0ex}}\hspace{1em}\mathit{I}\mathit{f}{h}_{i}\ge \frac{1}{3}\times {h}_{mrb}\mathit{T}\mathit{h}\mathit{e}\mathit{n}{h}_{rb}\left(i\right)={h}_{i};s=i\phantom{\rule{0ex}{0ex}}\hspace{1em}\hspace{1em}\mathit{I}\mathit{f}{S}_{1}=0\mathit{T}\mathit{h}\mathit{e}\mathit{n}{S}_{1}=i\mathit{E}\mathit{n}\mathit{d}\mathit{I}\mathit{f}\phantom{\rule{0ex}{0ex}}\hspace{1em}\mathit{E}\mathit{l}\mathit{s}\mathit{e}\mathit{I}\mathit{f}{h}_{i}\le 0.1\times {h}_{mrb}\mathit{T}\mathit{h}\mathit{e}\mathit{n}{h}_{fbi}={h}_{i}\mathit{E}\mathit{l}\mathit{s}\mathit{e}{h}_{fuz}={h}_{i}\mathit{E}\mathit{n}\mathit{d}\mathit{I}\mathit{f}\phantom{\rule{0ex}{0ex}}\mathit{N}\mathit{e}\mathit{x}\mathit{t}i\phantom{\rule{0ex}{0ex}}\mathit{I}\mathit{f}Sn\mathit{T}\mathit{h}\mathit{e}\mathit{n}\mathit{F}\mathit{o}\mathit{r}i=S\mathit{T}on{h}_{rbi}={h}_{i}\mathit{N}\mathit{e}\mathit{x}\mathit{t}i\mathit{E}\mathit{n}\mathit{d}\mathit{I}\mathit{f}\phantom{\rule{0ex}{0ex}}\mathit{I}\mathit{f}{h}_{S1-1}\le \frac{{h}_{mrb}}{10}\mathit{T}\mathit{h}\mathit{e}\mathit{n}{Z}_{S1-0.5ToS1-1}\in {h}_{rb}\mathit{E}\mathit{n}\mathit{d}\mathit{I}\mathit{f}$ |

_{rb}: roof bars; h

_{fb}: façade bars; h

_{fuz}: fuzzy bars.

## 4. Results of Filtering Algorithm Application

- The analysis of fuzzy bars in the building point cloud histogram has to be postponed to the end of the tree point elimination. This choice is supported by the fact that the roof planes, perhaps presented by fuzzy bars, are low in comparison to the building roof level and have smaller areas. If these planes were to be extracted before eliminating the tree points, they may be eliminated during the tree point recognition step.
- The application of the first seven rules on the building point cloud creates a new point cloud that represents the building roof (without the planes of fuzzy bars) in addition to the tree crown. This cloud will be named the noisy roof cloud.

## 5. Extended Filtering Algorithm

_{m}) of the noisy roof point cloud.

#### 5.1. Selection of Neighbouring Points

_{m}). This matrix is determined from the neighboring points for each cloud point. This matrix consists of n rows (n is the number of points of noisy roof cloud) and N columns. The number N represents the maximum possible number of neighboring points (please see Dey et al. [27]). Furthermore, In the point cloud list, the order of each point is the point number, e.g., if the point number i has 45 neighboring points, then the first 45 entries of the row number i will contain the neighboring point numbers, and the rest of cells of this row will contain zeros. This matrix will be employed for calculating the normal vector in addition to the roof feature values. Moreover, it will be used later (see Section 5.4) to recognize the roof points.

_{m}is calculated, the normal vectors of the noisy roof cloud can be calculated.

#### 5.2. Calculation of Normal Vectors and Change of Curvature Factor

_{m}, the best plane passing through each point can be fitted and the normal vector then can be calculated. To achieve this objective, the plane equation will be fitted using the eigenvectors of the covariance matrix which was employed by Sanchez et al. [28]. In this paper, three criteria are employed for distinguishing the roof points from the tree ones. These criteria are the standard deviation (σ) of the fitted plane (in meters), the angle (φ°) of the normal vector with the horizontal plane and the change of curvature factor (C

_{cf}) (unitless) which is calculated based on the eigenvalues according to Equation (1) [29].

#### 5.3. Filtering of Noisy Roof Point Cloud

_{i}, φ

_{i}and C

_{cf i}are, respectively, the standard deviation, the angle (φ°) and the change of curvature factor of the point number i. Furthermore, T

_{hσ}, T

_{hφ}and T

_{hccf}are the assigned thresholds that are used for detecting the roof points. In this context, the employed threshold values change from one building to other, and a stochastic method is adopted for calculating the smart threshold values. This method relies on the histogram analysis as shown in the next section.

#### 5.4. Calculation of Smart Threshold Values

_{cf}, respectively. The step values in these histograms are, respectively, equal to 0.2 m, 10° and 0.01 (unitless). These values are selected empirically depending on the range of each parameter and the sensitivity of the parameter variation to the filtering results, e.g., in Figure 5 and Figure 6, the variation of φ° values has less influence on the filtering result than the variation of C

_{cf}parameter, which is why the number of bins of φ° histogram is fewer than the number of bins of C

_{cf}histogram. Though the high-quality results shown in Section 6 demonstrate that the selected values of the three histogram steps are successful, more investigations are needed to prove their optimal values, which can be applied to different point densities and urban typologies.

_{cf}presented in Figure 6.

_{hφ}is shifted to two bins to the left of the most frequent angle bin.

_{hσ}threshold, except the threshold T

_{hσ}is shifted to only one bar to the right of the most frequent standard deviation bar.

_{hccf}is considered to be the most frequent value in the histogram (Figure 6) without shifting.

_{hσ}value is constant in eight buildings (T

_{hσ}= 0.8 m). In fact, the nine buildings belong to the same point cloud (Vaihingen city) and have the same urban typology and similar texture. On the other hand, the values of the thresholds T

_{hφ}and T

_{hccf}are related to the architecture complexity level of the roof, e.g., Building 2 has the most architecturally complex roof. This is why the T

_{hφ}value is the smallest (30°) and T

_{hccf}value is the largest (0.04).

_{m}matrix is employed. For each detected point in Figure 8a, all its neighboring points have to be added to the roof class according to the matrix N

_{m}. Figure 8b illustrates the result of adding the neighboring points. At this stage, all remaining points in the noisy point cloud will represent the tree class (red color in Figure 8c). Inside the black circles in Figure 8c, some roof points that were misclassified as vegetation points can be seen. The process for improving this result is detailed in the next section.

#### 5.5. Improvement of Filtering Results

## 6. Results

#### 6.1. Datasets

^{2}). It covers an area of 214 m × 159 m and it contains residential buildings and tree coverage that partially covers buildings. In terms of topography, it is a flat area.

#### 6.2. Accuracy Estimation

- In the case of high buildings with a high point density, the small roof planes that are not located in the main roof level may not be detected.
- In the case of buildings that consist of several masses with different heights, the portions of façade points with the same altitudes of the low roof planes will not be cancelled.
- Some of high tree points will not be eliminated.

## 7. Discussion

#### 7.1. Filtering Accuracy Discussion

^{2}and the mean point density = 6 points/m

^{2}, the threshold of the number of the façade points according to Rule 6 =300 points. Consequently, the detection probability of a small and low roof plane (for example: area = 5 m

^{2}and the number of points = 30 points) will be low because the number of the plane points in addition to the façade points belonging to the same bar may be smaller than the selected threshold. In the same context, when the building is high, this probability will decrease more because the point density of the roof surface will be greater than the mean point density. Figure 13c shows that one low and small roof plane is missed by the filtering algorithm (the red arrow points to a missed roof plane).

_{cf}) may not be eliminated (see blue arrow in Figure 10b). In fact, the percentage of these points in all tested buildings, in comparison to the number of the tree points, was negligible (less than 1%), which is why their influence on the modelling step was neglected.

#### 7.2. Accuracy Comparison

_{1}. According to the classification accuracy, a certain amount of nonbuilding points and noisy points may be present in this mask. Second, a region-growing algorithm is applied to this mask to detect each building independently and is then filtered by the suggested algorithm in this paper. This procedure allows for generating a filtered building mask named m

_{2}. Third, a 2D outlines modelling algorithm is applied to m

_{1}, which produces a new model named m

_{3}. Finally, the three calculated masks can be compared with a reference model to estimate their accuracy through the confusion matrix. If the deformations generated by the modelling algorithm in m

_{3}are neglected, a comparison between the last three models’ accuracies can be achieved, to judge which model is more faithful to the reference model.

#### 7.3. Comparison with Deep-Learning-Based Methods

#### 7.4. Ablation Study

## 8. Conclusions and Perspective

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Shan, J.; Yan, J.; Jiang, W. Global Solutions to Building Segmentation and Reconstruction. Topographic Laser Ranging and Scanning: Principles and Processing, 2nd ed.; Shan, J., Toth, C.K., Eds.; CRC Press: Boca Raton, FL, USA, 2018; pp. 459–484. [Google Scholar]
- Tarsha Kurdi, F.; Landes, T.; Grussenmeyer, P.; Koehl, M. Model-driven and data-driven approaches using Lidar data: Analysis and comparison. In Proceedings of the ISPRS Workshop, Photogrammetric Image Analysis (PIA07), International Archives of Photogrammetry, Remote Sensing and Spatial Information Systems. Munich, Germany, 19–21 September 2007; 2007; Volume XXXVI, pp. 87–92, Part3 W49A. [Google Scholar]
- Wehr, A.; Lohr, U. Airborne laser scanning—An introduction and overview. ISPRS J. Photogramm. Remote Sens.
**1999**, 54, 68–82. [Google Scholar] [CrossRef] - He, Y. Automated 3D Building Modelling from Airborne Lidar Data. Ph.D. Thesis, University of Melbourne, Melbourne, Australia, 2015. [Google Scholar]
- Tarsha Kurdi, F.; Awrangjeb, M. Comparison of Lidar building point cloud with reference model for deep comprehension of cloud structure. Can. J. Remote Sens.
**2020**, 46, 603–621. [Google Scholar] [CrossRef] - Shao, J.; Zhang, W.; Shen, A.; Mellado, N.; Cai, S.; Luo, L.; Wang, N.; Yan, G.; Zhou, G. Seed point set-based building roof extraction from airborne LiDAR point clouds using a top-down strategy. Autom. Constr.
**2021**, 126, 103660. [Google Scholar] [CrossRef] - Wen, C.; Li, X.; Yao, X.; Peng, L.; Chi, T. Airborne LiDAR point cloud classification with global-local graph attention convolution neural network. ISPRS J. Photogramm. Remote Sens.
**2021**, 173, 181–194. [Google Scholar] [CrossRef] - Varlik, A.; Uray, F. Filtering airborne LIDAR data by using fully convolutional networks. Surv. Rev.
**2021**, 1–11. [Google Scholar] [CrossRef] - Hui, Z.; Li, Z.; Cheng, P.; Ziggah, Y.Y.; Fan, J. Building extraction from airborne LiDAR data based on multi-constraints graph segmentation. Remote Sens.
**2021**, 13, 3766. [Google Scholar] [CrossRef] - Tarsha Kurdi, F.; Awrangjeb, M.; Munir, N. Automatic filtering and 2D modelling of Lidar building point cloud. Trans. GIS J.
**2021**, 25, 164–188. [Google Scholar] [CrossRef] - Maltezos, E.; Doulamis, A.; Doulamis, N.; Ioannidis, C. Building extraction from LiDAR data applying deep convolutional neural networks. IEEE Geosci. Remote Sens. Lett.
**2019**, 16, 155–159. [Google Scholar] [CrossRef] - Liu, M.; Shao, Y.; Li, R.; Wang, Y.; Sun, X.; Wang, J.; You, Y. Method for extraction of airborne LiDAR point cloud buildings based on segmentation. PLoS ONE
**2020**, 15, e0232778. [Google Scholar] [CrossRef] - Demir, N. Automated detection of 3D roof planes from Lidar data. J. Indian Soc. Remote Sens.
**2018**, 46, 1265–1272. [Google Scholar] [CrossRef] - Tarsha Kurdi, F.; Awrangjeb, M.; Munir, N. Automatic 2D modelling of inner roof planes boundaries starting from Lidar data. In Proceedings of the 14th 3D GeoInfo 2019, Singapore, 26–27 September 2019; pp. 107–114. [Google Scholar] [CrossRef] [Green Version]
- Park, S.Y.; Lee, D.G.; Yoo, E.J.; Lee, D.C. Segmentation of Lidar data using multilevel cube code. J. Sens.
**2019**, 2019, 4098413. [Google Scholar] [CrossRef] - Zhang, K.; Yan, J.; Chen, S.C. A framework for automated construction of building models from airborne Lidar measurements. In Topographic Laser Ranging and Scanning: Principles and Processing, 2nd ed.; Shan, J., Toth, C.K., Eds.; CRC Press: Boca Raton, FL, USA, 2018; pp. 563–586. [Google Scholar]
- Li, M.; Rottensteiner, F.; Heipke, C. Modelling of buildings from aerial Lidar point clouds using TINs and label maps. ISPRS J. Photogramm. Remote Sens.
**2019**, 154, 127–138. [Google Scholar] [CrossRef] - Vosselman, G.; Dijkman, S. 3D building model reconstruction from point clouds and ground plans. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2001**, 34, 37–44. [Google Scholar] - Hu, P.; Miao, Y.; Hou, M. Reconstruction of Complex Roof Semantic Structures from 3D Point Clouds Using Local Convexity and Consistency. Remote Sens.
**2021**, 13, 1946. [Google Scholar] [CrossRef] - Nguyen, T.H.; Daniel, S.; Guériot, D.; Sintès, C.; Le Caillec, J.-M. Super-Resolution-Based Snake Model—An Unsupervised Method for Large-Scale Building Extraction Using Airborne LiDAR Data and Optical Image. Remote Sens.
**2020**, 12, 1702. [Google Scholar] [CrossRef] - Zhang, W.; Wang, H.; Chen, Y.; Yan, K.; Chen, M. 3D building roof modelling by optimizing primitive’s parameters using constraints from Lidar data and aerial imagery. Remote Sens.
**2014**, 6, 8107–8133. [Google Scholar] [CrossRef] [Green Version] - Jung, J.; Sohn, G. Progressive modelling of 3D building rooftops from airborne Lidar and imagery. In Topographic Laser Ranging and Scanning: Principles and Processing, 2nd ed.; Shan, J., Toth, C.K., Eds.; CRC Press: Boca Raton, FL, USA, 2018; pp. 523–562. [Google Scholar]
- Awrangjeb, M.; Gilani, S.A.N.; Siddiqui, F.U. An effective data-driven method for 3D building roof reconstruction and robust change detection. Remote Sens.
**2018**, 10, 1512. [Google Scholar] [CrossRef] [Green Version] - Pirotti, F.; Zanchetta, C.; Previtali, M.; Della Torre, S. Detection of building roofs and façade from aerial laser scanning data using deep learning. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2019**, 4211, 975–980. [Google Scholar] [CrossRef] [Green Version] - Martin-Jimenez, J.; Del Pozo, S.; Sanchez-Aparicio, M.; Laguela, S. Multi-scale roof characterization from Lidar data and aerial orthoimagery: Automatic computation of building photovoltaic capacity. J. Autom. Constr.
**2020**, 109, 102965. [Google Scholar] [CrossRef] - Sibiya, M.; Sumbwanyambe, M. An algorithm for severity estimation of plant leaf diseases by the use of colour threshold image segmentation and fuzzy logic inference: A proposed algorithm to update a “Leaf Doctor” application. AgriEngineering
**2019**, 1, 205–219. [Google Scholar] [CrossRef] [Green Version] - Dey, K.E.; Tarsha Kurdi, F.; Awrangjeb, M.; Stantic, B. Effective Selection of Variable Point Neighbourhood for Feature Point Extraction from Aerial Building Point Cloud Data. Remote Sens.
**2021**, 13, 1520. [Google Scholar] [CrossRef] - Sanchez, J.; Denis, F.; Coeurjolly, D.; Dupont, F.; Trassoudaine, L.; Checchin, P. Robust normal vector estimation in 3D point clouds through iterative principal component analysis. ISPRS J. Photogramm. Remote Sens.
**2020**, 163, 18–35. [Google Scholar] [CrossRef] [Green Version] - Thomas, H.; Goulette, F.; Deschaud, J.; Marcotegui, B.; LeGall, Y. Semantic Classification of 3D Point Clouds with Multiscale Spherical Neighborhoods. In Proceedings of the International Conference on 3D Vision (3DV), Verona, Italy, 5–8 September 2018; pp. 390–398. [Google Scholar] [CrossRef] [Green Version]
- Tarsha Kurdi, F.; Landes, T.; Grussenmeyer, P. Joint combination of point cloud and DSM for 3D building reconstruction using airborne laser scanner data. In Proceedings of the 4th IEEE GRSS / WG III/2+5, VIII/1, VII/4 Joint Workshop on Remote Sensing & Data Fusion over Urban Areas and 6th International Symposium on Remote Sensing of Urban Areas, Télécom, Paris, France, 11–13 April 2007; p. 7. [Google Scholar]
- Eurosdr. Available online: www.eurosdr.net (accessed on 23 November 2021).
- Cramer, M. The DGPF test on digital aerial camera evaluation—Overview and test design. Photogramm.-Fernerkund.-Geoinf.
**2010**, 2, 73–82. [Google Scholar] [CrossRef] - Awrangjeb, M.; Zhang, C.; Fraser, C.S. Automatic extraction of building roofs using LiDAR data and multispectral imagery. ISPRS J. Photogramm. Remote Sens.
**2013**, 83, 1–18. [Google Scholar] [CrossRef] [Green Version] - Huang, R.; Yang, B.; Liang, F.; Dai, W.; Li, J.; Tian, M.; Xu, W. A top-down strategy for buildings extraction from complex urban scenes using airborne LiDAR point clouds. Infrared Phys. Technol.
**2018**, 92, 203–218. [Google Scholar] [CrossRef] - Widyaningrum, E.; Gorte, B.; Lindenbergh, R. Automatic Building Outline Extraction from ALS Point Clouds by Ordered Points Aided Hough Transform. Remote Sens.
**2019**, 11, 1727. [Google Scholar] [CrossRef] [Green Version] - Zhao, Z.; Duan, Y.; Zhang, Y.; Cao, R. Extracting buildings from and regularizing boundaries in airborne LiDAR data using connected operators. Int. J. Remote Sens.
**2016**, 37, 889–912. [Google Scholar] [CrossRef] - Griffiths, D.; Boehm, J. Improving public data for building segmentation from Convolutional Neural Networks (CNNs) for fused airborne lidar and image data using active contours. ISPRS J. Photogramm. Remote Sens.
**2019**, 154, 70–83. [Google Scholar] [CrossRef] - Ayazi, S.M.; Saadat Seresht, M. Comparison of traditional and machine learning base methods for ground point cloud labeling. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.
**2019**, XLII-4/W18, 141–145. [Google Scholar] [CrossRef] [Green Version]

**Figure 1.**(

**a**,

**d**) Building point clouds in Vaihingen and Hermanni datasets; (

**b**,

**e**) histograms of Z coordinate of building point clouds consecutively of the last two buildings; (

**c**,

**f**) final results of building point clouds’ filtering; The red dot in (

**b**,

**e**) is the PSTF; “1,2,3” in circles in (

**b**,

**e**) are building section numbers.

**Figure 2.**(

**a**,

**d**) Building point clouds, respectively, in Toronto and Strasbourg datasets; (

**b**,

**e**) histograms of Z coordinate of building point clouds consecutively of the last two buildings; (

**c**,

**f**) final results of building point clouds’ filtering; The red dots in (

**b**,

**e**) are the PSTF; “1,2,3” in circles in (

**b**,

**e**) are building section numbers.

**Figure 3.**Results of filtering algorithm application in case of buildings which contain high trees that occlude the roof. (

**a**,

**d**,

**g**) The original building point clouds; (

**b**,

**e**,

**h**) histograms of building point clouds; (

**c**,

**f**,

**i**) noisy roof point clouds; (

**a**–

**c**) building number 4; (

**d**–

**f**) building number 5; (

**g**–

**i**) building number 6; The red dots in (

**b**,

**e**,

**h**) are the PSTF; “1,2,3” in circles in (

**b**,

**e**,

**h**) are building section numbers.

**Figure 4.**Workflow of the extended filtering algorithm where σ is the standard deviation of the fitted plane, φ° is the angle of the normal vector with the horizontal plane, C

_{cf}is the change of curvature factor, and T

_{hσ}, T

_{hφ}and T

_{hccf}are their respective thresholds.

**Figure 5.**(

**a**,

**b**) Histograms of standard deviation (σ) and the angle (φ°) of normal vector with horizontal plane of noisy point cloud.

**Figure 7.**Algorithm flowchart for smart calculation of employed thresholds; Th

_{i}(Th_Fy, Th_Sd, Th_Cf) are the used thresholds; ε

_{i}are last shifts applied on thresholds; e

_{i}: are difference of number of points detected by thresholds before applying the shifts ε

_{i}.

**Figure 8.**Separation roof points from high tree points. (

**a**) Employment threshold result; (

**b**) adding neighboring points to the result; (

**c**) building points are in blue and tree points are in red; black circles refer to misclassified points.

**Figure 9.**Visualization of normalized Digital Surface Models (nDSM) before improvement; (

**a**) vegetation class; (

**b**) roof class.

**Figure 10.**Final filtering results of three building point clouds illustrated in Figure 3; (

**a**) Building 1; (

**b**) Building 2; (

**c**) Building 3. Green color represents vegetation and red color represents roof.

**Figure 11.**Locations of the study areas: (

**a**) Hermanni; (

**b**) Toronto; (

**c**) Aitkenvale; (

**d**) Vaihingen; (

**e**) Strasbourg.

**Figure 12.**Comparison between number of points of building point clouds, reference roof clouds and filtered roof clouds of Vaihingen dataset; the first 51 buildings have trees associated with roofs and the last 17 buildings have no trees associated with roofs.

**Figure 13.**(

**a**) Building point cloud in Toronto dataset; (

**b**) histogram of Z coordinate of building point cloud; (

**c**) final result of building point cloud filtering; the red dot in (

**b**) is PSTF; the blue arrow in (

**c**) façade points that are saved; the red arrow in (

**c**) the missed small and low plane; “1,2,3” in circles in (

**b**) are building section numbers.

**Table 1.**Values of calculated thresholds for the three buildings represented in Figure 3 in addition to five other buildings from Vaihingen point cloud.

T_{hφ} (Degree) | T_{hσ} (m) | T_{hccf} (Unitless) | |
---|---|---|---|

Building 1 | 40 | 0.8 | 0.01 |

Building 2 | 30 | 0.8 | 0.04 |

Building 3 | 50 | 0.8 | 0.02 |

Building 4 | 30 | 0.9 | 0.03 |

Building 5 | 30 | 0.8 | 0.02 |

Building 6 | 40 | 0.8 | 0.02 |

Building 7 | 30 | 0.8 | 0.01 |

Building 8 | 45 | 0.8 | 0.02 |

Building 9 | 30 | 0.8 | 0.02 |

Hermanni | Strasbourg | Toronto | Vaihingen | Aitkenvale | |
---|---|---|---|---|---|

Acquisition | June 2002 | September 2004 | February 2009 | August 2008 | -- |

Sensor | TopoEye | TopScan (Optech ALTM 1225) | Optech ALTM-ORION M | Leica Geosystems (Leica ALS50) | -- |

Point density(point/m^{2}) | 7–9 | 1.3 | 6 | 4–6.7 | 29–40 |

Flight height(m) | 200 | 1440 | 650 | 500 | >100 |

Dataset | Number of Buildings | Number of Points | Undesirable Points (%) | Corr (%) | Comp (%) | Q (%) |
---|---|---|---|---|---|---|

Average Values | ||||||

Hermanni | 12 | 7380 | 13.37 | 99.69 | 99.76 | 99.45 |

Strasbourg | 56 | 1433 | 18.05 | 98.44 | 95.63 | 94.24 |

Toronto | 72 | 14,700 | 23.25 | 98.56 | 95.57 | 94.23 |

Vaihingen | 68 | 2542 | 41.65 | 94.85 | 98.39 | 93.37 |

Aitkenvale | 28 | 9822 | 37.1 | 98.1 | 98.75 | 96.93 |

Average | 7175 | 26.7 | 97.9 | 97.6 | 95.6 |

Corr (%) | Comp (%) | Q (%) | |
---|---|---|---|

Maltezos et al. [11] | 85.3 | 93.8 | 80.8 |

Widyaningrum et al. [35] | 90.1 | 99.4 | 89.6 |

Huang et al. [34] | 96.8 | 96.2 | 93.2 |

Zhao et al. [36] | 91.0 | 95.0 | 86.8 |

Hui et al. [9] | 91.61 | 93.61 | 85.74 |

Wen et al. [7] | 95.1 | 91.2 | 86.7 |

Suggested approach | 97.9 | 97.6 | 95.6 |

Total Build | Z_His | φ° | σ | ccf | φ°+ σ + ccf | N_{m} | Imp | Fuz | Roof | Tree | Und |
---|---|---|---|---|---|---|---|---|---|---|---|

2936 | 2503 | 1456 | 1329 | 1368 | 1065 | 597 | 14 | 55 | 1727 | 776 | 433 |

_{m}: neighborhood matrix; Imp: improvement of filtering algorithm; Fuz: fuzzy bars analysis; Und: undesirable points; +: and.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Tarsha Kurdi, F.; Gharineiat, Z.; Campbell, G.; Awrangjeb, M.; Dey, E.K.
Automatic Filtering of Lidar Building Point Cloud in Case of Trees Associated to Building Roof. *Remote Sens.* **2022**, *14*, 430.
https://doi.org/10.3390/rs14020430

**AMA Style**

Tarsha Kurdi F, Gharineiat Z, Campbell G, Awrangjeb M, Dey EK.
Automatic Filtering of Lidar Building Point Cloud in Case of Trees Associated to Building Roof. *Remote Sensing*. 2022; 14(2):430.
https://doi.org/10.3390/rs14020430

**Chicago/Turabian Style**

Tarsha Kurdi, Fayez, Zahra Gharineiat, Glenn Campbell, Mohammad Awrangjeb, and Emon Kumar Dey.
2022. "Automatic Filtering of Lidar Building Point Cloud in Case of Trees Associated to Building Roof" *Remote Sensing* 14, no. 2: 430.
https://doi.org/10.3390/rs14020430