Multidirectional Shift Rasterization (MDSR) Algorithm for Effective Identification of Ground in Dense Point Clouds
Abstract
:1. Introduction
Current Ground Filtering Approaches
2. Materials and Methods
2.1. Algorithm Description
 Moving the raster as a whole (i.e., shifting the cells in which the lowest point is identified) in steps much smaller than the cell size in both horizontal axes (X, Y);
 Tilting (or, rather, rotation) of the raster around all 3 axes (X, Y, Z) changes the perspective for the evaluation of the elevations of individual points in the given projection. After such tilting, the raster is again gradually shifted as in the previous step, and points with the lowest elevation in each cell after each displacement are selected.
2.2. Illustration of the Algorithm Principle
2.3. Data for Testing
2.4. Data Processing and Evaluation
2.5. Filters Selected for Comparison
2.6. Visual Evaluation
2.7. Evaluation of the Differences from the MDSRBased Surface
3. Results
3.1. Visual Evaluation
3.2. Comparison of Point Clouds with an MDSRBased Surface
4. Discussion
 No approximations, simplifications, and assumptions of the terrain are made;
 The filtering step also dilutes the point cloud;
 Compared with the common geometric filters, this one can be used for much more complex terrains and, therefore, is much more versatile;
 The computational demands approximately quadratically increase with increasing required detail (due to the number of raster shifts and raster size);
 Similar to all filtering methods, this one also needs verification by an operator; here, typically, a thin layer of filtering artifacts on the edges of the area needs to be removed either manually or simply by cropping by (typically) several decimeters to a few meters;
 Where a dense point cloud is needed, the MDSR method could be also used as the first step of an advanced multistep algorithm for the acquisition of the first terrain approximation, after which the remaining points can be identified based on a threshold as in standard filters.
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. MDSR Algorithm Code
Points = [ (1) x1 y1 z1 R G B ... (2) x2 y2 z2 R G B ... . . (p) xp yp zp R G B ...]
 0.
 Calculation parameters
Raster cell size:  
RasterSize = R  
Number of shifts:  
NumberofShifts = N  
Shift size  
Shift = R/N 
 1.

Reduction of the coordinates of the entire point cloud in a way ensuring that the minimum X, Y, and Z coordinates are equal to zero
function Result = ReduceCoords(Points) Xm = min(Points(:,1)); Ym = min(Points(:,2)); Zm = min(Points(:,3)); For i = 1:p Result(i, :) = Points(i,:) – [Xm Ym Zm] end; End;
 2.

Rotation of the entire point cloud by defined angles about individual axes
function Result = RotatePoints(alfa, beta, gama, Points) RotX = [1 0 0; 0 cos(alfa) sin(alfa); 0 sin(alfa) cos(alfa)] RotY = [cos(beta) 0 sin(beta); 0 1 0; sin(beta) 0 cos(beta)] RotZ = [cos(gama) sin(gama) 0; sin(gama) cos(gama) 0; 0 0 1] Rot = RotZ*RotX*RotY; For i = 1:p Result(i, :) = (Rot*Points(i,:)’)’ end; End;
 3.

Shift of the X and Y coordinates by a predefined value
function Result = ShiftPoints(Sx, Sy, Points) For i = 1:p Result(i,:) = Points(i,:) + [Sx Sy 0] end; End;
 4.

Search for the point with the lowest elevation in each raster cell
function rastr = Rasterize(RasterSize, Points) max_x = max(Points(:,1)); max_y = max(Points(:,2)); limit_x = ceil(max_x/ RasterSize); limit_y = ceil(max_y/ RasterSize); rastr = zeros((limit_x, limit_y); for i = 1:p do x = floor(Points(i,1)/RasterSize)+1; y = floor(Points(i,2)/RasterSize)+1; if rastr(x,y) == 0 then rastr(x,y) = i else if Points(i,3) < Points(rastr(x,y),3) then rastr(x,y) = i; end; end; end; End;
 5.

Saving the identified data—this is solely a routine programming issue of data management
function result = SaveLowestPoints(LPointsID, Points) //here, the storage pathways and form are to be programmed. End;
 6.

The program itself
PointsC = ReduceCoords(Points) For alfa = [alfa1, alfa2, …, alfam] For beta = [beta1, beta2, …, betan] For gama = [gama1, gama2, …, gamao] PointsR = RotatePoints(alfa, beta, gama, PointsC) PointsR = ReduceCoords(PointsR) For SX = 0:N For SY = 0:N PointsS = ShiftPoints(SX*Shift, SY*Shift) LowestPointsID = Rasterize(RasterSize, PointsS) SaveLowestPoints(LowestPointsID, Points) end; end; end; end; end;
PointsC = ReduceCoords(Points) For alfa = [alfa1, alfa2, …, alfam] For beta = [beta1, beta2, …, betan] For gama = [gama1, gama2, …, gamao] PointsR = RotatePoints(alfa, beta, gama, PointsC) PointsR = ReduceCoords(PointsR) For SX = 0:N For SY = 0:N PointsS = ShiftPoints(SX*Shift, SY*Shift) LowestPointsID = Rasterize(RasterSize, PointsS) SaveLowestPoints(LowestPointsID, Points) end; end; end; end; end; 
Appendix B. MDSR Algorithm Flowchart
Appendix C. Scanning Systems Used for Data Acquisition
 DJI Zenmuse L1 UAV Scanner
Dimensions  152 × 110 × 169 mm 
Weight  930 ± 10 g 
Maximum measurement distance  450 m at 80% reflectivity 
Recording speed  190 m at 10% reflectivity 
System accuracy (1σ)  Single return: max. 240,000 points/s 
Distance measurement accuracy (1σ)  Multiple return: max. 480,000 points/s 
Beam divergence  Horizontal: 10 cm per 50 m 
Maximum registered reflections  Vertical: 5 cm per 50 m 
RGB camera sensor size  3 cm per 100 m 
RGB camera effective pixels  0.28° (vertical) × 0.03° (horizontal) 
Weight  Approx. 6.3 kg (with one gimbal) 
Max. transmitting distance (Europe)  8 km 
Max. flight time  55 min 
Dimensions  810 × 670 × 430 mm 
Max. payload  2.7 kg 
Max. speed  82 km/h 
 Leica Pegasus Mobile Scanner
Weight  51 kg 
Dimensions  60 × 76 × 68 cm 
Typical horizontal accuracy (RMS)  0.020 m 
Typical horizontal accuracy (RMS)  0.015 m 
Laser scanner  ZF 9012 
Scanner frequency  1 mil points per second 
Other accessories and features  Cameras IMU Wheel sensor GNSS—GPS and GLONASS 
 Trimble X7 Terrestrial Scanner
Weight  5.8 kg 
Dimensions  178 mm × 353 mm × 170 mm 
Laser wavelength  1550 nm 
Field of view  360° × 282° 
Scan speed  Up to 500 kHz 
Range measurement principle  Timeofflight 
Range noise  <2.5 mm/30 m 
Range accuracy (1 sigma)  2 mm 
Angular accuracy (1 sigma)  21″ 
Other important features  Sensors’ autocalibration 3 coaxial calibrated 10 MPix cameras Automatic level compensation (in range ±10°) Inertial navigation system for autoregistration 
 Leica P40 Terrestrial Scanner
Weight  12.25 kg 
Dimensions  238 mm × 358 mm × 395 mm 
Laser wavelength  1550 nm/658 nm 
Field of view  360° × 290° 
Scan speed  Up to 1 mil point/s 
Range measurement principle  Timeofflight 
Range accuracy (1 sigma)  1.2 mm + 10 ppm 
Angular accuracy (1 sigma)  8″ 
Other important features  Dualaxis compensator (accuracy 1.5″) Internal camera 4 MP per each 17° × 17°, color image; 700 MP for panoramic image 
 DJI Phantom 4 RTK
Weight  1.391 g 
Max. transmitting distance (Europe)  5 km 
Max. flight time  30 min 
Dimensions  250 × 250 × 200 mm (approx.) 
Max. speed  58 km/h 
Camera resolution  4864 × 3648 
Appendix D. Filtering Results for All Sites and Filters
References
 Pukanska, K.; Bartos, K.; Hideghety, A.; Kupcikova, K.; Ksenak, L.; Janocko, J.; Gil, M.; Frackiewicz, P.; Prekopová, M.; Ďuriska, I.; et al. Hardly Accessible Morphological Structures—Geological Mapping and Accuracy Analysis of SfM and TLS Surveying Technologies. Acta Montan. Slovaca 2020, 25, 479–493, ISSN 13351788. [Google Scholar] [CrossRef]
 Kalvoda, P.; Nosek, J.; Kuruc, M.; Volařík, T.; Kalvodova, P. Accuracy Evaluation and Comparison of Mobile Laser Scanning and Mobile Photogrammetry Data; IOP Conference Series: Earth and Environmental Science; IOP Publishing Ltd.: Bristol, UK, 2020; pp. 1–10. ISSN 17551307. [Google Scholar] [CrossRef]
 Štroner, M.; Urban, R.; Línková, L. A New Method for UAV Lidar Precision Testing Used for the Evaluation of an Affordable DJI ZENMUSE L1 Scanner. Remote Sens. 2021, 13, 4811. [Google Scholar] [CrossRef]
 Guillaume, A.S.; Leempoel, K.; Rochat, E.; Rogivue, A.; Kasser, M.; Gugerli, F.; Parisod, C.; Joost, S. Multiscale Very High Resolution Topographic Models in Alpine Ecology: Pros and Cons of Airborne LiDAR and DroneBased StereoPhotogrammetry Technologies. Remote Sens. 2021, 13, 1588. [Google Scholar] [CrossRef]
 Jon, J.; Koska, B.; Pospíšil, J. Autonomous Airship Equipped with MultiSensor Mapping Platform. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 40, 119–124, ISSN 21949034. [Google Scholar] [CrossRef] [Green Version]
 Berrett, B.E.; Vernon, C.A.; Beckstrand, H.; Pollei, M.; Markert, K.; Franke, K.W.; Hedengren, J.D. LargeScale Reality Modeling of a University Campus Using Combined UAV and Terrestrial Photogrammetry for Historical Preservation and Practical Use. Drones 2021, 5, 136. [Google Scholar] [CrossRef]
 McMahon, C.; Mora, O.E.; Starek, M.J. Evaluating the Performance of sUAS Photogrammetry with PPK Positioning for Infrastructure Mapping. Drones 2021, 5, 50. [Google Scholar] [CrossRef]
 Le Van Canh, X.; Cao Xuan Cuong, X.; Nguyen Quoc Long, X.; Le Thi Thu Ha, X.; Tran Trung Anh, X.; XuanNam Bui, X. Experimental Investigation on the Performance of DJI Phantom 4 RTK in the PPK Mode for 3D Mapping OpenPit Mines. Inz. Miner.J. Pol. Miner. Eng. Soc. 2020, 1, 65–74, ISSN 16404920. [Google Scholar] [CrossRef]
 Fagiewicz, K.; Lowicki, D. The Dynamics of Landscape Pattern Changes in Mining Areas: The Case Study of The AdamowKozmin Lignite Basin. Quaest. Geogr. 2019, 38, 151–162, ISSN 0137477X. [Google Scholar] [CrossRef] [Green Version]
 Zimmerman, T.; Jansen, K.; Miller, J. Analysis of UAS Flight Altitude and Ground Control Point Parameters on DEM Accuracy along a Complex, Developed Coastline. Remote Sens. 2020, 12, 2305. [Google Scholar] [CrossRef]
 Brunier, G.; Oiry, S.; Gruet, Y.; Dubois, S.F.; Barillé, L. Topographic Analysis of Intertidal Polychaete Reefs (Sabellaria alveolata) at a Very High Spatial Resolution. Remote Sens. 2022, 14, 307. [Google Scholar] [CrossRef]
 Taddia, Y.; GonzálezGarcía, L.; Zambello, E.; Pellegrinelli, A. Quality Assessment of Photogrammetric Models for Façade and Building Reconstruction Using DJI Phantom 4 RTK. Remote Sens. 2020, 12, 3144. [Google Scholar] [CrossRef]
 Kavaliauskas, P.; Židanavičius, D.; Jurelionis, A. Geometric Accuracy of 3D Reality Mesh Utilization for BIMBased Earthwork Quantity Estimation Workflows. ISPRS Int. J. GeoInf. 2021, 10, 399. [Google Scholar] [CrossRef]
 Schroder, W.; Murtha, T.; Golden, C.; Scherer, A.K.; Broadbent, E.N.; Almeyda Zambrano, A.M.; Herndon, K.; Griffin, R. UAV LiDAR Survey for Archaeological Documentation in Chiapas, Mexico. Remote Sens. 2021, 13, 4731. [Google Scholar] [CrossRef]
 Blistan, P.; Kovanic, L.; Patera, M.; Hurcik, T. Evaluation quality parameters of DEM generated with lowcost UAV photogrammetry and StructurefromMotion (SfM) approach for topographic surveying of small areas. Acta Montan. Slovaca 2019, 24, 198–212, ISSN 13351788. [Google Scholar]
 Nesbit, P.R.; Hubbard, S.M.; Hugenholtz, C.H. Direct Georeferencing UAVSfM in HighRelief Topography: Accuracy Assessment and Alternative Ground Control Strategies Along Steep Inaccessible Rock Slopes. Remote Sens. 2022, 14, 490. [Google Scholar] [CrossRef]
 Vosselman, G. Slope based filtering of laser altimetry data. Int. Arch. Photogramm. Remote Sens. 2000, 33, 935–942. [Google Scholar]
 Sithole, G. Filtering of laser altimetry data using a slope adaptive filter. Int. Arch. Photogramm. Remote Sens. 2001, 34, 203–210. [Google Scholar]
 Susaki, J. Adaptive Slope Filtering of Airborne LiDAR Data in Urban Areas for Digital Terrain Model (DTM) Generation. Remote Sens. 2012, 4, 1804–1819. [Google Scholar] [CrossRef] [Green Version]
 Kraus, K.; Pfeifer, N. Determination of terrain models in wooded areas with airborne laser scanner data. ISPRS J. Photogramm. Remote Sens. 1998, 53, 193–203. [Google Scholar] [CrossRef]
 Axelsson, P. DEM generation from laser scanner data using adaptive TIN models. Int. Arch. Photogramm. Remote Sens. 2000, 33, 111–118. [Google Scholar]
 Kobler, A.; Pfeifer, N.; Ogrinc, P.; Todorovski, L.; Ostir, K.; Dzeroski, S. Repetitive interpolation: A robust algorithm for DTM generation from Aerial Laser Scanner Data in forested terrain. Remote Sens. Environ. 2007, 108, 9–23. [Google Scholar] [CrossRef]
 Zhang, K.; Chen, S.C.; Whitman, D.; Shyu, M.L.; Yan, J.; Zhang, C. A progressive morphological filter for removing nonground measurements from airborne LIDAR data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 872–882. [Google Scholar] [CrossRef] [Green Version]
 Pingel, T.J.; Clarke, K.C.; McBride, W.A. An improved simple morphological filter for the terrain classification of airborne LIDAR data. ISPRS J. Photogramm. Remote Sens 2013, 77, 21–30. [Google Scholar] [CrossRef]
 Li, Y. Filtering Airborne LIDAR Data by AN Improved Morphological Method Based on MultiGradient Analysis. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 40, 191–194. [Google Scholar] [CrossRef] [Green Version]
 Im, J.; Jensen, J.R.; Hodgson, M.E. Objectbased land cover classification using highpostingdensity LiDAR data. GIScience Remote Sens. 2008, 45, 209–228. [Google Scholar] [CrossRef] [Green Version]
 Zhang, J.; Lin, X.; Ning, X. SVMBased Classification of Segmented Airborne LiDAR Point Clouds in Urban Areas. Remote Sens. 2013, 5, 3749–3775. [Google Scholar] [CrossRef] [Green Version]
 Tovari, D.; Pfeifer, N. Segmentation based robust interpolation—A new approach to laser data filtering. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2005, 36, 79–84. [Google Scholar]
 Vosselman, G.; Coenen, M.; Rottensteiner, F. Contextual segmentbased classification of airborne laser scanner data. ISPRS J. Photogramm. Remote Sens. 2017, 128, 354–371. [Google Scholar] [CrossRef]
 Bartels, M.; Wei, H. Segmentation of LiDAR data using measures of distribution. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2006, 36, 426–431. [Google Scholar]
 Crosilla, F.; Macorig, D.; Scaioni, M.; Sebastianutti, I.; Visintini, D. LiDAR data filtering and classification by skewness and kurtosis iterative analysis of multiple point cloud data categories. Appl. Geomat. 2013, 5, 225–240. [Google Scholar] [CrossRef] [Green Version]
 Buján, S.; Cordero, M.; Miranda, D. Hybrid Overlap Filter for LiDAR Point Clouds Using Free Software. Remote Sens. 2020, 12, 1051. [Google Scholar] [CrossRef] [Green Version]
 Zhang, W.; Qi, J.; Wan, P.; Wang, H.; Xie, D.; Wang, X.; Yan, G. An easytouse airborne LiDAR data filtering method based on cloth simulation. Remote Sens. 2016, 8, 501. [Google Scholar] [CrossRef]
 Rizaldy, A.; Persello, C.; Gevaert, C.; Oude Elberink, S.; Vosselman, G. Ground and MultiClass Classification of Airborne Laser Scanner Point Clouds Using Fully Convolutional Networks. Remote Sens. 2018, 10, 1723. [Google Scholar] [CrossRef] [Green Version]
 Zhang, J.; Hu, X.; Dai, H.; Qu, S. DEM Extraction from ALS Point Clouds in Forest Areas via Graph Convolution Network. Remote Sens. 2020, 12, 178. [Google Scholar] [CrossRef] [Green Version]
 Cai, S.; Zhang, W.; Liang, X.; Wan, P.; Qi, J.; Yu, S.; Yan, G.; Shao, J. Filtering Airborne LiDAR Data Through Complementary Cloth Simulation and Progressive TIN Densification Filters. Remote Sens. 2019, 11, 1037. [Google Scholar] [CrossRef] [Green Version]
 Hu, X.; Yuan, Y. DeepLearningBased Classification for DTM Extraction from ALS Point Cloud. Remote Sens. 2016, 8, 730. [Google Scholar] [CrossRef] [Green Version]
 Jakovljevic, G.; Govedarica, M.; AlvarezTaboada, F.; Pajic, V. Accuracy Assessment of Deep Learning Based Classification of LiDAR and UAV Points Clouds for DTM Creation and Flood Risk Mapping. Geosciences 2019, 9, 323. [Google Scholar] [CrossRef] [Green Version]
 Yang, Z.; Jiang, W.; Lin, Y.; Elberink, S.O. Using Training Samples Retrieved from a Topographic Map and Unsupervised Segmentation for the Classification of Airborne Laser Scanning Data. Remote Sens. 2020, 12, 877. [Google Scholar] [CrossRef] [Green Version]
 Li, H.; Ye, W.; Liu, J.; Tan, W.; Pirasteh, S.; Fatholahi, S.N.; Li, J. HighResolution Terrain Modeling Using Airborne LiDAR Data with Transfer Learning. Remote Sens. 2021, 13, 3448. [Google Scholar] [CrossRef]
 Na, J.; Xue, K.; Xiong, L.; Tang, G.; Ding, H.; Strobl, J.; Pfeifer, N. UAVBased Terrain Modeling under Vegetation in the Chinese Loess Plateau: A Deep Learning and Terrain Correction Ensemble Framework. Remote Sens. 2020, 12, 3318. [Google Scholar] [CrossRef]
 Moudrý, V.; Klápště, P.; Fogl, M.; Gdulová, K.; Barták, V.; Urban, R. Assessment of LiDAR ground filtering algorithms for determining ground surface of nonnatural terrain overgrown with forest and steppe vegetation. Measurement 2020, 150, 107047. [Google Scholar] [CrossRef]
 Chen, C.; Guo, J.; Wu, H.; Li, Y.; Shi, B. Performance Comparison of Filtering Algorithms for HighDensity Airborne LiDAR Point Clouds over Complex LandScapes. Remote Sens. 2021, 13, 2663. [Google Scholar] [CrossRef]
 Klápště, P.; Fogl, M.; Barták, V.; Gdulová, K.; Urban, R.; Moudrý, V. Sensitivity analysis of parameters and contrasting performance of ground filtering algorithms with UAV photogrammetrybased and LiDAR point clouds. Int. J. Digit. Earth 2020, 13, 1672–1694. [Google Scholar] [CrossRef]
 Štroner, M.; Urban, R.; Lidmila, M.; Kolář, V.; Křemen, T. Vegetation Filtering of a Steep Rugged Terrain: The Performance of Standard Algorithms and a Newly Proposed Workflow on an Example of a Railway Ledge. Remote Sens. 2021, 13, 3050. [Google Scholar] [CrossRef]
 Wang, Y.; Koo, K. Vegetation Removal on 3D Point Cloud Reconstruction of CutSlopes Using UNet. Appl. Sci. 2022, 12, 395. [Google Scholar] [CrossRef]
 Mohamad, N.; Ahmad, A.; Khanan, M.; Din, A. Surface Elevation Changes Estimation Underneath Mangrove Canopy Using SNERL Filtering Algorithm and DoD Technique on UAVDerived DSM Data. ISPRS Int. J. GeoInf. 2022, 11, 32. [Google Scholar] [CrossRef]
 Hui, Z.; Jin, S.; Xia, Y.; Nie, Y.; Xie, X.; Li, N. A mean shift segmentation morphological filter for airborne LiDAR DTM extraction under forest canopy. Opt. Laser Technol. 2021, 136, 106728. [Google Scholar] [CrossRef]
 Štular, B.; Lozić, E. Comparison of Filters for ArchaeologySpecific Ground Extraction from Airborne LiDAR Point Clouds. Remote Sens. 2020, 12, 3025. [Google Scholar] [CrossRef]
Data  Raster Size (m)  Number of Shifts  Shift Size (m)  Rotations  

Alfa (gon)  Beta (gon)  Gamma (gon)  
Site 1  1  10  0.1  −25; 0; 25  −25; 0; 25  −25; 0; 25 
Site 2  5  10  0.5  −25; 0; 25  −25; 0; 25  0; 50 
Site 3  1  10  0.1  0; 25; 50; 75; 90; 120  −50; 0; 50  −50; 0; 50 
Site 4  7.5  25  0.3  −25; 0; 25  −25; 0; 25  −25; 0; 25 
Data  PMF  SMRF  CSF  ATIN 

Site 1  Cell size 1.0 Initial distance 0.5 Max distance 0.5 Max window size 1 Slope 1  Cell 0.5 Scalar 1 Slope 3 Threshold 0.1 Window 1  Cloth resolution 0.2 m Classif. threshold 0.1 m Scene slope Slope processing Yes Number of iterations 500 

Site 2  Cell size 0.5 Initial distance 1.0 Max distance 1.0 Max window size 1.0 Slope 3  Cell 0.4 Scalar 0.2 Slope 3 Threshold 0.05 Window 10  Cloth resolution 0.1 m Classif. threshold 0.1 m Scene slope Slope processing Yes Number of iterations 500 

Site 3  Cell size 0.3 Initial distance 0.8 Max distance 1.0 Max window size 0.5 Slope 3  Cell 0.1 Scalar 0.1 Slope 3 Threshold 0.1 Window 1  Cloth resolution 0.1 m Classif. threshold 0.3 m Scene slope Slope Processing Yes Number of iterations 500 

Site 4  Cell size 0.3 Initial distance 0.5 Max distance 1.0 Max window size 12 Slope 2  Cell 0.4 Scalar 0.3 Slope 0.7 Threshold 0.2 Window 10  Cloth resolution 0.1 m Classif. threshold 0.1 m Scene slope Slope processing No Number of iterations 500 

Data  Original  PMF  SMRF  CSF  ATIN  MDSR 

Site 1  1,408,072  1,244,526  1,126,268  1,128,881  959,390  1,028,481 
Site 2  10,016,451  2,009,885  1,407,364  1,251,470  701,946  280,087 
Site 3  2,785,912  592,083  439,181  502,067  420,902  316,611 
Site 4  830,836  190,795  173,228  153,196  166,794  66,791 
Data  Unfiltered (m)  PMF (m)  SMRF (m)  CSF (m)  ATIN (m)  

Above  Below  Above  Below  Above  Below  Above  Below  Above  Below  
Site 1  2.434  0.007  0.072  0.015  0.043  0.014  0.099  0.007  0.048  0.060 
Site 2  10.339  0.025  0.631  0.026  0.950  0.028  0.395  0.025  0.045  0.027 
Site 3  7.764  0.006  0.270  0.003  0.211  0.003  0.316  0.006  0.074  0.002 
Site 4  20.238  0.017  0.145  0.010  0.098  0.010  0.099  0.017  0.080  0.017 
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. 
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Štroner, M.; Urban, R.; Línková, L. Multidirectional Shift Rasterization (MDSR) Algorithm for Effective Identification of Ground in Dense Point Clouds. Remote Sens. 2022, 14, 4916. https://doi.org/10.3390/rs14194916
Štroner M, Urban R, Línková L. Multidirectional Shift Rasterization (MDSR) Algorithm for Effective Identification of Ground in Dense Point Clouds. Remote Sensing. 2022; 14(19):4916. https://doi.org/10.3390/rs14194916
Chicago/Turabian StyleŠtroner, Martin, Rudolf Urban, and Lenka Línková. 2022. "Multidirectional Shift Rasterization (MDSR) Algorithm for Effective Identification of Ground in Dense Point Clouds" Remote Sensing 14, no. 19: 4916. https://doi.org/10.3390/rs14194916