# An Integrated Solution for 3D Heritage Modeling Based on Videogrammetry and V-SLAM Technology

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Materials and Methods

#### 2.1. Cameras Calibration

#### 2.2. Data Acquisition and Preliminary Images Selection

#### 2.3. Image Selection by Filtering Process

#### 2.4. Photogrammetric Process

#### 2.4.1. Point Cloud Filtering

#### 2.4.2. Mesh and Texture Mapping Generation

## 3. Experimental Test

- The first test compared the time spent in data acquisition and data processing.
- In the second test, a series of calculations and experiments were performed to determine the resolution and distribution of the point cloud of both systems in a comparative way.
- The third test evaluated the precision of both systems through three different sub-tests. In the first sub-test, 150 targets were measured as control points and a comparative study was carried out on the accuracy assessment of both systems; in the second sub-test, precision and zonal deformations were evaluated, in which the presence of systematic errors between the proposed system and the laser scanner were ruled out by means of circular statistical analysis. In the third sub-test, a visual comparison was made between cross-sections of the point clouds resulting from both systems.
- The last test evaluated the resulting textures of both systems through the analysis of different variables.
- The tests mentioned above are described in detail below.

#### 3.1. Comparison Times for Data Acquisition and Processing

#### 3.2. Points Cloud Resolution and Distribution

^{2}surface, following movement in a direction perpendicular to the optical axis of the main image (at 1 m distance). The outlier data were removed and the resulting points were projected onto the plane; with these points, we calculated the average resolution value, as was done in the first case. The results with the resolution values of each technique are shown in Table 3, in which the resolution obtained with the laser scanner in all cases was about 3.5 times higher than that obtained with the proposed system.

#### 3.3. Accuracy Assessments

#### 3.3.1. Control Points Accuracy Test

^{th}check point measured by the proposed system (or the scanner laser), ${\mathrm{b}}_{\mathrm{i}}$ is the corresponding check point acquired by the total station, and R and T are the rotation and translation parameters for the 3D Helmert transformation, respectively.

#### 3.3.2. Analysis of Systematic Errors Using Circular Statistics

- -
- Average azimuth $\left(\overline{\theta}\right)$ obtained by the vector sum of all the vectors in the sample, as calculated by the following equations:$$\overline{\theta}=arctan\frac{S}{C};\text{}\mathrm{with}\text{}s={{\displaystyle \sum}}_{i=1}^{n}\mathrm{sin}{\theta}_{i}\text{}\mathrm{and}\text{}c={{\displaystyle \sum}}_{i=1}^{n}\mathrm{cos}{\theta}_{i}$$
- -
- Modulus of the resulting vector ($R$), obtained by the following expression:$$R=\sqrt{{c}^{2}+{s}^{2}}.$$
- -
- Average modulus $\left(\overline{R}\right)$, obtained by the following expression, where n is the number of observations:$$\overline{R}=\frac{R}{n}.$$
- -
- Circular variance of the sample $\left(V\right)$, which is calculated by$$V=1-\overline{R}.$$
- -
- Sample standard circular deviation $\left(\upsilon \right)$, being$$\upsilon =\sqrt{-2\text{}log\left(1-V\right)}.$$

#### 3.3.3. Cross-Sections

#### 3.4. Points Color Evaluation

## 4. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Zlot, R.; Bosse, M.; Greenop, K.; Jarzab, Z.; Juckes, E.; Roberts, J. Efficiently Capturing Large, Complex Cultural Heritage Sites with a Handheld Mobile 3D Laser Mapping System. J. Cult. Herit.
**2014**, 15, 670–678. [Google Scholar] [CrossRef] - Wang, Y.; Chen, Q.; Zhu, Q.; Liu, L.; Li, C.; Zheng, D. A Survey of Mobile Laser Scanning Applications and Key Techniques over Urban Areas. Remote Sens.
**2019**, 11, 1540. [Google Scholar] [CrossRef][Green Version] - Mill, T.; Alt, A.; Liias, R. Combined 3D building surveying techniques laser scanning (TLS) and total station surveying for bim data management purposes. J. Civ. Eng. Manag.
**2013**, 19 (Suppl. 1), S23–S32. [Google Scholar] [CrossRef] - Omar, T.; Nehdi, M.L. Data Acquisition Technologies for Construction Progress Tracking. Autom. Constr.
**2016**, 70, 143–155. [Google Scholar] [CrossRef] - Moruno, L.; Rodríguez Salgado, D.; Sánchez-Ríos, A.; González, A.G. An Ergonomic Customized-Tool Handle Design for Precision Tools Using Additive Manufacturing: A Case Study. Appl. Sci.
**2018**, 8. [Google Scholar] [CrossRef][Green Version] - Navarro, S.; Lerma, J.L. Accuracy Analysis of a Mobile Mapping System for Close Range Photogrammetric Projects. Measurement
**2016**, 93, 148–156. [Google Scholar] [CrossRef][Green Version] - Ortiz-Coder, P.; Sánchez-Rios, A. A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm. Sensors
**2019**, 19, 3952. [Google Scholar] [CrossRef][Green Version] - Vanneschi, C.; Eyre, M.; Francioni, M.; Coggan, J. The Use of Remote Sensing Techniques for Monitoring and Characterization of Slope Instability. Procedia Eng.
**2017**, 191, 150–157. [Google Scholar] [CrossRef][Green Version] - Puente, I.; González-Jorge, H.; Martínez-Sánchez, J.; Arias, P. Review of Mobile Mapping and Surveying Technologies. Measurement
**2013**, 46, 2127–2145. [Google Scholar] [CrossRef] - Boehler, W.; Marbs, A. 3D Scanning Instruments. In Proceedings of the CIPA WG, Corfu, Greece, 1–2 September 2002. [Google Scholar]
- Campi, M.; di Luggo, A.; Monaco, S.; Siconolfi, M.; Palomba, D. Indoor and outdoor mobile mapping systems for architectural surveys. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2018**, XLII-2, 201–208. [Google Scholar] [CrossRef][Green Version] - Luhmann, T. Recent Developments in Close-Range Photogrammetry a Measurement Technology in Transition; GIM International: Lemmer, The Netherlands, 14 February 2019. [Google Scholar]
- Cerrillo-Cuenca, E.; Ortiz-Coder, P.; Martínez-del-Pozo, J.-Á. Computer Vision Methods and Rock Art: Towards a Digital Detection of Pigments. Archaeol. Anthropol. Sci.
**2014**, 6, 227–239. [Google Scholar] [CrossRef][Green Version] - Martínez, S.; Ortiz, J.; Gil, M.L.; Rego, M.T. Recording Complex Structures Using Close Range Photogrammetry: The Cathedral of Santiago De Compostela. Photogramm. Rec.
**2013**, 28, 375–395. [Google Scholar] [CrossRef] - Beretta, F.; Shibata, H.; Cordova, R.; Peroni, R.D.L.; Azambuja, J.; Costa, J.F.C.L. Topographic Modelling Using UAVs Compared with Traditional Survey Methods in Mining. REM Int. Eng. J.
**2018**, 71, 463–470. [Google Scholar] [CrossRef] - Kršák, B.; Blišťan, P.; Pauliková, A.; Puškárová, P.; Kovanič, Ľ.; Palková, J.; Zelizňaková, V. Use of Low-Cost UAV Photogrammetry to Analyze the Accuracy of a Digital Elevation Model in a Case Study. Measurement
**2016**, 91, 276–287. [Google Scholar] [CrossRef] - González-Aguilera, D.; Rodríguez-Gonzálvez, P.; Gómez-Lahoz, J. An Automatic Procedure for Co-Registration of Terrestrial Laser Scanners and Digital Cameras. ISPRS J. Photogramm. Remote Sens.
**2009**, 64, 308–316. [Google Scholar] [CrossRef] - Liu, W.I. Novel Method for Sphere Target Detection and Center Estimation from Mobile Terrestrial Laser Scanner Data. Measurement
**2019**, 137, 617–623. [Google Scholar] [CrossRef] - Faro Focus. Available online: https://www.faro.com/products/construction-bim/faro-focus/features/ (accessed on 27 November 2019).
- Leica RTC360. Available online: https://leica-geosystems.com/en-gb/products/laser-scanners/scanners/leica-rtc360 (accessed on 27 November 2019).
- Riegl VZ-400i. Available online: http://www.riegl.com/nc/products/terrestrial-scanning/produktdetail/product/scanner/48/ (accessed on 27 November 2019).
- Z+F Imager. Available online: https://www.zf-laser.com/z-f-imager-r-5016.184.0.html (accessed on 27 November 2019).
- Vexcel Imaging Ultracam Mustang. Available online: https://www.vexcel-imaging.com/ultracam-mustang/ (accessed on 27 November 2019).
- Leica Pegasus: Two. Available online: https://leica-geosystems.com/products/mobile-sensor-platforms/capture-platforms/leica-pegasus_two (accessed on 29 November 2019).
- Trimble MX9. Available online: https://geospatial.trimble.com/products-and-solutions/trimble-mx9#product-downloads (accessed on 29 November 2019).
- Leica Pegasus BackPack. Available online: https://leica-geosystems.com/products/mobile-sensor-platforms/capture-platforms/leica-pegasus-backpack (accessed on 29 November 2019).
- Viametris BMS3D LD5+. Available online: https://www.viametris.com/backpackmobilescannerbms3d (accessed on 29 November 2019).
- Trimble TIMMS Aplanix Indoor MMS. Available online: https://www.applanix.com/products/timms-indoor-mapping.htm (accessed on 29 November 2019).
- Cura, R.; Perret, J.; Paparoditis, N. A Scalable and Multi-Purpose Point Cloud Server (PCS) for Easier and Faster Point Cloud Data Management and Processing. ISPRS J. Photogramm. Remote Sens.
**2017**, 127, 39–56. [Google Scholar] [CrossRef] - Gruen, A.; Akca, D. Evaluation of the Metric Performance of Mobile Phone Cameras. In Proceedings of the International Calibration and Orientation Workshop EuroCOW 2008, Castelldefels, Spain, 30 January–1 February 2008. [Google Scholar]
- Roberts, J.; Koeser, A.; Abd-Elrahman, A.; Wilkinson, B.; Hansen, G.; Landry, S.; Perez, A. Mobile Terrestrial Photogrammetry for Street Tree Mapping and Measurements. Forests
**2019**, 10, 701. [Google Scholar] [CrossRef][Green Version] - Sirmacek, B.; Lindenbergh, R. Accuracy Assessment of Building Point Clouds Automatically Generated from Iphone Images. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2014**, XL-5, 547–552. [Google Scholar] [CrossRef][Green Version] - Photogrammetry App. Company: Linearis GmbH & Co. KG. Available online: http://www.linearis3d.com/ (accessed on 9 May 2020).
- Contextcapture Mobile. Company: Bentley. Available online: www.bentley.com (accessed on 12 April 2020).
- Remondino, F.; Nocerino, E.; Toschi, I.; Menna, F. A Critical Review of Automated Photogrammetricprocessing of Large Datasets. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2017**, XLII-2/W5, 591–599. [Google Scholar] [CrossRef][Green Version] - Trimble MX7. Available online: https://geospatial.trimble.com/products-and-solutions/trimble-mx7#product-support (accessed on 27 November 2019).
- Imaging Imajbox. Available online: https://imajing.eu/mobile-mapping-technologies/sensors/ (accessed on 27 November 2019).
- Chiabrando, F.; Della Coletta, C.; Sammartano, G.; Spanò, A.; Spreafico, A. “Torino 1911” Project: A Contribution of a Slam-Based Survey to Extensive 3D Heritage Modeling. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2018**, XLII-2, 225–234. [Google Scholar] [CrossRef][Green Version] - Chiabrando, F.; Sammartano, G.; Spanò, A.; Spreafico, A. Hybrid 3D Models: When Geomatics Innovations Meet Extensive Built Heritage Complexes. IJGI
**2019**, 8, 124. [Google Scholar] [CrossRef][Green Version] - Torresani, A.; Remondino, F. Videogrammetry vs. Photogrammetry for heritage 3d reconstruction. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2019**, XLII-2/W15, 1157–1162. [Google Scholar] [CrossRef][Green Version] - Gálvez-López, D.; Salas, M.; Tardós, J.D. Real-Time Monocular Object SLAM. Robot. Auton. Syst.
**2016**, 75, 435–449. [Google Scholar] [CrossRef][Green Version] - Scaramuzza, D.; Martinelli, A.; Siegwart, R. A Toolbox for Easily Calibrating Omnidirectional Cameras. In 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems; IEEE: Beijing, China, 2006; pp. 5695–5701. [Google Scholar] [CrossRef][Green Version]
- Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern Anal. Mach. Intell.
**2000**, 22, 1330–1334. [Google Scholar] [CrossRef][Green Version] - Luhmann, T.; Robson, S.; Kyle, S.; Harley, I. Close Range Photogrammetry: Principles, Techniques and Applications; Whittles Publishing: Caithness, UK, 2006. [Google Scholar]
- Rublee, E.; Rabaud, V.; Konolige, K.; Bradski, G. ORB: An Efficient Alternative to SIFT or SURF. In 2011 International Conference on Computer Vision; IEEE: Barcelona, Spain, 2011; pp. 2564–2571. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Montiel, J.M.M.; Tardós, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot.
**2017**, 31, 1255–1262. [Google Scholar] [CrossRef][Green Version] - Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis.
**2004**, 60, 91–110. [Google Scholar] [CrossRef] - Dung, L.-R.; Huang, C.-M.; Wu, Y.-Y. Implementation of RANSAC Algorithm for Feature-Based Image Registration. JCC
**2013**, 1, 46–50. [Google Scholar] [CrossRef] - Pierrot-Deseilligny, M.; Clery, I. Apero, an Open Source Bundle Adjustment Software for Automatic Calibration and Orientation of Set of Images. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.
**2011**, 38, 269–276. [Google Scholar] - Triggs, B.; Mclauchlan, P.; Hartley, R.; Fitzgibbon, A. Bundle Adjustment - A Modern Synthesis. In Proceedings of the ICCV ’99 Proceedings of the International Workshop on Vision Algorithms: Theory and Practice, Corfu, Greece, 21–22 September 1999; Springer: Berlin/Heidelberg, Germany, 2000; pp. 198–372. [Google Scholar]
- Rusu, R.B.; Cousins, S. 3D Is Here: Point Cloud Library (PCL). In 2011 IEEE International Conference on Robotics and Automation; IEEE: Shanghai, China, 2011; pp. 1–4. [Google Scholar] [CrossRef][Green Version]
- PCL Point Cloud Library. Available online: http://pointclouds.org/ (accessed on 12 April 2020).
- Bradski, G.; Kaehler, A. Learning OpenCV: Computer Vision in C++ with the OpenCV Library, 2nd ed.; O’Reilly Media, Inc.: Newton, MA, USA, 2013. [Google Scholar]
- Hoppe, H. Poisson Surface Reconstruction and Its Applications. In Proceedings of the 2008 ACM Symposium on Solid and Physical Modeling - SPM ’08; ACM Press: Stony Brook, NY, USA, 2008; p. 10. [Google Scholar] [CrossRef]
- Cignoni, P.; Callieri, M.; Corsini, M.; Dellepiane, M.; Ganovelli, F.; Ranzuglia, G. MeshLab: An Open-Source Mesh Processing Tool. Eurographics Ital. Chapter Conf.
**2008**, 8. [Google Scholar] [CrossRef] - Ranzuglia, G.; Callieri, M.; Dellepiane, M.; Cignoni, P.; Scopigno, R. MeshLab as a Complete Tool for the Integration of Photos and Color with High Resolution 3D Geometry Data. In Archaeology in the Digital Era Volume II, e-Papers from the 40th Conference on Computer Applications and Quantitative Methods in Archaeology, Southampton, 26–30 March 2012; Amsterdam University Press: Amsterdam, The Netherlands, 2013; pp. 406–416. [Google Scholar]
- Hong, S.; Jung, J.; Kim, S.; Cho, H.; Lee, J.; Heo, J. Semi-Automated Approach to Indoor Mapping for 3D as-Built Building Information Modeling. Comput. Environ. Urban Syst.
**2015**, 51, 34–46. [Google Scholar] [CrossRef] - Fisher, N.I.; Lewis, T.; Embleton, B.J.J. Statistical Analysis of Spherical Data, 1st ed.; Cambridge University Press: Cambridge, UK, 1987. [Google Scholar] [CrossRef]
- Polo, M.-E.; Felicísimo, Á.M. Full Positional Accuracy Analysis of Spatial Data by Means of Circular Statistics: Analyzing the Positional Error in Spatial Data. Trans. GIS
**2010**, 14, 421–434. [Google Scholar] [CrossRef] - Fisher, N.I. Statistical Analysis of Circular Data, 1st ed.; Cambridge University Press: Cambridge, UK, 1993. [Google Scholar] [CrossRef]
- Oden, N. Circular Statistics in Biology. Edward Batschelet. Q. Rev. Biol.
**1983**, 58, 312. [Google Scholar] [CrossRef] - James, M.R.; Robson, S.; Smith, M.W. 3-D Uncertainty-Based Topographic Change Detection with Structure-from-Motion Photogrammetry: Precision Maps for Ground Control and Directly Georeferenced Surveys: 3-D Uncertainty-Based Change Detection for SfM Surveys. Earth Surf. Process. Landf.
**2017**, 42, 1769–1788. [Google Scholar] [CrossRef]

**Figure 2.**Initial prototype configuration P-1 (top) and the actual prototype P-2 (bottom), 14 cm long and lighter, with an extendable pole anchor point (red arrow) and a ball joint system (yellow arrow) to facilitate data capture.

**Figure 4.**Effect of statistical outlier removal (SOR) and radius outlier removal (ROR) filters on a point cloud on the surface of a Roman column: (

**a**) Point cloud with outliers in areas marked with dashed lines in yellow; (

**b**) point cloud after applying the SOR filter, section AA is marked with a red line; (

**c**) point cloud of section AA after applying the SOR filter; and (

**d**) point cloud of section AA after the application of the SOR and ROR filters, with a better definition of the contour of the column section.

**Figure 5.**Work areas in the “Casa del Mitreo”: (

**a**). Working area 1: Pond and Peristilium (left); (

**b**). Working area 2: Underground rooms (center); and (

**c**). Working area 3: Rooms with mosaic floors (right).

**Figure 6.**Total station (

**a**) and target model (

**b**) used to co-ordinate control points; Faro Focus3D X 330 laser scanner recording data in working areas 1 (

**c**) and 3 (

**d**); and data capture with the proposed system in working area 2 (

**e**,

**f**).

**Figure 7.**Trajectory followed (red lines) during data acquisition in the three work zones: orking area 1 (

**a**); working area 2 (

**b**); and working area 3 (

**c**).

**Figure 8.**Comparison of the point cloud resulting from the LS Faro Focus3D X 330 (

**a**,

**c**,

**e**) and the proposed system (

**b**,

**d**,

**f**) of working area 1 (

**a**,

**b**), working area 2 (

**c**,

**d**), and working area 3 (

**e**,

**f**). Visualization was carried out in Meshlab [52] with a homogeneous normal calculation for both systems, with the aim of eliminating textures and favoring a balanced geometric comparison.

**Figure 10.**Overlapping cross sections generated in both point clouds, obtained using a horizontal plane at a height of 0.50 m above the ground. The result of the section of the cloud captured by the laser scanner appears in red, and that with the proposed system in green: Work zone 1 (

**a**) at scale 1/235; and work zone 3 (

**b**) at scale 1/180.

**Figure 11.**Detail of the point clouds of work zones 2 (

**a**) and 3 (

**b**) obtained with both systems. The two images in the upper part belong to those obtained with the proposed system and those obtained with the Focus3D X 330 are shown in the lower part.

**Table 1.**Main technical characteristics of cameras and associated lenses used in the two prototype versions P-1 and P-2 (from The Imaging Source Europe GmbH company, FUJIFILM Corporation, Ailipu Technology Co., Ltd., and JAI Ltd.). * Focal length have been computed using Scaramuzza model [42].

Prototype Version | Camera/Model | Resolution (pixels) | Focal (mm) | Angle of View (H° × V°) | Sensor Size (inch or mm) | Frame Rate (fps) |
---|---|---|---|---|---|---|

P-1 | A/ DFK 42AUC03 | 1280 × 960 | 2.1 | 97° × 81.2° | 1/3″ | 25 |

B/ DFK 33UX264 | 2448 × 2048 | 6 | 74.7° × 58.1° | 2/3″ | 38 | |

P-2 | A/ ELP-USB500W05G | 640 × 480 | * | 170° × 126° | 3.9 × 2.9 mm | 30 |

B/ GO-5000C-USB | 2560 × 2048 | 6.5 | 89° × 76.2° | 12.8 × 10.2 mm | 6 |

**Table 2.**Summary of total and partial times (working areas 1, 2, and 3) used in the acquisition and processing of data with the Faro Focus3D X 330 system and the proposed system.

Working Area 1 | Working Area 2 | Working Area 3 | Total | |||||
---|---|---|---|---|---|---|---|---|

Data Acquisition Time (min) | ||||||||

FARO FOCUS X 330 | 200 10 scans | 100 5 scans | 220 11 scans | 520 | ||||

Proposed System | 11 | 8 | 12 | 31 | ||||

Processing Time (min) | ||||||||

User time | CPU Time | User time | CPU Time | User time | CPU Time | User time | CPU Time | |

FARO FOCUS X 330 | 35 | 61 | 22 | 40 | 40 | 75 | 97 | 176 |

Proposed System | 0 | 605 | 0 | 421 | 0’ | 690 | 0 | 1716 |

**Table 3.**Resolution of both systems in the work areas and resolution of an individual scan on a flat 1 m2 surface at 1 m distance. * according to manufacturer data.

Resolution in the Study Areas (mm) | ||||
---|---|---|---|---|

Working Area 1 | Working Area 2 | Working Area 3 | Mean Resolution | |

Faro Focus3D X 330 | 0.4 | 0.4 | 0.5 | 0.4 |

Proposed System | 1.6 | 1.2 | 1.4 | 1.4 |

Individual Scan Resolution(mm) | ||||

Number of Points | Resolution | |||

Faro Focus3D X 330 | 11.108.889 | 0.3 * | ||

Proposed System | 576.255 | 1.3 |

**Table 4.**Accuracy assessment result for the Faro Focus3D X 330 and proposed system in working areas 1, 2, and 3 (unit: Millimeter).

Faro Focus3D X 330 | Proposed System | |||||||
---|---|---|---|---|---|---|---|---|

Working Area 1 | ||||||||

Error Vector X | Error Vector Y | Error Vector Z | Error | Error Vector X | Error Vector Y | Error Vector Z | Error | |

${\delta}_{avg}$ | 10 | 7 | ||||||

$\mathrm{RMSE}$ | 9 | 6 | 4 | 7 | 6 | 4 | 4 | 5 |

Working Area 2 | ||||||||

Error Vector X | Error Vector Y | Error Vector Z | Error | Error Vector X | Error Vector Y | Error Vector Z | Error | |

${\delta}_{avg}$ | 8 | 8 | ||||||

$\mathrm{RMSE}$ | 5 | 9 | 4 | 6 | 4 | 6 | 6 | 6 |

Working Area 3 | ||||||||

Error Vector X | Error Vector Y | Error Vector Z | Error | Error Vector X | Error Vector Y | Error Vector Z | Error | |

${\delta}_{avg}$ | 11 | 8 | ||||||

$\mathrm{RMSE}$ | 5 | 8 | 7 | 6 | 6 | 4 | 5 | 5 |

Estatistical | Sample M1 | Sample M2 |
---|---|---|

Average | 0.012 | 0.010 |

Standard error | 0.001 | 0.001 |

Median | 0.012 | 0.008 |

Mode | 0.011 | 0.008 |

Standard Deviation | 0.006 | 0.007 |

Variance | 0.000 | 0.000 |

Mínimun | 0.001 | 0.003 |

Máximum | 0.025 | 0.032 |

**Table 6.**Results of the basic statistics and the tests of fit to the uniform distribution (Rayleigh test) in samples M1 and M2.

Statistical Sample | Number of Observations | $\overline{\mathit{\theta}}$ | $\overline{\mathit{R}}$ | $\mathsf{\upsilon}$ | $\mathsf{\kappa}$ | Data Grouped? | Rayleigh Test |
---|---|---|---|---|---|---|---|

M1 | 42 | 297.3° | 0.13 | 115.3° | 0.27 | no | 0.48 |

M2 | 28 | 133.5° | 0.56 | 61.9° | 1.35 | no | 8.7*10^{−5} |

Image Data Adquisition | ||
---|---|---|

Laser Scanner Faro Focus X 330 | Proposed System | |

Format | .jpg | .jpg |

Resolution (pixels) | 20,198 × 8534 | 2560 × 2048 |

Images files size (Mb) | 7.68 | 20 |

Acquisition time (seg) | 180″ | 0.25″ |

GSD at 1.5 m distance | 0.46 mm | 0.57 mm |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Ortiz-Coder, P.; Sánchez-Ríos, A.
An Integrated Solution for 3D Heritage Modeling Based on Videogrammetry and V-SLAM Technology. *Remote Sens.* **2020**, *12*, 1529.
https://doi.org/10.3390/rs12091529

**AMA Style**

Ortiz-Coder P, Sánchez-Ríos A.
An Integrated Solution for 3D Heritage Modeling Based on Videogrammetry and V-SLAM Technology. *Remote Sensing*. 2020; 12(9):1529.
https://doi.org/10.3390/rs12091529

**Chicago/Turabian Style**

Ortiz-Coder, Pedro, and Alonso Sánchez-Ríos.
2020. "An Integrated Solution for 3D Heritage Modeling Based on Videogrammetry and V-SLAM Technology" *Remote Sensing* 12, no. 9: 1529.
https://doi.org/10.3390/rs12091529