Next Article in Journal
Fatigue Resistance Evaluation of Modified Asphalt Using a Multiple Stress Creep and Recovery (MSCR) Test
Next Article in Special Issue
Initial Results of Testing a Multilayer Laser Scanner in a Collision Avoidance System for Light Rail Vehicles
Previous Article in Journal
Potential of Waste Oyster Shells as a Novel Biofiller for Hot-Mix Asphalt
Previous Article in Special Issue
Examination of Indoor Mobile Mapping Systems in a Diversified Internal/External Test Field
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Performance Evaluation of Two Indoor Mapping Systems: Low-Cost UWB-Aided Photogrammetry and Backpack Laser Scanning

1
Interdepartmental Research Center of Geomatics (CIRGEO), University of Padova, via dell’Università 16, 35020 Legnaro (PD), Italy
2
Dipartimento Politecnico di Ingegneria e Architettura, University of Udine, via delle Scienze 206, 33100 Udine, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2018, 8(3), 416; https://doi.org/10.3390/app8030416
Submission received: 23 January 2018 / Revised: 28 February 2018 / Accepted: 6 March 2018 / Published: 11 March 2018
(This article belongs to the Special Issue Laser Scanning)

Abstract

:
During the past dozen years, several mobile mapping systems based on the use of imaging and positioning sensors mounted on terrestrial (and aerial) vehicles have been developed. Recently, systems characterized by an increased portability have been proposed in order to enable mobile mapping in environments that are difficult to access for vehicles, in particular for indoor environments. In this work the performance of a low-cost mobile mapping system is compared with that of: (i) a state-of-the-art terrestrial laser scanning (TLS), considered as the control; (ii) a mobile mapping backpack system (Leica Pegasus), which can be considered as the state-of-the-art of commercial mobile mapping backpack systems. The aim of this paper is two-fold: first, assessing the reconstruction accuracy of the proposed low-cost mobile mapping system, based on photogrammetry and ultra-wide band (UWB) for relative positioning (and a GNSS receiver if georeferencing is needed), with respect to a TLS survey in an indoor environment, where the global navigation satellite system (GNSS) signal is not available; second, comparing such performance with that obtained with the Leica backpack. Both mobile mapping systems are designed to work without any control point, to enable an easy and quick survey (e.g., few minutes) and to be easily portable (relatively low weight and small size). The case study deals with the 3D reconstruction of a medieval bastion in Padua, Italy. Reconstruction using the Leica Pegasus backpack allowed obtaining a smaller absolute error with respect to the UWB-based photogrammetric system. In georeferenced coordinates, the root mean square (RMS) error was respectively 16.1 cm and 50.3 cm; relative error in local coordinates was more similar, respectively 8.2 cm and 6.1 cm. Given the much lower cost (approximately $6k), the proposed photogrammetric-based system can be an interesting alternative when decimetric reconstruction accuracy in georeferenced coordinates is sufficient.

1. Introduction

Mobile mapping systems, based on terrestrial/aerial vehicles or on human-carried devices, have been used in the past dozen of years to map and monitor areas of interest for a wide range of applications [1,2,3,4,5,6,7,8,9,10]. These systems are equipped with navigation and imaging sensors. The former ones are usually based on the integration of positioning with the global navigation satellite system (GNSS) and inertial navigation with an inertial measurement unit (IMU), the latter ones consist of laser scanners and cameras. Despite their high reliability, the use of mobile mapping systems in indoor environments is usually limited (due to vehicle size and difficulties in moving it in such environments).
During the decade, several portable mobile mapping systems have been developed in order to enable accurate surveys also in environments that are difficult to reach with (large terrestrial/aerial) vehicles [11,12,13]. A recent example of a commercial solution allowing accurate surveying also in difficult/indoor environments is the Leica Pegasus backpack [14]. It is a wearable pedestrian mapping system enabling almost ubiquitous 3D data capture, i.e., outdoors, indoors, underground. Given its limited size ( 73 × 27 × 31 cm) and weight (11.9 kg), its nominal reconstruction accuracy (relative error 3 cm, in local coordinates; absolute error 5 cm, in georeferenced coordinates) [15], it can be considered a very interesting solution for quick and accurate surveys of small-/medium-sized sites.
The primary goal of this paper is to compare accuracies between the state-of-the-art portable laser scanner system described above and an alternative mobile mapping system. The proposed system is based on the use of photogrammetry [16,17,18,19] and ultra-wide band (UWB) sensors [20,21], whose principal characteristics are its low-cost and very high portability. A GNSS receiver can be optionally employed if data georeferencing is required [22,23]. The rationale is that of developing a low-cost system for indoor surveys, which requires only minimal human expertise and effort in its use. The imaging sensor is a standard consumer camera, i.e., Canon G7X, whereas positioning in local coordinates is obtained by means of a set of Pozyx UWB devices [24] (Figure 1a). Metric reconstruction is achieved by combining a self-calibrated photogrammetric reconstruction [25,26,27,28,29] computed with Agisoft PhotoScan [30] with UWB measurements. Georeferencing is obtained by using a Topcon HiPer V GNSS receiver [31], positioned before the bastion entry. The considered system is lightweight, easy to use and cheaper than the Pegasus backpack. Indeed, most of the weight is given by the GNSS receiver, 1 kg approximately, so that all the required instrumentation can be carried with a quite small backpack. Data collection simply requires properly spreading UWB devices inside the environment, taking photos with the camera and a UWB sensor attached to it and acquiring GNSS measurement outdoors, if needed. The overall cost of the photogrammetric system is relatively low, about $6k, out of which $400 is for the camera, $150 per Pozyx device and $4k for the (geodetic grade) GNSS receiver.
The considered case study for validating the proposed low-cost system is the survey of the Impossible bastion (the “Impossible bastion” is a defense post, which historically was considered unconquerable, thus the term “impossible”), a medieval structure (XVIth century) located within the ancient city walls of Padua, Italy (Figure 2a).
This historical building (nowadays almost completely underground) was surveyed by the Interdepartmental Research Center of Geomatics (CIRGEO) of the University of Padua in 2016 with a terrestrial laser scanner (TLS), within the framework of a fellowship (Culturalmente 2015). The 3D model produced from the TLS survey is shown Figure 2b.
The 3D model produced with the proposed low-cost mobile mapping system is compared with the TLS survey, which is used as reference dataset.
The survey with the Pegasus backpack was conducted on 21 December 2016 (Figure 1b). It is worth noticing that the GNSS signal is available only before entering the bastion; hence, most of the survey was done just by relying on Pegasus’ inertial navigation system (INS) [22,32,33] and on simultaneous localization and mapping (SLAM) technology [34,35].
The Pegasus backpack and our low-cost mobile mapping system are compared with the TLS survey taking into consideration both absolute and relative positioning accuracy. Absolute positioning accuracy aims at comparing two georeferenced models, i.e., models whose coordinates are related to a global coordinate reference system. Relative positioning accuracy aims at evaluating differences in the reconstructed 3D models, in which the georeferencing component has been removed. As such, relative positioning involves point coordinates related to a local reference frame. In the latter case, error is due to differences in the shape of the reconstructed models, whereas in the first case, georeferencing error can have a larger impact than shape differences between the models. These two cases will be referred to as absolute and relative error.
This paper extends a previous work presented in [36] providing a much more detailed analysis of both the Pegasus backpack and photogrammetric reconstruction errors. Furthermore, this work considers the use of a larger UWB network, which provided a better metric reconstruction (e.g., a better estimate of the photogrammetric reconstruction scale factor).
The rest of the paper is organized as follows: Section 2 describes the TLS survey. Section 3 provides details on the Pegasus survey and investigates its performance on the considered case study with respect to the TLS 3D model. Section 4 describes the low-cost mobile mapping system proposed in this paper and validates its results. Finally, the discussion and conclusions are drawn in Section 5 and Section 6, respectively.

2. 3D Reference Model-TLS Survey

In order to provide a 3D reference dataset for the assessment of the other two systems considered in this work, the interior of the Impossible bastion was fully surveyed with a Leica ScanStation C10 TLS. Based on time-of-flight (TOF) measuring principle, this scanning system combines all-in-one portability with the ability of performing surveys similar to a total station (traverse and resection). The laser features a full 360 × 270 field-of-view thanks to the adoption of the Smart X-Mirror design, low beam divergence (<6 mm @ 50 m), high accuracy (6 mm @ 50 m), long range (300 m @ 90% reflectivity), dual-axis compensator and high scan speed (50k pts/s). In addition, the total station-like interface and on-board graphic color touch screen display allow users on-site data viewing. Moreover, this scanner also has internal memory storage of 80 gigabytes, which is ideal for surveying large areas.
The laser survey of the fortification was carried out by using the traversing method. This surveying procedure allows the user to take advantage of the Leica C10 built-in dual-axis compensator to use conventional surveying techniques to perform a traverse and establishing relationships between successive scanner positions within a known or assumed coordinate system. Because the dual-axis compensator automatically corrects scan data for the vertical displacements, the traverse procedure does not require as many targets as regular target-based registration. In this case study, an open traverse consisting of nine laser stations was set up to completely cover the study area.
On each station, the laser scanner was set on a tripod and then carefully leveled and plumbed directly over the top of the traverse mark on the ground. Leveling of the instrument is required in order to enable automatic registration of acquired scans and to minimize measurement errors. After setting out the TLS, the bearing and coordinates of the back station were set in order to define the fore station. The positions of back and fore stations were measured by the laser scanner using proper Leica target poles (Figure 2b), whose sizes are automatically stored in the laser firmware.
In order to meet the requirement of a clear line of sight between each target station (back and fore) and the laser station, scanner positions were carefully designed by taking into account the inner geometry of the fortification. From each laser station, a set of scans was acquired with an average spatial resolution of about 1 cm at a distance of 10 m. Following a few initial tests, this value was deemed the most suited to generate a 3D model with a good level of detail for subsequent analyses.
Furthermore, in order to georeference the TLS-based 3D model, the traverse was properly connected to four control points (CPs) located outside of the bastion. Due to the morphological characteristics of the ancient structure and of the surrounding environment, three points were set at the tunnel entry, while the fourth one was placed in front of the left wing, being visible from the interior of the bastion through a small aperture. These CPs were measured both with a double frequency GNSS receiver (Topcon HiPer V) and with the Leica C10. In the latter case, specific Leica retroreflective 6” circular targets were employed. At the four CPs, GNSS observations were collected in network real-time kinematic (NRTK) mode, i.e., receiving real-time corrections from a nation-wide network of permanent GNSS stations (Topcon NetGEO network [37]).
The raw data collected with the ScanSation C10 were then imported in Leica Cyclone software for processing. First, the scan station sequence was assembled in the Cyclone Traverse Editor in order to build the traverse. To this aim, instrument and target heights measured in the field were set for each scanner position. After verification and validation of the parameters entered, the traverse could be carried out. At the end of this processing step, the laser stations and associated point clouds coordinates were defined in a common reference frame. Basically, by traversing, a preliminary alignment (registration) of the scans could be obtained. However, since the generation of a laser traverse is always affected by residual errors, the registration results need to be refined further. A global alignment procedure, based on the well-known ICP (iterative closest point, [38,39]) algorithm, was therefore applied to all the pre-registered scans. Being an iterative process, the ICP algorithm requires a good initial guess about the “closeness” between two point clouds in order to converge to the correct solution. In this case, such an initial estimate was provided by the output of the traverse, at the end of which scans acquired from different locations resulted in being just spatially “close” to each other. Thus, by exploiting this information, scan pre-alignment could be automatically refined with the ICP. After this step, a global point cloud of 27 million points was obtained with an average residual alignment error reduced to a few millimeters: 0.008 m, 0.003 m and 0.004 m are the maximum, minimum and average error, respectively.
The global point cloud was then georeferenced in the Leica Cyclone through the GNSS coordinates of the four CPs observed during TLS traversing. The residual georeferencing error was about 1.5 cm, which is the same order of magnitude of the precision of the real-time corrected GNSS coordinates of the CPs.

3. Leica Pegasus Backpack Survey

The Pegasus backpack is a terrestrial mobile mapping system recently developed by Leica: it features two laser profilers, five cameras, a GNSS receiver and a 200-Hz INS. The rationale of this system is that of allowing accurate and fast surveys of areas accessible by a human carrying a backpack. This is of clear interest especially in the case of places where it is difficult to carry other instruments: the weight of the Pegasus backpack, namely 11.9 kg, is surely reasonable for a backpack carried by an average person.
The SLAM technique using the two laser profilers, which acquire 600k points per second at a maximum range of 50 m, and with a high precision IMU, nominally ensures position accuracy in indoor environments from 5 cm to 50 cm walking for 10 min. However, as stated in the backpack specifications [14,15], several factors can affect this value.
Each camera acquires 4 Mpixel images at 2 Hz. Battery life nominally ensures 4 h of operating time, which should be enough for most use cases.
The Leica Pegasus backpack survey was done on 21 December 2016 and lasted less than a hour: approximately half an hour was spent for system setting up and calibration, whereas data acquisition required only a few minutes.

3.1. Precision Assessment

Assessment of the system precision is done by comparing the point clouds obtained when entering/exiting in/from the bastion. First, Figure 3 shows average point-to-point distances between the two georeferenced point clouds (i.e., absolute positioning error case): averages have been computed on slices orthogonal to the u line shown in Figure 3a (red line). The width of each slice is 10 cm. Figure 3b shows the bastion silhouette evaluated by projecting bastion model points on the vertical plane passing through the u line: coordinates on the horizontal axis correspond to those along the u line in Figure 3a. Figure 3c shows the point-to-point average distances varying the slice coordinate along the u line.
It is worth noticing that two peaks are clearly visible in correspondence with ceiling openings (at u 6 m and u 16 m, approximately). These peaks are clearly due to the point missing in the two point clouds: the right parts of these openings were well reconstructed in the point cloud shown in Figure 3b, whereas the left parts were better reconstructed in the other point cloud, depending on the walking direction (and hence the sensor orientation with respect to the openings).
As reported in Table 1, the average point-to-point distance between the two point clouds of the corridor is 4.7 cm; the root mean square (RMS) point-to-point distance is 5.8 cm; and the maximum distance is 110.3 m. The motivation of the large maximum distance value is that points are missing in the ceiling openings, as described above.
Then, Figure 3d shows the average point-to-point distances along line u after refining the registration between the two point clouds with the ICP algorithm (relative error case). The ICP algorithm was applied to the two point clouds after centering them with respect to the center of the first one. This registration led to a translation of 19.7 cm, −14.4 cm, −1.4 cm in the three axes (east, north, vertical respectively) and a 0.8 degree rotation. Average and RMS distances are 2.2 cm and 3.1 cm, respectively.
The above-mentioned translation and rotation between the two point clouds cause the numerical result difference between the absolute and relative error case. Such a difference is clearly mostly due to the absence of the GNSS signal and hence to errors of the INS and SLAM algorithm. The obtained absolute positioning error is within the nominal value of the Pegasus backpack error in absence of the GNSS signal (50 cm).

3.2. Accuracy Assessment: Relative Error with Respect to the TLS Model

After registration of the backpack and TLS 3D models, the average and RMS point-to-point distances are 4.3 cm and 8.2 cm (Table 2). Histogram of point-to-point distances between the two point clouds is shown in Figure 4a.
The largest errors are actually in the final part of the left wing of the bastion: this area is critical because of its quite complex structure, shown in Figure 4c. Given the complexity of this area, it would be interesting to compare the two models also discarding the final part of the left wing of the bastion. In this case, average and RMS point-to-point distances are 3.6 cm and 4.6 cm (Table 2), and the distance histogram can be seen in Figure 4b. After discarding the final part of the left wing of the bastion, the obtained RMS distance is close to the nominal relative accuracy (3 cm in indoor environments [15]). It is also worth noticing that discarding the final part of the bastion left wing significantly decreases the maximum error from 193.4 cm to 79.3 cm, as can be seen in Table 2 and comparing Figure 4a,b.

3.3. Accuracy Assessment: Absolute Error with Respect to the TLS Model

Figure 5a,b shows the map of the point-to-point distances (e.g., absolute error) and the corresponding histogram, respectively, for the two georeferenced point clouds. The obtained average, RMS and maximum distance both considering and discarding the final part of the left wing are reported in the last two rows of Table 3. Error in this case is clearly influenced by the absence of the GNSS signal inside of the bastion.
After centering the two point clouds with respect to the center of the TLS 3D model, refined registration between them was computed with the ICP algorithm: this registration led to a translation of −20.0 cm, −16.7 cm, −10.5 cm in the three axes (east, north, vertical respectively) and a 0.7 degree rotation.
Similarly to the relative error case, discarding the final part of the bastion left wing significantly reduces the maximum error in this case as well (Table 3).

4. Photogrammetric Reconstruction with the UWB Positioning System

This section considers an alternative 3D model of the bastion relying on the use of a very low-cost system with respect to those of Section 2 and Section 3. The proposed system is based on photogrammetric reconstruction, obtained with a standard consumer camera and Agisoft PhotoScan [30]. Then, metric reconstruction is achieved by integrating the photo-based 3D reconstruction with measurements of the Pozyx UWB positioning system [24]. Finally, a Topcon HiPer V GNSS receiver is used for georeferencing the 3D model. The rationale of this system is to produce a georeferenced 3D model avoiding the need for CPs inside of the bastion, where GNSS positioning is not reliable, and using low-cost and easily portable instruments.
Data acquisition in the bastion was carried out on 25 July 2017. Images have been taken by using a Canon G7X camera (20.2 MPix), with settings fixed at constant values (1/60-s shutter speed, f/1.8 aperture, 8.8-mm focal length, i.e., 35 mm equivalent: 24 mm). Five hundred and seven images have been collected in approximately one hour varying camera position and orientation. Portable spotlights were used in order to properly illuminate the bastion during image acquisitions.
Agisoft PhotoScan performed reconstruction with camera self-calibration: images were aligned with the highest accuracy setting, leading to an RMS reprojection error of 1.24 pixel, whereas high quality and mild depth filtering were set in the dense point cloud generation, resulting in a model with 190 million points. The average number of images used to reconstruct a single point was 5.3. Clear outliers have been manually deleted from the generated point cloud by means of CloudCompare, and the point cloud has been downsampled to a resolution similar to Pegasus point cloud, i.e., 27 millions of points. Let Σ p h o t o be the coordinate system associated with the 3D model provided by PhotoScan. A proper transformation (i.e., scaling factor, translation and rotation) was estimated as described in the following in order to map Σ p h o t o into the global reference system Σ g e o (i.e., WGS84 UTM32).
Since the GNSS signal was not available/reliable inside of the bastion, local positioning was obtained by using an UWB positioning system. In such a system, each device is a radio transmitter and receiver that provides ranging measurements once connected to another UWB device. Ranging is usually obtained by the time of flight of the radio signal, even if this clearly causes a decrease of ranging accuracy when devices are not in a clear line of sight (i.e., when obstacles are along the line connecting the two devices). A set of UWB devices, named anchors, are fixed to constant positions. Then, the UWB rover position is obtained by trilateration of range measurements provided by UWB anchors. A more detailed description of the Pozyx system used can be found in [8,24].
During the photogrammetric survey, a Pozyx rover was rigidly attached to the camera, whereas nine Pozyx anchors were distributed inside of the bastion (as shown in Figure 6) to enable assessment of rover positions during the indoor survey. Another Pozyx anchor was positioned outside of the bastion, in order to track the rover also in front of the tunnel entry, i.e., where GNSS is available.
Interestingly, Pozyx devices are provided with an auto-detecting procedure that allows them to automatically detect and communicate with each other. Once turned on, rovers and anchors start to communicate and to provide range measurements. In this case study, a rover was used to collect (anchor-rover) range measurements during the survey, whereas positioning was performed in post-processing. Nevertheless, it is worth noticing that the software provided with the Pozyx system provides also real-time positioning, if needed. However, this functionality was not used in our experiments in order to save computational power, hence allowing a higher data rate collection.
Given the ease of use of the Pozyx system, distributing anchors in the bastion and setting the system required just a few minutes. Ideal positioning of UWB anchors is usually in high positions (e.g., high on a wall) to increase the probability of the direct line-of-sight in measurements [24], but in this case study, they were placed on the ground due to the impossibility of attaching anything to the walls.
Calibration of the UWB systematic error has been previously considered in several works [40,41,42,43,44,45]; however, according to our experiments, systematic errors of low-cost Pozyx devices are significantly anisotropic and difficult to properly model [8], i.e., the dependence on the relative orientation between rover and anchors is typically not negligible. Motivated by this consideration, the Pozyx system is used without any ad hoc calibration procedure.
In order to make the system as simple to use as possible, anchor positions were obtained by means of a self-positioning procedure: a local coordinate system was established by considering range measurements provided by three UWB devices as described in [24], then trilateration (and nonlinear optimization) was used in order to estimate the positions of the remaining anchors (blue dot marks in Figure 6). Anchor self-positioning was performed assuming anchors placed on a planar surface (2D positioning). Their positions were measured also by means of a Leica TCR 702 Total Station (red dot marks in Figure 6). While using a total station requires some expertise, distributing UWB devices on the ground is a very easy task. Hence, the rationale of UWB self-positioning is that of providing a simple survey procedure for unspecialized personnel.
Blue and red points in Figure 6 clearly show the error of the self-positioning procedure due to UWB measurement errors. In order to ease the comparison, Figure 6 shows anchor positions overlapping on the bastion map (gray points): bastion coordinates in the Σ u w b system are obtained by means of the map from Σ p h o t o to Σ u w b , which was estimated as described in the following.
Rover altitude with respect to the ground varied across a relatively small interval. Hence, in order to ease the rover positioning problem, rover altitude has been assumed to be fixed to a constant value, i.e., 1.5 m. Range measurements are combined by using an extended Kalman filter (EKF) in order to obtain estimates of the rover position in the Σ u w b coordinate system [46]. The rover dynamic model is modeled as follows:
x t + 1 x ˙ t + 1 = 1 T 0 1 x t x ˙ t + w t
where x t is the rover 2D position at time t on the planar surface at constant altitude, T is the length of the time interval between the two estimates and w t is the model noise.
The measurement equation for anchor i is:
y t , i = ( x t , u q i , u ) 2 + ( x t , v q i , v ) 2 + h 2 + z t
where y t , i is the measurement of anchor i at time t, q i is the position of anchor i (u and v are two orthogonal direction on the horizontal plane) and z t is the measurement noise. Future investigation foresees also the integration with pedestrian positioning estimation methods [47,48,49].
The computed rover positions in the Σ u w b metric coordinate system, are used to estimate the parameters (translation vector, rotation matrix, scale factor) for mapping the photogrammetric point clouds from the photo to UWB frame. Least squares fitting (minimizing the sum of squared residuals) provided the best values for the parameters.
In the scaling process, the same weights have been given to all camera positions estimated by the UWB system. However, positions far from UWB anchors are clearly affected by a higher estimation error with respect to those surrounded by anchors. Introducing different weights depending on the considered position might improve scale estimation performance and will be the object of our future investigation.
Similarly to [8], the optimization problem considered for scale estimation was also extended to solve the synchronization between camera and UWB sensors measurements.
A further improvement of the above estimates can be obtained by taking into account also misalignments due to the lever arm and boresight between the camera and rover. The rover was fixed with a proper position and orientation in order to make the axes of its coordinate system have approximately the same orientation of those of the camera, and translation between their centers is mostly in a single direction (the vertical axis). Calibration of the camera-UWB rover system was performed similarly to the procedure described in [50]. The correction of camera positions can be applied once an estimate of the transformation from Σ p h o t o to Σ u w b is available. Then, the transformation Σ p h o t o to Σ u w b is re-estimated with the updated coordinates, and the process can be iterated until convergence.
Georeferencing of the photogrammetric model is obtained as follows: first, the UWB rover is set outside of the bastion over the position of the GNSS CP closest to the tunnel entry. This allows estimating the translation term between the Σ u w b local frame and the global mapping frame (WGS84, UTM32). North and vertical directions are estimated by exploiting IMU measurements (the IMU was pre-calibrated as described in [40], i.e., calibration of accelerometer and magnetometer): magnetometer and accelerometer data, collected in the rover reference frame are first transformed in the coordinate systems of the cameras and then in the Σ u w b system by using camera orientations (computed during the photogrammetric reconstruction) and the previously-estimated transformation from Σ p h o t o to Σ u w b . To be more specific, let g r o v e r , i and m r o v e r , i be the (calibrated) accelerometer and magnetometer measurement vectors taken during the acquisition of the i-th image. Since during image acquisition the camera is assumed to be still, accelerometer measurement g r o v e r , i corresponds to a measurement of the vertical direction in the rover reference frame (i.e., the opposite of the gravity direction). Furthermore, since they correspond to directions, they can be assumed to be unit vectors (i.e., they can be normalized). Let R r o v e r 2 c a m , t r o v e r 2 c a m be the rotation matrix and translation vector corresponding to the transformation from the rover to the camera reference frame, then measurement vectors in the camera reference frame are obtained by applying the corresponding transformation to g r o v e r , i and m r o v e r , i :
c g r o v e r , i m r o v e r , i R r o v e r 2 c a m c g c a m , i m c a m , i
Let R c a m , i , t c a m , i be the rotation matrix and translation vector corresponding to the i-th camera position and orientation in the photogrammetric reconstruction computed by PhotoScan, then:
c g c a m , i m c a m , i R c a m , i c g p h o t o , i m p h o t o , i
Finally, the rotation R p h o t o 2 u w b of the previously estimated transformation from Σ p h o t o to Σ u w b is applied to g p h o t o , i and m p h o t o , i :
c g p h o t o , i m p h o t o , i R p h o t o 2 u w b c g u w b , i m u w b , i
The obtained g u w b , i and m u w b , i are in the UWB reference system Σ u w b . Being that the rover is firmly held during camera acquisitions, the (unnormalized) estimate of the vertical direction g v e r t is obtained by averaging accelerometer measurements in the Σ u w b reference system:
g v e r t = i = 1 507 g u w b , i 507
After normalization of g v e r t , then the north direction is estimated by averaging the projections of magnetometer measurements (in the Σ u w b coordinate system) on the plane orthogonal to the vertical direction. Let Π g be the projection matrix to such a plane, then:
n o r t h = i = 1 507 Π g m u w b , i 507
To conclude, the n o r t h vector is normalized, and the 3D model is rotated according to the estimated north-vertical directions. Clearly, since the median is a more robust estimator with respect to the mean, it shall be considered in (6) and (7), in particular when the number of images is relatively low.
Table 4 reports the results obtained comparing photogrammetric and TLS reconstructions. To be more specific, the following cases are considered:
(A)
Relative error case: obtained registering the two point clouds by using the ICP algorithm. UWB anchor positions were estimated through the self-positioning procedure.
(B)
As in (A), but discarding the final part of the left wing of the bastion.
(C)
Relative error case: similarly to A, but with the optimal scale (experimentally set) of the photogrammetric reconstruction being used: this case should provide results on the relative error case similar to the use of ground CPs.
(D)
As in (C), but discarding the final part of the left wing of the bastion.
(E)
Absolute error case: obtained with UWB anchor positions estimated through the self-positioning procedure.
For comparison, results obtained by using anchor positions (in local coordinates) surveyed with Leica TCR 702 Total Station are considered, as well:
(F)
Similar to Case (A), but with surveyed anchor positions.
(G)
Similar to Case (E), but with surveyed anchor positions.
Error map and distribution for Case (A) are shown in Figure 7. For comparison, the error map and distribution for Case (C) are shown in Figure 8.
Figure 9a shows the map of point-to-point distances between TLS and photogrammetric point clouds in the WGS84 UTM32 coordinate system, i.e., Case (E). In order to ease the comparison, Figure 9b reports a top-view of the two overlapping models (TLS (blue) and the photogrammetric ones (gray)).
Finally, Figure 10a shows the map of point-to-point distances between TLS and photogrammetric point clouds in Case (G). Figure 10b reports a top-view of the two overlapping models (TLS (blue) and the photogrammetric ones (gray)).

5. Discussion

Table 1, obtained comparing different scans of the Leica Pegasus backpack, shows that RMS between the two georeferenced point clouds is 5.8 cm (absolute error case), whereas it reduces to 3.1 cm after registering them by using the ICP algorithm (relative error case). These results are actually in accordance with the Pegasus backpack specifications: 3 cm of relative accuracy indoors, whereas absolute position accuracy can go up to 50 cm for a ten-minute walk [15].
Despite that at first glance the numerical results reported above might suggest that the relative error of the proposed photogrammetric system can be potentially smaller than the Pegasus backpack’s, actually this is apparently contradicted by examining the results obtained in Case (F) in Table 4. Case (F) is analogous to (A), but for the anchor positions, which were surveyed with Leica TCR 702 Total Station in this case. It is clear that Case (F) should be considered as an optimal anchor positioning case, i.e., UWB self-positioning is expected to lead to worse (or equal) performance. Consequently, the fact that RMS for Case (A) is lower than in Case (F) (and lower than the backpack’s) should be considered just a matter of chance. Instead, Case (F) shows that the expected “best” relative RMS error for photogrammetry with UWB self-positioning (9.7 cm) is worse than that of the Pegasus backpack (5.8 cm).
Even though differences between the two Pegasus acquisitions are within the backpack’s official margins of tolerance, Figure 3 shows that most significant differences are actually due to missing points in one of the two point clouds. Taking into account also laser scanning sensors’ positions and orientations in the backpack (Figure 1b), this suggests that the completeness of the acquired dataset may depend also on the backpack’s orientation with respect to the objects of interest during the survey: varying backpack orientation shall be a preferable working condition in order to ensure the acquisition of more complete datasets.
Comparing backpack relative RMS errors, the negative influence of the final part of the bastion left wing is clear (Figure 4c): this area is intrinsically difficult to reconstruct, and both mobile mapping systems performed poorly in this part. This can be noted that even if actually this area has a more significant influence on the backpack error than on the photogrammetric one, which changes just from 6.1 cm to 4.6 cm by neglecting this area.
Comparison of the Pegasus 3D model with the reference one (TLS, Section 2) confirms that Pegasus reconstruction is usually very accurate both in the relative (RMS = 4.6 cm, discarding the final part of the left wing) and in the absolute error case (RMS = 16.1 cm).
Comparison of Cases (A) and (C) in Table 4 shows that UWB self-positioning in the low-cost photogrammetry presented in Section 4 allows obtaining a good scaling factor estimate and consequently reconstruction accuracy close to the optimal one (RMS = 6.1 cm and 5.9 cm, in the self-positioning and in the optimal scale case, respectively). This shows that if georeferencing is not needed, photogrammetric reconstruction with UWB self-positioning might be considered instead of the use of control points, in order to easily obtain a quite reliable metric reconstruction.
Differently from the Pegasus, which maintains good performance in both relative and absolute error cases, performance degradation of the low-cost system of Section 4 is significant when going from local to map coordinates (e.g., from relative (RMS = 6.1 cm) to absolute errors (RMS = 50.3 cm)). Two main factors cause such significant performance degradation: estimation error of GNSS point in Σ u w b coordinates and the assessment of Σ u w b orientation with respect to the georeferenced coordinate system. The first error is actually caused by the positioning estimation error of the UWB system, whereas the second is mostly due to IMU measurement errors and to calibration errors of the camera-UWB rover system.
The following observations are now in order:
  • The obtained results show that both the mobile mapping systems (Leica Pegasus and UWB-based photogrammetry) allowed producing accurate 3D models in the relative error case, with quite comparable accuracy.
  • Absolute accuracy (map coordinates) showed an apparent difference between the two portable systems (partially motivated by the use of a better IMU in the Leica Pegasus backpack with respect to the photogrammetric system proposed in this paper).
  • Results shown in this paper confirm the nominal characteristics of the Leica Pegasus backpack, as listed in its specifications [15].
  • Given its acceptable weight and quite good portability, the Leica Pegasus backpack is a very good candidate to produce accurate 3D models in areas where the GNSS signal is not available or hard to reach with other instruments (the weight of Leica ScanStation C10 is similar to the Pegasus one; however, the Pegasus backpack is easier to carry by a human operator in certain difficult environments).
  • Given the great portability of a standard camera and of Pozyx devices, which are small (the maximum side size is 6 cm) and lightweight (12 g, approximately), the proposed system is particularly well suited for mobile mapping applications where instruments have to be carried for long periods by human operators.
  • In this paper, UWB self-positioning was performed assuming that anchors were distributed on a 2D planar surface. Future investigation will extend UWB self-positioning to more general cases.
  • Comparison of Figure 6a,b shows that the proposed UWB self-positioning procedure led to quite significant errors on certain estimated anchor positions. This had a relatively small effect on the scale estimation error; however, it negatively influenced the north-vertical direction estimation used for georeferencing (comparison between Case (E) and (G) in Table 4).
  • Calibration errors of the camera-UWB rover system affect the performance of the photogrammetric system in the georeferenced case. Improvements shall be investigated, in particular to improve the relative orientation estimation.
  • The synchronization procedure between camera and UWB acquisitions, presented in Section 4, is clearly subject to estimation errors. A more robust synchronization shall be considered to improve the results.
  • Further future investigations will be dedicated to the reduction of photogrammetric error in the georeferenced case. Possible viable solutions that will be investigated are: including outdoor area in the photogrammetric reconstruction in order to make estimates of GNSS positions in the reconstruction more reliable, increasing the number of outdoor UWB anchors and/or GNSS points used for georeferencing the 3D model.
  • Both Leica Pegasus backpack and the photogrammetric system presented in Section 4 allowed for significantly reducing the survey duration with respect to TLS. Most of the time of the backpack survey was spent on calibrating the backpack’s sensors, whereas data acquisition was very fast (a few minutes). The time to set up the UWB system was relatively fast (a few minutes), but image acquisition for photogrammetric reconstruction was longer than the backpack’s data acquisition.

6. Conclusions

This paper compared indoor surveys of a medieval bastion in Padua done by means of (i) TLS, (ii) the Leica pegasus backpack and (iii) photogrammetry. In the latter case, metric reconstruction (and 3D model georeferencing) are obtained by means of UWB Pozyx devices spread on the bastion ground and a GNSS outdoor measurement.
Comparison between the Pegasus backpack and the photogrammetric system is motivated by the fact that they share common factors of interest: mobility, portability and possible usage in areas where GNSS is not available. Clearly, given the integration of more sensors and the drastically different cost, the performance of the Pegasus backpack is better than that of the photogrammetric system proposed here. Both the considered mobile mapping systems allowed producing accurate 3D models as concerns the relative error (e.g., RMS 8.2 cm and 6.1 cm for the Leica Pegasus backpack and photogrammetry, respectively); however, the performance of the Leica backpack was shown to be much better in terms of absolute error (map coordinates), i.e., 16.1 cm vs. 50.3 cm.
Given the much lower cost (less than $2k for the UWB devices, whereas approximately $4k for the GNSS receiver) and the ease of use, the proposed photogrammetric reconstruction method might be considered for applications where required absolute accuracy of the point cloud is lower than that ensured by TLS and the Pegasus backpack.
Future investigations will be dedicated to the improvement of the performance of the presented photogrammetric-based mapping system.

Acknowledgments

The authors acknowledge the key account manager of Leica Geosystems (Hexagon) Giovanni Abate for his precious help in supporting us during the development of this project. Furthermore, the authors acknowledge Comitato Mura di Padova and Fondazione Cassa di Risparmio di Padova e Rovigo, for their financial support related to the Culturalmente 2015 fellowship.

Author Contributions

All authors conceived of and designed the experiments. Andrea Masiero, Francesca Fissore and Alberto Guarnieri performed the surveys and produced the 3D models. Andrea Masiero analyzed the data. Andrea Masiero, Francesca Fissore and Alberto Guarnieri wrote the paper. Francesco Pirotti, Antonio Vettore and Domenico Visintini supervised the work and contributed to paper revision.

Conflicts of Interest

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses or interpretation of data; in the writing of the manuscript; nor in the decision to publish the results.

References

  1. El-Sheimy, N.; Schwarz, K. Navigating urban areas by VISAT–A mobile mapping system integrating GPS/INS/digital cameras for GIS applications. Navigation 1998, 45, 275–285. [Google Scholar] [CrossRef]
  2. Toth, C. Sensor integration in airborne mapping. In Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference, Budapest, Hungary, 21–23 May 2001; Volume 3, pp. 2000–2005. [Google Scholar]
  3. Remondino, F.; Barazzetti, L.; Nex, F.; Scaioni, M.; Sarazzi, D. UAV photogrammetry for mapping and 3D modeling-current status and future perspectives. Int. Arch. Photogramm. Remote Sens. Spa. Inf. Sci. 2011, 38. [Google Scholar] [CrossRef]
  4. Al Hamad, A.; El Sheimy, N. Smartphone based mobile mapping systems. Int. Arch. Photogramm. Remote Sens. Spa. Inf. Sci. 2014, 40, 29–34. [Google Scholar] [CrossRef]
  5. Piras, M.; Di Pietra, V.; Visintini, D. 3D modeling of industrial heritage building using COTSs system: Test, limits and performances. Int. Arch. Photogramm. Remote Sens. Spa. Inf. Sci. 2017, 42, 281–288. [Google Scholar] [CrossRef]
  6. Chiang, K.W.; Tsai, M.L.; Chu, C.H. The development of an UAV borne direct georeferenced photogrammetric platform for ground control point free applications. Sensors 2012, 12, 9161–9180. [Google Scholar] [CrossRef] [PubMed]
  7. Ballarin, M.; Balletti, C.; Faccio, P.; Guerra, F.; Saetta, A.; Vernier, P. Survey methods for seismic vulnerability assessment of historical masonry buildings. Int. Arch. Photogramm. Remote Sens. Spa. Inf. Sci. 2017, 42, 55–59. [Google Scholar] [CrossRef]
  8. Masiero, A.; Fissore, F.; Vettore, A. A low-cost UWB based solution for direct georeferencing UAV photogrammetry. Remote Sens. 2017, 9, 414. [Google Scholar] [CrossRef]
  9. Alsubaie, N.M.; Youssef, A.A.; El-Sheimy, N. Improving the accuracy of direct geo-referencing of smartphone-based mobile mapping systems using relative orientation and scene geometric constraints. Sensors 2017, 17, 2237. [Google Scholar] [CrossRef] [PubMed]
  10. Fissore, F.; Pirotti, F.; Vettore, A. Open source web tool for tracking in a low-cost MMS. Int. Arch. Photogramm. Remote Sens. Spa. Inf. Sci. 2017, 99–104. [Google Scholar] [CrossRef]
  11. Ellum, C.; El-Sheimy, N. The development of a backpack mobile mapping system. Int. Arch. Photogramm. Remote Sens. 2000, 33, 184–191. [Google Scholar]
  12. Flener, C.; Vaaja, M.; Jaakkola, A.; Krooks, A.; Kaartinen, H.; Kukko, A.; Kasvi, E.; Hyyppä, H.; Hyyppä, J.; Alho, P. Seamless mapping of river channels at high resolution using mobile LiDAR and UAV-photography. Remote Sens. 2013, 5, 6382–6407. [Google Scholar] [CrossRef]
  13. Kukko, A.; Kaartinen, H.; Hyyppä, J.; Chen, Y. Multiplatform Mobile Laser Scanning: Usability and Performance. Sensors 2012, 12, 11712–11733. [Google Scholar] [CrossRef]
  14. Leica Geosystems. Leica Pegasus: Backpack Wearable Mobile Mapping Solution. Available online: https://leica-geosystems.com/products/mobile-sensor-platforms/capture-platforms/leica-pegasus-backpack (accessed on 3 March 2018).
  15. Leica Geosystems. Leica Pegasus: Backpack. Mobile reality capture. Backpack specifications. Available online: https://leica-geosystems.com/-/media/files/leicageosystems/products/datasheets/leica_pegasusbackpack_ds.ashx?la=en-gb (accessed on 3 March 2018).
  16. Mikhail, E.M.; Bethel, J.S.; Mc Glone, J.C. Introduction to Modern Photogrammetry; Wiley & Sons: New York, NY, USA, 2001. [Google Scholar]
  17. McGlone, C.; Mikhail, E.; Bethel, J.; Mullen, R. Manual of Photogrammetry; ASPRS: Bethesda, MD, USA, 1980. [Google Scholar]
  18. Luhmann, T.; Robson, S.; Kyle, S.; Harley, I. Close Range Photogrammetry: Principles, Techniques And Applications; Whittles: Dunbeath, UK, 2006. [Google Scholar]
  19. Atkinson, K.B. Close Range Photogrammetry and Machine Vision; Whittles: Dunbeath, UK, 1996. [Google Scholar]
  20. Sahinoglu, Z.; Gezici, S.; Guvenc, I. Ultra-Wideband Positioning Systems; Cambridge University Press: Cambridge, UK, 2008; Volume 2. [Google Scholar]
  21. Lee, J.S.; Su, Y.W.; Shen, C.C. A comparative study of wireless protocols: Bluetooth, UWB, ZigBee, and Wi-Fi. In Proceedings of the 33rd Annual Conference of the IEEE Industrial Electronics Society, Taipei, Taiwan, 5–8 November 2007; pp. 46–51. [Google Scholar]
  22. Groves, P.D. Principles of GNSS, Inertial, and Multisensor Integrated Navigation Systems; Artech House: Norwood, MA, USA, 2013. [Google Scholar]
  23. Hofmann-Wellenhof, B.; Lichtenegger, H.; Wasle, E. GNSS—Global Navigation Satellite Systems: GPS, GLONASS, Galileo, and More; Springer Science & Business Media: New York, NY, USA, 2007. [Google Scholar]
  24. Pozyx Labs. Pozyx Positioning System. Available online: https://www.pozyx.io/ (accessed on 3 March 2018).
  25. Habib, A.; Morgan, M. Automatic calibration of low-cost digital cameras. Opt. Eng. 2003, 42, 948–955. [Google Scholar]
  26. Heikkila, J.; Silven, O. A four-step camera calibration procedure with implicit image correction. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Juan, PR, USA, 17–19 June 1997; pp. 1106–1112. [Google Scholar]
  27. Remondino, F.; Fraser, C. Digital camera calibration methods: considerations and comparisons. Int. Arch. Photogramm. Remote Sens. Spa. Inf. Sci. 2006, 36, 266–272. [Google Scholar]
  28. Fraser, C.; Stamatopoulos, C. Automated target-free camera calibration. In Proceedings of the ASPRS 2014 Annual Conference, Louisville, KY, USA, 23–28 March 2014; Volume 2328. [Google Scholar]
  29. Luhmann, T.; Fraser, C.; Maas, H.G. Sensor modelling and camera calibration for close-range photogrammetry. J. Photogramm. Remote Sens. 2015, 115, 37–46. [Google Scholar] [CrossRef]
  30. Agisoft PhotoScan. Available online: http://www.agisoft.com/ (accessed on 3 March 2018).
  31. Topcon Positioning Systems, Inc. HiPer V. Available online: https://www.topconpositioning.com/gnss/integrated-gnss-receivers/hiper-v (accessed on 3 March 2018).
  32. Farrell, J.; Barth, M. The Global Positioning System and Inertial Navigation; McGraw-Hill: New York, NY, USA, 1999. [Google Scholar]
  33. Farrell, J. Aided Navigation: GPS with High Rate Sensors; McGraw-Hill: New York, NY, USA, 2008. [Google Scholar]
  34. Leonard, J.; Durrant-Whyte, H. Simultaneous map building and localization for an autonomous mobile robot. In Proceedings of the IROS ’91. IEEE/RSJ International Workshop on Intelligent Robots and Systems ’91. ’Intelligence for Mechanical Systems, Osaka, Japan, 3–5 November 1991; Volume 3, pp. 1442–1447. [Google Scholar]
  35. Smith, R.C.; Cheeseman, P. On the representation and estimation of spatial uncertainty. Int. J. Robot. Res. 1986, 5, 56–68. [Google Scholar] [CrossRef]
  36. Masiero, A.; Fissore, F.; Guarnieri, A.P.M.; Vettore, A. Comparison of low-cost photogrammetric survey with TLS and Leica Pegasus backpack 3D models. Int. Arch. Photogramm. Remote Sens. Spa. Inf. Sci. 2017, 42, 147–153. [Google Scholar] [CrossRef]
  37. NetGEO. Available online: www.netgeo.it/ (accessed on 3 March 2018).
  38. Besl, P.; McKay, N. A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 4, 239–256. [Google Scholar] [CrossRef]
  39. Chen, Y.; Medioni, G. Object modelling by registration of multiple range images. Image Vis. Comput. 1992, 10, 145–155. [Google Scholar] [CrossRef]
  40. Hol, J. Sensor Fusion and Calibration of Inertial Sensors, Vision, Ultra-Wideband and GPS. Ph.D. Thesis, Linköping University, Linköping, Sweden, 2011. [Google Scholar]
  41. Toth, C.; Jozkow, G.; Ostrowski, S.; Grejner-Brzezinska, D. Positioning slow moving platforms by UWB technology in GPS-challenged areas. In Proceedings of the The 9th International Symposium on Mobile Mapping Technology, Sydney, Australia, 9–11 December 2015. [Google Scholar]
  42. Dierenbach, K.; Ostrowski, S.; Jozkow, G.; Toth, C.; Grejner-Brzezinska, D.; Koppanyi, Z. UWB for Navigation in GNSS Compromised Environments. In Proceedings of the 28th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS+ 2015), Tampa, FL, USA, 14–18 September 2015; pp. 2380–2389. [Google Scholar]
  43. Goel, S.; Kealy, A.; Gikas, V.; Retscher, G.; Toth, C.; Brzezinska, D.G.; Lohani, B. Cooperative Localization of Unmanned Aerial Vehicles Using GNSS, MEMS Inertial, and UWB Sensors. J. Surv. Eng. 2017, 143, 04017007. [Google Scholar] [CrossRef]
  44. Monica, S.; Ferrari, G. An experimental model for UWB distance measurements and its application to localization problems. In Proceedings of the 2014 IEEE International Conference on Ultra-WideBand (ICUWB), Paris, France, 1–3 September 2014; pp. 297–302. [Google Scholar]
  45. Wen, K.; Yu, K.; Li, Y. An experimental correction model for UWB through-the-wall distance measurements. In Proceedings of the 2016 IEEE International Conference on Ubiquitous Wireless Broadband (ICUWB), Nanjing, China, 16–19 October 2016; pp. 1–4. [Google Scholar]
  46. Anderson, B.D.; Moore, J.B. Optimal Filtering; Courier Corporation: North Chelmsford, MA, USA, 2012. [Google Scholar]
  47. Pirkl, G.; Munaretto, D.; Fischer, C.; An, C.; Lukowicz, P.; Klepal, M.; Timm-Giel, A.; Widmer, J.; Pesch, D.; Gellersen, H. Virtual lifeline: Multimodal sensor data fusion for robust navigation in unknown environments. Pervasive Mob. Comput. 2012, 8, 388–401. [Google Scholar]
  48. Saeedi, S.; Moussa, A.; El-Sheimy, N. Context-Aware Personal Navigation Using Embedded Sensor Fusion in Smartphones. Sensors 2014, 14, 5742–5767. [Google Scholar] [CrossRef] [PubMed]
  49. Masiero, A.; Guarnieri, A.; Pirotti, F.; Vettore, A. A Particle Filter for Smartphone-Based Indoor Pedestrian Navigation. Micromachines 2014, 5, 1012–1033. [Google Scholar] [CrossRef]
  50. Hol, J.; Schön, T.; Gustafsson, F. Modeling and Calibration of Inertial and Vision Sensors. Int. J. Robot. Res. 2010, 29, 231–244. [Google Scholar] [CrossRef]
Figure 1. (a) UWB Pozyx device (beside a centimeter ruler). (b) Leica Pegasus backpack.
Figure 1. (a) UWB Pozyx device (beside a centimeter ruler). (b) Leica Pegasus backpack.
Applsci 08 00416 g001
Figure 2. (a) A section of the tunnel of the Impossible bastion. (b) Setup of the Leica ScanStation C10 over a traverse mark and target pole over the back target.
Figure 2. (a) A section of the tunnel of the Impossible bastion. (b) Setup of the Leica ScanStation C10 over a traverse mark and target pole over the back target.
Applsci 08 00416 g002
Figure 3. Leica Pegasus precision assessment: comparison of two 3D models generated in successive surveys. (a) Top view of the Pegasus backpack 3D model and line u (in red) corresponding to the corridor direction. (b) Profile view of the corridor of the Pegasus backpack 3D model. (c) Average error between the two 3D models evaluated on slices distributed along the line u. (d) Average error between the two 3D models after registration with ICP.
Figure 3. Leica Pegasus precision assessment: comparison of two 3D models generated in successive surveys. (a) Top view of the Pegasus backpack 3D model and line u (in red) corresponding to the corridor direction. (b) Profile view of the corridor of the Pegasus backpack 3D model. (c) Average error between the two 3D models evaluated on slices distributed along the line u. (d) Average error between the two 3D models after registration with ICP.
Applsci 08 00416 g003aApplsci 08 00416 g003b
Figure 4. Leica Pegasus 3D model accuracy assessment: comparison with the TLS 3D model. Point clouds registered in local coordinates with the ICP algorithm. (a) Histogram of point-to-point distances between point clouds. (b) Histogram of point-to-point distances discarding the end part of the left wing of the bastion. (c) Final part of left wing of the bastion.
Figure 4. Leica Pegasus 3D model accuracy assessment: comparison with the TLS 3D model. Point clouds registered in local coordinates with the ICP algorithm. (a) Histogram of point-to-point distances between point clouds. (b) Histogram of point-to-point distances discarding the end part of the left wing of the bastion. (c) Final part of left wing of the bastion.
Applsci 08 00416 g004
Figure 5. Leica Pegasus 3D model accuracy assessment: comparison with the TLS 3D model. Comparison of georeferenced point clouds (WGS84 UTM32). (a) Map of point-to-point distances. (b) Histogram of distances shown in (a).
Figure 5. Leica Pegasus 3D model accuracy assessment: comparison with the TLS 3D model. Comparison of georeferenced point clouds (WGS84 UTM32). (a) Map of point-to-point distances. (b) Histogram of distances shown in (a).
Applsci 08 00416 g005
Figure 6. Pozyx anchor positions during survey in the Impossible bastion estimated with respect to PhotoScan 3D model (local coordinate system): surveyed with Leica TCR 702 Total Station (red dots); estimated with self-positioning of Pozyx anchors (blue dots).
Figure 6. Pozyx anchor positions during survey in the Impossible bastion estimated with respect to PhotoScan 3D model (local coordinate system): surveyed with Leica TCR 702 Total Station (red dots); estimated with self-positioning of Pozyx anchors (blue dots).
Applsci 08 00416 g006
Figure 7. Photogrammetric 3D model accuracy assessment: comparison with the TLS 3D model. Point clouds registered in local coordinates with the ICP algorithm. (a) Map of point-to-point distances. (b) Histogram of distances shown in (a).
Figure 7. Photogrammetric 3D model accuracy assessment: comparison with the TLS 3D model. Point clouds registered in local coordinates with the ICP algorithm. (a) Map of point-to-point distances. (b) Histogram of distances shown in (a).
Applsci 08 00416 g007
Figure 8. Photogrammetric 3D model (with optimal scale factor) accuracy assessment: comparison with the TLS 3D model. Point clouds registered in local coordinates with the ICP algorithm. (a) Map of point-to-point distances. (b) Histogram of distances shown in (a).
Figure 8. Photogrammetric 3D model (with optimal scale factor) accuracy assessment: comparison with the TLS 3D model. Point clouds registered in local coordinates with the ICP algorithm. (a) Map of point-to-point distances. (b) Histogram of distances shown in (a).
Applsci 08 00416 g008
Figure 9. (a) Map of point-to-point distances between TLS and photogrammetric point clouds in the WGS84 UTM32 coordinate system. (b) Top view of the two georeferenced point clouds (TLS (blue) and the photogrammetric ones (gray)).
Figure 9. (a) Map of point-to-point distances between TLS and photogrammetric point clouds in the WGS84 UTM32 coordinate system. (b) Top view of the two georeferenced point clouds (TLS (blue) and the photogrammetric ones (gray)).
Applsci 08 00416 g009
Figure 10. (a) Map of point-to-point distances between TLS and photogrammetric point clouds in the WGS84 UTM32 coordinate system. UWB anchor positions surveyed with Leica TCR 702 Total Station. (b) Top view of the two georeferenced point clouds (TLS (blue) and the photogrammetric ones (gray)).
Figure 10. (a) Map of point-to-point distances between TLS and photogrammetric point clouds in the WGS84 UTM32 coordinate system. UWB anchor positions surveyed with Leica TCR 702 Total Station. (b) Top view of the two georeferenced point clouds (TLS (blue) and the photogrammetric ones (gray)).
Applsci 08 00416 g010
Table 1. Leica Pegasus 3D model precision assessment.
Table 1. Leica Pegasus 3D model precision assessment.
Average Error (cm)RMS (cm)Maximum Error (cm)
absolute error4.75.8110.3
relative error2.23.160.5
Table 2. Leica Pegasus 3D model accuracy assessment: relative error with respect to the TLS 3D model.
Table 2. Leica Pegasus 3D model accuracy assessment: relative error with respect to the TLS 3D model.
Average Error (cm)RMS (cm)Maximum Error (cm)
relative error4.38.2193.4
relative error (without left wing)3.64.679.3
Table 3. Leica Pegasus 3D model accuracy assessment: absolute error with respect to TLS 3D model.
Table 3. Leica Pegasus 3D model accuracy assessment: absolute error with respect to TLS 3D model.
Average Error (cm)RMS (cm)Maximum Error (cm)
absolute error12.816.1139.3
absolute error (without left wing)11.714.467.8
Table 4. Photogrammetric 3D model accuracy assessment: comparison with the TLS 3D model.
Table 4. Photogrammetric 3D model accuracy assessment: comparison with the TLS 3D model.
Average Error (cm)RMS (cm)Max Error (cm)
(A) relative error3.66.1115.9
(B) relative error without left wing3.14.979.7
(C) relative error + opt.scale2.25.9120.9
(D) relative error + opt.scale without left wing1.94.790.8
(E) absolute error42.050.3134.9
(F) relative error + surveyed anchor positions6.69.7114.0
(G) absolute error + surveyed anchor positions23.730.7109.2

Share and Cite

MDPI and ACS Style

Masiero, A.; Fissore, F.; Guarnieri, A.; Pirotti, F.; Visintini, D.; Vettore, A. Performance Evaluation of Two Indoor Mapping Systems: Low-Cost UWB-Aided Photogrammetry and Backpack Laser Scanning. Appl. Sci. 2018, 8, 416. https://doi.org/10.3390/app8030416

AMA Style

Masiero A, Fissore F, Guarnieri A, Pirotti F, Visintini D, Vettore A. Performance Evaluation of Two Indoor Mapping Systems: Low-Cost UWB-Aided Photogrammetry and Backpack Laser Scanning. Applied Sciences. 2018; 8(3):416. https://doi.org/10.3390/app8030416

Chicago/Turabian Style

Masiero, Andrea, Francesca Fissore, Alberto Guarnieri, Francesco Pirotti, Domenico Visintini, and Antonio Vettore. 2018. "Performance Evaluation of Two Indoor Mapping Systems: Low-Cost UWB-Aided Photogrammetry and Backpack Laser Scanning" Applied Sciences 8, no. 3: 416. https://doi.org/10.3390/app8030416

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop