Next Article in Journal
Reproducibility of Cycling Kinetics on an Ergometer Designed to Quantify Asymmetry
Previous Article in Journal
An FPGA-Based Reconfigurable Accelerator for Real-Time Affine Transformation in Industrial Imaging Heterogeneous SoC
Previous Article in Special Issue
E-Sem3DGS: Monocular Human and Scene Reconstruction via Event-Aided Semantic 3DGS
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Quantifying the Measurement Precision of a Commercial Ultrasonic Real-Time Location System for Camera Pose Estimation in Indoor Photogrammetry

Department of Geomatics Engineering, The University of Calgary, 2500 University Drive NW, Calgary, AB T2N 1N4, Canada
*
Author to whom correspondence should be addressed.
Sensors 2026, 26(1), 319; https://doi.org/10.3390/s26010319
Submission received: 21 November 2025 / Revised: 18 December 2025 / Accepted: 19 December 2025 / Published: 3 January 2026
(This article belongs to the Special Issue Sensors for Object Detection, Pose Estimation, and 3D Reconstruction)

Abstract

Photogrammetric reconstruction from indoor imagery requires either labor-intensive ground control points (GCPs) or positioning sensor integration. While global navigation satellite system technology revolutionized aerial photogrammetry by enabling direct georeferencing through integrated sensor orientation (ISO), indoor environments lack an equivalent positioning solution. Before indoor positioning systems can be adopted for photogrammetric applications, their fundamental measurement precision must be established. This study characterizes the repeatability and temporal stability of the ZeroKey Quantum real-time location system (RTLS) as a prerequisite to testing reconstruction accuracy when RTLS measurements provide camera pose constraints in photogrammetric bundle adjustment. Through systematic tripod-mounted observations across 30 test locations in a controlled laboratory environment, optimal data collection protocols were determined, temporal stability was investigated, and measurement precision was quantified. An automated position-based stationary detection algorithm using a 20 mm threshold successfully identified all 30 stationary periods for durations of 30 s or less. Optimal duration analysis revealed that 1 s observation windows achieve 3 mm position precision and 1° orientation precision after brief settling, enabling practical workflows with worst-case total collection time of 2.5 s per station. Per-axis uncertainties were quantified as 1.6 mm, 1.7 mm, and 1.1 mm root mean square (RMS) for position and 0.08°, 0.09°, and 0.07° RMS for orientation. These findings demonstrate that ultrasonic RTLS achieves millimeter-level position repeatability and sub-degree orientation repeatability, establishing the measurement precision necessary to justify subsequent accuracy testing through photogrammetric bundle adjustment.

1. Introduction

1.1. Background

Photogrammetric reconstruction produces three-dimensional (3D) models from overlapping photographs, but without external reference information, these models exist in an arbitrary coordinate system with unknown scale. Resolving the scale ambiguity is essential for extracting metric information from the reconstruction. When aerial photogrammetry was first developed, this was achieved through extensive networks of ground control points (GCPs)—points with known coordinates that tie the reconstructed model to a global coordinate system and established correct scale. However, establishing and surveying these GCP networks was labor-intensive, severely limiting operational efficiency. The integration of global navigation satellite system (GNSS) receivers with aerial cameras in the 1990s revolutionized the field by enabling direct georeferencing through integrated sensor orientation (ISO), where GNSS measurements are incorporated as weighted observations in photogrammetric bundle adjustment [1,2]. GNSS-assisted aerial triangulation dramatically reduced ground control requirements, and combined GNSS/inertial navigation system solutions enabled high-accuracy reconstruction with minimal or no GCPs [3,4]. This technological advancement transformed aerial photogrammetry from a ground-control-intensive process into a rapid, efficient workflow.
Indoor environments lack an equivalent positioning solution. GNSS signals do not penetrate building structures, rendering GNSS-based ISO impossible for indoor applications. This limitation motivates investigation of indoor positioning systems (IPSs) that could provide camera pose constraints for indoor photogrammetric reconstruction.

1.2. Related Work

IPSs have been implemented using various positioning technologies, including Wi-Fi, Bluetooth Low Energy (BLE), ultra-wideband (UWB), and ultrasonic technologies. Comprehensive surveys have examined these technologies from perspectives of accuracy, coverage, cost, and application suitability [5,6,7]. Wi-Fi fingerprinting typically achieves meter-level accuracy [5], while BLE fingerprinting achieves 1–3 m accuracy [6]. UWB systems achieve 10–30 cm accuracy through time-of-arrival ranging [8], and ultrasonic ranging achieves 1–50 cm accuracy, with some implementations reaching millimeter-level performance [9,10]. Of these technologies, only UWB and ultrasonic technologies demonstrate accuracy potentially suitable for photogrammetric applications.
Significant research has addressed the integration of positioning systems with visual systems for dynamic applications involving moving cameras. While UWB-vision fusion has been investigated extensively for real-time applications [11,12,13,14,15], ultrasonic-vision integration has received comparatively less attention [16]. These approaches employ sensor fusion frameworks—such as visual odometry [11], visual-inertial odometry [13], and simultaneous localization and mapping [14]—that continuously estimate position as the camera moves. Such frameworks prioritize real-time responsiveness and trajectory continuity for applications such as mobile robotics, pedestrian navigation, and unmanned aerial vehicle (UAV) localization.
These sensor fusion frameworks can be characterized as tightly or loosely coupled. Tightly coupled integration techniques fuse raw measurements from multiple sensors simultaneously within a single estimation framework [14,17]. Loosely coupled techniques instead combine already-processed outputs from independent systems [18]. The present study proposes a loosely coupled approach where RTLS position and orientation measurements serve as weighted constraints in photogrammetric bundle adjustment. Regardless of fusion approach, calibration for sensor error models may be required depending on system characteristics [19], and geometric calibration of mounting parameters (lever arm and boresight) is necessary for accurate direct georeferencing [20,21,22]. However, mounting parameter calibration is beyond the scope of this precision characterization study.
For photogrammetric applications specifically, existing positioning integration approaches have employed post-processing methods rather than direct incorporation as weighted constraints within bundle adjustment. Reference [23] demonstrated UWB-based positioning for UAV photogrammetry by applying a post-processing transformation to align reconstruction with the positioning coordinate frame, while [24] used ultrasonic positioning for post-processing depth correction of structure-from-motion point clouds. Post-processing approaches can only correct the final reconstruction as a rigid body, treating systematic errors uniformly across the entire model. In contrast, ISO incorporates positioning measurements directly into bundle adjustment optimization, allowing the solution to account for measurement uncertainties at each camera station.
In summary, existing research on positioning-vision integration exhibits three key gaps: (1) UWB has received substantially more attention than ultrasonic positioning, despite ultrasonic systems being capable of superior accuracy; (2) integration approaches have focused on real-time tracking for dynamic applications rather than precise pose estimation at discrete capture instants; and (3) photogrammetric applications have employed post-processing corrections rather than direct incorporation of positioning measurements as weighted constraints in bundle adjustment. Systematic characterization of ultrasonic positioning system precision for stationary camera pose estimation, as required for photogrammetric ISO, remains relatively unexplored.

1.3. Research Objectives

These gaps motivate evaluation of a real-time location system (RTLS) as a positioning solution for indoor photogrammetry. The present study quantifies the measurement precision of the Quantum RTLS (ZeroKey Inc., Calgary, AB, Canada)—a commercial ultrasonic positioning system with an integrated inertial measurement unit (IMU) and manufacturer-specified position accuracy of 1.5 mm [25]. The envisioned operational workflow involves efficient handheld data collection where an operator walks through a space, pausing briefly at each station to capture imagery and record pose estimates—eliminating both GCPs and tripod setup. Realizing this workflow requires a positioning system with sufficient measurement accuracy. However, precision is a necessary prerequisite for accuracy: sensors with poor repeatability cannot provide accurate constraints. This study establishes precision thresholds appropriate for photogrammetric applications and identifies data collection protocols necessary for the RTLS to achieve them.
Before the envisioned handheld workflow can be validated, fundamental RTLS measurement characteristics must be established under controlled conditions. This study therefore employs tripod-mounted observations to establish optimal data collection protocols and quantify achievable precision. This approach isolates RTLS measurement behavior from operator motion artifacts, providing the empirical foundation required before progressing to handheld deployment.
This paper addresses the following research objectives in a controlled laboratory environment:
  • Develop an automated stationary detection algorithm to support the envisioned walk-around workflow, enabling the system to detect when the operator has paused and signal when measurements meeting precision thresholds have been acquired.
  • Evaluate IMU sensor fusion performance across all five proprietary configuration profiles to identify which provides suitable orientation constraints for stationary photogrammetric observations.
  • Determine optimal data collection protocols by evaluating both duration (observation windows from 0.25 s to 90 s) and strategy (immediate collection versus threshold-based settling).
  • Quantify achievable position and orientation precision under stationary conditions, establishing baseline performance characteristics to inform the weighting of camera pose constraints in photogrammetric bundle adjustment.
The findings provide the empirical foundation necessary for subsequent accuracy testing of RTLS-assisted photogrammetric reconstruction, establishing whether ultrasonic positioning can offer indoor photogrammetry what GNSS provided for aerial photogrammetry—a practical alternative to ground control.

1.4. Organization

The remainder of this paper is organized as follows: Section 2 describes the ZeroKey Quantum RTLS architecture, experimental design, and analytical framework. Section 3 presents experimental results and analysis, including stationary detection algorithm evaluation, IMU profile comparison, optimal data collection protocol analysis, and precision quantification. Section 4 synthesizes the findings and establishes accuracy testing as the focus for subsequent work.

2. Materials and Methods

2.1. ZeroKey Quantum RTLS

The ZeroKey Quantum RTLS is a commercial ultrasonic IPS. The following system description is based on the manufacturer’s white paper [26] and technical documentation [27].
The system comprises fixed anchor nodes with known positions, one or more mobile nodes that serve as the tracked devices, and a Gateway that serves as the communications bridge to a connected computer. The mobile node body frame coordinate system is defined with its origin at the center of the ultrasonic transducer, as shown in Figure 1. The positive X-axis points toward the device’s action button, the positive Y-axis points toward the micro-USB port, and the Z-axis completes a right-handed coordinate system.
The Quantum RTLS implements a hybrid electromagnetic-ultrasonic ranging protocol that eliminates the strict clock synchronization requirements of conventional ToF systems. When the mobile device initiates a ranging sequence, it transmits both an electromagnetic signal and an ultrasonic pulse simultaneously. The electromagnetic wave arrives at the anchor node essentially instantaneously and serves as a precise time reference, while the ultrasonic pulse arrives after a measurable delay. By comparing the arrival times of these two signals, the anchor node can calculate the distance to the mobile node without requiring synchronized clocks between devices.
The system fuses range measurements from multiple anchors with inertial measurements from the mobile node’s integrated IMU to produce six-degree-of-freedom pose estimates. The position of the mobile node transducer is expressed as Cartesian coordinates (X, Y, Z), and the orientation of the mobile node body frame relative to the local anchor network coordinate system is expressed as a unit quaternion.
The system requires the mobile device to have line-of-sight to a minimum of four anchors at all times for 3D positioning [29]. Before operation, the anchor network must undergo a guided calibration procedure to establish the local coordinate system [30]. The local frame is a right-handed Cartesian coordinate system with the negative Z-axis aligned with local gravity. The system selects an arbitrary anchor as the origin. Heading is also defined arbitrarily. The proprietary nature of the calibration algorithm prevents direct examination of how anchor geometry influences coordinate system establishment.
The integrated IMU provides orientation estimates through five selectable profiles optimized for different motion patterns [31]: Profile 1 (Pedestrian), Profile 2 (Tooltip), Profile 3 (Hybrid Autonomous Guided Vehicle), Profile 4 (Autonomous Guided Vehicle), and Profile 5 (Generic). Each profile employs proprietary sensor fusion algorithms and requires an initialization training procedure each time positioning begins to train the sensor fusion algorithm for the expected motion characteristics. All five profiles were evaluated empirically to determine which demonstrates precision characteristics best suited for photogrammetric bundle adjustment.

2.2. Experimental Design and Data Collection

2.2.1. Test Environment and Infrastructure

Experiments were conducted in the Geospatial Vision Metrology Laboratory at the University of Calgary, where a ZeroKey Quantum RTLS was deployed to provide positioning coverage over a 6 m × 8 m floor area. Eight anchor nodes were mounted on the walls and ceiling at heights ranging from 2.3 m to 3.2 m above the floor, selected to provide optimal geometric coverage at the typical handheld camera height of approximately 1.5 m, reflecting the intended smartphone photogrammetric workflow. The anchor network geometry was designed to satisfy three requirements:
  • Ensuring at least four anchors maintain line-of-sight to any position within the tracked volume to enable robust positioning.
  • Maximizing geometric diversity through varied anchor heights and spatial distribution to strengthen positioning accuracy throughout the test area.
  • Maintaining practical installation constraints by mounting anchors on existing walls and ceiling surfaces without specialized infrastructure.
Figure 2 presents a perspective view of the test area (rendered from a point cloud acquired via terrestrial laser scanning) showing the eight anchor node positions within the laboratory space. The anchor network was calibrated using the manufacturer’s guided calibration procedure to establish the local coordinate system.
Figure 3 shows the anchor network geometry in plan view, illustrating the 6 m × 8 m floor area and the distribution of anchor nodes around the perimeter and overhead. A regular grid of 30 test points with 1 m spacing was established within this area, covering the central region where geometric constraints from the anchor network are strongest. These grid points served as observation locations for systematic data collection, enabling evaluation of positioning performance as a function of location within the tracked volume.

2.2.2. Data Collection

Data collection followed a systematic protocol designed to evaluate measurement repeatability and temporal stability. At each grid point, a ZeroKey mobile node was positioned on a tripod with the transducer facing upward at approximately 1.5 m height—approximating typical handheld camera positions—and remained stationary for 90 s while continuously logging position and orientation estimates at 20 Hz. This tripod-mounted approach isolated RTLS measurement characteristics from operator motion artifacts, establishing proof-of-concept validation under controlled conditions before progressing to handheld deployment.
The 90 s observation duration was selected to enable investigation of temporal stability and convergence characteristics while remaining practical for multi-station surveys. Each stationary period provided approximately 1800 pose estimates (90 s × 20 Hz), enabling robust statistical characterization of measurement precision and systematic behavior.
Data collection was repeated across all 30 grid points for each of the five IMU profiles, generating comprehensive datasets for profile comparison. Prior to data collection for each profile, the system was initialized according to the manufacturer’s instructions.

2.3. Analytical Framework

2.3.1. Stationary Period Detection

Automated detection of stationary periods from the continuous data stream is essential for the envisioned walk-around workflow. A position-based detection approach using consecutive position differences was implemented, avoiding potential issues with velocity estimates derived from internal filtering rather than direct measurement. The algorithm computed the Euclidean distance between consecutive position estimates at the 20 Hz sampling rate, classifying samples as stationary when the distance fell below a specified threshold. Stationary periods were then identified as continuous runs of stationary samples exceeding a minimum duration requirement.
The position threshold parameter required empirical determination. To identify the minimum threshold achieving reliable stationary detection, systematic testing was performed across threshold values of 15 mm, 20 mm, and 25 mm. For each threshold, detection success was assessed by comparing the number of detected periods against the 30 known stationary periods, with ground truth boundaries established by timestamps recorded during data collection. The interaction between threshold and minimum duration was also characterized by varying the minimum duration parameter from 1 s to 25 s, with particular attention to whether detected period durations matched the true stationary period lengths (nominally 90 s).
Following threshold determination, the optimal threshold was adopted for all subsequent analyses. Detection performance was evaluated across the five IMU profiles and three stationary duration conditions (30 s, 60 s, and 90 s), measuring both the number of periods successfully identified and the 3D precision achieved for each detected period.

2.3.2. Position and Orientation Estimate Analysis

This analysis used the full 90 s collections at each grid point to establish baseline precision metrics and identify any temporal or spatial performance patterns that would influence subsequent data collection optimization.
For orientation analysis, quaternions from each 90 s period were averaged using the Markley method [32] to ensure mathematically rigorous averaging across the rotation space. For photogrammetric applications, the orientation quaternions were converted to Cardan angles ( ω ϕ κ ), representing sequential rotations about the X, Y’, and Z’’ axes and aligning with standard close-range photogrammetric notation for camera pose parameters [33].
The tripod-mounted data collection employed the mobile node with the ultrasonic transducer facing upward, simulating the envisioned smartphone mounting configuration where the mobile node would be rigidly attached to a smartphone held in landscape orientation and tilted upward to capture imagery. This mounting orientation results in nominal camera orientations of ω ≈ 90° (depending on the upward tilt angle), ϕ ≈ 0° (representing the approximate heading at which the mobile node was manually placed), and κ ≈ 0° (due to the landscape orientation). This physically realistic configuration enabled the measured orientations to be assessed against known nominal angles to validate whether the RTLS produces accurate orientation measurements for the intended photogrammetric application.
Position precision was characterized by 95% confidence ellipses for horizontal (XY) components and 95% confidence intervals for vertical (Z) components. Standard deviations were calculated separately for each Cardan angle to quantify rotational stability and identify component-specific characteristics.

2.3.3. Optimal Duration Analysis

Optimal duration analysis employed a two-stage approach to determine the minimum observation time required at each station to achieve pose estimates meeting precision thresholds. Candidate durations ranging from 0.25 s to 90 s were systematically evaluated, with the minimum viable duration set to 0.25 s (5 epochs at 20 Hz) to ensure reliable precision calculations.
The 3D position precision threshold was set at 3 mm—double the manufacturer’s 1.5 mm accuracy specification—to account for potential systematic errors and ensure robust performance under varying conditions. The precision threshold for each Cardan angle was set at 1°, a conservative value that the system easily satisfied. At the achieved precision level of approximately 0.1° (Section 3.4), orientation uncertainty contributes less than 9 mm positional uncertainty at object distances of 5 m, typical of indoor close-range applications.
First-window analysis evaluated the precision achievable from immediate data collection by extracting the initial portion of each 90 s dataset for each candidate duration and computing the resulting position and orientation precision. For each candidate duration, the mean and range of 3D position precision and per-angle orientation precision values across all 30 grid points were computed to characterize both typical performance and variability between locations.
Building on these baseline results, sliding-window threshold analysis identified the earliest window within each stationary period that achieved the precision criterion. Position and orientation measurements were analyzed independently, as their precision may evolve differently over time. Windows of fixed duration were advanced through the 90 s stationary period at 0.05 s (1-epoch) intervals. The analysis recorded:
  • Success rate: the percentage of grid points where the precision criterion was met.
  • Wait time: the delay from the start of the stationary period until the start of the first threshold-meeting window.
  • Collection time: wait time plus window duration.
  • Final precision: the precision values achieved during the first threshold-meeting window.
An optimal collection duration was selected based on these results—the shortest duration achieving 100% success rate for both position and orientation. Visualizations of the 95% position confidence regions were prepared for this optimal duration using both the first-window and sliding-window threshold approaches, demonstrating the improvements achievable through threshold-based settling compared to immediate data collection.

2.3.4. Observation Uncertainty Quantification

Lastly, per-component position and orientation precision were quantified to inform the weighting of camera pose constraints in photogrammetric bundle adjustment. Using the earliest windows meeting both position and orientation thresholds at the optimal duration across all grid points, individual standard deviations for each position axis and each Cardan angle were computed, along with their root mean square (RMS) values. This approach provides a balanced characterization of measurement uncertainty that accounts for spatial variability without being dominated by outliers.

3. Results

3.1. Stationary Period Detection

3.1.1. Threshold Determination

Systematic testing across threshold values of 15 mm, 20 mm, and 25 mm identified 20 mm as the minimum threshold achieving reliable detection of all 30 stationary periods.
Thresholds below 20 mm resulted in over-detection due to fragmentation of legitimate stationary periods. At 15 mm with 1 s minimum duration, 32 periods were detected instead of 30, with station 6—as shown in Figure 3b—fragmented into three separate detections (67.3 s, 6.7 s, and 20.4 s) due to instantaneous position jitter exceeding the stricter threshold. This fragmentation was consistent across minimum durations from 1 s to 10 s. Increasing the minimum duration to 25 s filtered the spurious fragments (yielding 30 detections), but the detected duration for station 6 remained only 67.3 s compared to the nominal 90 s stationary period. Correct detection of the full stationary period would require minimum durations exceeding 67 s, contradicting the goal of rapid data collection.
Conversely, increasing the threshold to 25 mm introduced a minor false positive at 1 s minimum duration: 31 periods were detected, including a spurious 1.1 s detection during a transition between stations. This was easily resolved by increasing the minimum duration to 5 s, which achieved correct detection of all 30 periods with accurate durations. Unlike the 15 mm threshold, which required impractically long minimum durations to avoid fragmenting legitimate stationary periods, the 25 mm threshold remained viable for efficient workflows. This asymmetry suggests that erring toward larger thresholds is preferable, as spurious transition detections are more easily filtered than fragmented stationary periods.
The 20 mm threshold achieved 100% detection success at minimum durations as short as 1 s, accommodating instantaneous sample-to-sample variability while maintaining sensitivity to true stationary periods. While this threshold exceeds the manufacturer’s 1.5 mm accuracy specification by an order of magnitude, it reflects instantaneous jitter characteristics of the real-time sensor fusion algorithm rather than time-averaged precision. The precision analysis in Section 3.4 confirms that millimeter-level repeatability is achieved once stationarity is established and measurements are averaged over appropriate collection windows. The 20 mm threshold was adopted for all subsequent stationary detection analyses.

3.1.2. Detection Performance

The automated stationary detection algorithm was evaluated across the five IMU profiles at three stationary period durations (30 s, 60 s, and 90 s) to assess detection reliability and characterize precision across profiles. Table 1 presents detection results showing the number of successfully detected periods, detection accuracy percentages, undetected point IDs, and 3D precision statistics for each profile-duration combination.
All five IMU profiles achieved 100% detection of 30 stationary periods at 30 s duration, with mean 3D precision ranging from 2.4 mm (Profile 2) to 4.1 mm (Profile 5). However, extending stationary periods to 60 s and 90 s reduced detection reliability for Profiles 1, 4, and 5, while Profiles 2 and 3 maintained 100% detection across all durations. Profile 1 showed the most severe duration sensitivity, declining to 90% for 90 s periods (failures at grid points 1, 22, and 26). Profiles 4 and 5 exhibited moderate duration sensitivity, with detection rates dropping to 97% at 60 s and further declining at 90 s. These findings indicate that RTLS’s proprietary filtering algorithms prioritize real-time responsiveness over extended stationary stability, making longer collection periods counterproductive for certain profiles.
Profile 2 was selected for all subsequent analyses due to its consistent 100% detection performance across all durations and its superior precision characteristics—both the best mean precision (2.4 mm) and the best maximum precision (5.9 mm) among the five profiles. While Profile 3 showed similar detection performance, Profile 2′s precision and design characteristics make it the optimal choice for the envisioned workflow. All subsequent results present data from Profile 2 only, with complete results for all five profiles available in the Appendix A to support application-specific optimization.

3.2. Position and Orientation Estimate Analysis

Position estimates demonstrated consistent sub-centimeter precision across the tracked volume. Figure 4 presents the 95% confidence regions calculated from the 90 s stationary periods at all 30 grid points. Horizontal confidence ellipses achieved a mean semi-major axis length of 5.1 mm with a maximum of 15.3 mm, while vertical confidence intervals averaged 2.0 mm with a maximum of 4.5 mm.
Orientation estimates demonstrated excellent temporal precision but exhibited expected spatial variation. Table 2 presents orientation statistics from 90 s stationary periods across all 30 grid points, showing both spatial mean values (variation across grid points) and temporal precision (mean standard deviation within each 90 s period).
Estimates for ω clustered tightly around 89.6°, demonstrating excellent agreement with the nominal 90° pitch angle for the tilted smartphone configuration. Measurements for κ centered at −0.4°, closely matching the expected 0° roll angle for landscape orientation. Values for ϕ ranged from −22.8° to 10.6°, with a spatial standard deviation of 8.3°—substantially larger variation compared to ω and κ . This variation reflects physical heading differences resulting from manual rig placement at each grid point rather than measurement instability, as visual alignment without indexed mounting hardware introduces heading variability.
Temporal standard deviations, averaged across all 30 grid points, demonstrated excellent measurement stability at individual locations, with ω achieving 0.07° ± 0.05°, ϕ achieving 0.12° ± 0.09°, and κ achieving 0.08° ± 0.07°. The temporal precision of ϕ —two orders of magnitude better than its spatial variation—confirms that the system accurately tracks heading changes, with observed spatial variation reflecting real differences in placement rather than measurement instability.

3.3. Optimal Duration Analysis

3.3.1. First-Window Analysis

First-window analysis examined precision achievable from immediate data collection using windows extracted from the beginning of each stationary period. Figure 5 presents the mean 3D precision and individual Cardan angle precisions across all 30 grid points as a function of collection duration.
Mean position precision improved from 10.7 mm at 0.5 s to 3.6 mm at 30 s, demonstrating the expected inverse relationship between collection duration and measurement uncertainty. However, mean precision did not achieve the 3 mm threshold until between 45 s and 60 s. More critically, the maximum precision across all grid points remained above 6.5 mm, even after 90 s, revealing that some locations exhibited persistent high variability.
For the Cardan angles, ω precision exhibited the fastest convergence and tightest distribution, with maximum values below 0.8° for all durations. Precision for κ demonstrated rapid convergence, with maximum values falling below the 1° threshold by approximately 4–5 s. Precision for ϕ showed the slowest convergence among the three angles, with maximum values requiring approximately 11–12 s to fall below 1°.

3.3.2. Sliding-Window Threshold Analysis

Sliding-window threshold analysis investigated whether allowing for an initial settling period could improve precision. Table 3 compares position precision achieved by immediate first windows versus windows positioned after threshold conditions were met.
The analysis demonstrated 100% success rates for all window durations up to 30 s, indicating that the 3 mm threshold could be achieved at all grid points given sufficient settling time. While first windows of 1 s achieved 11.1 mm mean position precision, the same 1 s window achieved 2.5 mm precision after a median wait time of only 0.60 s, confirming the presence of an initial settling period during which measurement precision stabilizes.
Among durations achieving 100% success, a clear trade-off emerged between wait time and collection duration. Shorter windows (0.5 s to 1 s) required median wait times of about 0.6 s but yielded median total collection times of 1.60 s or less. Longer windows (10 s to 30 s) required minimal median wait times but extended total collection times to 10 s or more. The maximum wait times revealed substantial spatial variability, with some grid locations requiring considerably longer settling periods than others.
Orientation measurements demonstrated fundamentally different convergence behavior. Table 4 presents corresponding sliding-window results for orientation precision, showing that all three Cardan angles achieved 100% success rates across all window durations, with median wait times of 0.00 s and maximum wait times of 0.30 s or less for durations up to 5 s. Near-immediate convergence confirmed that orientation estimates stabilized substantially faster than position estimates, establishing position as the critical bottleneck in data collection protocol optimization.

3.3.3. Optimal Duration Recommendation

Based on this analysis, a 1 s duration was selected as optimal for the photogrammetric workflow. This duration achieved 100% success for position estimates, with a median total collection time of 1.60 s and a worst-case collection time of 2.50 s, while providing sufficient samples (20 epochs at 20 Hz) for robust precision estimation. Orientation measurements required no additional wait time at this duration, confirming that the 1 s window captured stable estimates for both position and orientation. Shorter windows, while faster, provided fewer samples for precision calculations.
To demonstrate the effectiveness of the optimized protocol, 95% confidence regions were computed for both immediate 1 s collection without settling and 1 s collection after threshold-based settling.
Figure 6 shows confidence regions for immediate 1 s windows collected from the start of each stationary period. The horizontal confidence ellipses show highly variable and enlarged uncertainty regions, with a mean semi-major axis length of 27.0 mm and a maximum of 72.3 mm. Vertical confidence intervals similarly demonstrate poor precision, with a mean interval of 7.1 mm and a maximum of 19.5 mm. The elongated ellipses indicate strongly directional uncertainty patterns, suggesting that the system has not yet stabilized to provide uniform precision across measurement axes.
The 95% confidence regions for the optimized approach using 1 s windows positioned after threshold-based settling are given in Figure 7. Horizontal confidence ellipses demonstrated a mean semi-major axis length of 5.3 mm and a maximum of 7.1 mm—comparable to the 90 s extended collection shown in Figure 4. The vertical confidence intervals averaged 1.7 mm with a maximum of 4.5 mm, nearly identical to the 90 s results (2.0 mm mean, 4.5 mm maximum). The dramatic reduction compared to Figure 6 demonstrates that the settling period addresses initialization noise rather than fundamental geometric limitations.
This comparison demonstrates that the initial settling period, rather than extended averaging, is the critical factor for achieving precise measurements. The optimized 1 s protocol captures steady-state performance while dramatically reducing collection time, enabling efficient large-scale surveys without compromising data quality.

3.4. Observation Uncertainty Quantification

Table 5 presents per-component precision statistics calculated from the 1 s threshold-qualified windows at each of the 30 grid points. For practical implementation in photogrammetric software, the RMS values across all grid points provide conservative yet realistic uncertainty estimates. For position, these uncertainties (1.6 mm in X, 1.7 mm in Y, and 1.1 mm in Z) represent the achievable precision using the optimized 1 s collection protocol with appropriate settling time and are consistent with the manufacturer’s 1.5 mm specification. For orientation, the uncertainties (0.08° for ω , 0.09° for ϕ , and 0.07° for κ ) demonstrate sub-degree precision.

4. Discussion

This research characterized the measurement precision and temporal stability of the ZeroKey Quantum RTLS to establish optimal data collection protocols and determine whether its repeatability characteristics justify subsequent accuracy testing.
The automated stationary detection algorithm achieved 100% success for Profile 2 (Tooltip) across all observation durations, demonstrating that the envisioned walk-around workflow with brief pauses is technically feasible. Both the stationary detection and optimal duration analyses converge on the same conclusion: shorter data collection periods are superior for the Quantum RTLS. Just 1 s of data collection achieves both the 3 mm position precision threshold (after brief settling) and the 1° orientation precision threshold. The critical role of the settling period (median 0.6 s, maximum 1.5 s) rather than extended averaging indicates that operational workflows should allow the system to stabilize after operator motion ceases, after which measurements can be acquired rapidly.
The quantified per-component uncertainties provide the stochastic modeling parameters necessary for incorporating RTLS measurements as weighted constraints in photogrammetric bundle adjustment. The system’s millimeter-level position precision and sub-degree orientation precision demonstrate measurement repeatability appropriate for close-range photogrammetric applications. However, the relationship between these observation uncertainties and final reconstruction accuracy cannot be predicted without bundle adjustment testing.
This study characterized RTLS precision under a single anchor network configuration; sensitivity to anchor geometry warrants investigation in future work. Future work will also assess whether RTLS constraints yield accurate photogrammetric reconstructions, comparing results against traditional GCP-based approaches.

Author Contributions

Conceptualization, D.D.L. and F.N.; methodology, F.N. and D.D.L.; software, F.N.; validation, D.D.L.; formal analysis, F.N. and D.D.L.; investigation, F.N.; resources, D.D.L.; data curation, F.N.; writing—original draft preparation, F.N.; writing—review and editing, D.D.L. and F.N.; visualization, F.N.; supervision, D.D.L.; project administration, D.D.L.; funding acquisition, D.D.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Sciences and Engineering Research Council of Canada (NSERC), grant number [Grant Nos. RGPIN-2018-03775 and RGPIN-2024-03793]. The APC was funded by NSERC (Grant no. RGPIN-2024-03793).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study can be made available upon request from the corresponding author.

Acknowledgments

The authors gratefully acknowledge ZeroKey Inc. for providing the Quantum RTLS hardware used in this study.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BLEBluetooth Low Energy
GCPground control point
GNSSglobal navigation satellite system
IMUinertial measurement unit
IPSindoor positioning system
ISOintegrated sensor orientation
RMSroot mean square
RTLSreal-time location system
ToFtime-of-flight
UAVunmanned aerial vehicle
UWBultra-wideband

Appendix A

Appendix A.1. IMU Profile 1 Results

Figure A1. IMU Profile 1 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A1. IMU Profile 1 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a1
Table A1. IMU Profile 1 orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
Table A1. IMU Profile 1 orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
AngleSpatial Mean
(°)
Minimum
(°)
Maximum
(°)
Temporal Precision
(°)
ω 92.0 ± 0.391.392.40.05 ± 0.05
ϕ −5.1 ± 4.3−11.96.10.16 ± 0.15
κ −0.2 ± 0.5−0.91.00.05 ± 0.07
Figure A2. IMU Profile 1 running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Figure A2. IMU Profile 1 running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Sensors 26 00319 g0a2
Table A2. IMU Profile 1 sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table A2. IMU Profile 1 sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Time
(s)
Initial
Precision
(mm)
Success Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median
Collection Time
(s)
Maximum
Collection Time
(s)
Final
Precision
(s)
0.2511.2100%0.652.250.902.502.4
0.5015.7100%0.902.251.402.752.5
117.7100%1.152.452.153.452.7
216.4100%1.102.353.104.352.8
314.4100%1.002.304.005.302.8
511.7100%0.902.055.907.052.8
108.6100%0.751.9510.7511.952.8
206.1100%0.551.8520.5521.852.8
305.0100%0.301.6530.3031.652.7
454.1100%0.001.2545.0046.252.6
603.6100%0.000.7060.0060.702.5
Table A3. IMU Profile 1 sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table A3. IMU Profile 1 sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
TimeSuccess Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median Collection Time
(s)
Maximum Collection Time
(s)
0.25100%0.000.100.250.35
0.500.450.500.95
10.401.001.40
21.852.003.85
31.803.004.80
51.755.006.75
101.6010.0011.60
201.3020.0021.30
301.0030.0031.00
450.6045.0045.60
600.0060.0060.00
Figure A3. IMU Profile 1 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A3. IMU Profile 1 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a3
Figure A4. IMU Profile 1 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A4. IMU Profile 1 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a4
Table A4. IMU Profile 1 per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Table A4. IMU Profile 1 per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Position (mm)
AxisMinimumMaximumMeanRMS
X0.32.81.31.5
Y0.22.81.82.0
Z0.22.60.91.0
Orientation (°)
AngleMinimumMaximumMeanRMS
ω 0.010.170.050.07
ϕ 0.010.130.060.06
κ 0.000.030.010.02

Appendix A.2. IMU Profile 3 Results

Table A5. IMU Profile 3 orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
Table A5. IMU Profile 3 orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
AngleSpatial Mean
(°)
Minimum
(°)
Maximum
(°)
Temporal Precision
(°)
ω 90.8 ± 0.689.891.70.04 ± 0.07
ϕ −4.7 ± 6.9−16.913.90.16 ± 0.23
κ −0.1 ± 0.5−1.21.00.02 ± 0.04
Figure A5. IMU Profile 3 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A5. IMU Profile 3 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a5
Figure A6. IMU Profile 3 running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Figure A6. IMU Profile 3 running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Sensors 26 00319 g0a6
Table A6. IMU Profile 3 sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table A6. IMU Profile 3 sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Time
(s)
Initial
Precision
(mm)
Success Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median
Collection Time
(s)
Maximum
Collection Time
(s)
Final
Precision
(s)
0.259.6100%0.471.800.722.052.4
0.5011.7100%0.752.401.252.902.6
114.8100%1.023.402.024.402.6
214.9100%1.053.403.055.402.6
313.7100%1.023.404.026.402.7
511.4100%0.9710.655.9715.652.7
108.9100%0.8010.4510.8020.452.7
206.7100%0.539.9520.5329.952.7
305.5100%0.359.8530.3539.852.6
454.6100%0.209.7545.2054.752.5
604.0100%0.059.6560.0569.652.5
Table A7. IMU Profile 3 sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table A7. IMU Profile 3 sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
TimeSuccess Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median Collection Time
(s)
Maximum Collection Time
(s)
0.25100%0.000.100.250.35
0.500.300.500.80
10.301.001.30
21.002.003.00
32.603.005.60
56.305.0011.30
106.1510.0016.15
205.9520.0025.95
305.7530.0035.75
455.4045.0050.40
605.0560.0065.05
Table A8. IMU Profile 3 per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Table A8. IMU Profile 3 per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Position (mm)
AxisMinimumMaximumMeanRMS
X0.02.71.51.7
Y0.02.61.71.8
Z0.02.60.81.0
Orientation (°)
AngleMinimumMaximumMeanRMS
ω 0.000.100.030.04
ϕ 0.000.080.030.03
κ 0.000.050.010.02
Figure A7. IMU Profile 3 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A7. IMU Profile 3 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a7
Figure A8. IMU Profile 3 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A8. IMU Profile 3 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a8

Appendix A.3. IMU Profile 4 Results

Figure A9. IMU Profile 4 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A9. IMU Profile 4 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a9
Table A9. IMU Profile 4 orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
Table A9. IMU Profile 4 orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
AngleSpatial Mean
(°)
Minimum
(°)
Maximum
(°)
Temporal Precision
(°)
ω 90.8 ± 0.988.892.10.04 ± 0.06
ϕ −6.5 ± 5.0−14.22.50.23 ± 0.16
κ −0.6 ± 0.6−1.50.30.04 ± 0.06
Figure A10. IMU Profile 4 running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Figure A10. IMU Profile 4 running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Sensors 26 00319 g0a10aSensors 26 00319 g0a10b
Table A10. IMU Profile 4 sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table A10. IMU Profile 4 sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Time
(s)
Initial
Precision
(mm)
Success Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median
Collection Time
(s)
Maximum
Collection Time
(s)
Final
Precision
(s)
0.2510.9100%0.551.600.801.852.4
0.5013.4100%1.002.701.503.202.5
118.1100%1.152.702.153.702.6
217.3100%1.152.653.154.652.7
34.9100%1.102.604.105.602.7
512.0100%1.102.406.107.402.8
108.7100%1.002.0011.0012.002.7
206.3100%0.751.9020.7521.902.7
305.2100%0.501.7530.5031.752.8
454.4100%0.351.4545.3546.452.7
603.8100%0.201.2560.2061.252.7
Table A11. IMU Profile 4 sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table A11. IMU Profile 4 sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
TimeSuccess Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median Collection Time
(s)
Maximum Collection Time
(s)
0.25100%0.000.150.250.40
0.500.750.501.25
10.751.001.75
21.352.003.35
31.303.004.30
51.705.006.70
101.4010.0010.40
200.7520.0020.75
300.5530.0030.55
450.2045.0045.20
600.0560.0060.05
Figure A11. IMU Profile 4 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A11. IMU Profile 4 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a11
Figure A12. IMU Profile 4 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A12. IMU Profile 4 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a12
Table A12. IMU Profile 4 per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Table A12. IMU Profile 4 per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Position (mm)
AxisMinimumMaximumMeanRMS
X0.12.91.41.6
Y0.22.91.61.8
Z0.12.40.91.1
Orientation (°)
AngleMinimumMaximumMeanRMS
ω 0.000.060.020.03
ϕ 0.000.260.110.13
κ 0.000.080.020.02

Appendix A.4. IMU Profile 5 Results

Figure A13. IMU Profile 5 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A13. IMU Profile 5 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a13
Table A13. IMU Profile 5 orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
Table A13. IMU Profile 5 orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
AngleSpatial Mean
(°)
Minimum
(°)
Maximum
(°)
Temporal Precision
(°)
ω 90.2 ± 0.788.491.40.05 ± 0.08
ϕ −6.6 ± 5.3−19.53.50.12 ± 0.11
κ 0.1 ± 0.7−0.72.10.06 ± 0.06
Figure A14. IMU Profile 5 running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Figure A14. IMU Profile 5 running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Sensors 26 00319 g0a14
Table A14. IMU Profile 5 sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table A14. IMU Profile 5 sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Time
(s)
Initial
Precision
(mm)
Success Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median
Collection Time
(s)
Maximum
Collection Time
(s)
Final
Precision
(s)
0.2512.2100%0.781.901.032.152.4
0.5016.2100%1.052.651.553.152.4
119.8100%1.152.602.153.602.7
218.5100%1.123.703.125.702.7
316.6100%1.103.704.106.702.7
514.5100%1.083.656.088.652.7
1010.9100%0.953.5510.9513.552.8
208.1100%0.753.4020.7523.402.8
306.7100%0.573.3530.5733.352.8
455.6100%0.423.3045.4248.302.7
604.9100%0.103.1060.1063.102.6
Table A15. IMU Profile 5 sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table A15. IMU Profile 5 sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
TimeSuccess Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median Collection Time
(s)
Maximum Collection Time
(s)
0.25100%0.000.200.250.45
0.500.300.500.80
10.801.001.80
20.752.002.75
30.703.003.70
52.555.007.55
101.9010.0011.90
200.0020.0020.00
300.0030.0030.00
450.0045.0045.00
600.0060.0060.00
Figure A15. IMU Profile 5 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A15. IMU Profile 5 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a15
Figure A16. IMU Profile 5 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure A16. IMU Profile 5 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g0a16
Table A16. IMU Profile 5 per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Table A16. IMU Profile 5 per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Position (mm)
AxisMinimumMaximumMeanRMS
X0.13.01.21.4
Y0.12.91.92.1
Z0.12.60.91.1
Orientation (°)
AngleMinimumMaximumMeanRMS
ω 0.000.190.070.09
ϕ 0.020.520.070.12
κ 0.000.050.020.02

References

  1. Ackermann, F. Practical Experience with GPS Supported Aerial Triangulation. Photogramm. Rec. 1994, 14, 860–874. [Google Scholar] [CrossRef]
  2. Lucas, J.R. Aerotriangulation without Ground Control. Photogramm. Eng. Remote Sens. 1987, 53, 311–314. [Google Scholar]
  3. Schwarz, K.; Chapman, M.; Cannon, E.; Gong, P. An Integrated INS/GPS Approach to the Georeferencing of Remotely Sensed Data. Photogramm. Eng. Remote Sens. 1993, 59, 1667–1674. [Google Scholar]
  4. Cramer, M. Direct Geocoding—Is Aerial Triangulation Obsolete? Herbert Wichmann Verlag: Heidelberg, Germany, 1999; pp. 59–70. [Google Scholar]
  5. Zafari, F.; Gkelias, A.; Leung, K.K. A Survey of Indoor Localization Systems and Technologies. IEEE Commun. Surv. Tutor. 2019, 21, 2568–2599. [Google Scholar] [CrossRef]
  6. Sartayeva, Y.; Chan, H.C.B.; Ho, Y.H.; Chong, P.H.J. A Survey of Indoor Positioning Systems Based on a Six-Layer Model. Comput. Netw. 2023, 237, 110042. [Google Scholar] [CrossRef]
  7. Liu, H.; Darabi, H.; Banerjee, P.; Liu, J. Survey of Wireless Indoor Positioning Techniques and Systems. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2007, 37, 1067–1080. [Google Scholar] [CrossRef]
  8. Alarifi, A.; Al-Salman, A.; Alsaleh, M.; Alnafessah, A.; Al-Hadhrami, S.; Al-Ammar, M.A.; Al-Khalifa, H.S. Ultra Wideband Indoor Positioning Technologies: Analysis and Recent Advances. Sensors 2016, 16, 707. [Google Scholar] [CrossRef] [PubMed]
  9. Qi, J.; Liu, G.-P. A Robust High-Accuracy Ultrasound Indoor Positioning System Based on a Wireless Sensor Network. Sensors 2017, 17, 2554. [Google Scholar] [CrossRef]
  10. Brena, R.F.; García-Vázquez, J.P.; Galván-Tejada, C.E.; Muñoz-Rodriguez, D.; Vargas-Rosales, C.; Fangmeyer, J., Jr. Evolution of Indoor Positioning Technologies: A Survey. J. Sens. 2017, 2017, 2630413. [Google Scholar] [CrossRef]
  11. Zeng, Q.; Liu, D.; Lv, C. UWB/Binocular VO Fusion Algorithm Based on Adaptive Kalman Filter. Sensors 2019, 19, 4044. [Google Scholar] [CrossRef] [PubMed]
  12. Peng, P.; Yu, C.; Xia, Q.; Zheng, Z.; Zhao, K.; Chen, W. An Indoor Positioning Method Based on UWB and Visual Fusion. Sensors 2022, 22, 1394. [Google Scholar] [CrossRef] [PubMed]
  13. Lin, H.-Y.; Zhan, J.-R. GNSS-Denied UAV Indoor Navigation with UWB Incorporated Visual Inertial Odometry. Measurement 2023, 206, 112256. [Google Scholar] [CrossRef]
  14. Jiang, P.; Hu, C.; Wang, T.; Lv, K.; Guo, T.; Jiang, J.; Hu, W. Research on a Visual/Ultra-Wideband Tightly Coupled Fusion Localization Algorithm. Sensors 2024, 24, 1710. [Google Scholar] [CrossRef]
  15. Mu, H.; Yu, C.; Jiang, S.; Luo, Y.; Zhao, K.; Chen, W. Indoor Pedestrian Positioning Method Based on Ultra-Wideband with a Graph Convolutional Network and Visual Fusion. Sensors 2024, 24, 6732. [Google Scholar] [CrossRef]
  16. Park, I.; Cho, S. Fusion Localization for Indoor Airplane Inspection Using Visual Inertial Odometry and Ultrasonic RTLS. Sci. Rep. 2023, 13, 18117. [Google Scholar] [CrossRef]
  17. Fan, G.; Wang, Q.; Yang, G.; Liu, P. RFG-TVIU: Robust Factor Graph for Tightly Coupled Vision/IMU/UWB Integration. Front. Neurorobot 2024, 18, 1343644. [Google Scholar] [CrossRef] [PubMed]
  18. Sadruddin, H.; Mahmoud, A.; Atia, M. An Indoor Navigation System Using Stereo Vision, IMU and UWB Sensor Fusion. In Proceedings of the 2019 IEEE SENSORS, Montreal, QC, Canada, 27–30 October 2019; pp. 1–4. [Google Scholar]
  19. Lutz, P.; Schuster, M.J.; Steidle, F. Visual-Inertial SLAM Aided Estimation of Anchor Poses and Sensor Error Model Parameters of UWB Radio Modules. In Proceedings of the 2019 19th International Conference on Advanced Robotics (ICAR), Belo Horizonte, Brazil, 2–6 December 2019; pp. 739–746. [Google Scholar]
  20. Rau, J.-Y.; Habib, A.F.; Kersting, A.P.; Chiang, K.-W.; Bang, K.-I.; Tseng, Y.-H.; Li, Y.-H. Direct Sensor Orientation of a Land-Based Mobile Mapping System. Sensors 2011, 11, 7243–7261. [Google Scholar] [CrossRef]
  21. Kersting, A.P.; Habib, A.; Rau, J. New Method for the Calibration of Multi-Camera Mobile Mapping Systems. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, XXXIX-B1, 121–126. [Google Scholar] [CrossRef]
  22. Ma, H.; Ma, H.; Liu, K.; Luo, W.; Zhang, L. Direct Georeferencing for the Images in an Airborne LiDAR System by Automatic Boresight Misalignments Calibration. Sensors 2020, 20, 5056. [Google Scholar] [CrossRef]
  23. Masiero, A.; Fissore, F.; Vettore, A. A Low Cost UWB Based Solution for Direct Georeferencing UAV Photogrammetry. Remote Sens. 2017, 9, 414. [Google Scholar] [CrossRef]
  24. Mulindwa, D.B. Indoor 3D Reconstruction Using Camera, IMU and Ultrasonic Sensors. J. Sens. Technol. 2020, 10, 15–30. [Google Scholar] [CrossRef]
  25. ZeroKey Spatial Intelligence Home. Available online: https://zerokey.com/ (accessed on 16 December 2025).
  26. Leskiw, D.C.; Gao, D.G.; Lowe, M. ZeroKey’s Smart Space Indoor Positioning System [White Paper]; ZeroKey Inc.: Calgary, AB, Canada, 2020. [Google Scholar]
  27. ZeroKey Spatial Intelligence ZeroKey Info Center. Available online: https://infocentre.zerokey.com/articles/ (accessed on 16 December 2025).
  28. ZeroKey Spatial Intelligence Changes in the Mobile Body Frame from Quantum RTLS 1.0 to 2.0. Available online: https://infocentre.zerokey.com/articles/changes-in-the-mobile-body-frame-from-quantum-rtls (accessed on 16 December 2025).
  29. ZeroKey Spatial Intelligence Anchor Layouts for Starter Kits. Available online: https://infocentre.zerokey.com/articles/anchor-layouts-for-starter-kits (accessed on 16 December 2025).
  30. ZeroKey Spatial Intelligence Calibrate Your System with the ZeroKey Calibration Wizard. Available online: https://infocentre.zerokey.com/articles/how-to-calibrate-your-positioning-system (accessed on 16 December 2025).
  31. ZeroKey Spatial Intelligence Mobile Training and Mounting Steps. Available online: https://infocentre.zerokey.com/articles/mobile-training-and-mounting-steps (accessed on 16 December 2025).
  32. Markley, F.L.; Cheng, Y.; Crassidis, J.L.; Oshman, Y. Averaging Quaternions. J. Guid. Control Dyn. 2007, 30, 1193–1197. [Google Scholar] [CrossRef]
  33. Luhmann, T.; Robson, S.; Kyle, S.; Boehm, J. Close-Range Photogrammetry and 3D Imaging; De Gruyter: Berlin, Germany, 2013; ISBN 978-3-11-030278-3. [Google Scholar]
Figure 1. ZeroKey mobile node body frame coordinate system. The origin is located at the center of the ultrasonic transducer, with the positive X-axis pointing toward the action button, the positive Y-axis pointing toward the micro-USB port, and the Z-axis completing a right-handed coordinate system. Adapted from [28].
Figure 1. ZeroKey mobile node body frame coordinate system. The origin is located at the center of the ultrasonic transducer, with the positive X-axis pointing toward the action button, the positive Y-axis pointing toward the micro-USB port, and the Z-axis completing a right-handed coordinate system. Adapted from [28].
Sensors 26 00319 g001
Figure 2. Perspective view of the laboratory test environment showing anchor node deployment. Eight anchor nodes are distributed around the perimeter and overhead of the 6 m × 8 m test area, visualized using a high-resolution terrestrial laser scanning point cloud.
Figure 2. Perspective view of the laboratory test environment showing anchor node deployment. Eight anchor nodes are distributed around the perimeter and overhead of the 6 m × 8 m test area, visualized using a high-resolution terrestrial laser scanning point cloud.
Sensors 26 00319 g002
Figure 3. Anchor network geometry and test grid layout in plan view: (a) Eight anchor nodes distributed around the perimeter of the 6 m × 8 m floor area; (b) Anchor network with regular grid of 30 test points at 1 m spacing covering the central region.
Figure 3. Anchor network geometry and test grid layout in plan view: (a) Eight anchor nodes distributed around the perimeter of the 6 m × 8 m floor area; (b) Anchor network with regular grid of 30 test points at 1 m spacing covering the central region.
Sensors 26 00319 g003
Figure 4. 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure 4. 95% confidence regions from 90 s stationary periods at all 30 grid points: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g004
Figure 5. Running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Figure 5. Running precision versus collection duration: (a) 3D position precision; (b) ω precision; (c) ϕ precision; (d) κ precision.
Sensors 26 00319 g005
Figure 6. 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure 6. 95% confidence regions from immediate 1 s data collection at all 30 grid points without settling period: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g006
Figure 7. 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Figure 7. 95% confidence regions from optimized 1 s data collection at all 30 grid points after threshold-based settling: (a) Horizontal confidence ellipses showing XY precision across the tracked area; (b) Vertical confidence intervals showing Z precision for each grid point.
Sensors 26 00319 g007
Table 1. Stationary detection performance across five inertial measurement unit (IMU) profiles and three observation durations (30 s, 60 s, 90 s), showing number of detected periods, detection accuracy, undetected grid point IDs, and 3D precision statistics.
Table 1. Stationary detection performance across five inertial measurement unit (IMU) profiles and three observation durations (30 s, 60 s, 90 s), showing number of detected periods, detection accuracy, undetected grid point IDs, and 3D precision statistics.
IMU ProfileDuration
(s)
Number of Points DetectedDetection Rate
(%)
Undetected Point IDsMean
Precision
(mm)
Minimum
Precision
(mm)
Maximum
Precision
(mm)
13030100%-2.90.79.0
6030100%-2.8
902790%1, 22, 262.8
23030100%-2.41.25.9
6030100%-2.4
9030100%-2.4
33030100%-3.31.015.5
6030100%-3.2
9030100%-3.3
43030100%-3.31.56.3
602990%63.2
902990%63.2
53030100%-4.11.112.1
602997%64.0
902893%6, 284.0
Table 2. Orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
Table 2. Orientation statistics from 90 s stationary periods, showing spatial mean values (variation across 30 grid points) and temporal precision (mean standard deviation within each stationary period) for ω , ϕ , and κ Cardan angles.
AngleSpatial Mean
(°)
Minimum
(°)
Maximum
(°)
Temporal Precision
(°)
ω 89.6 ± 0.688.190.60.07 ± 0.05
ϕ −5.8 ± 8.3−22.810.60.12 ± 0.09
κ −0.4 ± 0.7−1.51.20.08 ± 0.07
Table 3. Sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table 3. Sliding-window position precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Time
(s)
Initial
Precision
(mm)
Success Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median
Collection Time
(s)
Maximum
Collection Time
(s)
Final
Precision
(s)
0.257.8100%0.381.350.631.602.6
0.5010.7100%0.571.551.072.052.3
111.1100%0.601.501.602.502.5
29.7100%0.683.052.685.052.7
38.6100%0.603.003.606.002.7
57.3100%0.552.955.557.952.7
105.5100%0.452.8010.4512.802.7
204.1100%0.1713.3520.1733.352.6
303.6100%0.035.4530.0335.452.6
453.197%-
602.897%-
Table 4. Sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
Table 4. Sliding-window orientation precision and wait time statistics for collection durations ranging from 0.25 s to 60 s.
TimeSuccess Rate
(%)
Median Wait
(s)
Maximum Wait
(s)
Median Collection Time
(s)
Maximum Collection Time
(s)
0.25100%0.000.250.250.50
0.500.300.500.80
10.151.001.15
20.152.002.15
30.203.003.20
50.155.005.15
100.0510.0010.05
200.0020.0020.00
300.0030.0030.00
450.0045.0045.00
600.0060.0060.00
Table 5. Per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Table 5. Per-component precision statistics for position and orientation measurements from optimized 1 s data collection across 30 grid points.
Position (mm)
AxisMinimumMaximumMeanRMS
X0.12.61.41.6
Y0.12.61.61.7
Z0.12.30.91.1
Orientation (°)
AngleMinimumMaximumMeanRMS
ω 0.010.170.060.08
ϕ 0.020.320.060.09
κ 0.000.200.050.07
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nayko, F.; Lichti, D.D. Quantifying the Measurement Precision of a Commercial Ultrasonic Real-Time Location System for Camera Pose Estimation in Indoor Photogrammetry. Sensors 2026, 26, 319. https://doi.org/10.3390/s26010319

AMA Style

Nayko F, Lichti DD. Quantifying the Measurement Precision of a Commercial Ultrasonic Real-Time Location System for Camera Pose Estimation in Indoor Photogrammetry. Sensors. 2026; 26(1):319. https://doi.org/10.3390/s26010319

Chicago/Turabian Style

Nayko, Faith, and Derek D. Lichti. 2026. "Quantifying the Measurement Precision of a Commercial Ultrasonic Real-Time Location System for Camera Pose Estimation in Indoor Photogrammetry" Sensors 26, no. 1: 319. https://doi.org/10.3390/s26010319

APA Style

Nayko, F., & Lichti, D. D. (2026). Quantifying the Measurement Precision of a Commercial Ultrasonic Real-Time Location System for Camera Pose Estimation in Indoor Photogrammetry. Sensors, 26(1), 319. https://doi.org/10.3390/s26010319

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop