Next Article in Journal
Wireless Chipless System for Humidity Sensing
Next Article in Special Issue
Integrity and Collaboration in Dynamic Sensor Networks
Previous Article in Journal
Mouthwitch: A Novel Head Mount Type Hands-Free Input Device that Uses the Movement of the Temple to Control a Camera
Previous Article in Special Issue
Non-GNSS Smartphone Pedestrian Navigation Using Barometric Elevation and Digital Map-Matching
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Uncertainty Characterisation of Mobile Robot Localisation Techniques using Optical Surveying Grade Instruments

by
Benjamin J. McLoughlin
*,
Harry A. G. Pointon
,
John P. McLoughlin
,
Andy Shaw
and
Frederic A. Bezombes
*
Engineering and Technology Research Institute, Liverpool John Moores University, 3 Byrom St, Liverpool L3 3AF, UK
*
Authors to whom correspondence should be addressed.
Sensors 2018, 18(7), 2274; https://doi.org/10.3390/s18072274
Submission received: 15 June 2018 / Revised: 7 July 2018 / Accepted: 11 July 2018 / Published: 13 July 2018
(This article belongs to the Special Issue GNSS and Fusion with Other Sensors)

Abstract

:
Recent developments in localisation systems for autonomous robotic technology have been a driving factor in the deployment of robots in a wide variety of environments. Estimating sensor measurement noise is an essential factor when producing uncertainty models for state-of-the-art robotic positioning systems. In this paper, a surveying grade optical instrument in the form of a Trimble S7 Robotic Total Station is utilised to dynamically characterise the error of positioning sensors of a ground based unmanned robot. The error characteristics are used as inputs into the construction of a Localisation Extended Kalman Filter which fuses Pozyx Ultra-wideband range measurements with odometry to obtain an optimal position estimation, all whilst using the path generated from the remote tracking feature of the Robotic Total Station as a ground truth metric. Experiments show that the proposed method yields an improved positional estimation compared to the Pozyx systems’ native firmware algorithm as well as producing a smoother trajectory.

1. Introduction

Localisation is a fundamental aspect in the area of mobile robotics. The ability for a robot to relate itself to an operating environment, a global frame of reference and other robotic systems is a key challenge in the development and deployment of autonomous systems.
In order to achieve autonomy, a mobile unmanned robot must be equipped with a localisation or positioning system that consistently and precisely determines robot pose, i.e., position and heading, as it navigates throughout an environment. Localisation methods and techniques are classified into two categories, relative and absolute. Relative localisation methods are generally conducted within the body frame of the platform, through the integration of techniques such as vision based odometry systems without image georeferencing [1,2], dead reckoning via inertial measurement units (IMU) [3,4] and wheel odometry (in the case of ground based systems) to determine the speed of the robotic platform [5,6]. Absolute techniques refer to the localisation of co-ordinate reference frames that are external to the robot, for example the use of independent landmarks [7,8], known correspondence reference points [9] or a Global Navigation Satellite System (GNSS) [10]. However, common implementations include the collaboration of relative and absolute positioning systems [11,12]. This combination is usually integrated within state estimation algorithms such as Particle Filters (PF) and both linear and nonlinear Kalman Filters, where such multi-sensor fusion techniques are designed whilst considering motion models and robot idiosyncrasies [13].
Considering outdoor environments, one notable example and a frequently implemented localisation technique is the use of GNSS and, when combined with an inertial navigation system (INS), GNSSs are often implemented as a standard technique for many commercial and off-the-shelf unmanned systems. For example, GNSS on unmanned aerial vehicles (UAVs) are often operated in conjunction with high resolution cameras for image georeferencing and photogrammetry techniques [14]. The implementation of the differential variant of GNSS for 3D light detection and ranging (LiDAR) systems for point cloud registration and georeferenced map generation has also been investigated [15]. However, the performance of GNSS in areas of mountainous terrain and urban locations with abounding tall structures as well as indoor environments have been a reason for others to explore alternative localisation techniques [16,17]. For this work, an alternative method for localisation in the form of Ultra-wideband (UWB) is investigated.

1.1. Ultra-Wideband Localisation Systems

UWB is a wireless radio technique that transmits and receives narrow pulses at a nanosecond rate. Unlike conventional radio frequency identification (RFID) based systems that operate on single bands, UWB transmits over a broad spectrum of radio frequencies [18]. The ability to transmit over this extensive bandwidth decreases the power spectral density, therefore enabling UWB based systems to avoid interfaces with other RF signals [19,20]. Range measurements produced from UWB are achieved from the accurate resolution of the Time of Arrival (TOA), commonly known as Time of Flight (TOF), of a pulsed waveform that is traveling between a target node (tag) and set of reference nodes (anchors) [21]. Furthermore, as a result of the large operating bandwidth, the UWB system obtains a high time resolution; therefore, positioning techniques can provide accurate range estimations [21]. Due to the aforementioned capabilities, UWB positioning systems have previously been applied to human tracking [22], sporting scenarios [23] and engineering applications such as the direct geo-referencing of images from an aerial platform [24]. Static characterisation of range uncertainty behaviour between two static UWB nodes has been previously investigated [25]. In order to calculate the range uncertainty for the work carried out in [25], static range readings were taken at various distances between two nodes and compared to a true distance for error calculations. Although this method allows static based range error characterisation, the method does not provide a dynamic error reading of the UWB system under the intended operating conditions of the deployed system. However, to obtain a more accurate judgment of the expected range error, the system would need to be monitored over various types of movement in continuous rather than discrete samples.
Deploying robotic systems with a means to make judgments and interact with the surrounding environment is entirely dependent upon an imperative element of robotics, which is uncertainty [9]. Uncertainty can arise from areas such as unpredictable environments, actuation factors, inaccurate system modeling and noise due to onboard sensors. However, for the case of this study, the range uncertainty of an network of sensor nodes from a UWB localisation system is characterised using a surveying grade optical instrument in the form of a Robotic Total Station (RTS).

1.2. Robotic Total Stations

A Trimble S7 RTS (Sunnyvale, CA, USA) was used as the principal method for uncertainty estimation and final system appraisal in the experiments. In normal operation, the RTS system is used in the laying out of building works, inspection of existing and ongoing building works and surveying of land. The RTS takes measurement readings through the use of an optical and remote distance sensing unit with an expected accuracy of 2 mm + 2 ppm in standard prism mode and 4 mm + 2 ppm in tracking prism mode depending on the prism in use [26]. In the case of Electronic Distance Measurement (EDM) devices, the accuracy of a measurement is dependent on a number of factors. However, an estimate of the error may be achieved through the use of the stated accuracy specifications over an example distance of 1 km such [27]:
± ( 2 mm + ( 2 ppm × ( 1 × 10 6 mm ) ) = ± 4 mm .
Therefore, the expected measurement error during this study may be taken as the upper bounding error found at the maximum operating distance of 20 m. This error may be taken as ±2.04 mm. The bearing of the target is taken through the use of an absolute encoder whose accuracy may be defined within DIN 18723, the specification for theodolite accuracy from the German Institute for Standards, at 1” (arc seconds) or 2.77 × 10 6 degrees standard deviation [26]. The range that the RTS is capable of tracking over depends on the prism or target in use, but, for the prism used in this experiment, distances of up to 500 m could be measured.
The main functionality of the UWB system is to dynamically localise the platform that is mounted with the mobile tag. Therefore, utilising an RTS with active target tracking features provides a foundation to characterise the range uncertainty of the robot in its intended operating conditions as opposed to a static based test. Using the RTS as a ground truth comparison metric to dynamically characterise the uncertainty of a UWB sensor network for state estimation system design is an unexplored area of research. There has also been little investigation into the use of such techniques in the field of mobile robot positioning using UWB systems. Work has been conducted that studies the implementation of an RTS system as a ground truth for the assessment of the validity of other localisation systems [28]. Another study to note conducted by [29] focuses on using UWB for position estimation of a large scale laser scanning unit for 3D Building Information Modeling (BIM). Additionally, a further analysis has been conducted into the viability of using UWB systems to track operatives in construction sites, and this investigation utilised an RTS for the ground truth [30]. The aforementioned investigations demonstrated the viability in applications; however, the uncertainty and lack of state estimation made the implementation of such a system inviable for applications requiring high positional accuracy. Other work carried out using an RTS to actively track unmanned systems consists of autonomous and remote positioning of a Micro Aerial Vehicle (MAV) using real-time input from the RTS [31]. An RTS has also been deployed as a substitute for GNSS as a measurement input source into an Extended Kalman Filter (EKF) for indoor environments [32].
Therefore, this paper seeks to utilise an RTS as a technique to determine the uncertainty of a UWB positioning system, as well as the uncertainty of a set of quadrature wheel encoders used on a mobile robotic platform. These uncertainty metrics are used as an input into an EKF for recursive state estimation to improve the overall position and orientation estimation, otherwise known as the pose of the robot.

2. State Estimation Formulation

2.1. Problem Formulation

A discrete EKF was used as the sensor fusion object to combine the initial position estimation from a control input with external range measurement updates acquired from the UWB anchors. The EKF calculates a linear approximation for a set of nonlinear functions based on the first-order Taylor expansion [9]. This is achieved through a two-stage iteration process where the future state of the system is predicted based on the current state and a state transition model. This is then corrected using a measurement from an external source. The nonlinear state transition and measurement models for an EKF are described in Equations (2) and (3) where X k ^ and Z k ^ represent the state and measurement vector estimations at time k, controlled input into the system is represented as u k and notations w k 1 and v k 1 are the system and measurement noise:
X k ^ = f ( X k 1 , u k ; w k 1 ) ,
Z k ^ = h ( X k 1 , v k 1 ) .
For the system implemented in this study, the range measurements acquired from the UWB anchors meant that the system was assessed as a range-based localisation problem. This problem when concerning mobile robots includes the continuous estimation of the robot’s state in terms of its planar Cartesian coordinates and heading X = [ x   y   ϕ ] T [33], in relation to the reference anchors in the environment that are assumed to have static positions mapped in M Bi = [ x Bi   y Bi ] , i [ 1 , , N B ] and provide a set of range measurements Z Bi = [ r Bi , i [ 1 , , N Bi ] ] at time k, where N B is the number of UWB anchors present. The EKF was used to integrate the initial position estimate with the external range measurements acquired from the UWB beacons. The EKF model used for localisation is based on that presented by Thrun et al. [9]. In the case of this study, sensor measurements with an initial state prediction were based on systematic mathematical modeling fused with with a control odometry input, where quadrature encoders are used to determine the angular displacement of the robots wheels at time k as well as a time synchronised gyroscope for heading calculation. The designs of the formulation for the state transition and measurement model are presented below.

2.1.1. State Transition Model

An input odometry model proposed by [13] and also adopted by [25] was chosen to formulate the state transition of the robot:
X ^ k = f ( X ^ k 1 , u k ) ,
f ( X ^ k 1 , u k ) = X ^ k 1 + δ D k cos ( ϕ k + δ ϕ k 2 ) δ D k sin ( ϕ k + δ ϕ k 2 ) ϕ k + δ ϕ k ,
where δ D k is the linear displacement at time k and is calculated using the radii of the wheels r L , r R and the angular displacement θ L k and θ R k acquired from the pulses of the encoders:
δ D k = r L δ θ L k + r R δ θ R k 2 .
Therefore, with the state vector defined as X = [ x   y   ϕ ] T , the input vector is represented as u k = [ θ L   θ R   δ ϕ ] T , enabling the system and input Jacobians G x and G u to be pre-formulated.
G x k = f x ( X ^ k , u k ) ,
G x k = 1 0 δ D k sin ϕ k + δ ϕ k 2 2 0 1 δ D k cos ϕ k + δ ϕ k 2 2 0 0 1 ,
G u k = f u ( X ^ k , u k ) ,
G u k = r L cos ϕ k + δ ϕ k 2 2 r R cos ϕ k + δ ϕ k 2 2 D k sin ϕ k + δ ϕ k 2 2 r L sin ϕ k + δ ϕ k 2 2 r R sin ϕ k + δ ϕ k 2 2 D k cos ϕ k + δ ϕ k 2 2 0 0 1 .

2.1.2. Measurement Model

The observed sensor measurements collected from UWB devices are range measurements representing the Euclidean distances between the static reference anchors A B i and the estimated position of the robot. At each measurement update, six anchors each provided a range measurement Z Bi = [ z 1 , z 2 z 6 ] , which is represented within the measurement function h below. Uncertainty within the measurements is represented by noise variance R. An overview of the EKF algorithm is shown in Algorithm 1:
Algorithm 1 Range based EKF Localisation
   Prediction:
1:
X ^ k = f ( X ^ k 1 , u k )
2:
G x k = f x ( X ^ k , u k )
3:
G u k = f u ( X ^ k , u k )
4:
P ^ k = G x k P ^ k 1 G x k T + G u k Q G u k T
   Correction:
5:
Z ^ B i = h ( X ^ k , A B i )
6:
H z k = h x ( X ^ k )
7:
K k = P ^ k H z k ( H z k P ^ k H z k T + R ) 1
8:
y = Z B i Z ^ B i
9:
X ^ k = X ^ k + K k y
10:
P ^ k = ( I K k H z k ) P ^ k
11:
if   !   measurement _ is _ available   then
12:
  do Prediction
13:
else
14:
  do Correction
15:
end if
Z ^ B i = h ( X ^ k , A B i ) ,
h ( X ^ k , A B ) = x k x b 1 2 + y k y b 1 2 x k x b 2 2 + y k y b 2 2 x k x b 3 2 + y k y b 3 2 x k x b 4 2 + y k y b 4 2 x k x b 5 2 + y k y b 5 2 x k x b 6 2 + y k y b 6 2 ,
and the Jacobian of the measurement model is obtained as:
H z k = h x ( X ^ k ) ,
H z k = x k x b 1 x k x b 1 2 + y k yb 1 2 y k y b 1 x k x b 1 2 + y k y b 1 2 0 x k x b 2 x k x b 2 2 + y k y b 2 2 y k y b 2 x k x b 2 2 + y k y b 2 2 0 x k x b 3 x k x b 3 2 + y k y b 3 2 y k y b 3 x k x b 3 2 + y k y b 3 2 0 x k x b 4 x k x b 4 2 + y k y b 4 2 y k y b 4 x k x b 4 2 + y k y b 4 2 0 x k x b 5 x k x b 5 2 + y k y b 5 2 y k y b 5 x k x b 5 2 + y k y b 5 2 0 x k x b 6 x k x b 6 2 + y k y b 6 2 y k y b 6 x k x b 6 2 + y k y b 6 2 0 .
The EKF uses the first-order Taylor expansion to linearly approximate the nonlinear state transition and measurement models. This is where the Jacobian matrices are utilised to represent the first-order partial derivatives of these models for linear approximation [9]. The Jacobian matrices and their inverse (used within the state transition and the measurement models) were generated symbolically using the MATLAB Symbolic toolbox (R2018a, Mathworks, Natick, MA, USA) prior to implementation. The three Jacobian matrices were then solved numerically upon each iteration of the filter. The filter was run offline using the MATLAB Control Systems toolbox.

3. Methodology

The methodology implemented for this work consisted of the configuration and calibration of the RTS to set a base reference frame, which was consistently used for all sensor systems throughout the testing procedures. To understand the range error behavior of the UWB system and the linear displacement error of the quadrature encoders, characterisation procedures were executed using the RTS as the ground truth. The results were then implemented into the design of the EKF for the sensor measurement noise and the system input noise variance. Additionally, localisation tests are then carried out to quantify the variation between localisation accuracy of the RTS, custom UWB trilateration algorithm specific to the device and the EKF localisation using range measurements.

3.1. Robotic Testing Platform

The testbed employed was a skid steer, ridged chassis, four wheeled remotely operated platform similar to the design used in [34]. The platform’s chassis was constructed from 10 mm square extruded aluminium and is driven using four EMG30 DC motors controlled by Quimat DC H-Bridge motor controllers [35]. The full system is powered by a four-cell lithium polymer battery. The motors include built-in incremental quadrature encoders mounted to the motors’ extended shaft. The encoders provide 360 pulses per revolution (ppr), or a pulse every 1° change in angular displacement [35]. The encoders were used to accurately monitor the wheels’ rotation. The on-board processing for data collection, platform control and sensor interaction was achieved using a Nvidia Jetson TX1 (Santa Clara, CA, USA). A Trimble AT360 reflective target was mounted rigidly to the top surface of the platform via a 10 mm extruded aluminium mast allowing for clear line of sight. An image of the robotic platform can be seen in Figure 1.

3.2. Robotic Total Station Configuration

The RTS was set up in a static location with calibrated known rectangular Northing, Easting and Elevation coordinates X R T S = [ 1996.085 , 4977.284 , 198.959 ] T . Regarding the British National Grid (BNG) frame, this system allows direct Cartesian output while consistently retaining north reference. The RTS was orientated by taking a back-sight observation measurement to another known point through a centrally mounted telescope to a 360° prism (the back-sight prism and RTS position are shown in Figure 1. Once configured, the TSC3 controller was responsible for operating the RTS through a radio link communication, where the RTS measured both angle and distance to a remote target and the TSC3 calculates the corresponding coordinates of the point within the established reference coordinate system. The RTS was then set to acquire an initial lock prior to robot movement. This enabled a consistent active tracking operation from the RTS using the AT360 target mounted on the robot. Example trajectories produced from the RTS are shown in Figure 2. In order for the trajectory analysis of the RTS to be used as a true baseline in comparison to the other sources of position estimations in this study, it was vital that the real-time NMEA string acquired from the RTS was time synchronised with the rest of the system. This resulted in the generation of a custom Trimble RS232 port monitoring program, which was compatible and able to communicate with other software packages specific to the various implemented localisation systems. This is further covered in Section 3.4 System Level Architecture.

3.3. UWB System

The UWB system used for this work is known as Pozyx [36]. Pozyx is a small and lightweight time of flight UWB positioning system that operates using a tag and anchor node network. Each compatible Pozyx transceiver device possesses its own identification code, therefore allowing the mobile operating tag to directly relate a range measurement to a specific anchor. Pozyx firmware deploys a lateration algorithm based on a linear least squares (LLS) method to calculate an estimated location of the tag in relation to the anchor positions. The system configuration settings are split into four sections that cover operating channel, bitrate, pulse repetition frequency (PRF) and preamble length [36]. For this study, the following configuration parameters were used:
  • Channel—5,
  • Bitrate—110 kbit/s,
  • PRF—64 MHz,
  • Preamble Length—1024.
The Pozyx firmware positioning algorithm “UWB Positioning Only” was utilised in a 2D mode for metric comparison against the RTS and the results from the EKF.

3.4. System Level Architecture

The system architecture set to operate on-board the robot was designed around the open source Robot Operating System (ROS) framework [37]. ROS is a meta-operating system that enables the integration of multiple sources of information from different devices for the purposes of control. For this work, comparing the trajectory acquired from an RTS with various other sources required precise data synchronisation. Utilising the node based message publishing and subscription protocol that ROS functions through, a custom compatible Trimble software was designed to integrate the RTS with ROS, thereby time synchronising the RTS with other data sources within the system. Another beneficial factor considered when using ROS for the operating framework was the ability to configure a master/slave network that enabled the consistent communication and synchronised data transfer between two different platforms, in this case, a Jetson TX1 (Ubuntu 16.04) (master) and a ground station (Ubuntu 16.04) (slave), where the latter was configured to monitor the incoming pseudo-NMEA GPGAA string acquired from the RTS. Figure 3 shows a graphical overview of the implemented system level architecture.
Dynamic tracking using the RTS is achieved via a lock onto a target, either in the traditional sense using systems such as the Trimble MT1000 (Figure 1a) or aided through high frequency Infrared LED in the case of the Trimble AT360 target (Figure 1b). Although the RTS is capable of tracking MT1000 targets unaided, the prism is equipped with IRLEDs to allow multiple addressed targets to be used in the same process. Figure 1 also shows the RTS with a lock onto the target when mounted on the deployed system. RTS lock is made when a confirmed, stable distance measurement is made to the AT360 target located on the mast of the robot. Once this lock is confirmed, the robot is instructed to follow a predefined set of paths. Data from the RTS is actively collected and stored using the ROSBag functionality of the ROS system along with data collected from the Pozyx and quadrature encoders for post trajectory comparisons.

3.5. Range Error Characterisation

In order to characterise the range error of the UWB system, the UWB anchors were placed within a 224 m 2 environment and RTS was used to take a static measurement directly to the UWB antenna on each anchor to acquire six precise positions all within the BNG coordinate frame. The robot, equipped with the UWB tag and RTS reflective target was then set to execute three paths at a constant speed. This enabled linear interpolation between the UWB and RTS for error analysis. At each update, the robot would receive six distance readings from the UWB anchors and a corresponding position update from the RTS. However, as the true position of the anchors and the RTS are known priors, the acquisition of the true position of the robot using the RTS enabled the calculation of the true Euclidean distance d T A i k between the robot and anchor i :
d T A i k = ( x T R x T A i ) 2 + ( y T R y T A i ) 2 ,
where x T R k , y T R k and x T A i , y T A i are the true Cartesian positions of the robot at time k and anchor i, respectively.

3.6. Encoder Error Characterisation

As the encoders were to be used as a control input into the EKF, the noise variance of the input was a required source in the construction of the EKF. Instead of consistently tuning this variable until the EKF yielded expected results, the RTS was used to estimate the noise variance of the encoders. In order to achieve this, the robot was instructed to complete a 10 m track for 10 repetitions. Each 10 m track was split into 2 m segments based on the displacement across ground given from the RTS. This was done to account for any possible wheel slippage occurrence. Time parameters were then cross referenced to identify the displacement estimation given from the encoders at the time corresponding to when the RTS displacement completed each segment. Time alignment between the two sources was achieved through linear interpolation as this test was conducted at a constant speed. The encoder distance and final distance error was then calculated through Equations (16) and (17), respectively:
d e n c = N e n c k C r ,
where N e n c is the number of encoder pulses at time k, C is the constant associated with angular displacement per pulse of the encoder set in radians at 0.01745 and r is the radius of the wheel:
d e r r = d e n c d R T S .
The 10 repetitions of the 10 m track resulted in the error deviation of 0.1185 m. All encoders were assumed to have the same error deviation.

3.7. Localisation

The configuration setup concerning the position of the RTS and UWB anchors for the range error characterisation was maintained for the localisation experiments. The results gathered from the EKF localisation algorithm were compared to the trajectory acquired by the RTS and the UWB. The robot was instructed to navigate various types of trajectories, where one was carried out at a constant velocity to allow for linear interpolation for metric positional comparison in terms of (x,y) between the RTS and the resulting estimations from the EKF. The linearity of the data was assessed to determine its suitability for interpolation. Although the general EKF algorithm performs ideally in pose estimations within the local body frame of the robot, the relationship to a global frame of reference is absent without providing an initial state vector. To convert the operating frame of the EKF to that of the RTS, the initial state vector X 0 was set to equal the first global position pose acquired from the RTS as shown below:
X 0 = R T S x 0 R T S y 0 R T S ϕ 0 .

4. Results and Analysis

4.1. Range Error Characterisation Results

Characterising the behavior of the range error for the UWB system was carried out with a main focus to determine the precision of the sensor. This is due to the variation of the error being a key input for the EKF. As the robot navigates along the repeated path shown in Figure 4, the distances between the robot and each anchor alternate, providing the opportunity to analyse how the error behaves during a dynamic scenario.
As each anchor provided a range measurement to the robot, the RTS calculated a corresponding true distance. The errors for each anchor follow a normal distribution shown by the Gaussian plots in Figure 5, where the standard deviation is seen to be similar for each anchor. However, due to the heading of the robot changing, there were notable periods of time where the tag was facing in a contrasting direction to each anchor creating a non clear line of sight condition (NCLOS). These periods may produce an extra source of error in the robot’s path and may possibly increase as larger operating ranges. As the UWB anchor range error is highly dependent upon the environment [38], re-characterisation would be required for each new anchor setup if specific uncertainty estimates for each anchor are used. If a system is to be deployed in a similar configuration and environment, a generalised uncertainty found by taking an average of the specific uncertainties of each anchor may be applicable. However, if the anchor configuration and environment have changed, re-characterisation of the anchors is required. An averaged uncertainty in this form allowed for a generalised uncertainty implementation.
Error results gathered from each anchor during the path were then combined to provide a generalised error metric. The combined errors are represented in Figure 6. All error statistics for each single anchor and the dataset containing combined anchor errors are shown in Table 1.

4.2. Localisation Techniques Assessment

Experiments focusing on the localization outcomes for this work can be seen in Figure 7. Both the EKF and the native UWB system firmware localisation technique produce significantly better results compared to that of the odometry technique, where wheel slippages are clearly noticeable as the robot appears to drift in relation to the true path of the RTS. The path generated from the UWB system is fairly accurate in most areas compared to the ground truth of the RTS. However, in some areas within the environment enclosed by the anchors, the UWB localisation is seen to be slightly noisy, which is something that the path generated by the EKF clearly reduces. Overall, the EKF seems to follow the ground truth trajectory of the RTS in more areas compared to the UWB system. However, it is noticeable that EKF does drift in areas where the UWB seems to be affected by noise. These noisy measurements from the UWB may be a factor in the slight drifts in the EKF, showing that the noise variance of the UWB may in fact be a dynamic variable depending upon the area that the robot is situated within in relation to the external anchor position and coverage.
Figure 8 shows the results of the EKF acquired from the robot’s path, which was conducted at a constant speed. Each pose produced from the EKF and UWB as well as corresponding timestamps were used to apply a linear interpolation process on the Northing and Easting plane of the RTS path. This was carried out achieve a point-to-point metric comparison. The EKF again is seen to be less noisy compared to the native UWB localisation algorithm. However, it is also noticeable that the pose of the EKF is seen to drift slightly when the UWB pose estimation from the system’s native algorithm also produces a high level of noise. This is observed directly north in relation to the position of the RTS. This shift in position from the UWB demonstrates again how the variance in noise may be dynamically changing depending upon the environmental conditions at that specific time. Final statistical comparison between the UWB and the EKF on a 2D plane can be found in Table 2. Results show that the EKF achieves a smaller mean error on both the x- and y-plane and with a reduced error deviation on the y, although it yields a slightly increased deviation on the x-axis.

5. Conclusions

In conclusion, this paper demonstrated the effectiveness of utilising a surveying grade optical instrument, in the form of a robotic total station as a ground truth metric and to dynamically characterise a generalised range uncertainty of a UWB sensor network, all within an intended operating environment. The resulting uncertainty was used as an input into an EKF fused with robot odometry to yield a reduced localisation error metric compared to the UWB system’s native localisation algorithm. Although the odometry is limited when used standalone, when fused with the range measurements from the UWB, it produces a less noisy overall trajectory. However, the EKF is seen to drift in areas within the environment based on the noise increase of the UWB. This has therefore led to planning future work aimed at the inclusion of a generalised adaptive uncertainty metric into the EKF which can also be achieved using the active target tracking feature from the RTS. This is an experiment that cannot be carried out through static tests between two UWB devices as it is environment dependent. The generalised adaptive uncertainty would be taken as a function of the range from the robot to anchor locations, the robot’s position within the environment, and combined with an estimate for the level of obstruction between the robot and anchors dependent upon the robot’s global heading. Additionally, further experimentation will also be undertaken into a specific uncertainty characterisation that may lead to a more effective implementation. This uncertainty will be specific to each individual anchor as opposed to a generalised metric. With this investigation, a comparison study can be completed that assesses the three uncertainty methods: generalised, adaptive generalised and specific, in order to observe which method yields the optimal results.

Author Contributions

B.J.M. and H.A.G.P. designed the unmanned system to conduct and perform the experiments and analysed all datasets and wrote the paper; B.J.M. designed the control algorithm; J.P.M. specialised in the operation of the surveying grade equipment; A.S. is coordinator for DigiArt; F.A.B. managed the project and contributed towards the modification and refinement of the paper.

Funding

This research was funded by Horizon 2020, the EU Research and Innovation programme (Grant No. 660566).

Acknowledgments

This work is a section of the Horizon 2020 project known as DigiArt, which seeks to generate 3D virtual worlds of archaeological sites, offering new pathways to a deeper understanding of European culture.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
EKFExtended Kalman Filter
TSTotal Station
RTSRobotic Total Station
GNSSGlobal Navigation Satellite System
IMUInertial Measurement Unit
INSInertial Navigation System
LiDARLight Detection and Ranging
UWBUltra-Wideband
EDMElectronic Distance Measurement
RFIDRADIO Frequency Identification
TOATime of Arrival
TOFTime of Flight
BNGBritish National Grid
ROSRobot Operating System

References

  1. Maimone, M.; Cheng, Y.; Matthies, L. Two years of visual odometry on the mars exploration rovers. J. Field Robot. 2007, 24, 169–186. [Google Scholar] [CrossRef]
  2. Park, J.H.; Shin, Y.D.; Bae, J.H.; Baeg, M.H. Spatial uncertainty model for visual features using a Kinect™ sensor. Sensors 2012, 12, 8640–8662. [Google Scholar] [CrossRef] [PubMed]
  3. Alatise, M.B.; Hancke, G.P. Pose Estimation of a Mobile Robot Based on Fusion of IMU Data and Vision Data Using an Extended Kalman Filter. Sensors 2017, 17, 2164. [Google Scholar] [CrossRef] [PubMed]
  4. Fox, D.; Burgard, W.; Kruppa, H.; Thrun, S. A probabilistic approach to collaborative multi-robot localization. Auton. Robots 2000, 8, 325–344. [Google Scholar] [CrossRef]
  5. Cho, B.S.; Moon, W.S.; Seo, W.J.; Baek, K.R. A dead reckoning localization system for mobile robots using inertial sensors and wheel revolution encoding. J. Mech. Sci. Technol. 2011, 25, 2907–2917. [Google Scholar] [CrossRef]
  6. Lee, S.; Song, J.B. Robust mobile robot localization using optical flow sensors and encoders. In Proceedings of the 2004 IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1 May 2004. [Google Scholar]
  7. Se, S.; Lowe, D.; Little, J. Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks. Int. J. Robot. Res. 2002, 21, 735–758. [Google Scholar] [CrossRef]
  8. Betke, M.; Gurvits, L. Mobile robot localization using landmarks. IEEE Trans. Robot. Autom. 1997, 13, 251–263. [Google Scholar] [CrossRef] [Green Version]
  9. Thrun, S.; Burgard, W.; Fox, D. Probabilistic Robotics; MIT Press: Cambridge, MA, USA, 2005. [Google Scholar]
  10. Goel, P.; Roumeliotis, S.I.; Sukhatme, G.S. Robust localization using relative and absolute position estimates. In Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’99), Kyongju, Korea, 17–21 October 1999. [Google Scholar]
  11. Agrawal, M.; Konolige, K. Real-time localization in outdoor environments using stereo vision and inexpensive gps. In Proceedings of the 18th International Conference on Pattern Recognition (ICPR 2006), Hong Kong, China, 20–24 August 2006. [Google Scholar]
  12. Caron, F.; Duflos, E.; Pomorski, D.; Vanheeghe, P. GPS/IMU data fusion using multisensor Kalman filtering: Introduction of contextual aspects. Inf. Fusion 2006, 7, 221–230. [Google Scholar] [CrossRef]
  13. Eliazar, A.I.; Parr, R. Learning probabilistic motion models for mobile robots. In Proceedings of the Twenty-First International Conference on Machine Learning, Banff, AB, Canada, 4–8 July 2004. [Google Scholar]
  14. Mostafa, M.M.; Schwarz, K.-P. Digital image georeferencing from a multiple camera system by GPS/INS. ISPRS J. Photogramm. Remote Sens. 2001, 56, 1–12. [Google Scholar] [CrossRef]
  15. Christiansen, M.P.; Laursen, M.S.; Jørgensen, R.N.; Skovsen, S.; Gislum, R. Designing and Testing a UAV Mapping System for Agricultural Field Surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [PubMed]
  16. Bachrach, A.; Prentice, S.; He, R.; Roy, N. RANGE–Robust autonomous navigation in GPS-denied environments. J. Field Robot. 2011, 28, 644–666. [Google Scholar] [CrossRef]
  17. Bachrach, A.; Prentice, S.; He, R.; Henry, P.; Huang, A.S.; Krainin, M.; Roy, N. Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments. Int. J. Robot. Res. 2012, 31, 1320–1343. [Google Scholar] [CrossRef] [Green Version]
  18. Aiello, G.R.; Rogerson, G.D. Ultra-wideband wireless systems. IEEE Microw. Mag. 2003, 4, 36–47. [Google Scholar] [CrossRef]
  19. Liu, H.; Darabi, H.; Banerjee, P.; Liu, J. Survey of wireless indoor positioning techniques and systems. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2007, 37, 1067–1080. [Google Scholar] [CrossRef]
  20. Gezici, S.; Tian, Z.; Giannakis, G.B.; Kobayashi, H.; Molisch, A.F.; Poor, H.V.; Sahinoglu, Z. Localization via ultra-wideband radios: A look at positioning aspects for future sensor networks. IEEE Signal Process. Mag. 2005, 22, 70–84. [Google Scholar] [CrossRef]
  21. Fontana, R.J. Recent system applications of short-pulse ultra-wideband (UWB) technology. IEEE Trans. Microw. Theory Tech. 2004, 52, 2087–2104. [Google Scholar] [CrossRef]
  22. Bharadwaj, R.; Parini, C.; Alomainy, A. Experimental investigation of 3-D human body localization using wearable ultra-wideband antennas. IEEE Trans. Antennas Propag. 2015, 63, 5035–5044. [Google Scholar] [CrossRef]
  23. Mucchi, L.; Trippi, F.; Carpini, A. Ultra wide band real-time location system for cinematic survey in sports. In Proceedings of the 2010 3rd International Symposium on Applied Sciences in Biomedical and Communication Technologies (ISABEL), Rome, Italy, 7–10 November 2010. [Google Scholar]
  24. Masiero, A.; Fissore, F.; Vettore, A. A low cost UWB based solution for direct georeferencing UAV photogrammetry. Remote Sens. 2017, 9, 414. [Google Scholar] [CrossRef]
  25. Conceição, T.; dos Santos, F.N.; Costa, P.; Moreira, A.P. Robot Localization System in a Hard Outdoor Environment; Springer International Publishing: Cham, Switzerland, 2018. [Google Scholar]
  26. Trimble Geospatial. Trimble S7 Total Station Datasheet. Available online: https://drive.google.com/file/d/0BxW3dqQ5gdnTNkdZRWNRMFdyWGc/view (accessed on 10 June 2018).
  27. Braun, J.; Stroner, M.; Urban, R.; Dvoracek, F. Suppression of systematic errors of electronic distance meters for measurement of short distances. Sensors 2015, 15, 19264–19301. [Google Scholar] [CrossRef] [PubMed]
  28. Martinez, G. Field tests on flat ground of an intensity-difference based monocular visual odometry algorithm for planetary rovers. In Proceedings of the 2017 Fifteenth IAPR International Conference on Machine Vision Applications (MVA), Nagoya, Japan, 8–12 May 2017. [Google Scholar]
  29. Cheng, T.; Venugopal, M.; Teizer, J.; Vela, P.A. Performance evaluation of ultra wideband technology for construction resource location tracking in harsh environments. Autom. Constr. 2011, 20, 1173–1184. [Google Scholar] [CrossRef]
  30. Lau, L.; Quan, Y.; Wan, J.; Zhou, N.; Wen, C.; Qian, N.; Jing, F. An autonomous ultra-wide band-based attitude and position determination technique for indoor mobile laser scanning. ISPRS Int. J. Geo-Inf. 2018, 7, 155. [Google Scholar] [CrossRef]
  31. Roberts, C.; Boorer, P. Kinematic positioning using a robotic total station as applied to small-scale UAVs. J. Spat. Sci. 2017, 61, 29–45. [Google Scholar] [CrossRef]
  32. Keller, F.; Sternberg, H. Multi-sensor platform for indoor mobile mapping: System calibration and using a total station for indoor applications. Remote Sens. 2013, 5, 5805–5824. [Google Scholar] [CrossRef]
  33. Kiriy, E.; Buehler, M. Three-State Extended Kalman Filter for Mobile Robot Localization; Tech. Rep. TR-CIM; McGill University: Montreal, QC, Canada, 2002; Volume 5, p. 23. [Google Scholar]
  34. Teslić, L.; Škrjanc, I.; Klančar, G. EKF-based localization of a wheeled mobile robot in structured environments. J. Intell. Robot. Syst. 2011, 62, 187–203. [Google Scholar] [CrossRef]
  35. Active Robotics. EMG30 Motor with Quadrature Encoders Datasheet. Available online: http://www.robot-electronics.co.uk/htm/emg30.htm (accessed on 10 June 2018).
  36. Pozyx Labs. Pozyx BVBA, Vrijdagmarkt 10/201, 9000 Gent, Belgium. Available online: https://www.pozyx.io/ (accessed on 26 May 2018).
  37. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Ng, A.Y. ROS: An open-source Robot Operating System. In Proceedings of the ICRA Workshop on Open Source Software, Kobe, Japan, 12–17 May 2009. [Google Scholar]
  38. Liang, Q. Radar sensor wireless channel modeling in foliage environment: UWB versus narrowband. IEEE Sens. J. 2011, 11, 1448–1457. [Google Scholar] [CrossRef]
Figure 1. Experimental setup (a) backsight prism; (b) unmanned developmental platform.
Figure 1. Experimental setup (a) backsight prism; (b) unmanned developmental platform.
Sensors 18 02274 g001
Figure 2. Example trajectories achieved using RTS active target tracking.
Figure 2. Example trajectories achieved using RTS active target tracking.
Sensors 18 02274 g002
Figure 3. System level architecture.
Figure 3. System level architecture.
Sensors 18 02274 g003
Figure 4. Robot path for Range Error Characterisation.
Figure 4. Robot path for Range Error Characterisation.
Sensors 18 02274 g004
Figure 5. Error Distributions of UWB range measurements.
Figure 5. Error Distributions of UWB range measurements.
Sensors 18 02274 g005
Figure 6. Error distribution for all anchors.
Figure 6. Error distribution for all anchors.
Sensors 18 02274 g006
Figure 7. Resulting paths from all techniques and RTS active tracking.
Figure 7. Resulting paths from all techniques and RTS active tracking.
Sensors 18 02274 g007
Figure 8. Trajectory comparison between EKF and UWB.
Figure 8. Trajectory comparison between EKF and UWB.
Sensors 18 02274 g008
Table 1. Range error statistics for each anchor.
Table 1. Range error statistics for each anchor.
AnchorMean Error (m)Standard Deviation of Error (m)
Anchor 10.03010.1216
Anchor 20.02350.1336
Anchor 30.02370.1287
Anchor 40.10140.1325
Anchor 50.10810.1194
Anchor 60.08670.1256
Combined0.06220.1323
Table 2. Positional errors in terms of mean and standard deviation (x—Easting, y—Northing).
Table 2. Positional errors in terms of mean and standard deviation (x—Easting, y—Northing).
AxisMean Error (m)Standard Deviation of Error (m)
UWB (x)0.06210.1478
UWB (y)0.07180.1510
EKF (x)0.01670.1611
EKF (y)0.00710.1326

Share and Cite

MDPI and ACS Style

McLoughlin, B.J.; Pointon, H.A.G.; McLoughlin, J.P.; Shaw, A.; Bezombes, F.A. Uncertainty Characterisation of Mobile Robot Localisation Techniques using Optical Surveying Grade Instruments. Sensors 2018, 18, 2274. https://doi.org/10.3390/s18072274

AMA Style

McLoughlin BJ, Pointon HAG, McLoughlin JP, Shaw A, Bezombes FA. Uncertainty Characterisation of Mobile Robot Localisation Techniques using Optical Surveying Grade Instruments. Sensors. 2018; 18(7):2274. https://doi.org/10.3390/s18072274

Chicago/Turabian Style

McLoughlin, Benjamin J., Harry A. G. Pointon, John P. McLoughlin, Andy Shaw, and Frederic A. Bezombes. 2018. "Uncertainty Characterisation of Mobile Robot Localisation Techniques using Optical Surveying Grade Instruments" Sensors 18, no. 7: 2274. https://doi.org/10.3390/s18072274

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop