Next Article in Journal
The Effect of Utilizing Distributed Intelligent Lighting System for Energy Consumption in the Office
Previous Article in Journal
Experimental Study on Uniform and Mixed Bed-Load Sediment Transport under Unsteady Flow
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Real-Time Indoor Positioning Approach Using iBeacons and Smartphone Sensors

College of Surveying and GeoInformatics, Tongji University, Shanghai 200092, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(6), 2003; https://doi.org/10.3390/app10062003
Submission received: 4 January 2020 / Revised: 3 March 2020 / Accepted: 10 March 2020 / Published: 15 March 2020
(This article belongs to the Section Earth Sciences)

Abstract

:
For localization in daily life, low-cost indoor positioning systems should provide real-time locations with a reasonable accuracy. Considering the flexibility of deployment and low price of iBeacon technique, we develop a real-time fusion workflow to improve localization accuracy of smartphone. First, we propose an iBeacon-based method by integrating a trilateration algorithm with a specific fingerprinting method to resist RSS fluctuations, and obtain accurate locations as the baseline result. Second, as turns are pivotal for positioning, we segment pedestrian trajectories according to turns. Then, we apply a Kalman filter (KF) to heading measurements in each segment, which improves the locations derived by pedestrian dead reckoning (PDR). Finally, we devise another KF to fuse the iBeacon-based approach with the PDR to overcome orientation noises. We implemented this fusion workflow in an Android smartphone and conducted real-time experiments in a building floor. Two different routes with sharp turns were selected. The positioning accuracy of the iBeacon-based method is RMSE 2.75 m. When the smartphone is held steadily, the fusion positioning tests result in RMSE of 2.39 and 2.22 m for the two routes. In addition, the other tests with orientation noises can still result in RMSE of 3.48 and 3.66 m. These results demonstrate our fusion workflow can improve the accuracy of iBeacon positioning and alleviate the influence of PDR drifting.

1. Introduction

Indoor positioning is important for indoor awareness, and it can support queries on the locations of users. Regarding pedestrian indoor positioning, many approaches rely on distinct sensors such as WiFi, Bluetooth, magnetic sensor [1], inertial measurement unit (IMU), etc. There is no generic solution for all kinds of indoor scenarios. In general, these reported studies can reach the accuracy of 2–5 m in calibrated indoor environments [2].
Indoor positioning methods can be categorized into centroid positioning [3], multilateration (hyperbolic positioning) [4], trilateration, fingerprinting positioning, pedestrian dead reckoning (PDR), vision-based positioning, etc. According to observation data, different positioning techniques include received signal strength indication (RSSI), time of arrival (TOA), time difference of arrival (TDOA), angle of arrival (AOA), image, etc. PDR is the simplest approach which calculates the next location by determining heading directions and the displacement from the start. PDR is an auxiliary approach and it relies on inertial sensors only. However, its drifting error would accumulate during walking, and thus PDR cannot independently generate accurate locations in the long term [5].
Other indoor positioning methods require sensor network in buildings to compute the absolute location of signal sources. Algorithms based on intersection, such as TOA, TDOA, and AOA, are sensitive to the measurement of time and angle. These methods are not suitable for smartphone since its low-cost sensors may not meet the requirement of high accuracy. In contrast, current fusion solutions for smartphone include the combinations of WiFi and PDR [6], geomagnetism/WiFi/PDR [7,8], or WiFi/Bluetooth [9]. Their typical positioning methods include fingerprinting and trilateration based on RSSI of WiFi and/or Bluetooth device. These means are often employed to provide PDR with initial location.
Compared to WiFi APs, a Bluetooth Low Energy (BLE) technology named iBeacon has merits in low price, low energy consumption, and no requirement of Internet [10,11]. It is ideal to support smartphone applications. However, previous research shows positioning based on iBeacon has a relatively lower accuracy than WiFi APs [12], due to the attenuation property of iBeacon signals.
Although a group of studies [13,14] has been reported on the fusion of iBeacon and PDR, the critical problem of signal fluctuation still needs to be addressed for accuracy improvement. Filtering is employed to mitigate the fluctuations of iBeacon signals, but it is only for static reference points [10,11]. Regarding the key parameters of PDR, some researchers [13] fuse ranging data of iBeacon in an extended Kalman filter to calibrate PDR results, yet without discussion of non-line-of-sight (NLOS) conditions.
In this paper, we focus on the accuracy improvement of iBeacon positioning, and also look for another fusion way of iBeacon and PDR to resist data noise. Our objective is to pursue a real-time solution of indoor positioning with a stable accuracy (e.g., 1–3 m). PDR lacks the estimate of initial location, and thus we propose an iBeacon positioning method to provide accurate coordinates (i.e., the baseline result) first. As we seek for a lightweight implementation on smartphone, trilateration is a preferable option due to its simplicity. However, its ranging accuracy often suffers from RSSI fluctuations caused by NLOS conditions. Therefore, we develop a fusion method of fingerprinting and trilateration which can mitigate ranging errors and improve the positioning accuracy of iBeacon RSSI. Another novel issue for the PDR approach is heading estimation. We develop a heading estimation method based on trajectory segmentation and a Kalman Filter (KF) fusing the measurements of heading and angular rate. According to the baseline result, PDR data are fused to correct the pedestrian’s trajectory locally, especially on trajectory continuity.
More specifically, we propose a fusion workflow to improve localization accuracy of smartphone. It involves received signal strength (RSS) of iBeacon and the data of accelerator and gyroscope for PDR. According to log-distance path loss model, ranging data can be transformed from the RSS between beacons and the measured location. Based on the iterative algorithm of trilateration [15], we add a weight matrix to emphasize the importance of near beacons. Real-time locations are derived by enhancing the trilateration with a specific fingerprinting result, i.e., the BLE-based method. During a tracking process, turns are pivotal for positioning since the walking state changes at them. To better estimate locations around turns, we segment pedestrian trajectories into linear motions in terms of turn locations. As the moving direction is relative stable in each segment, we can apply and update a KF for the segment to correct the headings. Finally, we devise another KF to derive locations in each segmented path by fusing the BLE-based positioning and the PDR. As a result, the BLE-based method lays the foundation for the fusion workflow and overcomes the drifting of the PDR.
We implemented this approach in an Android smartphone, and conducted real-time tests on a floor with an area of 44 m × 17 m. As we attach importance to turns, different routes with sharp turns (180° and 90°) were selected. The experiments regrading our fusion approach were conducted with starting location errors and heading orientation noises. The results demonstrate the proposed BLE-based method indeed improves the positioning accuracy of iBeacon. Moreover, the whole fusion approach can effectively alleviate the positioning error caused by inaccurate initial positions and orientation noises.
The remainder of this paper is organized as follows. Section 2 briefly introduces the related work on iBeacon/BLE techniques and PDR. Section 3 elaborates on our research method including BLE-based method, PDR, and the fusion workflow for indoor positioning. Section 4 presents our experimental results and Section 5 discusses them with positioning accuracy. Finally, Section 6 concludes this paper with some future work.

2. Related Work

2.1. RSS-Based Method

In this section, we shortly review the current solutions for indoor positioning regarding iBeacons and PDR. RSS-based methods can be roughly categorized into fingerprinting and ranging-based methods (e.g., trilateration [3,16]). In general, fingerprints or the radio map contains NLOS information of the environment, although its update costs some efforts [17]. The accuracy of fingerprinting depends on the number of BLE beacons for computation [18]. Specifically, multiple channels used by BLE beacons result in RSSI variation in a wider range than WiFi APs [19]. Researchers propose a method of batch filtering to mitigate the multipath effect of RSS. In another related study [11], the researchers set a window for batch filtering, and get the best performance (around 2.6 m for 90% of the time) with dense configuration of beacons. Sparse configuration of beacons result in the error of less than 4.8 m. In the same testbed, WiFi-based positioning involves an error of less than 8.5 m for 95% of the time.
Another group of methods focuses on the automation of fingerprint generation [20] or crowdsourcing [8,21]. These methods aim to automate the collection and update of fingerprints (i.e., radio map) [22,23]. Users can provide feedback to boost the matching accuracy of WiFi fingerprinting. However, these methods always have low stability.
In contrast, ranging-based methods estimate the coordinates of the receiver with transformed distances from RSSI, and they are prone to incorrect estimates due to the low ranging accuracy [16]. The accuracy of RSSI is influenced by many factors such as obstruction of walls, NLOS, multipath effect, etc. As low ranging accuracy is the bottleneck, we add a weight matrix to the iterative algorithm of trilateration [15] to stress the importance of near iBeacons for ranging. In this paper, we leverage the NLOS information in iBeacon fingerprints to correct the trilateration results. Our tests show the positioning accuracy is competitive with other RSS-based methods of WiFi or iBeacon.

2.2. PDR-Related Method

PDR methods can provide continuous trajectory yet they are prone to be influenced by accumulative errors [24]. Related research aims to improve the estimation of step detection, stride length, and heading orientation [24,25]. Commonly, PDR methods need a starting location (initial state), and its errors in orientation would accumulate along with walking. Accurate headings are quite important for PDR. To compensate direction errors, PDR is incorporated with other positioning techniques. Li et al. presented tests about the combination of PDR, WiFi fingerprinting, and magnetic matching (MM) [7]. The RMS of position error can be reduced to 3.8 m with the group of PDR/WiFi, and the RMS regarding PDR/WiFi/MM can be reduced to 3.2 m.
Map matching methods are applied with PDR to better estimate locations [26,27]. Zhou et al. (2017) alleviated the accumulated error of PDR by correcting locations with a navigation network [26]. With the help of particle filter on the network, the positioning accuracy of PDR can be raised up to 1.23 m. In addition, PDR, human activity recognition (HAR), and indoor landmarks are included to better estimate user trajectories [28]. Although these rigid constraints can improve the PDR trajectory, they may not be suitable for real-time positioning without a priori knowledge. To compensate the headings of smartphone, we design a method to subdivide pedestrian trajectories into linear motions in terms of turns, and update Kalman filtering for each trajectory segment.

2.3. Data Fusion

Data fusion can produce more consistent and accurate results than any single positioning technique. Hafner et al. (2013) compared the fusion results of KF and particle filter (PF) in WiFi positioning tests [29]. By applying fingerprinting with all available WiFi APs, they concluded there is no clear difference between KF and PF trajectories. However, in the case of fewer APs (three), the performance of PF is better than KF.
Leppäkoski et al. (2013) proposed a complementary extended Kalman filter (CEKF) to fuse WiFi RSS and foot-mounted PDR data and designed a PF to fuse PDR data with map information [6]. Although the PF generates user trajectories with the higher accuracy, the PF algorithm is quite time-consuming. The CEKF only needs 0.2 s for computation, while the PF needs more than 100 s in 10 m × 10 m area. Thus, PF may not be the first choice for smartphone applications.
KF has high computational efficiency and it is suitable for real-time application. Recently, Jenny et al. (2017) developed an application in tablet to provide a test in a small region on the fusion of iBeacon and PDR with KF. The result of fusion positioning is accurate (around 1 m) [14]. However, the robustness of this positioning approach is not clear since the test route is simple. Another study fuses iBeacon ranging data with PDR computation [13]. Based on the start location derived by iBeacon and WiFi, the ranging result of iBeacon RSS is fused in an extended Kalman filter to calibrate PDR results.
In this paper, we focus on the fusion of iBeacon and PDR data on the basis of the accuracy improvement of iBeacon positioning. In our designed KF, the accurate coordinates (i.e., ‘baseline’) of RSS-based positioning are fused with the heading and stride length of PDR. The accuracy of fusion is comparable to other WiFi/iBeacon localization, while the computation load is limited.

3. Research Method

3.1. Overview

As mentioned above, we aim to improve the accuracy of iBeacon positioning method in smartphone, and fuse it with PDR in a designed Kalman filter (KF) for real-time applications. A novel aspect of the proposed BLE-based method is that we introduce a specific fingerprinting method to fuse with trilateration. This fusion can resist the ranging error caused by RSS fluctuations of iBeacon. This BLE-based method lays a solid foundation for the later fusion with PDR result. Similar to the method in [15], we adopt the single point positioning algorithm in GNSS for trilateration, and we set up a weight matrix to differentiate the contribution of beacons in various distances. For the fingerprinting method, we adopt cosine similarity to reveal the consistent trend between two RSSI vectors. This trend can help us to exclude outliers and deal with the RSS differences between different smartphones. The proposed iBeacon-based approach achieves an accuracy close to that of WiFi-based approaches. The easy implementation of this approach in smartphone is also a valuable improvement of iBeacon-based positioning.
In addition, we develop a new implementation of PDR to resist heading errors via trajectory segmentation. We detect turns first with gyroscope data and subdivide user trajectory into segments according to these turns. We design a KF with heading measurement and the current angular rate, apply it to each segment, and update the filter for the next segment. This implementation can alleviate the error of headings in smartphone. Finally, we design a fusion method based on KF to combine the BLE-based method and the PDR implementation. It ensures both the accuracy and continuity of real-time trajectories.
The whole fusion workflow for indoor positioning is summarized in Figure 1. The related smartphone sensors include bluetooth module, gyroscope, accelerometer, and magnetic sensor. After merging the two RSS-based positioning methods (trilateration and fingerprinting), we can add the absolute coordinates into PDR computation. From the initial location, we can confirm each step with its length and check the altitude (including heading orientation). Meanwhile, we provide orientation filtering to correct the altitude measurement. The orientation filter is updated at each turn. After the whole PDR procedure, the location estimate and the current heading orientation are ready. BLE-derived location and PDR location are input to the designed KF (see Section 3.4), when the PDR location is out of the current room or the filtering interval has reached (e.g., 10 s). Finally, the fused location estimate can effectively resist drifting and jumping in a trajectory. The following subsections explain the details.

3.2. BLE-Based Method

By combining ranging of RSSI and the radio map, RSSI data of iBeacons can be used for accurate positioning. Trilateration can provide location estimates according to real-time RSSI, while fingerprinting can support it with the NLOS features in the radio map.
Trilateration. Inspired by the notion of pseudo ranges in GNSS positioning, we use RSSI ranging for the trilateration method.We also infer the related formula similar to the single point positioning with pseudo ranges. As indoor positioning is conducted in a small range, the propagation time from a beacon to the receiver is quite short. In this sense, clock error is negligible. Equation (1) presents this simple ranging model, where ρ i is the pseudo range and v represents stochastic noise. Because Equation (1) is a nonlinear function of the location coordinates (x, y, z), we expand it according to Taylor series and keep linear terms only. The derived linear form is given in Equation (2).
d i = ρ i + v = ( x i x ) 2 + ( y i y ) 2 + ( z i z ) 2 + v
d i ρ 0 i = x 0 x i ρ 0 i Δ x + y 0 y i ρ 0 i Δ y + z 0 z i ρ 0 i Δ z
According to the model of ‘log distance path loss’ of radio signals [30], RSSI is a function of distance (see Equation (3)). P ( d 0 ) indicates the RSSI of a reference location with distance of d 0 to a known transmitter, and d indicates the distance from the current location to the transmitter. Parameter n reflects attenuation of signal. Equation (4) shows the simplified case where the distance d 0 is a unit distance (i.e., 1 m) and A represents the RSSI of the unit distance.
R S S I = P ( d 0 ) 10 n l g ( d d 0 )
R S S I = A 10 n l g ( d ) , d 0 = 1 ,   A = P ( d 0 )
Conversely, RSSI value can be used to infer the distance (see Equation (5)) between a BLE beacon (the transmitter) and the current location of smartphone (the receiver). Thus, the distance between the received location and the emitted location of a beacon is given as follows:
d = 10 A R S S I 10 n
To implement Equation (5), two parameters have to be determined for this transformation from RSSI to length, i.e., A and n. The two parameters reflect the characteristics of specific environments. For the testbed in this paper, we pre-computed the two parameters at reference points via surveying adjustment. With more than four known distances between the selected points and BLE beacons, Equation (6) presents the error equation of the two parameters, which is derived from Equation (4). For k (k > 4) distances, we can obtain the adjustment values of A and n by inputting these distances ( d k ), RSSI measurements ( R S S I d k ), and the related RSSI estimate ( R S S I d k 0 , the initial value).
v 1 v 2 v k = 1 10 l g ( d 1 ) 1 10 l g ( d 2 ) 1 10 l g ( d k ) A n R S S I d 1 R S S I d 1 0 R S S I d 2 R S S I d 2 0 R S S I d k R S S I d k 0
After obtaining the environmental parameters of Equation (5), we can generate real-time locations by applying the trilateration algorithm in an iterative way (see Equation (7)). The error equation of location coordinates is derived from Equation (2):
V = B X l ,   l = d 1 ρ 0 1 d 2 ρ 0 2 d n ρ 0 n ,   X = Δ x Δ y Δ z ,   B = x 0 x 1 ρ 0 1 , y 0 y 1 ρ 0 1 , z 0 z 1 ρ 0 1 x 0 x 2 ρ 0 2 , y 0 y 2 ρ 0 2 , z 0 z 2 ρ 0 2 x 0 x n ρ 0 n , y 0 y n ρ 0 n , z 0 z n ρ 0 n ,   X 0 = x 0 y 0 z 0
In Equation (7), ( x n , y n , z n ) refers to the coordinates of beacon n, d n represents the measured distance between beacon n and the current location, and ρ 0 n stands for the initial guess of this distance. Accordingly, the solution of residuals is presented in Equation (8). The n × n matrix P is the weight matrix, which reflects the importance of each distance adopted. These distances are all derived from the related RSSI values (always negative), and we consider that a RSSI with a high value contains high reliability. Thus, we input the initial P as a diagonal matrix whose diagonal elements are set to 1 R S S I 2 .
X ^ = ( B T P B ) 1 B T P l
After getting the residuals of the coordinates, we can readily derive the coordinates of the current location by adding up the residuals to the initial values (Equation (9)). The vector x , y , z T indicates the final estimate of the current location.
x y z = X 0 + X ^
It should be noted that the calculation process in Equation (8) is iterative. Considering the non-convergence issue of iteration, we set thresholds for the iterated x , y , z T as the termination condition. For instance, the iteration would stop when two adjacent calculation results x and x contain a large difference over than the threshold (e.g., 2 m). In this way, we ensure this iteration can stop and the derived coordinates are not exaggerated.
Fingerprinting. Fingerprinting is another RSS-based localization technique adopted in this paper. This method is introduced in Section 2. Here, we aim to use the NLOS information in the database (radio map) to correct trilateration results. We select a deterministic matching method for positioning, that is, the comparison of measured RSSI vector and reference RSSI vector to rank the correlation degree.
The implementation of fingerprinting consists of several steps. First, in offline phase, we set up predefined reference points to collect the database (i.e., the RSSI vector of iBeacons on each point). Second, in online phase, we compare the current RSSI vector with the database and calculate the location coordinates by weighted K Nearest Neighbor (WKNN) method. As the RSS values of an iBeacon always fluctuate, we do not select the minimum Euclidean distance between RSSI vectors as the primary criterion of location matching. The criterion of minimum Euclidean distance is more prone to wrong estimate due to data noises and sensor differences of smartphones.
c o s ( θ ) = V i V j V i V j
More specifically, we use cosine similarity to determine the minimum closeness of RSSI vectors regarding locations (Equation (10)). When two RSSI vectors share the same size, their cosine similarity is calculated by their dot product and modules ( V i and V j ). In this way, we can check whether the trend of these RSSI vectors is consistent, and include these similar RSSI vectors to reckon the location. Compared to the minimum Euclidean distance, cosine similarity is less sensitive to RSS variations. It can provide more candidates around the genuine location and thus avoid the outliers even with the minimum Euclidean distance. In addition, distinct smartphone sensors often receive different RSS values of the same iBeacon. In this case, we can perceive the overall difference via the cosine similarity between the RSSI vector of a phone and that of the database.
As the values of cosine similarity have been normalized, the largest value relates to the maximum weight. We calculate the final coordinates with k candidate points for the current location by WKNN (see Equation (11)). In this way, we can obtain the positioning result from the fingerprinting method.
x = i = 1 k w i x i y = i = 1 k w i y i w i = c s i j = 1 k c s j
Fusion of the both methods. The results derived from trilateration and fingerprinting represent real-time and pre-collected information, respectively. We fuse them to obtain better location estimate to compensate ranging errors of iBeacon RSS. Here, we consider the constraint of a pedestrian’s motion scope, which supports us to generate reasonable location estimate. Considering a continuous movement, the last location correlates to the following one. Here, we limit the motion in a reasonable scope in terms of computational frequency of trilateration positioning. For example, 1 m is the scope for the next step for a pedestrian when the frequency of trilateration is 1 Hz. In other words, as the rate of the BLE-derived locations is 1 Hz, we set the threshold of stride as 1 m/s. In this way, after the location estimates of the two methods are generated, we can compute their distances to the last location. Assuming the resulting distance of trilateration is d 1 and that of fingerprinting is d 2 , we list their weights for fusion in Equation (12). A long distance relates to a small weight when the both distances are equal to or larger than 1 m. In contrast, a long distance gains a large weight when d 1 and d 2 are both shorter than 1 m. Based on the coordinates derived by the both methods, we employ these weights in Equation (12) to obtain the weighted coordinates of the current location. The fused location estimate would be more continuous and involves fewer ‘jumps’.
w 1 = d 2 d 1 + d 2 , w 2 = d 1 d 1 + d 2 , d 1 1 ,   d 2 1 w 1 = d 1 d 1 + d 2 , w 2 = d 2 d 1 + d 2 , d 1 < 1 ,   d 2 < 1

3.3. PDR on Smartphone

PDR is a lightweight independent positioning method which can provide continuous location estimates. It requires no specific indoor positioning infrastructure (e.g., BLE beacons or WiFi AP), but it needs an absolute location as initial state. Figure 2 shows the process of coordinate calculation of each step. The coordinates of location P k is computed by a simple principle shown in Equation (13). Basically, three problems need to be solved for PDR: (1) count steps; (2) estimate stride length; and (3) detect orientation. We use a smartphone of Android operating system (Android phone in short) where sensor information are easily accessed. Three sensors are adopted for PDR computation, including gyroscope, accelerator, and magnetic sensor. As shown in Figure 3, these sensors refer to three rotations, i.e., X-Axis (Roll), Y-Axis (Pitch), and Z-Axis (Yaw).
Android phone has provided a default approach to present orientation reading regarding the data of accelerometer and magnetic sensor. We employ this Android orientation in PDR computation. However, orientation error would accumulate fast due to the low accuracy of Android phone’s sensors. Especially, heading orientation would vary sharply after turns. In this case, we adopt gyroscope to detect turning locations and separate a pedestrian’s trajectory according to the turns.
Given a starting point, PDR can be conducted by measuring the heading orientation and stride length at each step. Figure 2 and Equation (13) present this simple computation process. At the step k, its coordinates ( x k , y k ) are the cumulative result of the previous k-1 steps. In the following part, we introduce the means for step detection and orientation correction.
x k = x 0 + n = 1 k d n s i n θ n y k = y 0 + n = 1 k d n c o s θ n
Here, we leverage accelerometer in Android phone to count steps [25]. First, we compute the vector module of the readings in three directions of accelerometer (i.e., X, Y, and Z). All these module values form a waveform along with time (Figure 4). A step is detected by checking two peaks which meet the condition of time difference (e.g., longer than 0.2 s). Meanwhile, the difference between peak and valley should be greater than tolerance. To filter noises in the waveform, a peak is confirmed when the continuous raise is detected more than or equal to two times.
Moreover, we adopt a simple but effective linear model to determine stride length [24]. The computational cost of this method is low and it is suitable for real-time positioning. In this model, we only need to fix the appropriate variables of a and b, and then stride length L can be decided (Equation (14)). Here, the parameter f stands for the current step frequency.
L = a f + b
The most important part for PDR is to obtain the correct orientation. According to Equation (13), correct direction ensures accurate coordinates. As mentioned above, the accuracy of orientation measurement is bound to smartphone sensors. Figure 5 gives an example of orientation deviation after a turn. Although the real-time orientation reading could vary sharply, in most cases, a pedestrian would walk in a relatively stable direction and pace before making turns. Thus, the user’s current movement before the next turn is considered a linear motion.
To distinguish the accumulative error of orientation, we divide a pedestrian’s trajectory into linear motions according to turn locations. First, we detect turns by checking peaks and valleys in gyroscope data (Figure 6). We compute the modulus of each gyroscope reading vector ( g x , g y , and g z ), and locate the exceptional peaks representing the turns (see Figure 6). A tolerance is set for gyroscope data to detect accurate turn locations since there are always large angular rates at turn locations. For the other cases, the related data of gyroscope fluctuate in a small range (around zero). In this way, we separate a pedestrian’s movement into segments by these detected turns.
As the motion in each segment is linear, we use a KF to alleviate the accumulative error of orientation. To improve the orientation regarding each segment, we set up the prediction equations of KF that includes heading orientation ( θ ) and angular rate ( φ ) (see Equation (15)). X ^ k is the state vector at time k, P k represents the covariance matrix at time k, and Q represents stochastic error. Δ t indicates the time of the change of angular rate. As this filter is applied to linear motion, the change of angular rate at the last moment can be regarded as the same to the current one. The predicted value of θ is set to the last θ plus the changed angle calculated by Δ t and φ k 1 . The state vector X ^ k can be computed by applying the update equations of this KF. Then, the filtered heading orientation θ k of time k is obtained. As the motions of two segments are in different orientations, the covariance matrix P k would be reset to an identity matrix after each turn, which avoids the influence of the last segment on the next one.
X ^ k = θ k φ k = F k X ^ k 1 = 1 Δ t 0 1 θ k 1 φ k 1 P k = F k P k 1 F k T + Q

3.4. Fusion Method

Both the BLE-derived locations and PDR results have pros and cons. A pedestrian’s trajectory derived by the BLE-based approach may still contain inconsistency. In contrast, locations generated by PDR are more continuous. However, PDR needs absolute coordinates as the starting location, and PDR cannot yield accurate locations in a long term due to the drift of orientation. Thus, we combine the two types of approach for indoor real-time positioning. A KF is employed to fuse the two approaches and yield more accurate trajectories.
The real-time BLE-derived locations provide the initial state for the PDR method, which can correct the drift of PDR result when the trajectory is accidentally out of the current room. Considering the linear motion in each segment of user trajectory, theoretically, the last stride length is equal to the current one. We built up the prediction equations of the Kalman filter in Equations (16) and (17). The matrix Q in Equation (17) stands for the covariance matrix of process noise, and P k is the covariance matrix predicted at time k.
X ^ k = x k y k d k = F k X ^ k 1 = 1 0 c o s θ k 1 0 1 s i n θ k 1 0 0 1 x k 1 y k 1 d k 1
P k = F k P k 1 F k T + Q
According to Kalman filtering, the update equations of this KF are presented in Equations (18)–(20). The matrix H k in Equation (18) is an identity matrix, and Z k refers to the BLE-derived locations at time slot k. The matrix R represents the covariance matrix of measurement noise (Equation (20)). After the Kalman gain K is calculated by Equation (20), the state vector X ^ k and the covariance matrix P k can be updated. X ^ k contains the fused coordinates ( x k , y k ) and the stride length d k .
X ^ k = X ^ k + K ( Z k H k X ^ k )
P k = P k K H k P k
K = P k H k T H k P k H k T + R
The BLE-based approach may involve a lower output frequency than PDR due to the RSSI data scanning, and thus locations generated by this fusion algorithm is periodically used to correct the drifting error of PDR (e.g., every 10 s). In addition, we introduce a simple geometric restriction to correct location estimates, i.e., the boundary of space/room. This correction would be triggered if the location estimate is unreasonably outside of the room. To ensure the proposed algorithm can be applied to general navigation scenarios, we do not use any map matching means in this fusion algorithm.

4. Experiment

Experiments were conducted on the fifth floor of the College of Surveying and Geo-informatics, Tongji university (Figure 7). The floor area is around 44 m × 17 m, and the minimum width of the corridor is 1.7 m. We adopted Galaxy S6 smartphone and its sensors for positioning. During our experiments, we held the smartphone horizontally in both steady and swaying states. In this study, the sampling rate of gyroscope and accelerator are both 5.5 Hz, and the computational rate for BLE-based positioning was set to 1 Hz. We developed an smartphone application (app) to collect sensor data, implemented our fusion positioning algorithm, and visualized positioning results in the floor plan.
The main purpose of these experiments was to investigate the positioning accuracy of the proposed fusion workflow, especially for the cases with sharp turns. In addition, we considered the influence on accuracy of two different errors. The first one is about inaccurate initial positions, and the second one involves frequently disturbed heading orientations (yaw) during walking.
Therefore, we designed two paths (see Figure 8) including multiple U-turns (180 degrees) and 90-degree-turn(s). The first route has two U-turns and one left turn, and the second case includes three U-turns of a round-trip route in the corridor.
First, we implemented the BLE-based positioning method fusing trilateration and fingerprinting of iBeacon. Only 10 BLE beacons were deployed, and they were set in an effective range (no more than 10 m) but not too close. As these beacons were set at similar heights (around 1.3 m above the floor), we fixed the z value in Equation (7). To implement the fingerprinting method, we set up reference points in regular distance to collect the radio map (Figure 9a). We sampled RSSI data for each point, and the sampling rate was 1 Hz. As mentioned above, we used cosine similarity for localization (Equation (10)).
To obtain the ranging parameters of RSSI for trilateration (see Equation (5)), we selected four points and collected RSSI data (Figure 9b). The links between the four stations and iBeacons included NLOS cases. The two parameters A and n were computed with least squares adjustment (see Equation (6)). The calculated values are −61.94 and 1.36 for A and n, respectively.
Figure 10 shows the BLE-derived locations of the two routes. Positioning accuracy is presented in Table 1. As ground truth is known, we computed root-mean-square error (RMSE) for the two routes. We ran the BLE-based localization three times for each route, and averaged the results of the six trials to calculate the total RMSE. The RMSE of Routes 1 and 2 are 2.71 and 2.77 m, respectively. The column ‘RMSE X’ shows the major error (2.57 and 2.69 m) is in the X axis (i.e., the heading direction). The positioning error is mainly from the ranging errors introduced by RSS fluctuations. In this corridor, because the distances between iBeacons and the smartphone are relatively shorter in Y direction, the ranging errors in X direction are much larger. As a result, positioning errors are mainly found in the X direction. We finally obtained the RMSE of 2.75 m regarding the BLE-based method, which is a relatively high accuracy to independent BLE positioning results.
Table 2 presents the separate RMSE of trilateration and fingerprinting for localization. Comparing to Table 1, one can find the accuracy improvement of the proposed BLE-based method.
The next test was for the proposed fusion workflow of positioning. As mentioned above, we considered two errors for real-time tracking: inaccurate initial position and disturbed orientation. Figure 11 presents the positioning result of Route 1 with an inaccurate initial position. This position has an error of 2.30 m. Figure 11a,b presents the raw PDR locations and those with filtered orientations, respectively. There is certain accuracy improvement around turn locations. However, the orientation filtering still cannot eliminate the systematic drifting caused by the initial position and directional measurements. In contrast, the final trajectory overcomes the both types of error (Figure 11c).
Figure 12 presents the real-time trajectories when the smartphone was randomly swayed between left and right directions. Figure 12a shows the noises in orientation severely influences the positioning accuracy. The angle at all these turn locations are incorrect. The orientations and the related location estimates are improved in Figure 12b, although the error is still considerable. The location accuracy is largely improved by the proposed fusion workflow. Figure 12c shows that this orientation noise can be alleviated by the result of the BLE-based method.
Figure 13 presents the tracking result of Route 2 with an inaccurate initial position. The error of the position is 2.03 m. Figure 13a presents an obvious systematic error on absolute coordinates. Similarly, orientation filtering provides local corrections (e.g., more overlapping segments), although the systematic error still exists (Figure 13b). The systematic drifting is removed in the trajectory of Figure 13c at the beginning phase. The four segments of the trajectory divided by the three U-turns highly overlap with the ground truth, which proves the proposed fusion workflow can deal with changes at turns as planned.
When the smartphone is swayed, the PDR computation becomes error-prone because of the noise of orientation measurement.The first segment of the trajectory in Figure 14a gets over the turn point, which deteriorates the precision of the following location estimates. The proposed fusion method corrects most locations of PDR back to the corridor area (Figure 14c). The improvement of accuracy also validates the use of the BLE-based method on severe deviation of PDR.
The above results demonstrate the BLE-based method can compensate the drifting error of PDR, and their fusion can improve positioning accuracy. For instance, the accuracy regarding inaccurate initial position is a little higher than the independent BLE positioning. With disturbed orientations, the proposed fusion method can resist the noisy measurements and still obtain a stable accuracy. Table 3 lists the RMSE of the four cases. We ran the proposed fusion workflow for each route four times and calculated the RMSE by averaging the test results. The accuracy is between 2 and 4 m. Similar to the results in Table 1, the main error occurs in the heading direction (the column ‘RMSE X(m)’).
A conclusion of the experimental results is that the orientation filter cannot independently improve location accuracy of the PDR. Although it can partially improve the direction and location estimates, it cannot reduce systematic errors. Instead, the proposed fusion positioning workflow can curb systematic errors.

5. Discussions

In this paper, we focus on the accuracy of real-time positioning, especially for the cases with sharp turns. The accuracy of location estimate of our trilateration method is bound to fluctuating RSS values. Figure 15 presents the RSSI distribution of two BLE beacons at the same location. The most-frequent RSSI of the two beacons are −71 dBm, but the absolute difference between RSS values can exceed 10 dBm (e.g., −84 dBm). Supposing the most-frequent RSSI relates to the distance d ˜ , and the error can exceed 20 m when d ˜ reaches 10 m (see Equation (5)), this ranging error has to be alleviated for indoor positioning.
Our solution is to fuse the trilateration result with RSSI fingerprints to compensate this error, which leads us to compute BLE-based locations in a relatively high accuracy. To prevent location estimates ‘jump’ dramatically, we average the RSSI values in each interval (e.g., 1 s) and run the trilateration algorithm iteratively (Equations (7) and (8)) to get reasonable estimates. The fusion of trilateration and fingerprinting effectively curbs the divergence of location error (see Equation (12)).
The BLE-based method provides the baseline accuracy for the whole fusion positioning approach. According to a related study [12] comparing the positioning accuracy of iBeacons and WiFi APs, WiFi APs outperform iBeacons on fingerprinting positioning accuracy. This study has pointed out the accuracy of WiFi positioning is 5 m 90% of the time but that of iBeacon is 5 m less than 70% of the time (Figure 16a).
We generated the cumulative distribution function (CDF) of our BLE-based positioning method (Figure 16b). It shows that the positioning accuracy can reach 4.50 m 90% of the time. This result indicates our BLE-based method boosts the positioning accuracy of iBeacon. This accuracy can be compared with that of WiFi positioning.
We also provide CDF graphs for the four cases (Figure 4). The CDF values of 90% are listed in Table 4. In 90% of the time, the CDF of inaccurate initial position are 3.78 and 3.54 m, respectively. They are better than the BLE-based method (4.50 m) when the smartphone is held steadily. The error brought by an initial position can be promptly corrected, and the entire trajectory’s accuracy is stable (see Figure 11c, Figure 13c, and Figure 17). The influence of the turns are limited as well. The accuracy of real-time tracking is competitive with the fingerprinting methods of iBeacon or WiFi APs in literature [11].
For the case of disturbed orientations, swaying the phone certainly enlarges the positioning error of PDR. Such errors have also been alleviated, and the RMSE of the both routes are 5.84 and 5.88 m, respectively (see Figure 17). The real-time tracking results are still not far away from the actual motions (see Figure 12c and Figure 14c). In practice, we do not expect the gyroscope sensor in smartphone can always be calibrated. The experiments with noisy headings confirm our fusion positioning approach can be applied to noisy situations.
The performance of orientation filtering was also investigated. To check the varied angle at each turn, we computed the differences between the headings around turn locations. The angle differences at the three U-turns of Route 2 are listed in Table 5. In the case of inaccurate initial position, the angle difference at the first U-turn (column ‘1st turn raw value’) contains obvious error (−208.206°), while the values after orientation filtering are close to the true value (columns ’xx turn filtered value’). With disturbed orientations, the values after orientation filtering are not accurate. This comparison reflects our orientation filtering method may not be valid when the smartphone is swayed.
From the test results in Figure 11b, Figure 12b, Figure 13b and Figure 14b, we find the orientation filter can only locally correct locations of each route segment. This orientation error is reduced when the PDR is conducted in a short range and frequently corrected by the BLE-based localization. The PDR enriches the route details during the tracking. Accordingly, ‘jumping’ cases are reduced as well (see Figure 11c).
Regarding the PDR algorithm, we also compared the accuracy of step counting between our method and the Android built-in step counter. Table 6 presents the average of four tests for each route. In this case, the step counts of our PDR algorithm is more accurate than the built-in counter. There are only slight differences between our counts and the actual steps, and these counts are sufficient for us to generate PDR results at a high accuracy.

6. Conclusions

This paper proposes a low-cost and real-time approach of indoor positioning based on iBeacon and PDR. It consists of several parts: (1) the BLE-based method of iBeacon, which fuses trilateration and fingerprinting; (2) a PDR method that considers filtering on heading orientation; and (3) the fusion approach of iBeacon and PDR to correct ‘jumping’ positions and PDR drifting error.
This approach was implemented in an Android smartphone, and we conducted real-time positioning experiments in a building. Real-time tracking tests were conducted on routes with sharp turns, and we considered the errors in initial positions and heading orientations. The experiments show the positioning accuracy of our BLE-based method is the RMSE of 2.75 m. The tests with inaccurate initial positions in two routes result in the RMSE of 2.39 and 2.22 m, respectively; while the other tests with disturbed orientations result in the RMSE of 3.48 and 3.66 m for the two routes. The results demonstrate our fusion method can improve the accuracy of real-time trajectories and alleviate the influence of inaccurate initial positions, systematic drifting, and orientation noises. The BLE-based positioning method provides the baseline accuracy, and the PDR method smooths real-time trajectories and enhances their continuity.
In the future, we plan to investigate RSSI fluctuation of iBeacon and improve its ranging accuracy. In addition, we will conduct more experiments in larger indoor spaces with sparse iBeacon configuration, and investigate the compensation to ranging error in longer distances. As smartphone can be held in different positions (horizontally or vertically in hand or in pocket), and these positions may lead to wrong altitude measurements for PDR, we will further develop the PDR algorithm to ensure accurate headings with different smartphone positions.
Thus far, we have adopted little restriction for indoor positioning since we intend to present the improvement from raw measurements of iBeacons and smartphone sensors. Thus, specific human motion model could be introduced to predict the movement of pedestrians, and the transfer probability of different locations would be considered with map constraints. Other fusion approaches for indoor positioning will be investigated for real-time applications as well (e.g., grid-based filters). Finally, different data sources can be introduced for fusion such as geomagnetic data.

Author Contributions

Conceptualization, L.L. and B.L.; Methodology, L.L., B.L., and L.Y.; Supervision, B.L.; Writing—Original draft, L.L., B.L., and T.L.; and Writing—Review and editing, L.Y. and T.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China (41874030), The Scientific and Technological Innovation Plan from Shanghai Science and Technology Committee (18511101801), The National Key Research and Development Program of China (2016YFB0501802), China Postdoctoral Science Foundation (grant number 2019M651581), and The Fundamental Research Funds for the Central Universities.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BLEBluetooth Low Energy
KFKalman filter
NLOSnon-line-of-sight
RSSreceived signal strength
RSSIreceived signal strength indication
RMSEroot mean squared error
PDRpedestrian dead reckoning
CDFcumulative distribution function

References

  1. Bilke, A.; Sieck, J. Using the Magnetic Field for Indoor Localisation on a Mobile Phone. In Progress in Location-Based Services; Krisp, J.M., Ed.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 195–208. [Google Scholar]
  2. Chen Ruizhi, C.L. Indoor Positioning with Smartphones:The State-of-the-art and the Challenges. Acta Geod. Cartogr. Sin. 2017, 46, 1316. [Google Scholar] [CrossRef]
  3. Pivato, P.; Palopoli, L.; Petri, D. Accuracy of RSS-Based Centroid Localization Algorithms in an Indoor Environment. IEEE Trans. Instrum. Meas. 2011, 60, 3451–3460. [Google Scholar] [CrossRef] [Green Version]
  4. Wan, J.; Yu, N.; Feng, R.; Wu, Y.; Su, C. Localization refinement for wireless sensor networks. Comput. Commun. 2009, 32, 1515–1524. [Google Scholar] [CrossRef]
  5. Deng, Z.A.; Guofeng, W.; Ying, H.; Di, W. Heading Estimation for Indoor Pedestrian Navigation Using a Smartphone in the Pocket. Sensors 2015, 15, 21518–21536. [Google Scholar] [CrossRef] [PubMed]
  6. Leppäkoski, H.; Collin, J.; Takala, J. Pedestrian Navigation Based on Inertial Sensors, Indoor Map, and WLAN Signals. J. Signal Process. Syst. 2013, 71, 287–296. [Google Scholar] [CrossRef]
  7. Li, Y.; Zhuang, Y.; Lan, H.; Zhou, Q.; Niu, X.; El-Sheimy, N. A hybrid WiFi/magnetic matching/PDR approach for indoor navigation with smartphone sensors. IEEE Commun. Lett. 2015, 20, 169–172. [Google Scholar] [CrossRef]
  8. Li, W.; Wei, D.; Lai, Q.; Li, X.; Yuan, H. Geomagnetism-Aided Indoor WiFi Radio-Map Construction via Smartphone Crowdsourcing. Sensors 2018, 18, 1462. [Google Scholar] [CrossRef] [Green Version]
  9. Kanaris, L.; Kokkinis, A.; Liotta, A.; Stavrou, S. Fusing bluetooth beacon data with WiFi radiomaps for improved indoor localization. Sensors 2017, 17, 812. [Google Scholar] [CrossRef] [Green Version]
  10. Fard, H.K.; Chen, Y.; Son, K.K. Indoor positioning of mobile devices with agile iBeacon deployment. In Proceedings of the 2015 IEEE 28th Canadian Conference on Electrical and Computer Engineering (CCECE), Halifax, NS, Canada, 3–6 May 2015; pp. 275–279. [Google Scholar] [CrossRef]
  11. Faragher, R.; Harle, R. Location Fingerprinting with Bluetooth Low Energy Beacons. IEEE J. Sel. Areas Commun. 2015, 33, 2418–2428. [Google Scholar] [CrossRef]
  12. Yang, L.; Li, B.; Li, H.; Shen, Y. iBeacon/WiFi Signal Characteristics Analysis for Indoor Positioning Using Mobile Phone. In China Satellite Navigation Conference (CSNC) 2017 Proceedings; Sun, J., Liu, J., Yang, Y., Fan, S., Yu, W., Eds.; Springer: Singapore, 2017; Volume 1, pp. 405–416. [Google Scholar]
  13. Chen, Z.; Zhu, Q.; Soh, Y.C. Smartphone inertial sensor-based indoor localization and tracking with iBeacon corrections. IEEE Trans. Ind. Inform. 2016, 12, 1540–1549. [Google Scholar] [CrossRef]
  14. Jenny, R.; Peilin, Z.; Mohamed, A.; Oliver, T. An Improved BLE Indoor Localization with Kalman-Based Fusion: An Experimental Study. Sensors 2017, 17, 951. [Google Scholar]
  15. Cho, S.Y. Localization of the arbitrary deployed APs for indoor wireless location-based applications. IEEE Trans. Consum. Electron. 2010, 56, 532–539. [Google Scholar] [CrossRef]
  16. Yang, Z.; Liu, Y. Quality of Trilateration: Confidence-Based Iterative Localization. IEEE Trans. Parallel Distrib. Syst. 2010, 21, 631–640. [Google Scholar] [CrossRef]
  17. Mirowski, P.; Milioris, D.; Whiting, P.; Ho, T.K. Probabilistic radio-frequency fingerprinting and localization on the run. Bell Labs Tech. J. 2014, 18, 111–133. [Google Scholar] [CrossRef]
  18. Pelant, J.; Tlamsa, Z.; Benes, V.; Polak, L.; Kaller, O.; Bolecek, L.; Kufa, J.; Sebesta, J.; Kratochvil, T. BLE device indoor localization based on RSS fingerprinting mapped by propagation modes. In Proceedings of the 27th International Conference Radioelektronika (RADIOELEKTRONIKA), Brno, Czech Republic, 19–20 April 2017; pp. 1–5. [Google Scholar] [CrossRef]
  19. Faragher, R.; Harle, R. An analysis of the accuracy of bluetooth low energy for indoor positioning applications. In Proceedings of the 27th International Technical Meeting of the Satellite Division of the Institute of Navigation, ION GNSS 2014, Tampa, FL, USA, 8–12 September 2014; Volume 1, pp. 201–210. [Google Scholar]
  20. Chen, J.; Zhang, Y.; Xue, W. Unsupervised Indoor Localization Based on Smartphone Sensors, iBeacon and WiFi. Sensors 2018, 18, 1378. [Google Scholar] [CrossRef] [Green Version]
  21. Yan, L.; Hoeber, O.; Chen, Y. Enhancing WiFi fingerprinting for indoor positioning using human-centric collaborative feedback. Hum. Centric Comput. Inf. Sci. 2013, 3, 2. [Google Scholar]
  22. Woo, S.; Jeong, S.; Mok, E.; Xia, L.; Choi, C.; Pyeon, M.; Heo, J. Application of WiFi-based indoor positioning system for labor tracking at construction sites: A case study in Guangzhou MTR. Autom. Constr. 2011, 20, 3–13. [Google Scholar] [CrossRef]
  23. Hossain, A.M.; Soh, W.S. A survey of calibration-free indoor positioning systems. Comput. Commun. 2015, 66, 1–13. [Google Scholar] [CrossRef]
  24. Li, F.; Zhao, C.; Ding, G.; Gong, J.; Liu, C.; Zhao, F. A Reliable and Accurate Indoor Localization Method Using Phone Inertial Sensors. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, UbiComp ’12, New York, NY, USA, 5–8 September 2012; pp. 421–430. [Google Scholar] [CrossRef]
  25. Qian, J.; Ma, J.; Ying, R.; Liu, P.; Ling, P. An improved indoor localization method using smartphone inertial sensors. In Proceedings of the International Conference on Indoor Positioning and Indoor Navigation, Montbeliard-Belfort, France, 28–31 October 2013; pp. 1–7. [Google Scholar] [CrossRef]
  26. Zhou, Y.; Zheng, X.; Xiong, H.; Chen, R. Robust Indoor Mobile Localization with a Semantic Augmented Route Network Graph. ISPRS Int. J. Geo-Inf. 2017, 6, 221. [Google Scholar] [CrossRef] [Green Version]
  27. Park, J.; Chen, J.; Cho, Y.K. Self-corrective knowledge-based hybrid tracking system using BIM and multimodal sensors. Adv. Eng. Inf. 2017, 32, 126–138. [Google Scholar] [CrossRef] [Green Version]
  28. Sheng, G.; Hanjiang, X.; Xianwei, Z.; Yan, Z. Activity Recognition and Semantic Description for Indoor Mobile Localization. Sensors 2017, 17, 649. [Google Scholar]
  29. Hafner, P.; Moder, T.; Wieser, M.; Bernoulli, T. Evaluation of smartphone-based indoor positioning using different Bayes filters. In Proceedings of the International Conference on Indoor Positioning and Indoor Navigation, Montbeliard-Belfort, France, 28–31 October 2013; pp. 1–10. [Google Scholar] [CrossRef]
  30. Akl, R.; Tummala, D.; Li, X. Indoor propagation modeling at 2.4 GHz for IEEE 802.11 networks. In Proceedings of the Sixth IASTED International Multi-Conference on Wireless and Optical Communications: Conference on Wireless Networks and Emerging Technologies, Banff, AB, Canada, 3–5 July 2006. [Google Scholar]
Figure 1. The data fusion workflow of Bluetooth Low Energy (BLE) positioning and pedestrian dead reckoning (PDR) locations.
Figure 1. The data fusion workflow of Bluetooth Low Energy (BLE) positioning and pedestrian dead reckoning (PDR) locations.
Applsci 10 02003 g001
Figure 2. Coordinate calculation of each step in PDR.
Figure 2. Coordinate calculation of each step in PDR.
Applsci 10 02003 g002
Figure 3. Axes of Android phone.
Figure 3. Axes of Android phone.
Applsci 10 02003 g003
Figure 4. The vector module of accelerator readings for walking.
Figure 4. The vector module of accelerator readings for walking.
Applsci 10 02003 g004
Figure 5. Heading orientation deviation after a turn.
Figure 5. Heading orientation deviation after a turn.
Applsci 10 02003 g005
Figure 6. Detection of turns in gyroscope data.
Figure 6. Detection of turns in gyroscope data.
Applsci 10 02003 g006
Figure 7. The test floor.
Figure 7. The test floor.
Applsci 10 02003 g007
Figure 8. Two routes in the testbed. The red dot indicates the starting location and the black lines represent the ground truth: (a) Route 1 contains two U-turns and one left turn; and (b) Route 2 includes three U-turns.
Figure 8. Two routes in the testbed. The red dot indicates the starting location and the black lines represent the ground truth: (a) Route 1 contains two U-turns and one left turn; and (b) Route 2 includes three U-turns.
Applsci 10 02003 g008
Figure 9. Reference points for fingerprinting: (a) points for fingerprinting in offline phase; and (b) four points for the adjustment of ranging parameters.
Figure 9. Reference points for fingerprinting: (a) points for fingerprinting in offline phase; and (b) four points for the adjustment of ranging parameters.
Applsci 10 02003 g009
Figure 10. The BLE-derived locations. The black lines indicate the ground truth: (a) Route 1; and (b) Route 2.
Figure 10. The BLE-derived locations. The black lines indicate the ground truth: (a) Route 1; and (b) Route 2.
Applsci 10 02003 g010
Figure 11. Trajectory of Route 1 with an inaccurate initial position, where red lines represent the trail of user and black lines indicate the ground truth: (a) PDR locations; (b) PDR locations after orientation filtering; and (c) the path derived from the fusion workflow.
Figure 11. Trajectory of Route 1 with an inaccurate initial position, where red lines represent the trail of user and black lines indicate the ground truth: (a) PDR locations; (b) PDR locations after orientation filtering; and (c) the path derived from the fusion workflow.
Applsci 10 02003 g011
Figure 12. Trajectory of Route 1 when the heading was randomly swayed: (a) PDR locations; (b) PDR locations after orientation filtering; and (c) the path derived from the fusion workflow.
Figure 12. Trajectory of Route 1 when the heading was randomly swayed: (a) PDR locations; (b) PDR locations after orientation filtering; and (c) the path derived from the fusion workflow.
Applsci 10 02003 g012
Figure 13. Trajectory of Route 2 with an inaccurate initial position: (a) PDR locations; (b) PDR locations after orientation filtering; and (c) the path derived from the fusion workflow.
Figure 13. Trajectory of Route 2 with an inaccurate initial position: (a) PDR locations; (b) PDR locations after orientation filtering; and (c) the path derived from the fusion workflow.
Applsci 10 02003 g013
Figure 14. Trajectory of Route 2 when the heading was randomly swayed: (a) PDR locations; (b) PDR locations after orientation filtering; and (c) the path derived from the fusion workflow.
Figure 14. Trajectory of Route 2 when the heading was randomly swayed: (a) PDR locations; (b) PDR locations after orientation filtering; and (c) the path derived from the fusion workflow.
Applsci 10 02003 g014
Figure 15. RSSI Histogram of two BLE beacons: (a) Beacon ‘AC:23:3F:20:8D:67’; and (b) Beacon ‘AC:23:3F:20:8D:69’.
Figure 15. RSSI Histogram of two BLE beacons: (a) Beacon ‘AC:23:3F:20:8D:67’; and (b) Beacon ‘AC:23:3F:20:8D:69’.
Applsci 10 02003 g015
Figure 16. Comparison of CDF between a previous study and this paper: (a) CDF of positioning accuracy of iBeacons and WiFi APs [12]; and (b) CDF of our BLE-based method.
Figure 16. Comparison of CDF between a previous study and this paper: (a) CDF of positioning accuracy of iBeacons and WiFi APs [12]; and (b) CDF of our BLE-based method.
Applsci 10 02003 g016
Figure 17. CDF of the both routes. Orange color indicates the case of inaccurate initial positions, while blue represents the case of disturbed orientations: (a) Route 1; and (b) Route 2.
Figure 17. CDF of the both routes. Orange color indicates the case of inaccurate initial positions, while blue represents the case of disturbed orientations: (a) Route 1; and (b) Route 2.
Applsci 10 02003 g017
Table 1. Accuracy of the positioning method fusing trilateration and fingerprinting of iBeacon.
Table 1. Accuracy of the positioning method fusing trilateration and fingerprinting of iBeacon.
RMSE XY (m)RMSE X (m)RMSE Y (m)Mean Error X (m)Mean Error Y (m)
Route 12.712.570.851.900.60
Route 22.772.690.662.060.50
All2.752.650.742.000.53
Table 2. Positioning accuracy of trilateration and fingerprinting.
Table 2. Positioning accuracy of trilateration and fingerprinting.
Trilateration RMSE (m)Fingerprinting RMSE (m)
All3.423.22
Table 3. Positioning accuracy of the four cases of Routes 1 and 2.
Table 3. Positioning accuracy of the four cases of Routes 1 and 2.
CaseRMSE XY (m)RMSE X (m)RMSE Y (m)Mean
Error X (m)
Mean
Error Y (m)
Route 1 – inaccurate start2.392.270.771.820.64
Route 1 – disturbed orientation3.483.340.982.580.80
Route 2 – inaccurate start2.222.130.601.710.48
Route 2 – disturbed orientation3.663.540.922.740.70
Table 4. CDF of the four cases.
Table 4. CDF of the four cases.
BLERoute 1
Inaccurate Start
Route 1
Disturbed Orientation
Route 2
Inaccurate Start
Route 2
Disturbed Orientation
CDF 90% (m)4.503.785.843.545.88
Table 5. Angle differences of heading in Route 2.
Table 5. Angle differences of heading in Route 2.
1st Turn
Raw
Value
1st Turn
Filtered
Value
1st Turn
Truth
2nd Turn
Raw
Value
2nd Turn
Filtered
Value
2nd Turn
Truth
3rd Turn
Raw
Value
3rd Turn
Filtered
Value
3rd Turn
Truth
inaccurate start (°)−208.206−199.900−180172.384176.029180−179.480−178.770−180
disturbed orientation (°)−178.126−185.673−180176.291135.914180−152.557−149.704−180
Table 6. Step counts of our PDR algorithm and Android API.
Table 6. Step counts of our PDR algorithm and Android API.
Our MethodAndroidGround Truth
Route 110489102
Route 2179174184

Share and Cite

MDPI and ACS Style

Liu, L.; Li, B.; Yang, L.; Liu, T. Real-Time Indoor Positioning Approach Using iBeacons and Smartphone Sensors. Appl. Sci. 2020, 10, 2003. https://doi.org/10.3390/app10062003

AMA Style

Liu L, Li B, Yang L, Liu T. Real-Time Indoor Positioning Approach Using iBeacons and Smartphone Sensors. Applied Sciences. 2020; 10(6):2003. https://doi.org/10.3390/app10062003

Chicago/Turabian Style

Liu, Liu, Bofeng Li, Ling Yang, and Tianxia Liu. 2020. "Real-Time Indoor Positioning Approach Using iBeacons and Smartphone Sensors" Applied Sciences 10, no. 6: 2003. https://doi.org/10.3390/app10062003

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop