Next Article in Journal
Instability Compensation of Recording Interferometer in Phase-Sensitive OTDR
Next Article in Special Issue
Place Recognition through Multiple LiDAR Scans Based on the Hidden Markov Model
Previous Article in Journal
Helical Gearbox Defect Detection with Machine Learning Using Regular Mesh Components and Sidebands
Previous Article in Special Issue
Enhancing Pure Inertial Navigation Accuracy through a Redundant High-Precision Accelerometer-Based Method Utilizing Neural Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Enhanced Indoor Three-Dimensional Localization System with Sensor Fusion Based on Ultra-Wideband Ranging and Dual Barometer Altimetry †

1
Department of Mechatronics Engineering, Hanyang University, Ansan 15588, Republic of Korea
2
Robotics Department, Hanyang University ERICA, Ansan 15588, Republic of Korea
*
Author to whom correspondence should be addressed.
This paper is an extension version of the conference paper: Bao, L.; Li, K.; Li, W.; Shin, K.; Kim, W. A Sensor Fusion Strategy for Indoor Target Three-dimensional Localization based on Ultra-Wideband and Barometric Altimeter Measurements. In Proceedings of the 2022 19th International Conference on Ubiquitous Robots (UR), Jeju, Republic of Korea, 4–6 July 2022.
Sensors 2024, 24(11), 3341; https://doi.org/10.3390/s24113341
Submission received: 23 April 2024 / Revised: 12 May 2024 / Accepted: 21 May 2024 / Published: 23 May 2024

Abstract

:
Accurate three-dimensional (3D) localization within indoor environments is crucial for enhancing item-based application services, yet current systems often struggle with localization accuracy and height estimation. This study introduces an advanced 3D localization system that integrates updated ultra-wideband (UWB) sensors and dual barometric pressure (BMP) sensors. Utilizing three fixed UWB anchors, the system employs geometric modeling and Kalman filtering for precise tag 3D spatial localization. Building on our previous research on indoor height measurement with dual BMP sensors, the proposed system demonstrates significant improvements in data processing speed and stability. Our enhancements include a new geometric localization model and an optimized Kalman filtering algorithm, which are validated by a high-precision motion capture system. The results show that the localization error is significantly reduced, with height accuracy of approximately ±0.05 m, and the Root Mean Square Error (RMSE) of the 3D localization system reaches 0.0740 m. The system offers expanded locatable space and faster data output rates, delivering reliable performance that supports advanced applications requiring detailed 3D indoor localization.

1. Introduction

Indoor localization technology is essential for enabling precise personnel tracking and efficient robot navigation in complex environments [1,2,3]. Its applications significantly improve efficiency and convenience [4]. Indoor localization can be divided into two categories according to the size of the localization area. The first category addresses larger areas, such as halls or multi-story buildings [5,6]. This scenario often requires multiple anchor points as references [7], or multi-sensor fusion [8]. The localization requirements are meter-level plane localization, and floor identification. The second category focuses on high-precision localization within a slightly smaller indoor area. Some existing technologies have already achieved indoor two-dimensional (2D) localization, for example, visual features [9], Light Detection and Ranging (LiDAR) [10,11], and wireless localization [12]. However, with the advancement of indoor three-dimensional (3D) localization sensor technology, emerging services like indoor drone control [2,13], virtual reality (VR) [14,15], or augmented reality (AR) [16] experiences demand higher accuracy in location. They also require more stringent height estimation and more frequent updates of information [17,18]. Additionally, while exploring the potential of these technologies, it is important to consider the cost factor [19] to ensure that technological solutions have the potential for widespread application. Based on our previous research [20], this study proposes an enhanced indoor 3D localization system, which fuses ultra-wideband (UWB) sensor ranging and barometric pressure (BMP) sensor height measurement.
Commonly used wireless sensors for ranging and localization include Wi-Fi, Bluetooth, and Radio Frequency Identification (RFID) [21]. They estimate the target’s location by measuring signal strength or time differences [1], with meter-level accuracy. Although they satisfy commercial and industrial needs to a certain extent, they still face challenges in positioning accuracy and stability in complex indoor environments. Additionally, visual positioning systems and LiDAR offer higher accuracy but their implementation costs and demands on computational resources limit their potential for widespread application [9,10]. In the realm of indoor localization, technology stands out for its high precision in distance measurement by transmitting ultra-short pulse signals [22]. This method achieves centimeter-level ranging accuracy, surpassing traditional Wi-Fi and Bluetooth technologies in both accuracy and interference resistance [3,23,24]. The target’s 3D location can be calculated based on the distance measurement information between the target and each UWB base station. However, despite the superior distance measurement capabilities of UWB, existing research indicates persistent errors in height estimation [13], underscoring the need for continued research to optimize localization algorithms and enhance system performance.
In contrast, barometric pressure sensors (BMPs) enhance height measurement accuracy [25]. The earliest mercury barometer was invented in the 17th century by physicist Evangelista Torricelli [26]. Thanks to ongoing technological advancements, modern electronic barometers [27], which convert barometric pressure into electrical signals, are widely used. These BMP sensors can precisely measure barometric pressure and output digital signals, thereby estimating the corresponding height. By measuring atmospheric pressure changes, BMP sensors provide critical data for adjusting the vertical positioning in indoor localization systems, which is often flawed in UWB-only setups. However, the barometric pressure value at a given location can vary over time [6]. Outdoors, the shifting of high- and low-pressure weather systems can alter barometric pressure [28]. Especially strong convective weather accelerates changes in barometric pressure, leading to increased height drift errors [29]. Additionally, changes in air temperature cause air expansion or contraction, resulting in decreased or increased barometric pressure. In indoor environments, factors such as the building’s sealing, ventilation conditions, and the temperature difference between indoors and outdoors predominantly influence barometric pressure [30]. The barometric pressure in well-sealed indoor environments is relatively stable. In contrast, the opening and closing of doors and windows, along with the ventilation system, can cause fluctuations in indoor barometric pressure, which limits the single BMP sensor’s performance for height estimation in indoor environments.
Current indoor localization systems often fail to deliver accurate height measurements, a gap our system addresses by integrating BMP sensors known for their reliable barometric pressure measurements. This integration ensures indoor localization system enhances not only vertical but also horizontal localization precision. Based on our previous research [20], we have upgraded the hardware and software. By observing the trend of barometric pressure changes at different indoor locations over 30 min using BMP sensors, we confirmed once again that the drift of barometric pressure in the same indoor space tends to be consistent. In this study, we introduced an enhanced indoor 3D localization system. The barometric pressure values at the tag are transmitted to the main controller via Wi-Fi signals. The relative height between the tag and anchor is then calculated using derived formulas. The system’s main controller employs a dual-core processor with Wi-Fi capability, enhancing the rate and stability of data transfer and processing. The height measurements obtained from the dual BMP sensors are used to assist the UWB sensors in spatial ranging. The distance measurements from the UWB sensors are refined and corrected using a fitting equation. The localization system includes three anchors at the same height, where UWB sensors on each anchor measure the distances to the tag. The tag’s planar position is estimated through geometric modeling and the centroid method of triangles. Finally, Kalman filtering is applied to obtain the most accurate estimate of the tag’s location, resulting in a smoother and more precise localization trajectory. Compared to previous research, the new hardware support and software architecture optimization have enabled the system to obtain sensor information more quickly and stably. The dispersed anchor setup allows for spatial localization over a larger area, and the redesigned geometric localization model and Kalman filtering algorithm have further improved localization accuracy.
Relative to similar localization strategies [13,31], which typically exhibit height estimation errors exceeding 0.2 m, our dual BMP sensors-based method significantly improves accuracy in height estimation, thus enhancing the system’s 3D localization precision. This work’s main contributions are as follows:
  • Proposed and validated a method for estimating tag height based on dual BMP sensors, effectively compensating for most indoor barometric pressure deviations, and providing a more accurate height estimation than achievable with a single BMP sensor. For the challenge of indoor height estimation, our accuracy is approximately ±0.05 m, with an RMSE of 0.0282 m.
  • Developed a hardware framework that enhances the system to be more efficient and stable, with a localization output rate reaching 37 Hz, a nine-fold increase compared to earlier designs.
  • Our portable localization system covers a larger measurable range. The proposed geometric localization model and the Kalman filtering technique are empirically validated, showing a 2D localization RMSE of 0.0585 m, and a 3D localization RMSE of 0.0740 m. Compared to indoor localization systems with a similar number of anchors, ours offers an extended measurable range and superior accuracy.

2. Hardware System Design

The hardware system framework is depicted in Figure 1. The target (Tag) contains one UWB sensor and one BMP sensor. The indoor localization system features three anchors (Anchor (A), Anchor (B), Anchor (C)), each equipped with three UWB sensors and one BMP sensor to pinpoint the target’s (Tag) location. The main controller, ESP32 (manufactured by Espressif Systems, Shanghai, China) is responsible for receiving data from the sensors, calculating the tag’s location, and sending critical data to an external computer for data collection. The distances between the tag and each anchor are estimated by UWB technology. The ranging results are collected by the sub-controller of Anchor (A) and transmitted to the main controller via I2C communication. The BMP sensor on the tag measures the barometric pressure in real time, which is collected and processed by an ESP8266 controller (manufactured by Espressif Systems, China), and then sent to the main controller by Wi-Fi wireless communication. Furthermore, thanks to the dual-core processor of the main controller (ESP32), the Wi-Fi signal-reading process and the location calculation process do not interfere with each other, greatly enhancing the stability of the localization system.
Figure 2 shows the tag’s hardware design. The tag primarily consists of a controller (ESP8266), a BMP sensor, and a UWB sensor, all powered by a shared 5 V battery. The controller (ESP8266) and the BMP sensor are soldered onto a printed circuit board (PCB), with their I2C ports connected by the internal circuit. Finally, these components are assembled in a 3D-printed plastic frame. Additionally, a marker for the motion capture system (MCS) is also mounted on the top of the frame, which can be used for experimental validation. The anchors of the localization system are mounted on three tripods, which are powered by either a computer or batteries as shown in Figure 3. The UWB sensors in each anchor need to be adjusted to the same height using the tripods. Also, the proposed localization system is easy to set up in new indoor environments due to the portability of the tripods. And the specific parameters of the hardware devices in the designed system are detailed in Table 1.

3. Spatial Distance Measurement

3.1. Height Measurement with Dual BMP Sensors

The barometric pressure varies with changes in altitude and can drift over time. BMP sensors estimate height by measuring barometric pressure values. In a previous study [20], we proposed a method for estimating indoor target height using dual BMP sensors. In this new study, we have revisited the observation of barometric pressure variations at fixed indoor locations, and conducted re-validation in the updated localization system.

3.1.1. Observation of Indoor Barometric Pressure Measurements

In an office indoor environment, we collected barometric pressure data using two BMP sensors. The BMP sensor of Anchor (A) remained stationary, and initially, the tag’s BMP sensor was positioned close to Anchor (A) at the same height. A few seconds after initiating the timer, the tag moved to another location and remained stationary, with a horizontal distance of 2 m and a height drop to −0.5 m. This observation lasted for 30 min (i.e., 1800 s). Due to the significant noise in the raw measurement data from the BMP sensor, a weighted average filter was used to minimize the noise. The results of this observation are displayed in Figure 4.
The observation results show that there is a continuous drift in the indoor barometric pressure values over thirty minutes. After filtering, the data noise from the BMP sensors was reduced. However, even if the two sensors are of the same model, the barometric pressure values they measure at the same height location will still have minor differences P d i f f . To unify this deviation, the final output barometric pressure value of the tag P T a g ( t ) was adjusted by adding P d i f f as the following equation:
P d i f f = P A n c h o r ( 0 ) P T a g ( 0 ) ,
P T a g ( t ) = P T a g ( t ) + P d i f f ,
where P A n c h o r ( 0 ) is the initial barometric pressure value of Anchor (A), P T a g ( 0 ) is the initial barometric pressure value of the tag, and  P T a g ( t ) is the barometric pressure value of the tag measured at the current moment. After the tag’s location was lowered, its barometric pressure value increased. During the observed 30 min, the trends of barometric drift at the two stationary positions were almost identical.

3.1.2. Relative Height Estimation Based on Dual BMP Sensors

After obtaining the final barometric pressure data from the BMP sensors ( P T a g ( t ) and P A n c h o r ( t ) ), the height values of the tag and the anchor ( H T a g ( t ) and H A n c h o r ( t ) ) can be determined by the barometric height calculation [32] as follows:
H A n c h o r ( t ) = 1 P A n c h o r ( t ) P 0 1 5.257 · T + 273.15 0.0065 ,
H T a g ( t ) = 1 P T a g ( t ) P 0 1 5.257 · T + 273.15 0.0065 ,
where the reference barometric pressure P 0 is equal to Anchor (A)’s initial value P A n c h o r ( 0 ) . The T is the temperature value in °C. And the height data analyzed from this are presented in Figure 5, where the analyzed height values in the stationary location float with changes in barometric pressure.
After the tag descended to −0.5 m, the height results calculated by Equation (4) were also approximately −0.5 m. This confirms that BMP sensors possess a reasonable level of accuracy for measuring height changes over short periods in indoor environments. However, after thirty minutes of observation, the height data drift estimated by a single BMP sensor reached up to nearly 1.5 m at its maximum. Consequently, the results from a single sensor demonstrate considerable uncertainty after several minutes. Due to the similarity in barometric drift trends observed with the two BMP sensors in the indoor environment, we propose a method using dual BMP sensors to more accurately measure the indoor tag height H d u a l ( t ) as illustrated in Equations (5) and (6):
H d u a l ( t ) = H T a g ( t ) H A n c h o r ( t )
= P A n c h o r ( t ) P A n c h o r ( 0 ) 1 5.257 P T a g ( t ) P A n c h o r ( 0 ) 1 5.257 · K t e m p .
where P A n c h o r ( 0 ) is the initial barometric pressure of Anchor (A) and serves as the reference barometric pressure. In the indoor environment, the temperatures of the tag and the anchor are similar, so the temperature parameter value is uniformly set to K t e m p , which has a value of 44,330. The relative height was calculated as shown in red in Figure 5. After 30 min, the estimated tag height drifted by about 0.2 m, which is two-fifteenths of the error from a single sensor. This result has practical value for measuring tag height in indoor environments.
In the designed localization system, the filtered barometric pressure value from the tag is transmitted to the main controller via Wi-Fi. The time spent on Wi-Fi communication can impact the stability of the main process. Thus, this study leverages the dual-core processing capability of the controller (ESP32), dedicating one core specifically to handle Wi-Fi data transmission. Consequently, the data output of the localization system is accelerated and stabilized. Meanwhile, if the localization duration extends, the height estimated by the dual BMP sensors may still experience some drift. Therefore, the designed localization system includes a barometric pressure recalibration function. When the ranging part of the UWB sensor detects that the tag is very close to Anchor (A), the barometric pressure difference P d i f f is updated as follows:
P d i f f = P A n c h o r ( t ) P T a g ( t ) .
Meanwhile, the height estimation is further optimized in the designed 3D localization algorithm.

3.2. Distance Measurement Optimization for UWB Sensors

Patch-type UWB sensors, limited by their physical structure, may have ranging affected by orientation [33]. In this study, UWB sensors based on the new generation DW3000 chip (manufactured by Qorvo, Greensboro, NC, USA) are used for ranging. They also feature 2 dBi gain antennas to enhance the UWB sensors’ omnidirectional ranging capabilities. The ranging principle involves recording the time it takes to transmit and receive extremely short pulse signals, which is used to calculate the distance between two devices [22].
Similar to BMP sensors, UWB sensors are also subject to measurement noise. Noise data or sudden erroneous extreme values can compromise the accuracy of the localization system. Therefore, filters are applied to refine the raw data output from the UWB sensors, obtaining more stable distance measurements L F i l t e r . However, due to the hardware limitations of UWB sensors, filtered UWB measurements still show some errors compared to the actual distances. For example, the actual distance between two UWB sensors is 3 m, but the filtered output from the sensor devices is 3.18 m. Therefore, more measured and actual values need to be sampled to calibrate the UWB sensor [34,35].
Considering the range of indoor measurements, we sampled distance values at multiple fixed points from 0.1 m to 10 m. As illustrated in Figure 6, these measurement results were used to fit a calibration equation using Matlab’s Curve Fitting Toolbox, resulting in the following equation:
L U W B ( t ) = a · L F i l t e r ( t ) + b ,
where the fitting parameter a is 0.9954, parameter b is −16.02, and the confidence bound for the fitting coefficients is 95%. When the filtered sensor measurement L F i l t e r ( t ) is input in real time, the system outputs the calibrated value L U W B ( t ) . The final distance estimate obtained after calibration is closer to the real distance.

4. Indoor 3D Localization Method

For 3D localization of the tag, a minimum of four reference anchors is typically required, with additional anchors used to enhance localization accuracy [36]. In this study, we use three UWB sensors at the same height as the localization anchor points to establish a geometric localization model for tag 3D localization. Furthermore, BMP sensors are utilized to determine the tag’s height relative to a reference plane, as well as enhancing the height estimation’s accuracy. The designed localization system is approached as in Algorithm 1.
Algorithm 1 Indoor 3D localization method.
Input: Distance values measured by UWB sensors: L U W B A ( t ) , L U W B B ( t ) , L U W B C ( t ) .
   Barometric pressure values measured by BMP sensors: P A n c h o r ( t ) , P T a g ( t ) .
Output: The coordinates of the tag’s 3D indoor location: ( T a g x ( t ) , T a g y ( t ) , T a g z ( t ) ) .
1:
if  ( L U W B A ( t ) < A predefined close distance value for calibration )  then
2:
     P d i f f P A n c h o r ( t ) P T a g ( t )
3:
end if
4:
P T a g ( t ) P T a g ( t ) + P d i f f
5:
H d u a l ( t ) Dual BMP sensor height calculation with Equation ( 6 )
6:
P z ( t ) H d u a l ( t )
7:
L ¯ U W B A ( t ) , L ¯ U W B B ( t ) , L ¯ U W B C ( t ) H d u a l ( t ) , L U W B A ( t ) , L U W B B ( t ) , L U W B C ( t ) with Equation ( 9 )
8:
P x ( t ) , P y ( t ) Geometric Localization Model ( L ¯ U W B A ( t ) , L ¯ U W B B ( t ) , L ¯ U W B C ( t ) )
9:
T a g x ( t ) , T a g y ( t ) , T a g z ( t ) Kalman Filter Function ( P x ( t ) , P y ( t ) , P z ( t ) )
10:
return  ( T a g x ( t ) , T a g y ( t ) , T a g z ( t ) )

4.1. Geometric Localization Model

Anchors for localization are distributed in an indoor space. The UWB sensors on the three anchors are adjusted to the same height H T r i p o d using tripods. In this horizontal plane, Anchor (A) is positioned midway between Anchor (B) and Anchor (C). As depicted in Figure 7, the geometry model is based on these three anchors. The tag location is denoted as point P . The locations of the three anchors are denoted as points A , B , C . The point O is the midpoint of line segment B C . Therefore, the localization system is defined with the x-axis in the direction of OA , the y-axis in the direction of BC , and the z-axis facing vertically upwards. The lines A P , B P , C P represent the true length of the tag from each anchor. The projection of the tag point P in the plane is the point K , and the coordinates ( P x , P y , P z ) of point P need to be computed.
The UWB sensors at each anchor point measure the distance ( L U W B A ( t ) , L U W B B ( t ) , L U W B C ( t ) ) from the tag point P in real time. At a given moment, these moments are denoted as A P ^ , B P ^ , and C P ^ , respectively. To calculate the tag’s coordinates ( P x ( t ) , P y ( t ) ) in the X-axis and Y-axis, it is necessary to project the UWB measurements in the real 3D environment to the anchors’ planes. Equation (9) employs the Pythagorean theorem to calculate the projected distance values from the UWB sensors, integrating the tag’s height information estimated by dual BMP sensors as the PK in the geometric model. The lengths obtained are L ¯ U W B A ( t ) , L ¯ U W B B ( t ) , L ¯ U W B C ( t ) , corresponding to line segments A K ^ , B K ^ , and C K ^ , respectively:
L ¯ U W B ( t ) = L U W B ( t ) 2 H d u a l ( t ) 2 .
Within the plane of the anchors, circles are drawn with the anchor as the center and the measured projected distance as the radius, respectively. For instance, a circle is drawn with the Anchor C as the center and C K ^ as the radius. Ideally, the three circles should intersect at point K . However, since the lengths A P ^ , B P ^ , and C P ^ estimated by the UWB sensor are approximations, the circles might intersect at multiple points as depicted in Figure 8. The three intersections P ¯ B C , P ¯ A C , P ¯ B C that are closest to each other are selected to form a triangle, and its centroid is used as the estimated point K . To determine the centroid’s position, each of the two circles needs to provide a vertex for the final triangle.
The geometric calculation starts by determining whether these circles intersect, based on the locations of points A , B , and C , and the lengths of A P ^ , B P ^ , and C P ^ . When each pair of circles has two intersection points, a total of six intersection points are generated. Additionally, due to the impact of UWB data noise, the measured results for A P ^ , B P ^ , and C P ^ may result in the circles not intersecting. In this case, the vertex between the two circles needs to be redefined. Therefore, this study addresses several potential geometric scenarios separately. As an example, the intersection of circles A and C is divided into two cases for solving the intersection’s result. Then, the results of the other two sets of two circles are combined to find the triangle’s centroid.
Case 1: Two circles intersect at two points.
There are two intersections when two circles intersect. As shown in Figure 9, the intersection points P 1 and P 2 to be found are symmetrical about the line segment A C . The area of A C P 1 is calculated using Heron’s formula, and the lengths of P 1 G and C G are determined as shown in Equations (10)–(12).
S A C P 1 = f · ( f A C ) · ( f A P 1 ) · ( f C P 1 ) , ( w h e r e , f = 0.5 · ( A C + A P 1 + C P 1 ) ) .
P 1 G = 2 · S A C P 1 / A C ,
C G = C P 1 2 P 1 G 2 ,
Then, the coordinates of point G can be obtained by the principle of similar triangles, i.e., by calculating the ratio of the lengths of CG to AC. Therefore, based on the point G , the coordinates of the points P 1 and P 2 in this plane are solved from the perpendicular relation and the length of P 1 G . Then, the point P 2 inside the circle B is chosen as the coordinates of P ¯ A C . Using the same method, P ¯ B C and P ¯ B C are determined through geometric calculation.
Case 2: Two circles are tangent or non-intersecting.
As in Figure 10, the inclusion relationship between circle A and circle C is determined by the length relationship between A P ^ , C P ^ , and A C . When two circles are tangents, there is a shared tangent point that can be used as a vertex ( P ¯ A C ) of the required triangle. And when two circles do not have an intersection point, it can be the case that the two circles do not contain each other, or that one circle contains the other. In this case, it is necessary to determine the point P ¯ A C in these unsolved cases by the percentage of A P ^ and C P ^ .
The cases where the two circles are tangents or non-intersecting can be calculated as Equations (13) and (14):
P ¯ A C x = A x + ( C x A x ) · ( A P ^ / ( A P ^ + C P ^ ) ) ( if A C ( A P ^ + C P ^ ) ) A x ( O A / A C ) · ( A P ^ + A C + C P ^ ) / 2 ( if A P ^ ( A C + C P ^ ) ) C x + ( O A / A C ) · ( A P ^ + A C + C P ^ ) / 2 ( if C P ^ ( A C + A P ^ ) ) ,
P ¯ A C y = A y + ( C y A y ) · ( A P ^ / ( A P ^ + C P ^ ) ) ( if A C ( A P ^ + C P ^ ) ) A y + ( O C / A C ) · ( A P ^ + A C + C P ^ ) / 2 ( if A P ^ ( A C + C P ^ ) ) C y ( O C / A C ) · ( A P ^ + A C + C P ^ ) / 2 ( if C P ^ ( A C + A P ^ ) ) ,
where ( A x , A y ) represents the coordinate of point A . ( C x , C y ) represents the coordinate of point C , and ( P A C x , P A C y ) is the coordinate of point P ¯ A C . The three scenarios in the piecewise function correspond to the following: the two circles do not encompass each other; circle A contains circle C; and circle C contains circle A. Here, when the judgment in the equation is an equal sign, it represents scenarios where the two circles are tangents.
Similarly, the results of geometrically resolving circles A and B and circles B and C, respectively, result in a total of three intersections: P ¯ B C , P ¯ A C , and P ¯ B C . Figure 11 shows a possible scenario where circles A and C do not intersect but circle B intersects circles A and C, respectively. In addition, for any other possible scenarios, it is possible to solve for three sets of two circles based on Case 1 and Case 2, respectively, thus forming a triangle that can be used to solve for the centroid K.
Finally, by the centroid method, the location ( P x , P y ) of the point P projected on the anchors’ plane is estimated. Then, the tag’s height H d u a l ( t ) estimated by the dual BMP sensor is used again. The tag’s 3D coordinates are denoted as:
P x = 1 3 ( P ¯ A B x + P ¯ A C x + P ¯ B C x ) P y = 1 3 ( P ¯ A B y + P ¯ A C y + P ¯ B C y ) P z = H d u a l .

4.2. Optimization of Location Estimation Based on Kalman Filtering

Due to the fluctuations in the data measured by the sensors, the tag’s location calculated using the geometric model may deviate from the actual value. The Kalman filtering algorithm is an optimization estimation algorithm that is very effective in dealing with linear dynamic systems containing Gaussian noise. To optimize the location estimation, this study employs the Kalman filtering method to predict and update the optimal coordinates of the tag.
First, the time node information of the running loop in the localization system can be used to estimate the tag’s velocity v ( t 1 ) at the last moment ( t 1 ) :
v ( t 1 ) = v x ( t 1 ) v y ( t 1 ) v z ( t 1 ) = Δ P x ( t 1 ) Δ P y ( t 1 ) Δ P z ( t 1 ) · 1 Δ t ( t 1 ) ,
where the tag’s velocity v ( t 1 ) in the three dimensions consists of the components [ v x ( t 1 ) , v y ( t 1 ) , v z ( t 1 ) ]′. The tag’s height P z ( t 1 ) is equal to the height estimated by the dual BMP sensors H d u a l ( t 1 ) . Δ t denotes the change, with the change in locations represented by [ Δ P x ( t 1 ) , Δ P y ( t 1 ) , Δ P z ( t 1 ) ]′.
Assume that the labels have similar velocities in a very short period of time. Then, the predicted location X ( t | t 1 ) at the current moment is as follows:
U ( t ) = v ( t 1 ) · Δ t ( t ) ,
X ( t | t 1 ) = X ( t 1 | t 1 ) + U ( t ) .
where the subscript ( t | t 1 ) denotes the estimation of the ( t ) moment based on the ( t 1 ) moment. The X ( t 1 | t 1 ) is the optimal solution for the location estimated at the previous moment, [ P x ( t 1 ) , P y ( t 1 ) , P z ( t 1 ) ]′. And the control input U ( t ) is the predicted location increment.
Then, the update part of the Kalman filter for the localization data is as follows:
P ( t | t 1 ) = A · P ( t 1 | t 1 ) · A T + Q ,
K ( t ) = P ( t | t 1 ) H T ( H P ( t | t 1 ) H T + R ) 1 ,
Z ( t ) = [ P x ( t ) , P y ( t ) , P z ( t ) ] ,
X ^ ( t | t ) = X ^ ( t | t 1 ) + K ( t ) · ( Z ( t ) X ( t | t 1 ) ) ,
P ( t | t ) = ( I K ( t ) ) · P ( t | t 1 ) .
Herein, P represents the prediction error covariance, K ( t ) is the Kalman gain, and X ^ ( t | t ) is the state estimation. A is the state transition matrix, H is the observation matrix, and I is a 3 × 3 identity matrix. Z ( t ) is the actual measurement at the current moment, and its data are the tag’s coordinate data calculated by the geometric localization model in Equation (15). Referring to the tests of UWB and BMP sensors in Section 3, the process noise covariance matrix Q in the Kalman filter is set to d i a g ( [ 1 × 10 4 , 1 × 10 4 , 1 × 10 3 ] ) , and the measurement noise covariance matrix R is set to d i a g ( [ 4 × 10 4 , 4 × 10 4 , 5 × 10 3 ] ) . Finally, the positioning system’s output at the current moment is X ^ ( t | t ) , i.e., the estimated location of the tag [ T a g x ( t ) , T a g y ( t ) , T a g z ( t ) ]′.
In addition, at the start of the localization system operation, the Kalman filter cannot function accurately immediately due to the lack of initial measurement data for prediction. Therefore, at the first moment, the filter does not perform computations, and its output is directly set to the input positioning data. The initial error covariance matrix is set as a diagonal matrix diag[1, 1, 1], and from the second moment onwards, the filter begins to operate formally.
This section introduced the proposed indoor 3D localization method utilizing sensor fusion. UWB sensors at three anchors calculated distances to the tag, while dual BMP sensors provided height estimates, enabling the mapping of UWB sensors’ range measurements onto a 2D plane. The geometric localization model, based on these measurements, integrated various relationships and compensated for errors. The tag’s location on the anchor’s plane was determined using the centroid method, and Kalman filtering was applied to enhance the accuracy of the 3D location estimates. The scheme will be validated experimentally in the following section.

5. Experiments and Evaluations

5.1. Experimental Setup

The experimental validations were located in indoor environments. One setting was in a laboratory equipped with a motion capture system, and the other was a hall with more space. The experimental setup for the two scenarios is shown in Figure 12. The UWB sensors in the individual anchors were adjusted to the same height H T r i p o d by means of a tripod, and the parameters of the anchors’ positions are listed in Table 2.
In the laboratory environment, a motion capture system (produced by OptiTrack) was used to capture the tag’s reference positions, with the marker fixed on the tag. This system covers a measurable area of 3 m (length) × 2.5 m (width) × 2 m (height) with millimeter accuracy. Results from the motion capture system serve as reference locations for the tag, used to evaluate the localization system’s accuracy. Additionally, to verify the maximum locatable range of the proposed system, a static tag localization experiment was carried out in a spacious hall setting.
Moreover, Figure 12 shows the base coordinate systems of both the proposed localization system and the motion capture system. The base coordinate system of the localization system { O } is as described in Section 4.1’s geometric localization model. The base coordinate system of the motion capture system { M C S } is established through a device containing three MCS markers, positioned directly below { O } and parallel to the ground. As the frames of the two systems reside in different coordinate systems, the registration of two frames is required to compare data. Each frame of the motion capture system is transformed into the proposed localization system as follows:
P M , i = R MCS O · P M , i + d MCS O = 0 0 1 1 0 0 0 1 0 · x P M , i y P M , i z P M , i + 0 0 H T r i p o d = z P M , i x P M , i y P M , i H T r i p o d .
where P M , i is the i-th reference location data in the { M C S } coordinate system. It is transformed to the coordinate location P M , i in the { O } coordinate system by a rotation matrix R MCS O and a translation matrix d . The specific values of the two matrices are obtained in accordance with the location information in the schematic. The result of the transformation is ( z P M , i , x P M , i , y P M , i H T r i p o d ) . It should be declared that there is a registration error in the two-frame registration process. The measurement point P O of the localization system is the tag’s UWB antenna, and the measurement point P M of the motion capture system is the MCS marker. Due to the deviation of about 2 cm between the measurement points, the attitude change when the tag moves may slightly affect the registration accuracy.
In addition, the Root Mean Square Error (RMSE) is a common measure of the difference between the estimated and reference values as defined in Equation (25). Here, E r r i denotes the Euclidean distance between the i-th estimated location and its corresponding reference location. A smaller RMSE value indicates a lesser difference between the estimated and the reference values, signifying higher localization accuracy:
RMSE = 1 n i = 1 n ( E r r i ) 2 .
After registering the frames of the localization system to the frames of the motion capture system, the specific definition of E r r i for different error evaluation objects is as follows:
  • Errors on different coordinate axes ( E r r x , i , E r r y , i , E r r z , i ):
    E r r x , i = x r e f , i x e s t , i ,
    E r r y , i = y r e f , i y e s t , i ,
    E r r z , i = z r e f , i z e s t , i .
  • Two-dimensional localization error on the X-Y plane ( E r r 2 D , i ):
    E r r 2 D , i = ( E r r x , i ) 2 + ( E r r y , i ) 2 .
  • Three-dimensional localization error across the X-Y-Z axes ( E r r 3 D , i ):
    E r r 3 D , i = ( E r r x , i ) 2 + ( E r r y , i ) 2 + ( E r r z , i ) 2 .
    where x r e f , i , y r e f , i , z r e f , i are the tag’s reference location data collected by the motion capture system. x e s t , i , y e s t , i , and z e s t , i are the estimated location data of the proposed localization system.

5.2. Indoor 3D Localization Experiment

The experiment aims to verify the accuracy of the indoor 3D localization system, particularly the accuracy of the height estimation. For this purpose, the tag’s moving reference trajectory was set as two rectangles (1.4 m × 1.6 m) with different heights. The reference locations of the tag were recorded by the high-precision motion capture system.
The tag’s trajectory, as shown in Figure 13, began at the point labeled ’Start’, moved around a horizontal rectangle, then vertically up about 0.5 m. After completing another horizontal loop around the rectangle, it concluded at the point labeled ’End’. The tag’s reference trajectory is depicted as the ‘Reference’ in the figure, with the geometric localization results shown in blue circles. The final Kalman filtering results produced by the localization system are indicated by the red line. The system’s output rate was 37 Hz, processing nearly 2600 data sets in 70.5 s.
To better differentiate the accuracy of localization in each dimension, Figure 14 displays the comparison of the tag’s positions on the X-axis, Y-axis, and Z-axis. The gray background in the figure indicates the phase where the tag was ascending, flanked by rectangular path sections. It is evident that the system’s localization results closely align with the reference trajectory. On the X and Y axes, there are relatively larger errors at the corners of the rectangular path. The optimization by the Kalman filter significantly enhances the geometric localization results.
The analysis of the results from the indoor 3D localization experiment is presented in Table 3. For the final output data of the localization system, the RMSE was about 0.041 m for the X-axis and Y-axis, and 0.028 m for the Z-axis. When the errors across different axes were superimposed, the 2D and 3D localization errors were also increased. Additionally, the cumulative distribution function (CDF) shown in Figure 15 demonstrates the specific error distribution for the 3D localization results.
The accuracy of the tag’s height estimation is crucial for 3D localization. In the Z-axis part, the results of the height error estimated by the dual barometric pressure sensors are as shown in Figure 16. Before filtering, the height error of the geometric localization results was about ±0.1 m. After filtering, the error was reduced to approximately ±0.05 m.

5.3. Locatable Range Verification Experiment

To verify the maximum locatable range of the proposed system, we conducted experiments in a more spacious hall. The three anchors were located in the same planar position as the experimental setup, and the height of the anchors H T r i p o d was set to 1.3 m. As in Figure 17, the stationary reference points for the test were set at the four vertices of a 6 m (length) × 6 m (width) × 2 m (height) sized cube. Based on the coordinate system of the localization system, the positions of these four reference points were Point ① ( 3 , 3 , 1.1 ) , Point ② ( 3 , 3 , 1.1 ) , Point ③ ( 3 , 3 , 0.9 ) , and Point ④ ( 3 , 3 , 0.9 ) . For each reference point, the localization system sampled 380 localization data (about 10 s) separately. In Figure 17, the geometric localization results are shown in blue points, and the Kalman filtering results are shown in red points.
Table 4 presents the RMSE comparisons of 2D and 3D errors for four points, calculated against their respective reference points. And Figure 18 shows a detailed distribution of the 3D localization error results, where the blue boxes are geometric localization results, and the purple boxes are Kalman filtering results.
Analysis data and boxplots indicate that the Kalman filtering optimized the geometric localization results for four points. Notably, the error variability range narrowed, the median error decreased, and the average 3D localization RMSE improved from 0.0860 m to 0.0740 m. These results confirm the system’s locatability within a 6 m (length) × 6 m (width) × 2 m (height) range.

6. Discussion

In this section, we compare the proposed system with our previous study and other related studies, respectively.

6.1. Comparison with Our Previous Study

This study successfully implemented indoor 3D localization, with the proposed system being evaluated through two experiments. The first experiment displayed the localization trajectories of dynamic tags within a limited range, while the second demonstrated localization estimates of static tags over extended distances.
The 2D RMSE results for both experiments were nearly identical, approximately 0.058 m. For the 3D experimental results, the RMSE at longer distances was 0.074 m, which was an increase of 0.01 m compared to the experimental results obtained near the anchor.
Compared to our previous study [20], this research developed an enhanced indoor 3D localization system, with significant upgrades and updates to both hardware and software:
  • In terms of hardware, the UWB sensor was upgraded from a patch type to an antenna type. To account for the propagation direction and range of the antenna signal, a short antenna with 2 dBi gain was selected to enhance the indoor omnidirectional ranging capability for the UWB sensors.
  • Additionally, the main controller was upgraded from ESP8266 to the dual-core ESP32 chip. By rationally allocating tasks through the software, one core was specifically used for Wi-Fi data reading, thus making the data output of the localization system faster and more stable. The localization output rate reached 37 Hz, which was a nine-fold increase.
  • The dispersed arrangement of the anchors allowed the locatable area of the new localization framework to expand. The verified locatable range of the system was 6 m (length) × 6 m (width) × 2 m (height), which was approximately three times larger compared to the previous localization device.
  • In particular, the RMSE of the 3D localization system reached 0.074 m, which improved the localization accuracy by 40.7% compared to our previous study.

6.2. Comparison with Other Related Studies

Although a high-precision motion capture system was employed during the experimental validation, it uses fixed motion capture cameras that are not easily moved and are expensive. Our localization system features quick anchor setup, and the estimated tag trajectories closely match the reference values, offering more economic value in some application scenarios.
In the height estimation, localization using three UWB sensors makes it difficult to distinguish between positive and negative tag heights. A study on indoor 3D drone localization increased the accuracy of height estimation by adding a UWB sensor at a different height [13]. However, this study reports that an error of approximately 0.2 m still exists in the height estimation. Moreover, as bandwidth utilization increases, a lower sensor data output rate (5 Hz) significantly impacts the accuracy of the localization system. Ma’s research designed a 3D localization system for indoor mobile robots using four UWB anchors at different heights [31], compensating for the signal interference from patch-type UWB sensors, with decimeter-level accuracy meeting the application requirements. In areas with small height differences, a single BMP sensor shows insignificant pressure variations [6], and the accuracy is also greatly reduced by the effect of barometric drift. However, our proposed method using dual BMP sensors excels in height estimation, achieving an accuracy of ±0.05 m.
Table 5 and Table 6 compare the localization performance of more different studies by RMSE results. In 2D planar localization, our approach achieves over twice the accuracy compared to methods that utilize additional UWB sensors [13,37,38]. Zigbee-based localization methods are slightly less precise [39], and LiDAR faces challenges with repetitive localization accuracy [40].
In indoor 3D localization, a substantial number of sensors are required to support the localization in larger buildings, and significant barometric changes can estimate floor levels [6]. For the size of the room environments, an indoor localization method that integrates SLAM and UWB technologies demonstrate notable accuracy [41]. Yoon et al. developed a system that combines IMU and UWB, achieving high-accuracy localization in smaller areas for entertainment scenarios [42]. This study has achieved a 60% performance improvement over similar research with comparable measurement scopes as referenced in [13]. While the localization accuracy provided by four anchors at varying heights meets the needs of the intended scenarios [31], inaccuracies in any UWB range measurement can significantly increase the error in height estimation. To tackle this issue, our system incorporates dual BMP sensors to improve the precision of the height estimation. By utilizing geometric localization models and Kalman filters, the system’s indoor 3D localization RMSE is optimized to 0.074 m.
However, the system we propose is not without flaws. Continuous thick obstructions between UWB sensors can impact the accuracy of some distance measurements, leading to reduced overall localization accuracy. The height estimation of indoor dual BMP sensors has been validated. But the relative barometric pressure measurements can be unstable near air conditioning vents, which are independent of the overall indoor barometric pressure. Meanwhile, employing additional reference anchors will furnish the localization system with more measurement data. However, more non-linear factors must be considered in real-world localization applications. We plan to further optimize and explore these aspects in subsequent studies. Furthermore, limited by the size of the battery, future improvements could involve using smaller lithium batteries and integrating all components on the same PCB to further reduce the size of the tag. Alternatively, a smartwatch that integrates UWB sensors, BMP sensors, and Wi-Fi transceivers could serve as an alternative hardware for the tag.

7. Conclusions

This study developed an enhanced indoor 3D localization system utilizing UWB and BMP sensors. The system features dispersed anchors as reference points within an indoor environment. The anchors were set on tripods and could be easily arranged in new environments. Filters reduced measurement noise for both types of sensors. BMP sensors measured barometric pressure at various indoor heights. The barometric pressure value at the tag was sent to the main controller through a Wi-Fi enabled microcontroller. The tag’s relative height was estimated by comparing it with the barometric pressure value at the anchor point, and the error of the estimated height result was about ±5 cm. This height value is also used to project the UWB sensor’s measurements onto the anchor’s plane, which helps reduce errors in 2D localization. UWB sensors at the three anchors calculated distances to the tag. The established geometric localization model considered various geometric relations and compensated for potential errors. The tag’s projection location on the anchor’s plane was determined using the centroid method. Finally, Kalman filtering optimized the location estimation.
We validated the localization performance and locatable range of the proposed system through indoor 3D localization experiments. The system has fast and stable output. In particular, the height estimation scheme with dual barometers estimated the height results with an accuracy of about ±0.05 m. The RMSE of the 2D localization reached 0.0585 m, and the RMSE of the 3D localization reached 0.0740 m. Compared to indoor localization systems in similar environments, our system has a larger measurable range and higher localization accuracy.
In future research, we plan to use more anchors to provide measurement data for the indoor 3D localization system and consider more non-linear factors to optimize the system’s localization performance, with specific application scenarios such as indoor drones’ localization.

Author Contributions

Conceptualization, L.B.; methodology, L.B. and K.L.; software, L.B. and W.D.; validation, L.B., J.L. and K.L.; formal analysis, J.L., W.D. and W.L.; investigation, L.B., K.L. and W.L.; resources, K.S. and W.K.; data curation, L.B. and W.D.; writing—original draft preparation, L.B.; writing—review and editing, K.S. and W.K.; visualization, L.B. and J.L.; supervision, K.S.; project administration, W.K.; funding acquisition, W.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2022R1C1C1008306), and in part by the China Scholarship Council (CSC) (No. 202108260014).

Institutional Review Board Statement

The study did not require ethical approval.

Informed Consent Statement

The study did not involve humans.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
2DTwo-Dimensional
3DThree-Dimensional
BMPBarometric Pressure
UWBUltra-Wideband
MCSMotion Capture System
CDFCumulative Distribution Function
RMSERoot Mean Square Error

References

  1. Zafari, F.; Gkelias, A.; Leung, K.K. A Survey of Indoor Localization Systems and Technologies. IEEE Commun. Surv. Tutor. 2019, 21, 2568–2599. [Google Scholar] [CrossRef]
  2. Ledergerber, A.; Hamer, M.; D’Andrea, R. A robot self-localization system using one-way ultra-wideband communication. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 3131–3137. [Google Scholar] [CrossRef]
  3. Chen, Y.; Zhao, Y.; Lyu, J.; Dai, X.; Huang, R.; Zhang, Q.; Jing, H. Nuclear Power Plant Indoor Personnel Positioning Scheme and Test. In Proceedings of the 2023 5th International Conference on Electronics and Communication, Network and Computer Technology (ECNCT), Guangzhou, China, 18–20 August 2023; pp. 289–292. [Google Scholar] [CrossRef]
  4. Laoudias, C.; Moreira, A.; Kim, S.; Lee, S.; Wirola, L.; Fischione, C. A Survey of Enabling Technologies for Network Localization, Tracking, and Navigation. IEEE Commun. Surv. Tutor. 2018, 20, 3607–3644. [Google Scholar] [CrossRef]
  5. Li, J.; Bi, Y.; Li, K.; Wang, K.; Lin, F.; Chen, B.M. Accurate 3D Localization for MAV Swarms by UWB and IMU Fusion. In Proceedings of the 2018 IEEE 14th International Conference on Control and Automation (ICCA), Anchorage, AK, USA, 12–15 June 2018; pp. 100–105. [Google Scholar] [CrossRef]
  6. Si, M.; Wang, Y.; Zhou, N.; Seow, C.; Siljak, H. A Hybrid Indoor Altimetry Based on Barometer and UWB. Sensors 2023, 23, 4180. [Google Scholar] [CrossRef] [PubMed]
  7. Li, Z.; Dehaene, W.; Gielen, G. A 3-tier UWB-based indoor localization system for ultra-low-power sensor networks. IEEE Trans. Wirel. Commun. 2009, 8, 2813–2818. [Google Scholar] [CrossRef]
  8. Xu, S.; Chen, R.; Guo, G.; Li, Z.; Qian, L.; Ye, F.; Liu, Z.; Huang, L. Bluetooth, Floor-Plan, and Microelectromechanical Systems-Assisted Wide-Area Audio Indoor Localization System: Apply to Smartphones. IEEE Trans. Ind. Electron. 2022, 69, 11744–11754. [Google Scholar] [CrossRef]
  9. Piciarelli, C. Visual Indoor Localization in Known Environments. IEEE Signal Process. Lett. 2016, 23, 1330–1334. [Google Scholar] [CrossRef]
  10. Wu, Y.; Zhao, C.; Lyu, Y. DMLL: Differential-Map-Aided LiDAR-Based Localization. IEEE Trans. Instrum. Meas. 2023, 72, 1–14. [Google Scholar] [CrossRef]
  11. Zeng, Q.; Tao, X.; Yu, H.; Ji, X.; Chang, T.; Hu, Y. An Indoor 2-D LiDAR SLAM and Localization Method Based on Artificial Landmark Assistance. IEEE Sens. J. 2024, 24, 3681–3692. [Google Scholar] [CrossRef]
  12. Bochem, A.; Zhang, H. Robustness Enhanced Sensor Assisted Monte Carlo Localization for Wireless Sensor Networks and the Internet of Things. IEEE Access 2022, 10, 33408–33420. [Google Scholar] [CrossRef]
  13. Strohmeier, M.; Walter, T.; Rothe, J.; Montenegro, S. Ultra-Wideband Based Pose Estimation for Small Unmanned Aerial Vehicles. IEEE Access 2018, 6, 57526–57535. [Google Scholar] [CrossRef]
  14. Buck, L.; Vargas, M.F.; McDonnell, R. The Effect of Spatial Audio on the Virtual Representation of Personal Space. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Christchurch, New Zealand, 12–16 March 2022; pp. 354–356. [Google Scholar] [CrossRef]
  15. Kumar, C.P.; Poovaiah, R.; Sen, A.; Ganadas, P. Single access point based indoor localization technique for augmented reality gaming for children. In Proceedings of the 2014 IEEE Students’ Technology Symposium, Kharagpur, India, 28 February–2 March 2014; pp. 229–232. [Google Scholar] [CrossRef]
  16. Lan, Y.S.; Sun, S.W.; Shih, H.C.; Hua, K.L.; Chang, P.C. O-Shooting: An Orientation-based Basketball Shooting Mixed Reality Game Based on Environment 3D Scanning and Object Positioning. In Proceedings of the 2018 IEEE 7th Global Conference on Consumer Electronics (GCCE), Nara, Japan, 9–12 October 2018; pp. 678–679. [Google Scholar] [CrossRef]
  17. Zhao, Y.; Fan, X.; Xu, C.Z.; Li, X. ER-CRLB: An Extended Recursive Cramér–Rao Lower Bound Fundamental Analysis Method for Indoor Localization Systems. IEEE Trans. Veh. Technol. 2017, 66, 1605–1618. [Google Scholar] [CrossRef]
  18. Liu, X.; Cen, J.; Zhan, Y.; Tang, C. An Adaptive Fingerprint Database Updating Method for Room Localization. IEEE Access 2019, 7, 42626–42638. [Google Scholar] [CrossRef]
  19. Bouhdid, B.; Akkari, W.; Belghith, A. Accuracy/Cost Trade-Off in Localization Problem for Wireless Sensor Networks. In Proceedings of the 2018 International Conference on Control, Automation and Diagnosis (ICCAD), Marrakech, Morocco, 19–21 March 2018; pp. 1–6. [Google Scholar] [CrossRef]
  20. Bao, L.; Li, K.; Li, W.; Shin, K.; Kim, W. A Sensor Fusion Strategy for Indoor Target Three-dimensional Localization based on Ultra-Wideband and Barometric Altimeter Measurements. In Proceedings of the 2022 19th International Conference on Ubiquitous Robots (UR), Jeju, Republic of Korea, 4–6 July 2022; pp. 181–187. [Google Scholar] [CrossRef]
  21. Sesyuk, A.; Ioannou, S.; Raspopoulos, M. A Survey of 3D Indoor Localization Systems and Technologies. Sensors 2022, 22, 9380. [Google Scholar] [CrossRef]
  22. Wu, Y.; Wang, W.; Xu, H.; Kang, H. Research on Indoor Sports Positioning Algorithm Based on UWB. In Proceedings of the 2021 International Conference on Control, Automation and Information Sciences (ICCAIS), Xi’an, China, 14–17 October 2021; pp. 638–643. [Google Scholar] [CrossRef]
  23. Xu, H.; Zhu, Y.; Wang, G. On the anti-multipath performance of UWB signals in indoor environments. In Proceedings of the ICMMT 4th International Conference on, Proceedings Microwave and Millimeter Wave Technology, Beijing, China, 18–21 August 2004; pp. 163–166. [Google Scholar] [CrossRef]
  24. Zhang, W.; Liu, J.; Wang, J. A multi-sensor fusion method based on EKF on granary robot. In Proceedings of the 2020 Chinese Control And Decision Conference (CCDC), Hefei, China, 22–24 August 2020; pp. 4838–4843. [Google Scholar] [CrossRef]
  25. Li, Y.; Chen, W.; Wang, J.; Nie, X. Precise Indoor and Outdoor Altitude Estimation Based on Smartphone. IEEE Trans. Instrum. Meas. 2023, 72, 9513111. [Google Scholar] [CrossRef]
  26. West, J.B. Torricelli and the Ocean of Air: The First Measurement of Barometric Pressure. Physiology 2013, 28, 66–73. [Google Scholar] [CrossRef] [PubMed]
  27. Bhadoria, G.; Thomas, C.K.; Panchal, S.; Nayak, S. Multi Purpose Flight Controller for UAV. In Proceedings of the 2023 3rd International Conference on Advancement in Electronics & Communication Engineering (AECE), Ghaziabad, India, 23–24 November 2023; pp. 79–82. [Google Scholar] [CrossRef]
  28. Bao, X.; Xiong, Z.; Sheng, S.; Dai, Y.; Bao, S.; Liu, J. Barometer measurement error modeling and correction for UAH altitude tracking. In Proceedings of the 2017 29th Chinese Control and Decision Conference (CCDC), Chongqing, China, 28–30 May 2017; pp. 3166–3171. [Google Scholar] [CrossRef]
  29. Bo, L.; Chao, X.; Xiaohui, L.; Wenli, W. Research and experimental validation of the method for barometric altimeter aid GPS in challenged environment. In Proceedings of the 2017 13th IEEE International Conference on Electronic Measurement & Instruments (ICEMI), Yangzhou, China, 20–23 October 2017; pp. 88–92. [Google Scholar] [CrossRef]
  30. Kalamees, T.; Kurnitski, J.; Jokisalo, J.; Eskola, L.; Jokiranta, K.; Vinha, J. Measured and simulated air pressure conditions in Finnish residential buildings. Build. Serv. Eng. Res. Technol. 2010, 31, 177–190. [Google Scholar] [CrossRef]
  31. Ma, J.; Duan, X.; Shang, C.; Ma, M.; Zhang, D. Improved Extreme Learning Machine Based UWB Positioning for Mobile Robots with Signal Interference. Machines 2022, 10, 218. [Google Scholar] [CrossRef]
  32. Pierleoni, P.; Belli, A.; Maurizi, L.; Palma, L.; Pernini, L.; Paniccia, M.; Valenti, S. A Wearable Fall Detector for Elderly People Based on AHRS and Barometric Sensor. IEEE Sens. J. 2016, 16, 6733–6744. [Google Scholar] [CrossRef]
  33. Mondal, R.; Reddy, P.S.; Sarkar, D.C.; Sarkar, P.P. Compact ultra-wideband antenna: Improvement of gain and FBR across the entire bandwidth using FSS. IET Microw. Antennas Propag. 2020, 14, 66–74. [Google Scholar] [CrossRef]
  34. Zhang, W.; Zhu, X.; Zhao, Z.; Liu, Y.; Yang, S. High Accuracy Positioning System Based on Multistation UWB Time-of-Flight Measurements. In Proceedings of the 2020 IEEE International Conference on Computational Electromagnetics (ICCEM), Singapore, 24–26 August 2020; pp. 268–270. [Google Scholar] [CrossRef]
  35. Feng, T.; Yu, Y.; Wu, L.; Bai, Y.; Xiao, Z.; Lu, Z. A Human-Tracking Robot Using Ultra Wideband Technology. IEEE Access 2018, 6, 42541–42550. [Google Scholar] [CrossRef]
  36. Lazzari, F.; Buffi, A.; Nepa, P.; Lazzari, S. Numerical Investigation of an UWB Localization Technique for Unmanned Aerial Vehicles in Outdoor Scenarios. IEEE Sens. J. 2017, 17, 2896–2903. [Google Scholar] [CrossRef]
  37. Guo, H.; Li, M. Indoor Positioning Optimization Based on Genetic Algorithm and RBF Neural Network. In Proceedings of the 2020 IEEE International Conference on Power, Intelligent Computing and Systems (ICPICS), Shenyang, China, 28–30 July 2020; pp. 778–781. [Google Scholar] [CrossRef]
  38. Pan, H.; Qi, X.; Liu, M.; Liu, L. Map-aided and UWB-based anchor placement method in indoor localization. Neural Comput. Appl. 2021, 33, 11845–11859. [Google Scholar] [CrossRef]
  39. Oğuz-Ekim, P. TDOA based localization and its application to the initialization of LiDAR based autonomous robots. Robot. Auton. Syst. 2020, 131, 103590. [Google Scholar] [CrossRef]
  40. Zhang, L.; Zhang, S.; Leng, C.T. A Study on the Location System Based on Zigbee for Mobile Robot. Appl. Mech. Mater. 2014, 651, 612–615. [Google Scholar] [CrossRef]
  41. Tiemann, J.; Ramsey, A.; Wietfeld, C. Enhanced UAV Indoor Navigation through SLAM-Augmented UWB Localization. In Proceedings of the 2018 IEEE International Conference on Communications Workshops (ICC Workshops), Kansas City, MO, USA, 20–24 May 2018; pp. 1–6. [Google Scholar] [CrossRef]
  42. Yoon, P.K.; Zihajehzadeh, S.; Kang, B.S.; Park, E.J. Robust Biomechanical Model-Based 3-D Indoor Localization and Tracking Method Using UWB and IMU. IEEE Sens. J. 2017, 17, 1084–1096. [Google Scholar] [CrossRef]
Figure 1. The hardware system framework.
Figure 1. The hardware system framework.
Sensors 24 03341 g001
Figure 2. The tag’s hardware design.
Figure 2. The tag’s hardware design.
Sensors 24 03341 g002
Figure 3. Hardware of the localization system’s anchors.
Figure 3. Hardware of the localization system’s anchors.
Sensors 24 03341 g003
Figure 4. Barometric pressure values measured by BMP sensors and filtering results in 30 min.
Figure 4. Barometric pressure values measured by BMP sensors and filtering results in 30 min.
Sensors 24 03341 g004
Figure 5. Estimated height values and relative height values based on dual BMP sensors in 30 min.
Figure 5. Estimated height values and relative height values based on dual BMP sensors in 30 min.
Sensors 24 03341 g005
Figure 6. The calibration equation fitted based on the sampled measurement values and actual distance values.
Figure 6. The calibration equation fitted based on the sampled measurement values and actual distance values.
Sensors 24 03341 g006
Figure 7. Geometric model based on the three anchors.
Figure 7. Geometric model based on the three anchors.
Sensors 24 03341 g007
Figure 8. Geometric localization model.
Figure 8. Geometric localization model.
Sensors 24 03341 g008
Figure 9. Geometric localization model of circle A and circle C intersecting at two points.
Figure 9. Geometric localization model of circle A and circle C intersecting at two points.
Sensors 24 03341 g009
Figure 10. Geometric localization model of circle A and circle C without intersections.
Figure 10. Geometric localization model of circle A and circle C without intersections.
Sensors 24 03341 g010
Figure 11. A possible scenario of the geometric localization model.
Figure 11. A possible scenario of the geometric localization model.
Sensors 24 03341 g011
Figure 12. The schematic of the experimental setup.
Figure 12. The schematic of the experimental setup.
Sensors 24 03341 g012
Figure 13. Results of the tag’s trajectories in the indoor 3D localization experiment.
Figure 13. Results of the tag’s trajectories in the indoor 3D localization experiment.
Sensors 24 03341 g013
Figure 14. Comparison of the tag’s coordinates in three dimensions.
Figure 14. Comparison of the tag’s coordinates in three dimensions.
Sensors 24 03341 g014
Figure 15. CDF results for the 3D localization errors in the localization experiment.
Figure 15. CDF results for the 3D localization errors in the localization experiment.
Sensors 24 03341 g015
Figure 16. Results of the height errors in the localization experiment.
Figure 16. Results of the height errors in the localization experiment.
Sensors 24 03341 g016
Figure 17. Results of the tag’s locations in the locatable range validation experiment.
Figure 17. Results of the tag’s locations in the locatable range validation experiment.
Sensors 24 03341 g017
Figure 18. Boxplot results for the 3D localization errors in the locatable range validation experiment.
Figure 18. Boxplot results for the 3D localization errors in the locatable range validation experiment.
Sensors 24 03341 g018
Table 1. The parameters of the devices used in the hardware system.
Table 1. The parameters of the devices used in the hardware system.
DeviceMicrochipBoard ModelCommunication Mode
Controller (ESP32)ESP32Arduino Nano ESP32Serial, I2C, Wi-Fi
Controller (ESP8266)ESP8266ESP8266 D1 MiniI2C, Wi-Fi
Sub-ControllerATmega2560Arduino MegaSerial, I2C
BMP SensorMS5611GY-63I2C
UWB SensorDW3000D-DWM-PG3.9Serial, UWB
Table 2. The parameters of the experimental setup.
Table 2. The parameters of the experimental setup.
H Tripod OA OB OC
1.5 m2.8 m2.4 m2.4 m
Table 3. Analysis of the indoor 3D localization experimental results.
Table 3. Analysis of the indoor 3D localization experimental results.
Coordinate AxisLocalization ResultMaximum Error [m]* RMSE [m]
XGeometric0.15240.0516
Filtering0.11370.0415
YGeometric0.13740.0402
Filtering0.14030.0392
ZGeometric0.14340.0451
Filtering0.06160.0282
X-Y (2D)Geometric0.17810.0649
Filtering0.16020.0578
X-Y-Z (3D)Geometric0.19830.0790
Filtering0.16040.0643
* RMSE: Root Mean Square Error.
Table 4. Analysis of the locatable range validation experimental results.
Table 4. Analysis of the locatable range validation experimental results.
PointLocalization MethodsRMSE (2D) [m]RMSE (3D) [m]
Geometric0.05750.0777
Filtering0.05410.0688
Geometric0.07970.0928
Filtering0.07780.0856
Geometric0.05990.0835
Filtering0.05920.0696
Geometric0.06260.0900
Filtering0.04300.0720
Average ValueGeometric0.06490.0860
Filtering0.05850.0740
Table 5. Comparison of 2D localization accuracy of different methods.
Table 5. Comparison of 2D localization accuracy of different methods.
No.ReferenceSensing TechnologyRangeRMSE
UWBBMPLiDARZigbee* L × W [m][m]
1Proposed 6 × 6 0.058
2[13] 5 × 5 0.158
3[37] 14 × 12 0.100
4[38] 15 × 10 0.560
5[39] 20 × 20 0.350
6[40] 4 × 4 0.636
* L × W: Length × Width.
Table 6. Comparison of 3D localization accuracy of different methods.
Table 6. Comparison of 3D localization accuracy of different methods.
No.ReferenceSensing TechnologyRangeRMSE
UWBBMPSLAMIMU* L × W × H [m][m]
1Proposed 6 × 6 × 2 0.074
2[6] 20 × 20 × 36 0.144
3[13] 5 × 5 × 5 0.185
4[31] 5 × 5 × 3 0.145
5[41] 6 × 6 × 4.5 0.139
6[42] 2 × 2 × 1 0.130
* L × W × H: Length × Width × Height.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bao, L.; Li, K.; Lee, J.; Dong, W.; Li, W.; Shin, K.; Kim, W. An Enhanced Indoor Three-Dimensional Localization System with Sensor Fusion Based on Ultra-Wideband Ranging and Dual Barometer Altimetry. Sensors 2024, 24, 3341. https://doi.org/10.3390/s24113341

AMA Style

Bao L, Li K, Lee J, Dong W, Li W, Shin K, Kim W. An Enhanced Indoor Three-Dimensional Localization System with Sensor Fusion Based on Ultra-Wideband Ranging and Dual Barometer Altimetry. Sensors. 2024; 24(11):3341. https://doi.org/10.3390/s24113341

Chicago/Turabian Style

Bao, Le, Kai Li, Joosun Lee, Wenbin Dong, Wenqi Li, Kyoosik Shin, and Wansoo Kim. 2024. "An Enhanced Indoor Three-Dimensional Localization System with Sensor Fusion Based on Ultra-Wideband Ranging and Dual Barometer Altimetry" Sensors 24, no. 11: 3341. https://doi.org/10.3390/s24113341

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop