Skip to Content
SymmetrySymmetry
  • Article
  • Open Access

18 February 2026

Symmetry-Enhanced Indoor Occupant Locating and Motionless Alarm System: Fusion of BP Neural Network and DS-TWR Technology

,
,
,
and
1
School of Environmental Science and Safety Engineering, Tianjin University of Technology, Tianjin 300384, China
2
School of Physics, Nankai University, Tianjin 300071, China
3
Information Research Institute of the Ministry of Emergency Management, Beijing 100029, China
4
Tianjin Zhishang Safety Technology Consulting Service Co., Ltd., Tianjin 300461, China

Abstract

To address the critical demand for real-time dynamic tracking of personnel in complex buildings during emergency rescue, a novel system was proposed integrating Back Propagation (BP) neural networks with Double-Sided Two-Way Ranging (DS-TWR) technology to achieve precise indoor localization and motionless detection. Comprising hardware (positioning base stations, tags, POE switches, routers, and a computer) and software (developed on LabVIEW), the system leverages the symmetric signal transmission of DS-TWR and the adaptive learning capability of BP neural networks to effectively mitigate multipath interference, enhancing positioning consistency and accuracy. Thresholds of time period and movement distance were set to determine whether the occupant was trapped. When tested in several common building structures, it demonstrated good stability and high accuracy—the average RMSE of the positioning system was within 0.012–0.018 m (static state) and 0.048–0.065 m (dynamic state). Furthermore, the system could real-time monitor and display the movement trajectory of each person, and automatically alarm when anyone was trapped in a fire scene. Hence, rescue measures can be taken timely according to the alarm information provided by the system, effectively ensuring the safety of personnel and improving the efficiency of fire rescue work. The proposed approach provides a symmetry-driven framework for intelligent building safety.

1. Introduction

Large-scale fires in various buildings are currently occurring at a higher rate globally. In large public spaces such as shopping malls, museums, hospitals, and convention centers, fires pose significant challenges due to the expansive layouts, limited exits, and narrow passageways of these buildings. These structural characteristics hinder fire-fighters from identifying the real-time status and locations of trapped individuals [1], while victims themselves often struggle to perceive their surroundings, severely complicating rescue efforts. When individuals are rendered unconscious due to smoke inhalation or structural collapse, their lives are threatened as they cannot be detected and treated in a timely manner [2]. In 2024, China reported 2001 fire-related fatalities. A fire in a commercial and residential building in Xinyu City, Jiangxi Province, resulted in 39 deaths and nine injuries; a fire in an elderly apartment in Chengde City, Hebei Province caused 20 deaths; and a fire at Jiuding Shopping Plaza Zigong City, Sichuan Province, led to 16 deaths and 39 injuries.
In the field of emergency rescue, current traditional technologies for monitoring personnel status inside buildings primarily include infrared, ultrasonic, and computer vision technologies [3,4,5]. These methods are constrained by environmental lighting, monitoring angles, distance and building interior structures, making large-scale search and rescue operations challenging. Moreover, these technologies mainly identify the static state through physiological signals such as human body heat radiation, respiration, and heartbeat, which have a certain lag and can easily delay the critical rescue opportunity. In addition, various positioning technologies have been proposed for personnel positioning, including geomagnetic positioning, ultrasound-based methods, inertial sensors, visual-based positioning, and approaches utilizing Radio Frequency Identification (RFID), WiFi, Bluetooth, Ultra-Wide Band (UWB), Zigbee, and cellular mobile networks [6,7,8,9,10,11]. Most of these technologies suffer from defects such as large positioning errors, high deployment costs, short transmission range, or poor real-time performance. In contrast, UWB technology, a low-power radio solution, is widely adopted in high-risk production sites such as mines and chemical plants, due to its high communication bandwidth, high security, high time resolution, and strong penetration.
However, due to multipath and non-line-of-sight (NLOS) propagation, UWB ranging results may still have deviations. Although the combination of UWB and machine learning has been explored in prior studies [12,13,14,15,16,17], existing research rarely integrates the symmetry of building structures into the system design and algorithm optimization, and few studies realize multifunctional integration of positioning, real-time trajectory monitoring and motionless alarm. Recent deep learning-based UWB localization studies have adopted CNNs [14,16], LSTMs [14,15] and GNNs [16]. CNNs excel in extracting spatial features of ranging data but require large-scale labeled datasets and high computational resources; LSTMs are suitable for dynamic positioning with time-series ranging data but have high latency and complex model training; GNNs can model the spatial correlation of base stations but require complex graph structure construction and are difficult to implement on industrial control platforms such as LabVIEW. In contrast, BP neural networks feature low computational complexity, fast training and inference speed, and mature engineering implementation, which is more suitable for fire rescue scenarios with strict real-time response and the limited computational resources of command center computers.
This paper proposes a building occupant locating and motionless alarm system that utilizes BP (Back Propagation) neural network and UWB-based DS-TWR (Double-Sided Two-Way Ranging) ranging technology. The system overcomes the positioning deviations caused by multipath effects and NLOS propagation during the DS-TWR ranging process, achieving centimeter-level positioning accuracy. Moreover, developed on the LabVIEW 2025 Q1 software platform, the system features a visual interface for real-time location, trajectory monitoring and an automatic motionless alarm, thereby shortening the search and rescue time and avoiding missing the golden rescue period. This system can be integrated into the fire command center, combined with existing fire detection systems, to provide intelligent services for emergency rescue in complex environments of large public places. The novelty of this work lies in three aspects: (1) A structure-signal-algorithm triple symmetric synergy-driven system framework is proposed to improve positioning consistency. The symmetric deployment of base stations (e.g., four corners of a rectangle) in symmetrical buildings conforms to the symmetric characteristics of electromagnetic wave propagation, the symmetric bidirectional communication mechanism of DS-TWR offsets clock skew, and the BP neural network is trained with symmetrically sampled data to enhance the generalization ability of ranging-coordinate mapping. Additionally, the Xavier initialization of weights in BP neural network is a symmetric algorithm that maintains the stability of activation values and gradients across all layers in the network by setting a symmetric distribution for weights. (2) The system not only achieves high-precision positioning but also integrates the visualization of real-time trajectory monitoring and motionless alarm functions, specifically addressing the needs of fire rescue scenarios. (3) The system has good transferability across rectangular, square and L-shaped layouts, with accurate trajectory display and stable operation even in metal obstacle environments. It features low computational complexity, enabling real-time response for fire rescue and other emergency scenarios.

2. Positioning Algorithm Based on BP Neural Network and DS-TWR Technology

2.1. DS-TWR Ranging Technology

UWB calculates the current location of personnel by exchanging information between pre-deployed positioning base stations and internal personnel tags. UWB positioning algorithms are primarily categorized into two types. One uses the time difference in signal transmission between tags and base stations for positioning (e.g., Time of Arrival (TOA) positioning algorithm and Time Difference of Arrival (TDOA) positioning algorithm [18]), which typically require strict clock synchronization. The other measures the distance between the tag and the base stations, and then calculates the tag position using methods such as trilateration, which can overcome errors caused by time asynchrony between tags and base stations. The selection of the UWB ranging method directly impacts the UWB system’s localization accuracy.
The ranging approaches mainly include Received Signal Strength Indicator (RSSI) distance measurement and Time of Flight (TOF) distance measurement. The RSSI-based ranging method exhibits significant deviations between measured and actual distances because the signal strength can be easily affected by multipath interference. TOF ranging employs bidirectional communication, with variants such as Single-sided Two-way Ranging (SS-TWR) and Double-sided Two-way Ranging (DS-TWR). DS-TWR adds an additional communication round on the basis of SS-TWR ranging, utilizing symmetric bidirectional signal exchange (i.e., symmetric uplink and downlink communication) to offset clock skew. This ensures minimal error even under conditions of prolonged response times or low clock precision. Consequently, the personnel positioning-motion detection-static alarming system designed for buildings adopts DS-TWR for its ranging implementation.
After measuring the distance between the tag and each base station, the trilateral measurement method is usually adopted to calculate the tag’s position in previous research. Due to the multipath effect of radio transmission, the influence of signal strength and clock accuracy, there will often be deviations in the DS-TWR ranging results. If the ranging data is directly substituted into the trilateral measurement to perform simple formula calculation, this deviation will inevitably be brought into the positioning results.

2.2. BP Neural Network

BP neural network, as an intelligent information processing system, is a multilayer feed-forward network trained on error back propagation, with strong capability for non-linear mapping and self-learning [19], which can eliminate the influence of ranging error and effectively improve the positioning accuracy. Therefore, the BP neural network is integrated to calculate the coordinates of personnel.
Compared with Support Vector Regression (SVR), random forest, and deep neural networks, BP neural network is selected in this work due to the following considerations: (1) The input-output relationship of the positioning system (distance to coordinates) is relatively simple, and BP neural network can achieve sufficient accuracy with lower computational complexity, which is conducive to real-time operation; (2) BP neural network has good adaptability to symmetrical spatial data, and the training process is easy to combine with the symmetric characteristics of the system; (3) BP neural network is more mature and easier to implement on the LabVIEW platform, reducing the difficulty of system integration. Although advanced models may achieve slightly higher accuracy in some cases, their computational cost and deployment complexity are not suitable for real-time positioning and alarm systems in fire rescue scenarios.
A BP neural network consists of an input layer, hidden layers (one or multiple), and an output layer [20], as shown in Figure 1. Signals enter the network through neurons in the input layer, are processed by activation functions in the hidden layer, then transmitted to the output layer where they undergo another activation function before final output. Neurons in each layer are fully connected to all neurons in adjacent layers through weighted connections. And the output of each layer node only affects the input of the next layer node, and neurons on the same layer are independent of each other.
Figure 1. Structure diagram of BP neural network model.
Taking a node in the latter layer of any two adjacent layers as an example, the neuron model in a BP neural network is illustrated in Figure 2, where xi represents the input term of the function, wij denotes the weight between nodes, j indicates the j-th node, θj represents the threshold of the j-th node, and fj is the activation function of this model. The activation function in BP neural networks is typically the sigmoid function. The activation function [21] is expressed as:
        f j = 1 1 + e x i × w i j + θ j
Figure 2. Neuron model diagram.
The training process of the BP neural network algorithm is mainly divided into two stages: forward propagation of signals and backpropagation of errors, as illustrated in the flowchart in Figure 3.
Figure 3. Flowchart of BP neural network algorithm.

2.3. BP Neural Network Training to Enhance Positioning Accuracy

The BP neural network learns the functional relationship between input (DS-TWR ranging data) and output (actual coordinates) through training, and uses the trained model to calculate personnel position coordinates. Detailed parameter determination, sample collection and training process are moved to Appendix A, and the core training settings and results are summarized as follows:
(1) Sample collection and division: A 7 m × 5 m rectangular symmetrical experimental area is constructed, with four base stations deployed at the four corners (symmetric layout). A total of 400 symmetrically and uniformly distributed sample points are selected in the area, and DS-TWR is used to measure the distance from the tag to each base station to form a 400 × 4 raw dataset, which is randomly divided into training set (240 groups, 60%), validation set (80 groups, 20%) and test set (80 groups, 20%).
(2) Network parameters setting: Training is conducted on MATLAB R2022b with the core parameters shown in Table 1; data normalization (0,1), Xavier weight initialization and adaptive learning rate (10% reduction every 200 iterations) are adopted, with early stopping (validation error increase for five consecutive iterations) to prevent overfitting.
Table 1. Input parameters for BP neural network.
(3) Hidden layer node determination: Considering the input layer node m = 4 (4 groups of ranging data), output layer node n = 2 (x/y coordinates), selected sample size k = 400, the number range of hidden layer nodes calculated by empirical formulas [22,23] is 6–13. The trial-and-error method and sensitivity analysis are adopted to achieve minimum training and validation error. The training performance for 6–15 nodes are shown in Table 2. Therefore, the optimal hidden layer node is determined to be 13, with the network topology of 4-13-2.
Table 2. Test errors for different numbers of hidden layer nodes.
(4) Model verification: The trained BP model is validated using the test set (80 points, compared with baseline models (trilateration, random forest, k-nearest neighbor) using Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Absolute Error (MAE) and cumulative distribution function (CDF) evaluation metrics.
Figure 4 displays the calculated root mean square error of the different models. The prediction error evaluation metric data for the four models are shown in Table 3. It can be seen that the BP model’s predicted values outperform those of the trilateration method, random forest, and k-nearest neighbor. Compared with trilateration method, the BP model has 81.67% lower RMSE, 51.72% lower MAPE, and 56.53% lower MAE. Compared with the random forest and k-nearest neighbor models, the BP model also achieves lower error metrics, indicating that the BP model is more suitable for this positioning system, especially in symmetrical spatial environments.
Figure 4. Comparison of RMSE for the different models.
Table 3. Prediction errors for the four models.
The CDF curves in Figure 5 shows that the BP model has a higher cumulative probability of small errors, 95% of errors within 0.06 m and 50% of errors within 0.02 m, which verifies its high precision and stability. These results demonstrate that the BP model can effectively enhance positioning accuracy.
Figure 5. CDF curves of prediction errors for different models.
In summary, the BP neural network model trained with symmetrical spatial data can effectively enhance positioning accuracy, outperforming other machine learning models. Therefore, the system developed in this paper adopts a combination of DS-TWR ranging and BP neural network for positioning calculation.

3. Design of the System

The system consists of hardware equipment and software programming, with core functions of high-precision personnel positioning, real-time trajectory display and automatic motionless alarm for fire environments. The system mainly includes two parts: the hardware equipment required for locating and the software programming to perform calculations for various functions.

3.1. Overview of System Functions

Based on LabVIEW platform, BP neural network and UWB DS-TWR technology, the system realizes the accurate positioning of personnel in fire environments, with real-time display of personnel position, movement trajectory and automatic alarm information on computer terminals. The core functions include:
(1) High-precision positioning: Eliminating UWB ranging error through BP neural network to achieve centimeter-level positioning of personnel in the fire.
(2) Personnel locating and trajectory monitoring. Collecting hardware information via LabVIEW, calculating real-time position through the proposed algorithm, and recording and displaying personnel movement trajectory.
(3) Automatic motionless alarm: Judging the movement status of the personnel according to configured thresholds, and triggering a real-time alarm to the command center for the trapped personnel.

3.2. Hardware Equipment

The hardware equipment includes UWB positioning base stations, positioning tags, POE switches, routers and a computer, as shown in Figure 6. Base stations are symmetrically deployed within the positioning area to provide position parameters. Tags are worn by personnel in the building aera to collect real-time position information. POE switches are responsible for data transmission in the system and power supply to base stations. Routers connect all base stations to the same local network, and assign IP addresses to the base stations. The computer processes the collected data, calculates the specific position of each person via BP neural network algorithm and displays it on the terminal. The POE switches, routers and computer are all placed in the Building Fire Protection Command and Control Center. The working process is: wireless signal exchange between tags and base stations → base stations collect ranging data → data transmission to computer via POE switches/routers → BP neural network calculation for real-time position → feedback to system interface for locating-monitoring-alarm.
Figure 6. Structural framework of the system.

3.3. Software Programming

The system utilizes the LabVIEW platform to collect the hardware information of the positioning system, imported the DS-TWR ranging data into the BP neural network (trained in MATLAB and integrated into LabVIEW via DLL files) to calculate the position information of personnel in the fire scene, and displayed it on the front-end system interface in real time. A trajectory display model is designed to draw a real-time movement trajectory, and an alarm function is also designed to trigger the automatic alarm according to the movement status of the personnel. The relevant programming diagrams are attached in Appendix B.
Figure 7a shows the interactive interface of the system, with the white area displaying the layout of the positioning area. In the area numbered 1, operations and parameter settings can be performed, including starting record, refreshing record, current position display, trajectory display and alarm display. The trajectory color of different individuals can be set in the area numbered 2. The parameter setting panel (Figure 7b) supports background map import (JPG/PNG/BMP), base station parameter configuration, personnel name definition and alarm threshold setting (movement distance/time period). Taking PNG image format as an example, after selecting the image, the system extracts its metadata, outputs the image data to generate a flattened pixel map, and imports the map into an XY plot as a background layer.
Figure 7. Display of the system interface, (a) interactive interface of the system, and (b) control panel parameter setting bar.
The system’s real-time performance test results show an update rate of 10 Hz (position data updated every 100 ms), average latency of 80 ms from data collection to position display, 15% average CPU usage and 800 MB memory usage on an Intel Core i7-12700H/16 GB RAM computer, which can run stably on ordinary fire command center computers and meet the real-time response requirements of emergency scenarios.

4. Functional Test

To fully verify the performance of the system in different structures and complex environments, multiple test scenarios were designed, including a rectangular laboratory (14 m × 10 m), a square office (7 m × 7 m), and an L-shaped corridor (10 m × 3 m + 5 m × 4 m). The locating-motion monitoring-static alarming systems were built in these scenarios, as shown in Figure 8. Base stations were deployed in each scenario, with their planar positions determined according to the actual environment. After the completion of the construction, the relevant parameters of the positioning base station were inputted into the system for testing and verification.
Figure 8. Physical image of test scenarios, (a) rectangular laboratory, (b) square office, and (c) L-shaped corridor.
Personnel in a fire scene are usually in motion. Therefore, when they remain stationary for a long time, we can consider that the personnel may have fallen into a coma or have been trapped. The alarm range was set to 0.2 m, and the alarm time period was set as 30 s, which is consistent with commonly used firefighter pagers. When the distance traveled by the personnel in the fire scene within 30 s does not exceed 20 cm, the system displays an alarm message. The threshold value of movement range and alarm time period can be changed according to different situations.

4.1. Static Positioning Accuracy and Stability Test

To verify the positioning accuracy and stability of the system under static conditions, multi-point testing was conducted in different regions and distances of the L-shape corridor. Seven test points were selected: P1(1.00, 1.20), P2(1.50, 5.00), P3(2.50, 9.00), P4(2.80, 4.20), P5 (3.50, 3.80), P6(4.00, 2.00) and P7(7.50, 3.00). These points cover locations such as corners, bends, and central areas, as shown in Figure 9a. For each test point, the position data sampling interval was set to 2 s and the coordinate data of the tag was recorded every 2 s within 20 min, resulting a total of 600 data points. Figure 9b shows the coordinate data recorded at test point P1(1.00, 1.20) and Table 4 summarizes the static positioning errors of the seven test points.
Figure 9. Positioning under static state, (a) test points and (b) coordinate data at P1.
Table 4. Static positioning errors of five test points.
The results indicate that in the stationary state, the positioning error range of the system is approximately 0.012–0.018 m. Specifically, the overall X-coordinate fluctuates within ±0.025–± 0.032 m, while the Y-coordinate fluctuates within ±0.035–±0.042 m, confirming the system’s favorable static positioning accuracy. Furthermore, 95% of the coordinate data collected over 20 min for the same test point is concentrated within a ±0.02 m range relative to the true coordinates, which verifies the system’s stability.

4.2. Motion Trajectory Monitoring Accuracy Test

To test the accuracy of the system in detecting the trajectory of personnel during their movement, preset routes were designed in the rectangular laboratory, square office, and L-shaped corridor. A person with a tag was assigned to walk strictly along the preset route from the starting point to the end point. The trajectory recorded by the system during this process is compared with the actual walking trajectory. Figure 10 shows the comparison in the rectangular laboratory. The black line in the figure represents the trajectory recorded by the system, and the red line represents the actual walking trajectory. Table 5 summarizes the trajectory monitoring errors in different scenarios.
Figure 10. Comparison of recorded trajectory and actual walking path. (The black line represents the trajectory recorded by the system, and the red line represents the actual walking trajectory).
Table 5. Trajectory monitoring errors in different scenarios.
It can be seen from Figure 10 and Table 5 that the trajectory recorded by the system is basically consistent with the actual walking trajectory. There are very few coordinate points with significant errors, which mainly concentrate at the corners. The maximum deviation error is 0.10 m (L-shaped corridor), and the average RMSE is within 0.048–0.065 m. The CDF metrics shows that 95% of the trajectory errors are within 0.070–0.095 m, indicating that the system exhibits excellent positioning accuracy under dynamic conditions and has good transferability to different layouts.

4.3. Real-Time Position and Trajectory Display Function Test

In order to test the real-time position and trajectory display function, two individuals were asked to carry positioning tags and walk in any direction in the laboratory square office. To distinguish between the individuals, one is referred to as Zhang San, and the other is referred to as Li Si.
When it is required to view the walking trajectory of all individuals in the area, the mode ‘Show all trajectories’ should be chosen, as shown in Figure 11, and then the walking trajectory routes of both Zhang San and Li Si will be shown simultaneously. If it is only required to view the trajectory of someone specific (such as Zhang San), the mode ‘Show Zhang San’s Trajectory’ should be chosen which will only display the walking trajectory of the designated person—Zhang San—while the walking trajectories of the others will be hidden.
Figure 11. Display of the walking trajectories.
When the ‘Show current position’ mode is selected, the current position of all individuals can be automatically displayed in the interface, as shown in Figure 12. When it is necessary to display the position of only a certain person (such as Zhang San), the button ‘display Zhang San’s position’ should be clicked, and then the current position of the specified individual would be shown in the interface.
Figure 12. Display of the real-time positions.
The test results show that the real-time position and trajectory display function runs well and meets the design requirements.

4.4. Alarm Display Function Test

The alarm indication light is set to trigger an alarm based on the following criterion: If the tag moves less than 0.2 m within 30 s, the corresponding individual will be judged to be in danger, the indicator light will turn red, and the name of the individual will be shown in the name display box of the alarm personnel.
The test procedure and results are set as follows:
(1) Two participants (Zhang San and Li Si) with positioning tags respectively begin to walk and keep walking for 5 min.
The alarm indicator light does not change, as shown in Figure 13a.
Figure 13. Display of alarm function: (a) both walking, (b) static/walking, and (c) both static.
(2) After 5 min, Li Si becomes stationary, while Zhang San continues to walk.
At this time, the alarm indicator light turns red after about 2 s, and ‘Li Si alarm’ displays, as shown in Figure 13b.
(3) Then, after 3 min, Zhang San stops walking. Both participants remains stationary for 2 min.
At this moment, “Zhang San alarm; Li Si alarm” displays. During the 2 min the two participants keep a stationary state, the alarm indicator light remains red, and the name display box of the alarm personnel continues to show ‘Zhang San alarm; Li Si alarm’, as shown in Figure 13c.
Based on the comprehensive test results, it has been proven that the system can issue an alarm once someone becomes motionless. The name display function can correctly display the names of alarm personnel. Although the alarm indicator light has a delay of two seconds, it does not affect the timely rescue.

4.5. Complex Environment Test

To verify the performance of the system in complex environments, we deployed several metal-structured devices and high-power electromagnetic equipment in the rectangular laboratory, and remeasured the static positioning accuracy and dynamic trajectory accuracy. Table 6 summarizes the test results.
Table 6. System performance in complex environments.
The results show that in complex environments, the system’s positioning accuracy slightly decreases, but still maintains a high level (static RMSE < 0.03 m, dynamic RMSE < 0.09 m). The alarm delay increases slightly (≤2.5 s), which is within the acceptable range for fire rescue. This indicates that the system has good anti-interference ability and practicality under metal and electromagnetic environments.

5. Discussion

Table 7 summarizes several studies on indoor positioning systems, which basically cover the current common positioning technologies and system functions.
Table 7. Summary of the relevant literature.

5.1. Technology

At present, the indoor positioning technology for emergency rescue mainly includes UWB positioning technology, ZigBee, RFID technology, inertial navigation technology, etc. Among the above positioning methods, UWB is the most widely used due to the advantages of fast signal transmission, low production cost and high positioning accuracy [36]. However, UWB ranging technology can also result in data bias due to the multipath effect and NLOS propagation. Combing two or more technologies can effectively improve positioning accuracy. For example, the positioning system developed by Vey et al. [34] using UWB, altimeter and radio has an accuracy of 0.36 m.
Thus, this paper integrates BP neural network to establish a nonlinear function model between ranging data and actual coordinates, effectively reducing positioning errors. Compared with more advanced models such as random forest and K-nearest neighbor, BP neural network is more suitable for this system due to its high accuracy, low computational complexity and good compatibility with the LabVIEW platform, which is crucial for real-time positioning and alarm in fire rescue scenarios.
The symmetry-driven design is the key innovation of this system. For highly asymmetric building structures (e.g., irregular curved layouts, non-uniformly distributed obstacles) and cluttered environments, the system’s performance will have a certain degree of degradation: (1) In highly asymmetric layouts, the symmetric matching between base station deployment and electromagnetic wave propagation will be broken, leading to increased DS-TWR ranging errors. (2) The BP neural network trained with symmetrical spatial data has limited generalization ability to asymmetric spatial features, which will result in an increase in positioning RMSE. (3) In severely cluttered environments, multipath and NLOS propagation will be aggravated, leading to unstable DS-TWR ranging data and an increase in dynamic positioning RMSE, and the alarm delay may increase.

5.2. Accuracy

As shown in Table 7, the positioning accuracy of the current related research is 0.36–10 m, while our research achieves a much higher centimeter-level accuracy (0.012–0.018 m in static state and 0.048–0.065 m in dynamic state).
This paper fuses BP neural network with UWB (DS-TWR) technology, reducing the positioning error to the centimeter level (an order of magnitude higher than existing UWB-based systems). It is emphasized that the reported centimeter-level accuracy is achieved based on prior calibration/training (systematic symmetric sample collection, BP neural network parameter optimization) and controlled deployment conditions (symmetric base station layout, fixed spatial reference, and low environmental interference in test scenarios); in actual fire scenes with random obstacles and dynamic environmental changes, the system’s positioning accuracy will decrease.

5.3. Function

The system realizes three core functions for fire rescue scenarios, which are rarely integrated in the existing research (as shown in Table 7):
(1) High-precision personnel positioning. During emergencies such as fires or earthquakes, the critical prerequisite for search and rescue operations is rapidly determining occupants’ positions and current statuses. This system achieves real-time visualization of the number and distribution of on-site personnel in the fire command center, facilitating the dispatch of rescue forces to densely populated areas and improving rescue efficiency.
(2) Real-time trajectory display. By supporting separate viewing of each person’s walking path, this system can help the command center to guide firefighter rescue and evacuation and prevent people getting lost in complex building structures. The test results in different layouts (rectangular, square, L-shaped) show that the system has good transferability, and the trajectory display is accurate and reliable.
(3) Automatic motionless alarm. During a fire, individuals often fall into a coma due to smoke poisoning or sudden collapse of building structures, and cannot move or evacuate independently. Automatic motionless alarm is the key function specially designed for fire rescue, which triggers a real-time alarm when personnel are stationary for a long time (due to smoke poisoning or building collapse), with alarm information including the trapped personnel’s name and location to ensure timely rescue. The system has good transferability across rectangular, square and L-shaped layouts, and can work normally in metal obstacle environments, enhancing practicality in an actual fire rescue.

6. Conclusions

For personnel trapped in a fire, early detection can significantly increase their chance of survival.
(1) A symmetry-enhanced indoor occupant locating and motionless alarm system was designed, which integrates UWB (DS-TWR) ranging technology with a BP neural network algorithm on the LabVIEW platform, forming a structure-signal-algorithm triple symmetric synergy. The system’s centimeter-level positioning accuracy (0.012–0.018 m static, 0.048–0.065 m dynamic) is achieved based on prior calibration/training and controlled symmetric deployment conditions, and it has higher positioning accuracy and stronger adaptability to symmetrical structure buildings than existing systems.
(2) The system was tested in multiple scenarios (rectangular laboratory, square office, L-shaped corridor) and complex environments (metal obstacles + electromagnetic interference). The results show good positioning accuracy and stability, accurate real-time position/trajectory display, and a timely, automatic motionless alarm (alarm delay ≤ 2.5 s). The system maintains high positioning accuracy (static RMSE <0.03 m, dynamic RMSE <0.09 m) in complex environments, with good anti-interference ability.
(3) The system can effectively assist fire command centers with real-time positioning and trajectory monitoring of on-site personnel, and automatically trigger alarm signals for trapped motionless personnel, ensuring timely rescue. It has good practical value and application prospects in fire rescue and other emergency scenarios, with low computational complexity and easy integration with existing fire detection systems.

7. Future Work

In this paper, a locating-monitoring- motionless alarm system is designed. On the basis of accurate positioning, trapped personnel can be promptly identified and located. Many electronic manufacturers such as Apple and Xiaomi have integrated UWB chips into their own mobile phones. In the future, mobile phones can replace UWB positioning tags, which can reduce the deployment cost of indoor positioning infrastructure and improve the operability of the system. To realize this goal, the following key technical issues need to be addressed: (1) Compatibility between mobile UWB chips and existing base stations: Conduct compatibility tests between mainstream mobile UWB chips (e.g., Apple U1, Xiaomi UWB chip) and base stations, and develop adaptive protocols if necessary. (2) Signal transmission differences: Analyze the differences in signal transmission power, bandwidth, and ranging accuracy between mobile phone UWB chips and professional tags, and optimize the BP neural network model to adapt to these differences. (3) Power consumption optimization: Mobile phones have limited battery capacity, so it is necessary to design a low-power UWB communication mechanism to ensure long-term operation in emergency scenarios.
It is very important to train the BP neural network according to the actual situation. In future research, we will further expand the training samples set to include more building layouts (especially highly asymmetric layouts) and environmental conditions, improving the generalization ability of the model. In addition, the lightweight fusion of BP neural network with shallow CNNs/LSTMs will be explored to balance positioning accuracy and computational complexity, improving the system’s performance in highly asymmetric building structures. Moreover, future research should explore the application of this system in modern asymmetric building structures. The optimal deployment strategy of base stations and the optimization method of BP neural network will be studied for asymmetric structures to extend the application scope of the system to more complex buildings.

Author Contributions

Conceptualization, L.W. and X.M.; methodology, L.W. and W.C.; software, Z.W.; validation, Z.W. and A.S.; formal analysis, X.M.; data curation, X.M.; writing—original draft preparation, Z.W.; writing—review and editing, L.W.; visualization, Z.W.; supervision, L.W.; project administration, L.W.; funding acquisition, L.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China, grant number 52074192 and Tianjin Municipal Science and Technology Bureau, grant number 24YDTPJC00110.

Data Availability Statement

Dataset available on request from the authors.

Conflicts of Interest

The authors declare no conflicts of interest. Author Aijun Sun was employed by the company Tianjin Zhishang Safety Technology Consulting Service Co., Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Appendix A. BP Neural Network Training Implementation Details

This appendix includes the detailed construction process of the symmetrical experimental scenario, systematic sample collection steps, complete BP neural network training process flow, and original parameter adjustment records.
(1)
The symmetrical experimental scenario and sample
To ensure the systematicity and representativeness of the data, the construction of the experimental scenario follows a universal principle in architectural design—the spatial symmetry of the structure. Considering that most architectural plans are rectangular layouts with symmetrical structures, a rectangular area with an area of 7 m by 5 m was experimentally selected and constructed. At the initial stage of the experiment, a local coordinate system was established using a total station, and the actual coordinates of all positioning base station antennas were accurately measured, providing a unified spatial reference based on geometric symmetry for the entire measurement system.
The specific hardware deployment also reflects the principle of symmetry in layout. According to the classic regional full coverage positioning model, four positioning base stations were deployed at the four corners of a rectangular area. This four-sided symmetrical layout not only conforms to the physical characteristics of electromagnetic wave propagation symmetry structures but also provides the geometric optimal reference framework for trilateration algorithms.
The core strategy for experimental data collection is to fully utilize the symmetrical geometric characteristics of the rectangular region for systematic sampling. In order to effectively support the precise and efficient training of BP neural networks and fully consider the impact of data size on model performance, this study symmetrically and uniformly selected 400 sample points for data collection throughout the entire rectangular region. These 400 points are systematically selected to cover possible personnel positions within the area, while ensuring balanced data distribution, as shown in Figure A1. A raw ranging dataset with a scale of 400 × 4 was generated by measuring the distance from the tag to each base station using DS-TWR technology.
Figure A1. 400 positioning points for BP neural network training and testing.
Figure A1. 400 positioning points for BP neural network training and testing.
Symmetry 18 00376 g0a1
The 400 groups of data are divided into a training set, validation set and test set by random sampling, with a ratio of 6:2:2. Specifically, 240 groups of data were used as the training set—80 groups as the validation set (for overfitting control) and 80 groups as the test set.
(2)
BP neural network training process and parameters
Network parameters were continuously adjusted and optimized to enhance generalization and convergence capabilities. The software environment for BP neural network training is MATLAB R2022b, and the training procedure is as follows: (1) Data preprocessing: Normalize the input distance data and output coordinate data to the range [0, 1] to accelerate network convergence. (2) Network initialization: Adopt Xavier initialization for weights to avoid gradient vanishing, and initialize thresholds to 0. (3) Forward propagation: Calculate the output of the hidden layer and output layer using the sigmoid activation function. (4) Error calculation: Use mean squared error (MSE) as the loss function. (5) Backpropagation: Adjust weights and thresholds using the gradient descent algorithm, with an adaptive learning rate (reduce by 10% every 200 iterations). (6) Overfitting control: Monitor the validation set error during training. If the validation error increases for five consecutive iterations, stop training early to avoid overfitting. The final parameter settings for BP neural network are shown in Table 1 of the main text.
(3)
Determination of the number of hidden nodes
BP network adds several layers (one or more) of neurons between the input and output layers. These neurons have no direct connection with the outside world, but changes in their state can affect the relationship between input and output. Each layer can have several nodes. The number of intermediate layers and nodes in each layer of the network can be arbitrarily set according to specific situations.
Our positioning system aims to obtain the position information of each tag from distances between tag and base stations. The operation of this system is not complicated. Therefore, the BP neural network with single hidden layer was selected to process the data. The number of nodes in the hidden layer will affect the performance of BP neural network. For example, if the value is too large, it will lead to long learning time and weak generalization ability. If the value is too small, it will lead to a decrease in fault tolerance.
The number of hidden layer nodes in this paper was determined according to the empirical formulas 1 and 2, combined with sensitivity analysis. The empirical formulas are [22,23]:
l = m + n + α
i = 0 m C l i > k
where l is the number of nodes in the hidden layer; m is the number of nodes in the input layer; n is the number of nodes in the output layer; α is a constant between 1 and 10; i is a constant between 0 m; and k is the number of samples.
There are four positioning base stations in this system, resulting in four sets of TWR ranging data between the base stations and the tag. These four sets of data serve as the four nodes of the input layer, i.e., m = 4. The output layer outputs the current position coordinates x and y corresponding to the tag, so there are two nodes in the output layer, i.e., n = 2. According to selected sample size of 400, the number range of the hidden layer nodes calculated by Formula 1 and Formula 2 is 6–13.
To further verify the optimal number of hidden layer nodes, sensitivity analysis was performed. The trial-and-error method was adopted to train the network with the hidden layer set to values within this range, and the results are shown in Table 2 of the main text. When the number of hidden layer nodes is 13, the network achieved the optimal training performance, with the minimum maximum error, minimum error, and root mean square error (RMSE). Additionally, the validation set error is also the lowest at 13 nodes, indicating good generalization ability. Therefore, the network topology was determined as 4-13-2. The theoretical basis for selecting 13 nodes is that it balances the model’s fitting ability and generalization ability: too few nodes (e.g., 6–8) lead to underfitting (high training error), while too many nodes (e.g., 14–15, tested additionally) result in overfitting (high validation error).
(4)
BP neural network training
Taking the positioning area of 7 m by 5 m as an example, 240 positioning points were selected as the training set, 80 as the validation set, and 80 as the test set. As shown in Figure 4, four base stations were set up at the four corners of the region, and the tags were placed at the 400 positioning points to measure the distance between the tags and the base stations.
A coordinate system was established with the position of base station 1 as the coordinate origin, the direction of the base station 2 as the positive direction of the x axis, and the direction of the base station 4 as the positive direction of the y axis. Since the system is based on DS-TWR ranging technology, it is necessary to take the distance values measured by DS-TWR between the tags at 400 positioning points and each base station as the input data. The actual position coordinates of the 400 positioning points were used as output data for training and validation in the BP neural network. After the training was completed, the generated mathematical function model was saved. The data for BP neural network learning is shown in Table A1.
Table A1. Sample points for BP neural network learning.
Table A1. Sample points for BP neural network learning.
NumberThe Distance Between the Tags and the Base Stations (m)Real Coordinates of Locating Tag (m)
Base Station 1Base Station 2Base Station 3Base Station 4xy
14.085.663.970.940.5983.948
23.535.274.101.360.5983.386
32.974.874.321.900.5982.824
……………………………………
4004.050.744.075.703.9040.5675
(5)
Validation of the training results of BP neural network
The generated mathematical function model is validated using the test set (80 points). Trilateration method, random forest regression and k-nearest neighbor regression were chosen as baseline models for comparison with BP, as these methods are widely used and reproducible in indoor location problems. The random forest regression and k-nearest neighbor regression were implemented in MATLAB R2022b with default parameters (number of trees = 100 for random forest, k = 5 for k-nearest neighbor). The DS-TWR ranging information of the 80 test points is input into the BP neural network model, trilateration method, random forest model and k-nearest neighbor model respectively. The coordinates obtained through calculation are denoted as (x1, y1), and compared with the real coordinate values (x, y).
To evaluate the prediction results more intuitively, four evaluation metrics—Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Absolute Error (MAE) and cumulative distribution function (CDF)—were selected. The MAE and RMSE metrics reflect the gap between the predicted values and true values, with a value range of [0, +∞); the closer the value is to 0, the more accurate the model is. MAPE is a percentage value representing the average percentage of relative errors between predicted values and actual values, with a value range of [0, +∞); the closer the value is to 0, the more accurate the model is. The CDF curve shows the cumulative probability of errors within a certain range.
The specific comparison is shown in Section 2.3 (4) of the main text.

Appendix B. Programming Diagrams of the System

Figure A2. Loading background.
Figure A2. Loading background.
Symmetry 18 00376 g0a2
Figure A3. Recording motion track coordinate data.
Figure A3. Recording motion track coordinate data.
Symmetry 18 00376 g0a3
Figure A4. Motion trajectory display.
Figure A4. Motion trajectory display.
Symmetry 18 00376 g0a4
Figure A5. Motionless alarm display.
Figure A5. Motionless alarm display.
Symmetry 18 00376 g0a5

References

  1. Khan, A.A.; Khan, M.A.; Leung, K.; Huang, X.; Luo, M.; Usmani, A. A review of critical fire event library for buildings and safety framework for smart firefighting. Int. J. Disaster Risk Reduct. 2022, 83, 103412. [Google Scholar] [CrossRef]
  2. Qi, Y.; Pan, Z.; Hong, Y.; Yang, M.H.; Van Den Hengel, A.; Wu, Q. The Road to Know-Where: An Object-and-Room Informed Sequential BERT for Indoor Vision-Language Navigation. arXiv 2021. [Google Scholar] [CrossRef]
  3. Xiao, J.; Zhou, Z.; Yi, Y.; Ni, L.M. A Survey on Wireless Indoor Localization from the Device Perspective. Acm Comput. Surv. 2016, 49, 1–31. [Google Scholar] [CrossRef]
  4. Zafari, F.; Gkelias, A.; Leung, K.K. A Survey of Indoor Localization Systems and Technologies. IEEE Commun. Surv. Tutor. 2019, 21, 2568–2599. [Google Scholar] [CrossRef]
  5. Wang, H.; Wang, G.; Li, X. Image-based occupancy positioning system using pose-estimation model for demand-oriented ventilation. J. Build. Eng. 2021, 39, 102220. [Google Scholar] [CrossRef]
  6. Gomes, E.L.; Fonseca, M.; Lazzaretti, A.E.; Munaretto, A.; Guerber, C. Clustering and Hierarchical Classification for High-Precision RFID Indoor Location Systems. IEEE Sens. J. 2022, 22, 5141–5149. [Google Scholar] [CrossRef]
  7. Xu, J.; Yang, Z.; Chen, H.; Liu, Y.; Zhou, X.; Li, J.; Lane, N. Embracing Spatial Awareness for Reliable WiFi-Based Indoor Location Systems. In Proceedings of the 2018 IEEE 15th International Conference on Mobile Ad Hoc and Sensor Systems (MASS), Chengdu, China, 9–12 October 2018; pp. 281–289. [Google Scholar] [CrossRef]
  8. Terán, M.; Aranda, J.; Carrillo, H.; Mendez, D.; Parra, C. IoT-based system for indoor location using bluetooth low energy. In Proceedings of the 2017 IEEE Colombian Conference on Communications and Computing (COLCOM), Cartagena, Colombia, 16–18 August 2017; pp. 1–6. [Google Scholar] [CrossRef]
  9. Großwindhager, B.; Stocker, M.; Rath, M.; Boano, C.A.; Römer, K. SnapLoc: An Ultra-Fast UWB-Based Indoor Localization System for an Unlimited Number of Tags. In Proceedings of the 18th ACM/IEEE International Conference on Information Processing in Sensor Networks, Montreal, QC, Canada, 16–18 April 2019; pp. 61–72. [Google Scholar]
  10. Yang, S.; Liu, J.; Gong, X.; Huang, G.; Bai, Y. A Robust Heading Estimation Solution for Smartphone Multisensor-Integrated Indoor Positioning. IEEE Internet Things J. 2021, 8, 17186–17198. [Google Scholar] [CrossRef]
  11. Zhou, H.; Cong, H.; Wang, Y.; Dou, Z. A computer-vision-based deep learning model of smoke diffusion. Process Saf. Environ. Prot. 2024, 187, 721–735. [Google Scholar] [CrossRef]
  12. Wang, P.; Lian, Z.; Núñez-Andrés, M.A.; Tian, Y.; Wang, M.; Chai, H.; Bi, J.; Liu, X. A GCN-GRU-KAN-Based Framework for UWB 3D localization in adverse geometric configurations. Measurement 2026, 258, 119066. [Google Scholar] [CrossRef]
  13. Liu, Q.; Yin, Z.; Zhao, Y.; Wu, Z.; Wu, M. UWB LOS/NLOS identification in multiple indoor environments using deep learning methods. Phys. Commun. 2022, 52, 101695. [Google Scholar] [CrossRef]
  14. Tu, C.; Zhang, J.; Quan, Z.; Ding, Y. UWB indoor localization method based on neural network multi-classification for NLOS distance correction. Sensors Actu. A: Phys. 2024, 379, 115904. [Google Scholar] [CrossRef]
  15. Yang, H.; Wang, Y.; Seow, C.K.; Sun, M.; Joseph, W.; Plets, D. A novel credibility evaluation and mitigation for ranging measurement in UWB localization. Measurement 2025, 256, 117721. [Google Scholar] [CrossRef]
  16. Kordi, K.A.; Roslee, M.; Alias, M.Y.; Alhammadi, A.; Waseem, A.; Osman, A.F. Survey of Indoor Localization Based on Deep Learning. Comput. Mater. Contin. 2024, 79, 3261–3298. [Google Scholar] [CrossRef]
  17. Osman, A.; Shamsfakhr, F.; Vecchio, M.; Antonelli, F. Adaptive GNSS–UWB Sensor Fusion for Reliable Localization in Precision Agriculture. Smar. Agric. Technol. 2026, 13, 101846. [Google Scholar] [CrossRef]
  18. Liu, C.; Yun, J. A Joint TDOA/FDOA Localization Algorithm Using Bi-iterative Method with Optimal Step Length. Chin. J. Electron. 2021, 30, 119–126. [Google Scholar] [CrossRef]
  19. Tran, H.Q.; Ha, C. Machine learning in indoor visible light positioning systems: A review. Neurocomputing 2022, 491, 117–131. [Google Scholar] [CrossRef]
  20. Chen, X.; Zhang, M.; Ruan, K.; Gong, C.; Zhang, Y.; Yang, S.X. A Ranging Model Based on BP Neural Network. Intell. Autom. Soft Comput. 2015, 22, 325–329. [Google Scholar] [CrossRef]
  21. Zhao, L.; Ren, Y.; Wang, Q.; Deng, L.; Zhang, F. Visible Light Indoor Positioning System Based on Pisarenko Harmonic Decomposition and Neural Network. Chin. J. Electron. 2024, 33, 195–203. [Google Scholar] [CrossRef]
  22. Xu, Y.; Wang, K.; Jiang, C.; Li, Z.; Yang, C.; Liu, D.; Zhang, H. Motion-Constrained GNSS/INS Integrated Navigation Method Based on BP Neural Network. Remote Sens. 2023, 15, 154. [Google Scholar] [CrossRef]
  23. Hao, W.; Huang, Y.; Zhao, G. Acoustic sources localization for composite pate using arrival time and BP neural network. Polym. Test. 2022, 115, 107754. [Google Scholar] [CrossRef]
  24. Chong, Y.; Xu, X.; Guo, N.; Shu, L.; Zhang, Q.; Yu, Z.; Wen, T. Cooperative Localization of Firefighters Based on Relative Ranging Constraints of UWB and Autonomous Navigation. Electronics 2023, 12, 1181. [Google Scholar] [CrossRef]
  25. Schmitt, S.; Will, H.; Hillebrandt, T.; Kyas, M. A virtual indoor localization testbed for Wireless Sensor Networks. In Proceedings of the 10th Annual IEEE International Conference on Sensing, Communications and Networking, New Orleans, LA, USA, 24–27 June 2013; pp. 239–241. [Google Scholar]
  26. Han, R.Q. Application of inertial navigation high precision positioning system based on SVM optimization. Syst. Soft Comput. 2024, 6, 2772–9419. [Google Scholar] [CrossRef]
  27. Yang, G.; Zhu, S.; Li, Q.; Zhao, K. UWB/INS Based Indoor Positioning and NLOS Detection Algorithm for Firefighters. In Proceedings of the 2020 IEEE 22nd International Conference on High Performance Computing and Communications; IEEE 18th International Conference on Smart City; IEEE 6th International Conference on Data Science and Systems (HPCC/SmartCity/DSS), Cuvu, Fiji, 14–16 December 2020; pp. 909–916. [Google Scholar] [CrossRef]
  28. Li, J.; Xie, Z.; Sun, X.; Tang, J.; Liu, H.; Stankovic, J.A. An Automatic and Accurate Localization System for Firefighters. In Proceedings of the 2018 IEEE/ACM Third International Conference on Internet-of-Things Design and Implementation (IoTDI), Orlando, FL, USA, 17–20 April 2018; pp. 13–24. [Google Scholar] [CrossRef]
  29. Li, T.; Wang, Q.; Xu, Y.; An, L.; Wang, M. Design and Implementation of Autonomous Navigation and Search and Rescue System for Firefighters Based on Cloud Platform. J. Command Control 2023, 9, 303–313. [Google Scholar] [CrossRef]
  30. Gandhi, S.R.; Ganz, A.; Mullett, G. FIREGUIDE: Firefighter guide and tracker. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 2037–2040. [Google Scholar] [CrossRef]
  31. Pascucci, F.; Setola, R. An Indoor localization Framework for Hybrid Rescue Teams. IFAC Proc. Vol. 2011, 44, 4765–4770. [Google Scholar] [CrossRef]
  32. Aleksandar, M.; Vojin, Š. Indoor navigation system for firefighters. In Proceedings of the 2011 19th Telecommunications Forum (TELFOR), Belgrade, Serbia, 22–24 November 2011; pp. 1324–1327. [Google Scholar] [CrossRef]
  33. Berrahal, S.; Boudriga, N.; Chammem, M. Wban-Assisted Navigation for Firefighters in Indoor Environments. Adhoc Sens. Wirel. Networks 2016, 33, 81–119. [Google Scholar]
  34. Vey, Q.; Spies, F.; Pestourie, B.; Genon-Catalot, D.; Van Den Bossche, A.; Val, T.; Dalce, R.; Schrive, J. POUCET: A Multi-Technology Indoor Positioning Solution for Firefighters and Soldiers. In Proceedings of the 2021 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Lloret de Mar, Spain, 29 November–2 December 2021. [Google Scholar] [CrossRef]
  35. Nilsson, J.O.; Zachariah, D.; Skog, I.; Händel, P. Cooperative localization by dual foot-mounted inertial sensors and inter-agent ranging. EURASIP J. Adv. Signal Process. 2013, 2013, 164. [Google Scholar] [CrossRef]
  36. Ruiz, A.R.J.; Granja, F.S. Comparing Ubisense, BeSpoon, and DecaWave UWB Location Systems: Indoor Performance Analysis. IEEE Trans. Instrum. Meas. 2017, 66, 2106–2117. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.