Research on the Wearable Augmented Reality Seeking System for Rescue-Guidance in Buildings †

: When a construction disaster occurs, the ﬁrst-line rescue personnel often enter the disaster site immediately, and every second counts in rescuing the people who need help. However, the rescue personnel may not be familiar with the indoor layouts of different buildings. If the indoor paths are complicated, or when the ﬁre smoke obstructs the line of sight, the rescue personnel are prone to spatial disorientation, which usually causes the rescue personnel to fall into danger. Therefore, we have developed the “Wearable Augmented reality Seeking System” (WASS) to assist rescue personnel in reading the information provided by the “Building Information Guiding System”. This system allows them to enter an unfamiliar space and reach the target rescue position, retreat to the entrance, or ﬁnd an alternative escape route. The WASS is based on the HoloLens augmented reality system, which displays 3D digital information such as indoor layouts, one’s current location, spatial images captured by an infrared camera and a depth camera, and 3D virtual guiding symbols or text. The WASS includes two modules: First, the augmented reality gesture interaction module allows one to read the positioning anchor information of the “Building Information Guiding System” (BIGS). The rescue personnel can communicate via gestures, select the task target, and follow the 3D virtual guidance symbols in the air to reach the relay anchor points and ﬁnally arrive at the target position. Second, the service support module, including a lighting source and backup power, ensures that the QR code recognition process and long-term operation of the WASS are successful.


Introduction
In 2021, a fire broke out in a building in Changhua County, Taiwan.A firefighter who was the first to enter the fire scene was left alone because his companion was injured and retreated.He was thought to be disoriented in the thick smoke and was eventually found to have exhausted his oxygen cylinder and died in a room without any windows.
According to the Ministry of Interior survey report in May 2022, the main cause of the firefighter's death was the inhalation of toxic gases, resulting in hypoxic shock and suffocation [1].Therefore, we have established a Wearable Augmented reality Seeking System (WASS) to help firefighters and rescue personnel obtain 3D guidance information for the location of firefighting facilities or escape facilities immediately when entering an unfamiliar interior space and see through the thick smoke to figure out the objects and the heat sources in the surrounding area.We expect the WASS to improve rescue services' success rates and the safety of firefighters or rescue personnel.

Literature Review
In the context of identifying the location of people indoors, the most frequently researched technology is indoor positioning technology [2][3][4].Nowadays, the primary indoor positioning process first involves setting up auxiliary nodes with fixed positions in the indoor environment.The positions of these nodes are known.Position information such as radio frequency identification (RFID) tags is directly stored in the nodes, while other data, such as infrared and ultrasonic data, are stored in the databases of computer terminals [2].
The positioning system needs to measure the distance from the measured node to the auxiliary node to determine the relative position.Distance measurement usually requires transmitting and receiving equipment.According to the difference in the positions of the transmitter and receiver, the positioning techniques are divided into two types: In the first type, the transmitter is located at the measured node, and the receiver is located at the auxiliary node, and this involves techniques such as infrared, ultrasonic, and radio frequency identification (RFID).In the second type, the transmitter is located at the auxiliary node and the receiver is located at the measured node, involving techniques such as WiFi, ultra-wideband (UWB), and ZigBee [2].
The characteristics of the above positioning technologies are presented in the following Table 1.However, none of the current mainstream indoor positioning systems mentioned above meet all of the relevant requirements (e.g., low cost and high accuracy).The auxiliary nodes "installed in the buildings" have to be without replacing batteries, and the system can still work even under extreme environmental conditions, such as during power outages, under high humidity, at high temperatures, and amidst dense smoke.

Case Study
In practical applications, thermal imaging cameras (TICs) are the best tools for firefighters to see through smoke and find the source of the fire in a fire scene obscured by dense smoke.In accordance with the "Fire Bureau of Taichung City Government Guiding Principles for Operation and Maintenance of Disaster Relief Equipment" [5], the appropriate circumstances in which one should use a thermal imaging camera include the following.TICs are used to detect and display the temperature around the fire site, to search for fire points and hidden fire sources, and determine the direction of fire spread.

Identifying the Environment
Under the influence of factors such as dense smoke, insufficient light, and a closed environment generated at the disaster site, visibility can be reduced, which is harmful to the safety of rescuers.The thermal images displayed by thermal imaging cameras can be used to preliminarily distinguish the terrain and features of the site.

Search and Rescue of Human Life
When the temperature of any object is above absolute zero, it emits different infrared rays due to the strength of internal molecular vibrations.Rescuers can use this feature to search for people who need to be rescued.

Chemical Tank Disasters
If the chemical tank body is impacted, overturned, and leaked, it may cause ignition or a fire, and the pressure accumulates as the temperature rises.Thermal imaging cameras can detect the temperature change of a tank body, allowing one to take appropriate protective measures to reduce the occurrence of more disasters (Figure 1).
fighters to see through smoke and find the source of the fire in a fire scene obscured by dense smoke.In accordance with the "Fire Bureau of Taichung City Government Guiding Principles for Operation and Maintenance of Disaster Relief Equipment" [5], the appropriate circumstances in which one should use a thermal imaging camera include the following.

Fire Cases
TICs are used to detect and display the temperature around the fire site, to search for fire points and hidden fire sources, and determine the direction of fire spread.

Identifying the Environment
Under the influence of factors such as dense smoke, insufficient light, and a closed environment generated at the disaster site, visibility can be reduced, which is harmful to the safety of rescuers.The thermal images displayed by thermal imaging cameras can be used to preliminarily distinguish the terrain and features of the site.

Search and Rescue of Human Life
When the temperature of any object is above absolute zero, it emits different infrared rays due to the strength of internal molecular vibrations.Rescuers can use this feature to search for people who need to be rescued.

Chemical Tank Disasters
If the chemical tank body is impacted, overturned, and leaked, it may cause ignition or a fire, and the pressure accumulates as the temperature rises.Thermal imaging cameras can detect the temperature change of a tank body, allowing one to take appropriate protective measures to reduce the occurrence of more disasters (Figure 1).Since the TICs are handheld devices, they hinders the ability of firefighters or rescue personnel to carry other things, extinguish fires, or assist rescuers.Moreover, TICs only have one single function.If their multiple functions were integrated into the design of a wearable device, the dexterity of the user's hands would be increased.Since the TICs are handheld devices, they hinders the ability of firefighters or rescue personnel to carry other things, extinguish fires, or assist rescuers.Moreover, TICs only have one single function.If their multiple functions were integrated into the design of a wearable device, the dexterity of the user's hands would be increased.

Design of the AR System
Based on the above analysis, we designed the Wearable Augmented reality Seeking System (WASS), which allows search and rescue personnel to directly see infrared images through smart glasses without using a handheld thermal camera.At the same time, through the depth sensor, the outline of surrounding objects can be seen clearly in the dark, even if these objects do not generate heat themselves.Additionally, through reading the information provided by the "Building Information Guiding System" (BIGS), the WASS lets rescue personnel view the 3D guiding arrows and information floating in the air to guide them in reaching the target rescue position, retreat to the entrance, or find an alternative escape route.
We used HoloLens as the basic device of the WASS.HoloLens has hardware equipment such as infrared cameras, depth sensors, and inertial sensors (Figure 2).It also has an open development environment "Research Mode" that allows researchers to develop software to control the hardware equipment on it.
WASS lets rescue personnel view the 3D guiding arrows and in air to guide them in reaching the target rescue position, retreat t alternative escape route.
We used HoloLens as the basic device of the WASS.HoloL ment such as infrared cameras, depth sensors, and inertial sens an open development environment "Research Mode" that allow software to control the hardware equipment on it.According to the introductory page of the website for HoloL is for research applications and has access to the following strea 1. Visible Light Environment Tracking Cameras-Grayscale tem for head tracking and map building.2. Depth Camera operating in two modes.
AHAT, high-frequency (45 FPS) near-depth sensing is use ferent from the first version's short-throw mode, AHAT gives a p wrap beyond 1 m.

Long-throw, low-frequency (1-5 FPS) far-depth sensing use
Two versions of the IR-reflectivity stream are used by th depth.These images are illuminated by infrared and unaffected The Research Mode is designed for academic and industr new ideas in the fields of Computer Vision and Robotics.It is not deployed in enterprise environments or available through the distribution channels [8].

Experiment and Discussion
The WASS is based on the HoloLens augmented reality Mode, we built a Visual Studio environment to complete the pro one is wearing the HoloLens, they can see two visible light cam According to the introductory page of the website for HoloLens2, the Research Mode for research applications and has access to the following streams: 1.
Visible Light Environment Tracking Cameras-Grayscale cameras used by the system for head tracking and map building.

2.
Depth Camera operating in two modes.
AHAT, high-frequency (45 FPS) near-depth sensing is used for hand tracking.Different from the first version's short-throw mode, AHAT gives a pseudo-depth with phase wrap beyond 1 m.
Two versions of the IR-reflectivity stream are used by the HoloLens to compute depth.These images are illuminated by infrared and unaffected by ambient visible light.
The Research Mode is designed for academic and industrial researchers exploring new ideas in the fields of Computer Vision and Robotics.It is not intended for applications deployed in enterprise environments or available through the Microsoft Store or other distribution channels [8].

Experiment and Discussion
The WASS is based on the HoloLens augmented reality system.In the Research Mode, we built a Visual Studio environment to complete the programming so that, when one is wearing the HoloLens, they can see two visible light camera views on the left and right: an infrared camera view and a depth reader view (Figure 3).
As shown in the Figure 4, visible light cameras do not permit the viewing of images in a low-light environment.Still, infrared cameras allow one to see hot objects such as hands, and depth sensors can depict the contours and depths of surrounding objects.This helps search and rescue personnel see other rescuers and those who need rescuing and escape paths clearly in the dark and amidst thick smoke, also helping them to avoid hitting surrounding objects.As shown in the figure 4, visible light cameras do not permit the viewing of images in a low-light environment.Still, infrared cameras allow one to see hot objects such as hands, and depth sensors can depict the contours and depths of surrounding objects.This helps search and rescue personnel see other rescuers and those who need rescuing and escape paths clearly in the dark and amidst thick smoke, also helping them to avoid hitting surrounding objects.Compared with normal helmets used by firefighters (Figure 5), WASS displays 3D digital information in the HoloLens, including indoor layouts, one's current location, spatial images captured by the infrared camera and depth camera, and 3D virtual guiding symbols or text.Regarding the system's integration in firefighters' helmets, we used a fire helmet with a shallow brim so that the HoloLens glasses can be lifted when necessary.At the same time, there is an extended brim at the back of the fire helmet to protect the battery of the HoloLens from impact and other damage (Figure 6).The simulation of a firefighter wearing a WASS-based device is shown in Figure 7.As shown in the figure 4, visible light cameras do not permit the viewing of images in a low-light environment.Still, infrared cameras allow one to see hot objects such as hands, and depth sensors can depict the contours and depths of surrounding objects.This helps search and rescue personnel see other rescuers and those who need rescuing and escape paths clearly in the dark and amidst thick smoke, also helping them to avoid hitting surrounding objects.Compared with normal helmets used by firefighters (Figure 5), WASS displays 3D digital information in the HoloLens, including indoor layouts, one's current location, spatial images captured by the infrared camera and depth camera, and 3D virtual guiding symbols or text.Regarding the system's integration in firefighters' helmets, we used a fire helmet with a shallow brim so that the HoloLens glasses can be lifted when necessary.At the same time, there is an extended brim at the back of the fire helmet to protect the battery of the HoloLens from impact and other damage (Figure 6).The simulation of a firefighter wearing a WASS-based device is shown in Figure 7. Compared with normal helmets used by firefighters (Figure 5), WASS displays 3D digital information in the HoloLens, including indoor layouts, one's current location, spatial images captured by the infrared camera and depth camera, and 3D virtual guiding symbols or text.Regarding the system's integration in firefighters' helmets, we used a fire helmet with a shallow brim so that the HoloLens glasses can be lifted when necessary.At the same time, there is an extended brim at the back of the fire helmet to protect the battery of the HoloLens from impact and other damage (Figure 6).The simulation of a firefighter wearing a WASS-based device is shown in Figure 7.The WASS includes the following modules: 1.The augmented reality gesture interaction module, which helps one to read t tioning anchor information of BIGS.The rescue personnel can communicate tures, select the task target, and follow the 3D virtual guidance symbols in t reach the relay anchor points and finally arrive at the target position.2. The service support module, which includes a lighting source and backup p ensure the QR code recognition process and the long-term successful operatio WASS.The WASS includes the following modules: 1.The augmented reality gesture interaction module, which helps one to read the posi tioning anchor information of BIGS.The rescue personnel can communicate via ges tures, select the task target, and follow the 3D virtual guidance symbols in the air to reach the relay anchor points and finally arrive at the target position.2. The service support module, which includes a lighting source and backup power to ensure the QR code recognition process and the long-term successful operation of th WASS.The WASS includes the following modules: 1.The augmented reality gesture interaction module, which helps one to read the positioning anchor information of BIGS.The rescue personnel can communicate via gestures, select the task target, and follow the 3D virtual guidance symbols in the air to reach the relay anchor points and finally arrive at the target position.2.
The service support module, which includes a lighting source and backup power to ensure the QR code recognition process and the long-term successful operation of the WASS.

Conclusions
In this study, we applied HoloLens Research Mode in an open environment in combination with other creative programs to construct the WASS.The user of the WASS can see images provided by visible light cameras, infrared cameras, and depth cameras on the screen of the smart glasses.Then, using the WASS, rescue personnel can still work in dark or dense smoky environments and see their fellow rescuers, those that need to be rescued, surrounding objects, and escape routes.The WASS can also be used with the BIGS indoor space database to see 3D guidance arrows and information inertially positioned in space.The WASS can help firefighters overcome spatial disorientation in extreme environments.

Figure 3 .
Figure 3. Four views captured by different cameras and sensors equipped on HoloLens launched by the program used for this study.The red-to-white gradient color bar on the screen is the coordinate showing the extent of the HoloLens tilt or rotation detected by the Inertial Measurement Unit (IMU).

Figure 4 .
Figure 4. Visible light cameras and IR cameras cannot see objects clearly, but the depth sensor can allow one to see nearby things.

Figure 3 .
Figure 3. Four views captured by different cameras and sensors equipped on HoloLens launched by the program used for this study.The red-to-white gradient color bar on the screen is the coordinate showing the extent of the HoloLens tilt or rotation detected by the Inertial Measurement Unit (IMU).

Figure 3 .
Figure 3. Four views captured by different cameras and sensors equipped on HoloLens launched by the program used for this study.The red-to-white gradient color bar on the screen is the coordinate showing the extent of the HoloLens tilt or rotation detected by the Inertial Measurement Unit (IMU).

Figure 4 .
Figure 4. Visible light cameras and IR cameras cannot see objects clearly, but the depth sensor can allow one to see nearby things.

Figure 4 .
Figure 4. Visible light cameras and IR cameras cannot see objects clearly, but the depth sensor can allow one to see nearby things.

Figure 5 .
Figure 5. Helmets used by active-duty firefighters in Taichung City.Figure 5. Helmets used by active-duty firefighters in Taichung City.

Figure 5 .
Figure 5. Helmets used by active-duty firefighters in Taichung City.Figure 5. Helmets used by active-duty firefighters in Taichung City.

Figure 5 .
Figure 5. Helmets used by active-duty firefighters in Taichung City.

Figure 6 .
Figure 6.The WASS consists of a fire helmet and HoloLens 2, as well as other accessories.

Figure 7 .
Figure 7. Simulation of a firefighter wearing a WASS-based device.

Figure 6 .
Figure 6.The WASS consists of a fire helmet and HoloLens 2, as well as other accessories.

Figure 5 .
Figure 5. Helmets used by active-duty firefighters in Taichung City.

Figure 6 .
Figure 6.The WASS consists of a fire helmet and HoloLens 2, as well as other accessories.

Figure 7 .
Figure 7. Simulation of a firefighter wearing a WASS-based device.

Figure 7 .
Figure 7. Simulation of a firefighter wearing a WASS-based device.