1.1. Background
The Bureau of Labor Statistics records indicate an increasing trend in worker fatalities since 2009 [
1]. Among the total worker fatalities in the US, the construction industry alone contributes to almost 20% of fatalities in private industry [
2]. Approximately 60% of such fatalities in 2017 resulted from falls, strikes by objects, and electrocutions, as well as workers being caught in between objects (e.g., equipment); these are also known as the fatal four in construction [
2]. Further, transportation accidents accounted for 40% of the total worker fatalities in the same year [
1]. These statistics indicate that fatal accidents are common in workplaces, and the passive safety measures adopted during construction are insufficient to prevent the consequences of such accidents. Further, studies have found that construction superintendents are not able to identify all possible hazards in work zones [
3,
4,
5,
6]. As such, the researchers have determined the necessity of a supplementary system capable of identifying possible hazards in active work zones. Such a system acts as an additional sense for the construction personnel, and assists in minimizing fatal risks by alerting workers of these possible risks. On this account, researchers have explored the applicability of real-time hazard detection systems for preventing fatalities and severe injuries.
Studies to develop real-time hazard detection systems have used a wide variety of sensing technologies. One of the frequently used sensing technologies is proximity detecting sensors [
7,
8,
9,
10,
11,
12,
13,
14,
15], which prevent construction workers, drivers, equipment operators, etc., from proximity hazards by generating real-time alerts. Various studies to solve proximity issues on job sites have utilized different technologies, such as radio frequency identification (RFID) [
8,
11], ultra-wideband (UWB) [
12,
16], global positioning system (GPS) [
10], inertial measurement unit (IMU) sensors [
10], bluetooth low energy (BLE) based sensors [
7], unmanned aerial vehicles (UAV) [
17], etc. These studies [
8,
9,
10] have demonstrated the capabilities of their systems to minimize proximity hazards on construction sites, mainly the fatalities and severe injuries resulting from workers being struck by vehicles or equipment. Besides proximity issues, researchers have also attempted to resolve other types of hazard on construction sites, such as the use of sensor data to automate the monitoring of temporary structures [
18,
19] and prevent fall hazards [
20]. There are many studies that have used computer vision techniques to identify the unsafe behavior of workers [
21] and failure of temporary structures [
22,
23].
Most of these automated approaches focus on improving the overall safety in construction work zones. However, these studies do not address the individualized safety issues, which are more important when attempting to prevent individual worker fatalities. Thus, researchers are now interested in developing systems that are capable of generating individualized information. Accordingly, researchers have developed various wearable technologies to resolve construction safety issues. In recent years, there has been a significant increase in studies to explore the application of wearable devices in construction. One of the most studied topics is the application of wearable devices to identify unsafe worker positions [
24,
25,
26,
27,
28,
29,
30,
31]. The wearable devices monitor the posture and body movement of workers while they are performing various construction activities and unsafe positions are reported back to them. Accordingly, workers correct their positions and prevent themselves from fatalities or injuries resulting from such postures. For this purpose, researchers have mostly used IMU-based devices [
25,
32,
33], pressure sensors [
30], and built-in smartphone sensors [
26,
27,
31]. Other studies have used wearable devices to detect physical fatigue among workers [
34,
35,
36], as well as the psychological status of workers [
36,
37] in order to prevent accidents due to unsafe worker behaviors. Researchers have also used insole pressure sensors to detect potential worker instability [
38], and thus, prevent fall hazards.
The studies with wearable technologies have mainly focused on preventing hazards related to unsafe worker behaviors on construction sites. However, there are limited studies on using wearable technologies to warn workers of potential hazards surrounding them. An augmented reality-based wearable device proposed by Kim et al. [
39] has the capability of alerting workers to prevent possible hazards. Their system warns workers of the orientations and distances of potential hazards, along with indicating the level of safety for individual workers. However, the system is designed to generate warnings based on the visuals of workers. Thus, it may not be able to prevent worker injury due to possible hazards beyond their vision. Despite the high potential of their system to alert construction workers, limited sight, due to obstructions or low vision, and limited hearing, due to loud noises on construction sites, may be limitations as well. As such, it is not effective to only depend on the senses of workers to prevent construction accidents; a few recent studies found that visual and auditory alerts often fail to alert workers [
40,
41]. Studies have shown that additional sensory systems can play a significant role to supplement these limited senses [
42,
43]. Therefore, this study focuses on exploring the use of a wearable system that activates the sense of touch in workers. For this purpose, the researchers developed a tactile signal-based wearable system to inform workers of detected hazards through vibration signals on their backs.
1.2. Research with Tactile Sensors
While exploring studies utilizing tactile sensors, it has been observed that most of these studies focused on assisting people for navigation purposes. Van Erp [
44] investigated the effectiveness of perceiving tactile directional information using 15 tactors on a waist belt around the torso of static test participants. Other studies [
45,
46,
47] have demonstrated the application of such a tactile system on rugged terrain for military usage and concluded that the tactile-based navigation system is capable of guiding effectively during local navigation. Grierson et al. [
48,
49] tested a wearable belt with four vibrating motors to facilitate the navigation of older people who have limited visual capability or low memory. The vibration motors are fixed on a belt in such a way that they lie at the cardinal front, back, left, and right parts of the body. The system is integrated with a GPS for location tracking, and it demonstrated the capability to guide people to their pre-set destinations by using vibration signals to specify direction and distance. However, this study was limited to testing four obvious directions: front, back, left, and right, which might have resulted in better performance of the system. Other studies [
50,
51,
52] demonstrated the capability of a system with eight vibration motors equally spaced around the waist, for assisting the navigation of visually impaired people. Marston et al. [
53,
54] demonstrated the applicability of a vibrotactile system wearable on the wrist to assist navigation of visually impaired individuals. The system is integrated with a GPS and it uses a single vibrator to indicate correct heading direction of the person wearing it. Thus, the problem with this system is that, at turning points, a person has to spend some time turning around to determine the correct direction to move forward. Furthermore, researchers have also developed a wearable system capable of transforming visual information captured with a camera into tactile signals to assist in navigation [
55].
While observing the interface of vibration motors used in the previously discussed navigation assistance systems, it was found that most of these studies [
44,
45,
48,
49,
50,
51,
52] had their motors arranged around a waist belt at an equal spacing to assist in navigation. Few studies [
56,
57] used a 3 × 3 back-array arrangement of motors on the back of people to guide in navigation. These systems with a back array layout of motors created sequential vibrations of motors to indicate direction to users. Srikulwong and O’Neill [
58] compared the effectiveness of transmitting directional information using three different interfaces of vibration motors: 3 × 3 back-array at 50 mm and 80 mm of spacing between motors, and eight motors arranged around a waist-belt. The study identified the waist-belt interface to be the most effective for navigation purposes, while the interface with the 3 × 3 back-array at 50 mm spacing had the worst performance. The researchers also concluded that the directional information could be effectively delivered with vibration motors arranged around the body to resemble actual directions in the horizontal plane. With such an arrangement, it is easier for users to identify directions based on the location of a single vibrating motor around their body.
Apart from the studies to determine the feasibility of using a vibration-based system for navigation purposes, Durá-Gil et al. [
50] investigated different vibration patterns for effective directional information. Among the eight vibration motors placed at equal spacing around the waist, five motors at the front were used for indicating direction, such as continue straight and turn left/right at different angles. Other motors at the back were used to indicate that the users should stop or slow down. The researchers conducted a pairwise comparison to identify a suitable vibration pattern from a set of different vibration patterns for particular directional information. Their study concluded that vibrations with single bursts are effective to deliver directional information, compared to vibrations in sequence.
It is evident that researchers have achieved significant progress in developing tactile sensor-based navigation devices mainly to help people requiring assistance to navigate around freely. These developments have demonstrated the feasibility of using such a tactile-based application to enhance the sensing capability of construction workers to detect surrounding hazards. In response to the problem of construction safety, Cho and Park [
59] developed a prototype tactile sensory system and investigated signal parameters in hope to provide an artificial sensing ability to improve the perception capability of workers. Their study mainly focused on investigating three basic signal parameters—signal intensity, signal duration, and duration between signals—to build basic language parameters for construction safety applications. The researchers identified eight distinct signal units based on those signal parameters, for effective communication with construction workers. However, their study did not account for system design factors that can help effective construction of messages that comprise hazard information. In order to address this issue, this study focuses on determining the configuration of a tactile sensory communication system for the perception of hazard information.