Integration with 3D Visualization and IoT-Based Sensors for Real-Time Structural Health Monitoring

Real-time monitoring on displacement and acceleration of a structure provides vital information for people in different applications such as active control and damage warning systems. Recent developments of the Internet of Things (IoT) and client-side web technologies enable a wireless microcontroller board with sensors to process structural-related data in real-time and to interact with servers so that end-users can view the final processed results of the servers through a browser in a computer or a mobile phone. Unlike traditional structural health monitoring (SHM) systems that deliver warnings based on peak acceleration of earthquake, we built a real-time SHM system that converts raw sensor results into movements and rotations on the monitored structure’s three-dimensional (3D) model. This unique approach displays the overall structural dynamic movements directly from measured displacement data, rather than using force analysis, such as finite element analysis, to predict the displacement statically. As an application to our research outcomes, patterns of movements related to its structure type can be collected for further cross-validating the results derived from the traditional stress-strain analysis. In this work, we overcome several challenges that exist in displaying the 3D effects in real-time. From our proposed algorithm that converts the global displacements into element’s local movements, our system can calculate each element’s (e.g., column’s, beam’s, and floor’s) rotation and displacement at its local coordinate while the sensor’s monitoring result only provides displacements at the global coordinate. While we consider minimizing the overall sensor usage costs and displaying the essential 3D movements at the same time, a sensor deployment method is suggested. To achieve the need of processing the enormous amount of sensor data in real-time, we designed a novel structure for saving sensor data, where relationships among multiple sensor devices and sensor’s spatial and unique identifier can be presented. Moreover, we built a sensor device that can send the monitoring data via wireless network to the local server or cloud so that the SHM web can integrate what we develop altogether to show the real-time 3D movements. In this paper, a 3D model is created according to a two-story structure to demonstrate the SHM system functionality and validate our proposed algorithm.


Introduction
Important civil structures, such as bridges, energy utilities, nuclear power plants, and dams, require regular monitoring of their behaviors to support management decisions, reduce the loss of lives under natural disasters. Moreover, as a result of monitoring behaviors, necessary maintenance can be handled in time to minimize hazardous impacts on a large number of resources that are commonly employed within the constructions [1]. Structural health monitoring (SHM) aims to develop a system that can continuously check a structure, deliver current structural responses, and even alert when the structure exceeds the design limit. A structure's behaviors often vary according to its age, usage, or environmental factors. SHM systems in the past research are designed with pre-assigned or specific types of sensor deployments due to the pre-defined calculation or in-device computations. Some SHM studies examined the monitoring approaches by evaluating

Related Works
Earlier SHM research studied a method or a sensor device to issue an early warning when the external force exceeded the design capacity of the structure or the implementation of the closed sensor network for monitoring [2,3,[10][11][12][13][14][15]. Recent works focused on structural damage detections. Aloisio et al. [16] conducted a variance-based sensitivity analysis of various damage indicators to understand their possible usage limits. Zhoi and Wahab [17] incorporated the cosine measure and modal assurance criterion (MAC) indicators in the application of transmissibility-based structural damage detection. Sony and Sadhu [18] used various numerical simulations to validate the synchrosqueezing transform (SST) for progressive damage assessment. In applying IoT in the SHM, Muttillo et al. [19] built an IoT sensor system that integrated ADXL355 sensors, a Same3X8E ARM Cortex-M3 microcontroller, and a personal computer via a RS485 communication bus. Their suggested system evaluated the damage indicator but does not send monitoring data wirelessly to a server on the internet. Abdelgawad and Yelamarthi [20] used a USB Wi-Fi module, a Raspberry Pi 2 board, buffers, ADCs, DACs, and PZTs to build an IoT platform. The results will be sent to the server hosted in ThingWorx. Martinez et al. [21] created an IoT-based prototype that uses a Raspberry Pi 3 board, a USB wireless card, and accelerometer sensors to show the acceleration values in a x-y chart on a server. Aba et al. [22] developed an IoT-based sensor device that could monitor petroleum pipelines and showed the results in real-time on the ThingSpeak website.

Sensor Device
Our design choice emphasizes a device that can easily be installed in the existing structure and built with inexpensive costs. Hence, we selected the NodeMCU (a microcontroller board with a 32-bit ESP8266 chip), the ADXL345 accelerometer sensor [23], and a MicroSD board (see Figure 1). This integration almost occupied all the general purpose input/output (GPIOs) that NodeMCU provides (i.e., ten 3.3 voltage digital GPIOs and one 1.8 voltage analog input GPIO). In addition to the inexpensive cost for accelerometer sensors, the ADXL345 provides a better detection range and preciseness among MMA7455, MMA8451, MPU6060, MMA7660, and ADXL335 chips. The micro-SD board is not essential but it saves the monitoring results locally as backup, and the backup data can be analyzed offline if needed.
Sensors 2021, 21, x FOR PEER REVIEW 3 of 16 system that handles big data and heterogeneous sensors in an edge-cloud computing environment.

Related Works
Earlier SHM research studied a method or a sensor device to issue an early warning when the external force exceeded the design capacity of the structure or the implementation of the closed sensor network for monitoring [2,3,[10][11][12][13][14][15]. Recent works focused on structural damage detections. Aloisio et al. [16] conducted a variance-based sensitivity analysis of various damage indicators to understand their possible usage limits. Zhoi and Wahab [17] incorporated the cosine measure and modal assurance criterion (MAC) indicators in the application of transmissibility-based structural damage detection. Sony and Sadhu [18] used various numerical simulations to validate the synchrosqueezing transform (SST) for progressive damage assessment. In applying IoT in the SHM, Muttillo et al. [19] built an IoT sensor system that integrated ADXL355 sensors, a Same3X8E ARM Cortex-M3 microcontroller, and a personal computer via a RS485 communication bus. Their suggested system evaluated the damage indicator but does not send monitoring data wirelessly to a server on the internet. Abdelgawad and Yelamarthi [20] used a USB Wi-Fi module, a Raspberry Pi 2 board, buffers, ADCs, DACs, and PZTs to build an IoT platform. The results will be sent to the server hosted in ThingWorx. Martinez et al. [21] created an IoT-based prototype that uses a Raspberry Pi 3 board, a USB wireless card, and accelerometer sensors to show the acceleration values in a x-y chart on a server. Aba et al. [22] developed an IoT-based sensor device that could monitor petroleum pipelines and showed the results in real-time on the ThingSpeak website.

Sensor Device
Our design choice emphasizes a device that can easily be installed in the existing structure and built with inexpensive costs. Hence, we selected the NodeMCU (a microcontroller board with a 32-bit ESP8266 chip), the ADXL345 accelerometer sensor [23], and a MicroSD board (see Figure 1). This integration almost occupied all the general purpose input/output (GPIOs) that NodeMCU provides (i.e., ten 3.3 voltage digital GPIOs and one 1.8 voltage analog input GPIO). In addition to the inexpensive cost for accelerometer sensors, the ADXL345 provides a better detection range and preciseness among MMA7455, MMA8451, MPU6060, MMA7660, and ADXL335 chips. The micro-SD board is not essential but it saves the monitoring results locally as backup, and the backup data can be analyzed offline if needed. The wireless capability is provided by the ESP8266 chip. A small amount of data can be saved in its flash memory. The ESP8266 has a station infrastructure (STA) mode that The wireless capability is provided by the ESP8266 chip. A small amount of data can be saved in its flash memory. The ESP8266 has a station infrastructure (STA) mode that sets a service set identifier (SSID) to allow other computers to connect to its wireless network; this means that the device links to the wireless network and obtains a local IP address during STA mode. Inside the device, we stored several web pages in the ESP8266 chip's built-in flash memory, letting users save configuration variables inside the flash memory. Because the user interface is a web page, the users can use a browser to set the key variables (e.g., the wireless network's SSID, account and password, server host, etc.). During the initialization stage of the device, a universal unique identifier (UUID) is generated and saved in the flash memory. The server uses this UUID to associate with its deployed location, the relation with other devices, and the moving point in the 3D model. When real-time monitoring records associated with the device's identification number are received by the server, the overall movement of the 3D structural model is calculated according to our algorithm and shown to the client.
It is critical to ensure sufficient power to a sensor device to monitor the structure in real-time. The micro USB port in the NodeMCU provides the entire device power. Using the micro USB port extends the sensor device's power source options (e.g., using a replaceable high-capacity battery) and increases the device's portability. On the other hand, the device buffers the monitoring data in the built-in flash memory and then pushes the data via the network to an internet server. All the computations are performed in the server. Decreasing the number of data transmission times to the server and shifting the computation from the sensor device to the server should reduce the power consumption.

System Architecture
Modern software widely applies various web technologies to facilitate inter-operability between programs or to exchange data among servers. Due to the recent advance of Javascript technology, particularly in the server-side run-time environment (i.e., Node.js) and client-side framework and libraries, software can enjoy the benefits from the Javascript; that is, its scalability, high-performance, and asynchronous non-blocking executions. Hence, our server software integrates various Javascript server-side and user interface libraries-Node.js, SocketIO, Three.js, and C3.js (see Figure 2). All the server-side Javascript components run on the Node.js environment. The SocketIO is for real-time communication, which sends bi-directional messages according to the pre-defined events, between the client side (i.e., user's browser) and the server side. So, each axis' displacement values as well as the transformation and movement data of each 3D element are embedded in the server-to-client message sent via the protocol created by the SocketIO on both sides. The Three.js on the client-side page uses WebGL to provide application programming interfaces (APIs) for users to create and render the 3D model. We create an online 3D model editor and a real-time display based on the Three.js. Since Three.js can run on any browser for online 3D model rendering and editing, the 3D model can be displayed in any client [24]. The C3.js is the chart library to show real-time acceleration values on the X-Y line chart. sets a service set identifier (SSID) to allow other computers to connect to its wireless network; this means that the device links to the wireless network and obtains a local IP address during STA mode. Inside the device, we stored several web pages in the ESP8266 chip's built-in flash memory, letting users save configuration variables inside the flash memory. Because the user interface is a web page, the users can use a browser to set the key variables (e.g., the wireless network's SSID, account and password, server host, etc.). During the initialization stage of the device, a universal unique identifier (UUID) is generated and saved in the flash memory. The server uses this UUID to associate with its deployed location, the relation with other devices, and the moving point in the 3D model. When real-time monitoring records associated with the device's identification number are received by the server, the overall movement of the 3D structural model is calculated according to our algorithm and shown to the client.
It is critical to ensure sufficient power to a sensor device to monitor the structure in real-time. The micro USB port in the NodeMCU provides the entire device power. Using the micro USB port extends the sensor device's power source options (e.g., using a replaceable high-capacity battery) and increases the device's portability. On the other hand, the device buffers the monitoring data in the built-in flash memory and then pushes the data via the network to an internet server. All the computations are performed in the server. Decreasing the number of data transmission times to the server and shifting the computation from the sensor device to the server should reduce the power consumption.

System Architecture
Modern software widely applies various web technologies to facilitate inter-operability between programs or to exchange data among servers. Due to the recent advance of Javascript technology, particularly in the server-side run-time environment (i.e., Node.js) and client-side framework and libraries, software can enjoy the benefits from the Javascript; that is, its scalability, high-performance, and asynchronous non-blocking executions. Hence, our server software integrates various Javascript server-side and user interface libraries-Node.js, SocketIO, Three.js, and C3.js (see Figure 2). All the server-side Javascript components run on the Node.js environment. The SocketIO is for real-time communication, which sends bi-directional messages according to the pre-defined events, between the client side (i.e., user's browser) and the server side. So, each axis' displacement values as well as the transformation and movement data of each 3D element are embedded in the server-to-client message sent via the protocol created by the SocketIO on both sides. The Three.js on the client-side page uses WebGL to provide application programming interfaces (APIs) for users to create and render the 3D model. We create an online 3D model editor and a real-time display based on the Three.js. Since Three.js can run on any browser for online 3D model rendering and editing, the 3D model can be displayed in any client [24]. The C3.js is the chart library to show real-time acceleration values on the X-Y line chart.  The software architecture of our server is layer style. The upmost layer is HTTP, responsible for the request and response between sensor device and server. All the IoT sensor device monitoring results are pushed to this server according to its sampling frequency via HTTP protocol. All the required computations are in between the HTTP and data storage layers. Inside the computation part, each monitoring point's global displacements are calculated from monitored acceleration values and then the global displacements can be converted into each 3D structural element's rotation and shift. Regarding the data storage, the system uses MongoDB, which is classified as a document-oriented (NoSQL) database.
There will be two different kinds of usages of the SHM system (see Figure 3). Firstly, the 3D model is formed according to the real-world civil structure and then all sensor device details (e.g., locations and identification number) are scaled down to the model. All the model and sensor details are saved in the database. When the monitoring starts, all real-time monitoring results collected from IoT sensor devices are sent to the server via the wireless network. To reduce the transmission data, the monitoring results are firstly sent to the local server. The local server then sends replica and computation results to the cloud server. During this monitoring phase, once the user's client accesses any server, the user can see the 3D model moving in real-time.
sor device monitoring results are pushed to this server according to its sampling frequency via HTTP protocol. All the required computations are in between the HTTP and data storage layers. Inside the computation part, each monitoring point's global displacements are calculated from monitored acceleration values and then the global displacements can be converted into each 3D structural element's rotation and shift. Regarding the data storage, the system uses MongoDB, which is classified as a document-oriented (NoSQL) database.
There will be two different kinds of usages of the SHM system (see Figure 3). Firstly, the 3D model is formed according to the real-world civil structure and then all sensor device details (e.g., locations and identification number) are scaled down to the model. All the model and sensor details are saved in the database. When the monitoring starts, all real-time monitoring results collected from IoT sensor devices are sent to the server via the wireless network. To reduce the transmission data, the monitoring results are firstly sent to the local server. The local server then sends replica and computation results to the cloud server. During this monitoring phase, once the user's client accesses any server, the user can see the 3D model moving in real-time.

Challenges and Assumptions
The real-world structural displacement combines the deformation of each element (e.g., bending or shrinking) and the ground movement. The actual deformation of the structure depends on its material, shape, and force. After the structure is built, the composition of the material is often unknown. That is why most SHM sensor devices only monitor acceleration and issues warnings when the monitoring acceleration is close to the designed value. The nature of having many unknown factors of the to-be-monitor afterbuilt structure particularly brings tough challenges to our system for displaying each structural element's actual deformation and displacement on its 3D model according to three-dimensional monitored displacements.
The overall displacements derived by the sensor's monitoring results are the combinational effects of horizontal and vertical forces. Traditional systems use forces and material stress and strain model to derive the displacements of the structural element. One assumption has to be made to treat the monitored displacements as an aftereffect of complex combinational forces and only use displacements to illustrate the structure's 3D model movement. We assume that every column section between two monitoring points is a rigid body-no deformation. Therefore, the assumption overlooks the initial linear and later non-linear behaviors of most construction materials. We attempt to overcome

Challenges and Assumptions
The real-world structural displacement combines the deformation of each element (e.g., bending or shrinking) and the ground movement. The actual deformation of the structure depends on its material, shape, and force. After the structure is built, the composition of the material is often unknown. That is why most SHM sensor devices only monitor acceleration and issues warnings when the monitoring acceleration is close to the designed value. The nature of having many unknown factors of the to-be-monitor after-built structure particularly brings tough challenges to our system for displaying each structural element's actual deformation and displacement on its 3D model according to three-dimensional monitored displacements.
The overall displacements derived by the sensor's monitoring results are the combinational effects of horizontal and vertical forces. Traditional systems use forces and material stress and strain model to derive the displacements of the structural element. One assumption has to be made to treat the monitored displacements as an aftereffect of complex combinational forces and only use displacements to illustrate the structure's 3D model movement. We assume that every column section between two monitoring points is a rigid body-no deformation. Therefore, the assumption overlooks the initial linear and later non-linear behaviors of most construction materials. We attempt to overcome this weakness by putting more sensors on one column. If four corners of a floor are monitoring points that are supported by columns, the deformation of the floor can be decided by the displacements of the column at each corner.
Our primary focus is on horizontal displacements. We limit our problem to horizontal measurements, which implies that the vertical (i.e., z-axis) measurement is neglected and is regarded as a shift to the entire structure. Because horizontal forces mainly cause a structure's horizontal displacement, the contribution of the vertical force on the horizontal displacement can be neglected.

Structural Global Displacement
The displacement can be derived from the numerical integration of the acceleration. In other words, the area under the graph of the velocity curve over time is the displacement. Previous studies [25][26][27][28] have suggested methods to calculate it from the accelerometer so we applied those integration equations to get the global displacement at each sensor location.
Getting global displacement at each monitoring point is the first step to visualize the 3D model movements. However, the global displacement is insufficient to understand how each connected structural element moves (e.g., how floor and column moves) because elements in the 3D model rotate and move at their local coordinate. As a result, we must figure out a way to translate the global displacements into each element's rotation and displacement at its local coordination system.

Conversion from Global Displacement to Element's Local Rotation and Displacement
Earthquakes produce three-dimensional ground forces. The displacements and rotations of the structure can be viewed as the result of these three ground forces. Figure 4 shows that the movements of the two connected columns (i.e., Column01 and Column12) are caused by the horizontal forces from the position at time t 0 to another one at t 1 . Every P ij point is a sensor monitoring location, and signals the point at the i-th position at the j-th step. To illustrate how to derive global-to-local conversion equations under this situation, we divide the movement between these two timestamps into six steps. The first three steps are displacements along with the Y G , X G , and Z G axes, and the latter three steps are rotations at X L , Y L , and Z L axis. Since all six steps are caused by forces along with X G and Y G , there is not any rotation at Z L axis.  However, 3D models are designed to rotate at the center of the object, unli above scenario-rotating at the bottom of each column. To translate the rotation in 5 to the rotation at the bottom, the 3D model element must include a shift after it at the same required angle. Given that there are multiple columns connected, at th column, we have to consider two types of shifts. The first one is from the displacem its bottom point, which is caused by the (n−1)-th element's shift. The bottom point is measured at time ti to ti−1 and its movement is in three directions so each dire displacement can be calculated by the difference of coordinate value at each axis. fore, the bottom point displacement at the x-axis is Xn−1(ti) − Xn−1(ti−1) and the same pr can be applied to the y and z-axis. The second type of movement caused by the rota illustrated in Figure 6. For the n-th column rotating around the x and y-axis, the to placements are shown as follows:  (1) and (2) where L 1 is the length of the Column01, X 1 (t 1 ) and Y 1 (t 1 ) are coordinates of the top point of the Column01 at time t 1 , X 0 (t 1 ) and Y 0 (t 1 ) are coordinates of the bottom point at Column01 at time t 1 , r y0 is the angle at Y L axis, and r x0 is the angle at X L axis.     Both rotations will also contribute z-axis displacement d z1 . The value at time t 1 can be established after r y0 and r x0 are computed. d z1 = L 1 1 − cos cos r y0 cos cos r x0 (3) Z 1 (t 1 ) = L 1 cos cos r y0 cos cos r x0 When we have multiple connected columns and two rotations are applied at a particular column n, its two rotation angles and z-axis displacement can be calculated from the generalized equations as follows: r xn−1 = (Y n (t i )-Y n−1 (t i ))/(L n cos cos r yn−1 (6) d zn = L n 1 − cos cos r yn−1 cos cos r xn−1 Z n (t i ) = L n cos cos r yn−1 cos cos r xn−1 (8) where L n is the length of the n-th column, X n (t i ) and Y n (t i ) are coordinates of the top point of the n-th column at time t 1 , X n−1 (t i ) and Y n−1 (t i ) are coordinates of the bottom point of the n-th column at time t i , r yn−1 is the angle at Y L axis, and r xn−1 is the angle at X L axis. However, 3D models are designed to rotate at the center of the object, unlike the above scenario-rotating at the bottom of each column. To translate the rotation in Figure 5 to the rotation at the bottom, the 3D model element must include a shift after it rotates at the same required angle. Given that there are multiple columns connected, at the n-th column, we have to consider two types of shifts. The first one is from the displacement at its bottom point, which is caused by the (n−1)-th element's shift. The bottom point's shift is measured at time t i to t i−1 and its movement is in three directions so each direction's displacement can be calculated by the difference of coordinate value at each axis. Therefore, the bottom point displacement at the x-axis is X n−1 (t i ) − X n−1 (t i−1 ) and the same principle can be applied to the y and z-axis. The second type of movement caused by the rotation is illustrated in Figure 6. For the n-th column rotating around the and y-axis, the total displacements are shown as follows: d cxn−total = (L n /2) 1 − sin sin r yn−1 + (X n−1 (t i )-X n−1 (t i−1 ))    The displacement at the z-axis is the sum of the center point declining introduced by both rotations (i.e., d czn + d czn ') and also the bottom point's movement.
The global coordinate of the center point measured at the time t i is in the following equation.
We can apply the same idea to derivate the conversion from global displacement to the local beam element's rotation and displacement. For a beam parallel to the y-axis, its rotations can be at the zor x-axis (see Figure 7).
r zn−1 = ((Z n (t i )-Z n−1 (t i ))/(L n cos cos r xn−1 )) (14) d yn = L n (1 − cos cos r xn−1 cos cos r zn−1 )    Similarly, for a beam parallel to the x-axis, when its rotations are in the z-or y-axis, the equations for getting the angles are Equation (16) and Equation (14), respectively.

Sensor Deployment
Past research [29][30][31] already discussed the importance of IoT device deployment. In our system, the sensor location is particularly critical because it determines the details of the overall monitored structure's transformation and movement. Sensors are installed at the monitoring points. The sections between two sensor devices are treated as connected columns or beams. As a result, when a structure is modeled, its model must compose of separated elements, where both ends of the element are the monitoring points. Adding sensors to columns and joints must happen before attaching them to beams. By adding more sensor devices to each structural element (i.e., beam or column), the non-linear deformation can be shown. Figure 8 shows that the more detailed deformation can be displayed while the number of sensors increases. We must know that the minimum sensor installations are at the joints on the columns. Otherwise, our above equations cannot be adequately applied for displaying movements in our SHM system. = (( ( )-( ))/ ) = (( ( )-( ))/( )) (14) Similarly, for a beam parallel to the x-axis, when its rotations are in the z-or y-axis, the equations for getting the angles are Equation (16) and Equation (14), respectively.

Sensor Deployment
Past research [29][30][31] already discussed the importance of IoT device deployment. In our system, the sensor location is particularly critical because it determines the details of the overall monitored structure's transformation and movement. Sensors are installed at the monitoring points. The sections between two sensor devices are treated as connected columns or beams. As a result, when a structure is modeled, its model must compose of separated elements, where both ends of the element are the monitoring points. Adding sensors to columns and joints must happen before attaching them to beams. By adding more sensor devices to each structural element (i.e., beam or column), the non-linear deformation can be shown. Figure 8 shows that the more detailed deformation can be displayed while the number of sensors increases. We must know that the minimum sensor installations are at the joints on the columns. Otherwise, our above equations cannot be adequately applied for displaying movements in our SHM system.

Timestamp
Timestamp or time is critical to the real-time system. The sensor device only contains a timer that measures time in milliseconds starting from zero when it boots. The timestamp keeps increasing until it shuts down. Our way to get time is to map the timestamp from its timer to the server's clock time by sending a request and receiving the server's response for inquiry and recording the time. Figure 9 shows that the timestamp x1 should be mapped to the server's clock time t1. Due to latency, when the server records the clock time tr1 associated with x1, the time already passed dtr1. Hence, the mapped server time cannot be accurate.

Timestamp
Timestamp or time is critical to the real-time system. The sensor device only contains a timer that measures time in milliseconds starting from zero when it boots. The timestamp keeps increasing until it shuts down. Our way to get time is to map the timestamp from its timer to the server's clock time by sending a request and receiving the server's response for inquiry and recording the time. Figure 9 shows that the timestamp x 1 should be mapped to the server's clock time t 1 . Due to latency, when the server records the clock time tr 1 associated with x 1 , the time already passed dtr 1 . Hence, the mapped server time cannot be accurate. We try to calculate the average latency and use it to adjust the clock time so we can get the recorded mapping time closer to the one that should be mapped. The following equations show how we map timer x1 and x2 to clock time t1 and t2, respectively. We try to calculate the average latency and use it to adjust the clock time so we can get the recorded mapping time closer to the one that should be mapped. The following equations show how we map timer x 1 and x 2 to clock time t 1 and t 2 , respectively.
According to the equations above, mapped clock time in each sensor device is an approximate value. Therefore, it is not easy for all devices to get the acceleration at the same clock time. What is worse is that multiple devices receive different network latencies. So, the best strategy is to reduce the latency. As a result, having a local server to communicate with all the servers becomes a better solution.

Data Format
To use our suggested method to display the overall movement of the structure in its 3D model, the SHM system needs to save more than measurement data. Sensor deployment information and relation between different types of structural elements, such as floors, columns, and beams, are also required. For example, to calculate the displacement of a column, all the monitoring points at both ends of the connected parts are needed.
All the relations are organized in key-value pairs and arrays in the JavaScript Object Notation (JSON) format. Figure 10 shows an example of how the data are saved. We try to calculate the average latency and use it to adjust the clock time so we can get the recorded mapping time closer to the one that should be mapped. The following equations show how we map timer x1 and x2 to clock time t1 and t2, respectively.
According to the equations above, mapped clock time in each sensor device is an approximate value. Therefore, it is not easy for all devices to get the acceleration at the same clock time. What is worse is that multiple devices receive different network latencies. So, the best strategy is to reduce the latency. As a result, having a local server to communicate with all the servers becomes a better solution.

Data Format
To use our suggested method to display the overall movement of the structure in its 3D model, the SHM system needs to save more than measurement data. Sensor deployment information and relation between different types of structural elements, such as floors, columns, and beams, are also required. For example, to calculate the displacement of a column, all the monitoring points at both ends of the connected parts are needed.
All the relations are organized in key-value pairs and arrays in the JavaScript Object Notation (JSON) format. Figure 10 shows an example of how the data are saved.  The structural 3D model is created after the user edits it on the website. The model data saved in the JSON format contain camera information, geometries, materials, and objects. After the monitoring data are converted into each object's rotation, displacement, or even transformation, the dynamic behaviors under external forces can be shown in its 3D model.

Sensor Device Software
The sensor device's user interface is a tab-like web page (see Figure 11). The "Setup" tab allows users to connect to a wireless network so that the device can acquire an IP address from the router (see Figure 12). Users configure the server IP and hostname in the "Cloud" Table After all the required information is set, the "Home" tab shows its IP and identification number. The "Home" tab provides the real-time acceleration measurements at x, y, and z directions after the enable button is clicked. Users can also select a different sample rate. The buffer size input is used to specify the internal memory buffer for caching the monitoring data. As for the "Setup" tab, user account and password can be set in order to protect illegal access to this user interface of this sensor box. In this way, saved configurations will not be altered.

Sensor Device Software
The sensor device's user interface is a tab-like web page (see Figure 11). The "Setup" tab allows users to connect to a wireless network so that the device can acquire an IP address from the router (see Figure 12). Users configure the server IP and hostname in the "Cloud" Table After all the required information is set, the "Home" tab shows its IP and identification number. The "Home" tab provides the real-time acceleration measurements at x, y, and z directions after the enable button is clicked. Users can also select a different sample rate. The buffer size input is used to specify the internal memory buffer for caching the monitoring data. As for the "Setup" tab, user account and password can be set in order to protect illegal access to this user interface of this sensor box. In this way, saved configurations will not be altered.

Server Software
The implementation contains three major parts: (1) project, structural model, and sensor information; (2) 3D model building tool; and (3) 3D model real-time movement display. Users can create a project that includes various structures. Sensor device locations and identifications are saved in the server. After the user creates a 3D model in our application, the model data will automatically be saved in the server for later display. The last Figure 11. Home tab of sensor device user interface.

Sensor Device Software
The sensor device's user interface is a tab-like web page (see Figure 11). The "Setup" tab allows users to connect to a wireless network so that the device can acquire an IP address from the router (see Figure 12). Users configure the server IP and hostname in the "Cloud" Table After all the required information is set, the "Home" tab shows its IP and identification number. The "Home" tab provides the real-time acceleration measurements at x, y, and z directions after the enable button is clicked. Users can also select a different sample rate. The buffer size input is used to specify the internal memory buffer for caching the monitoring data. As for the "Setup" tab, user account and password can be set in order to protect illegal access to this user interface of this sensor box. In this way, saved configurations will not be altered.

Server Software
The implementation contains three major parts: (1) project, structural model, and sensor information; (2) 3D model building tool; and (3) 3D model real-time movement display. Users can create a project that includes various structures. Sensor device locations and identifications are saved in the server. After the user creates a 3D model in our application, the model data will automatically be saved in the server for later display. The last

Server Software
The implementation contains three major parts: (1) project, structural model, and sensor information; (2) 3D model building tool; and (3) 3D model real-time movement display. Users can create a project that includes various structures. Sensor device locations and identifications are saved in the server. After the user creates a 3D model in our application, the model data will automatically be saved in the server for later display. The last part is the real-time view for 3D model movement. The user can move the camera position or zoom in/out the view to see the movement from different angles. Figure 13 shows the 3D model editor in our server application inside our SHM system. All the structural elements are built by using the "box" object. The Three.js client-side component can allow users to use any browser to edit their 3D models according to the to-be monitored structure. Because of using a browser, the feature of editing anywhere is realized. We also want to emphasize why global to local conversion is necessary in real-time display. In Figure 14, the local coordinate on each structural element is at its center. It is important to note that the displacement, rotation, and transformation are all at their local coordination system instead of the global coordinate system. Each element's local coordination system will change after its local rotation or shift. It introduces the challenges of global to local conversion. the to-be monitored structure. Because of using a browser, the feature of editing anywhere is realized. We also want to emphasize why global to local conversion is necessary in realtime display. In Figure 14, the local coordinate on each structural element is at its center. It is important to note that the displacement, rotation, and transformation are all at their local coordination system instead of the global coordinate system. Each element's local coordination system will change after its local rotation or shift. It introduces the challenges of global to local conversion.

Validation
Our validation targets correctly display real-time 3D movements according to the structural monitoring data sent by multiple sensor devices. Unsynchronized monitoring timestamps across different sensor devices or inaccurate global and local conversion equations will cause the detached structural model parts (e.g., a beam disconnects with a column) or unreasonable movement of elements (e.g., a clockwise rotation becomes an incorrect counterclockwise rotation).
There are two parts in the validation. First, we test the time adjustment in the sensor device. Four devices are put on the small-scale shaking table and shook with a maximum acceleration of 0.2 g. The monitored acceleration values will be examined at each timestamp. The second part is a simulation for investigating our conversion approach, which is a step before we later conduct a scaled structure experiment on a shaking table. We build a two-story 3D model for displaying the movement. On the model, the locations of the sensors are at each joint so we use eight sensor devices to send the simulation data to the server on the local network. The movements of the 3D model will be checked via a Figure 13. Two-story structure on 3D model editor of the SHM system. side component can allow users to use any browser to edit their 3D models according to the to-be monitored structure. Because of using a browser, the feature of editing anywhere is realized. We also want to emphasize why global to local conversion is necessary in realtime display. In Figure 14, the local coordinate on each structural element is at its center. It is important to note that the displacement, rotation, and transformation are all at their local coordination system instead of the global coordinate system. Each element's local coordination system will change after its local rotation or shift. It introduces the challenges of global to local conversion.

Validation
Our validation targets correctly display real-time 3D movements according to the structural monitoring data sent by multiple sensor devices. Unsynchronized monitoring timestamps across different sensor devices or inaccurate global and local conversion equations will cause the detached structural model parts (e.g., a beam disconnects with a column) or unreasonable movement of elements (e.g., a clockwise rotation becomes an incorrect counterclockwise rotation).
There are two parts in the validation. First, we test the time adjustment in the sensor device. Four devices are put on the small-scale shaking table and shook with a maximum acceleration of 0.2 g. The monitored acceleration values will be examined at each timestamp. The second part is a simulation for investigating our conversion approach, which is a step before we later conduct a scaled structure experiment on a shaking table. We build a two-story 3D model for displaying the movement. On the model, the locations of the sensors are at each joint so we use eight sensor devices to send the simulation data to the server on the local network. The movements of the 3D model will be checked via a

Validation
Our validation targets correctly display real-time 3D movements according to the structural monitoring data sent by multiple sensor devices. Unsynchronized monitoring timestamps across different sensor devices or inaccurate global and local conversion equations will cause the detached structural model parts (e.g., a beam disconnects with a column) or unreasonable movement of elements (e.g., a clockwise rotation becomes an incorrect counterclockwise rotation).
There are two parts in the validation. First, we test the time adjustment in the sensor device. Four devices are put on the small-scale shaking table and shook with a maximum acceleration of 0.2 g. The monitored acceleration values will be examined at each timestamp. The second part is a simulation for investigating our conversion approach, which is a step before we later conduct a scaled structure experiment on a shaking table. We build a two-story 3D model for displaying the movement. On the model, the locations of the sensors are at each joint so we use eight sensor devices to send the simulation data to the server on the local network. The movements of the 3D model will be checked via a browser that accesses the server. Figure 15 shows the procedure of how we deal with the conversion equation validation.

Time and Latency
Since four devices are moved at the same acceleration, after our time adjustment for each device, all of them are expected to get the same acceleration value at the same timestamp. This implies that their wave pattern should be very close to each other. According to Figure 16, the time adjustment is accurate because it shows four monitoring curves are very close. Table 1 shows the standard deviation of the acceleration values of the four devices at each timestamp. A very low maximum standard deviation also validates our time adjustment method. browser that accesses the server. Figure 15 shows the procedure of how we deal with the conversion equation validation.

Time and Latency
Since four devices are moved at the same acceleration, after our time adjustment for each device, all of them are expected to get the same acceleration value at the same timestamp. This implies that their wave pattern should be very close to each other. According to Figure 16, the time adjustment is accurate because it shows four monitoring curves are very close. Table 1 shows the standard deviation of the acceleration values of the four devices at each timestamp. A very low maximum standard deviation also validates our time adjustment method.

Maximum Standard Deviation
Minimum Standard Deviation 0.0950045 0

3D Movement Calculation
Our server for hosting the real-time SHM system is implemented in the one core CPU, 512MB ram, and Ubuntu OS machine on the local network. The two-story structural 3D model (see Figure 13) is built to validate our proposed global to local conversion. Every sensor is assigned at each joint so eight devices are used in the simulation. The acceleration data pushed by the sensors continuously contribute the displacements at the sampling rate of 20 Hz to the server. It is noteworthy that the monitoring sample time interval is different from the display time interval. In our validation, we use one second. This can

Time and Latency
Since four devices are moved at the same acceleration, after our time adjustment for each device, all of them are expected to get the same acceleration value at the same timestamp. This implies that their wave pattern should be very close to each other. According to Figure 16, the time adjustment is accurate because it shows four monitoring curves are very close. Table 1 shows the standard deviation of the acceleration values of the four devices at each timestamp. A very low maximum standard deviation also validates our time adjustment method.

Maximum Standard Deviation
Minimum Standard Deviation 0.0950045 0

3D Movement Calculation
Our server for hosting the real-time SHM system is implemented in the one core CPU, 512MB ram, and Ubuntu OS machine on the local network. The two-story structural 3D model (see Figure 13) is built to validate our proposed global to local conversion. Every sensor is assigned at each joint so eight devices are used in the simulation. The acceleration data pushed by the sensors continuously contribute the displacements at the sampling rate of 20 Hz to the server. It is noteworthy that the monitoring sample time interval is different from the display time interval. In our validation, we use one second. This can  Our server for hosting the real-time SHM system is implemented in the one core CPU, 512MB ram, and Ubuntu OS machine on the local network. The two-story structural 3D model (see Figure 13) is built to validate our proposed global to local conversion. Every sensor is assigned at each joint so eight devices are used in the simulation. The acceleration data pushed by the sensors continuously contribute the displacements at the sampling rate of 20 Hz to the server. It is noteworthy that the monitoring sample time interval is different from the display time interval. In our validation, we use one second. This can avoid a display lag on the client's browser, which is caused by too frequent updates and heavy client-server data traffic.
Two criteria were used to validate the global to local conversation. At each joint, connected elements cannot detach during every movement because our conversion equations are under the assumption that there are no broken joints and rigid body rotations and displacements. The other criterion examines rotations or displacements in the correct direction. Three people inspected the movements by using their browser at the same time, checked the model movements from various angles by moving the camera, and then made observation notes for each movement. Figure 17 shows that eight movements repeat the same sequence continuously because the sensor devices keep delivering a series of pre-defined acceleration values. Throughout the whole simulation, no disjoints between elements or wrong rotations were seen by the inspectors. This indicated that the system displayed movements correctly and validated our conversion equations. Because a successful 3D model movement also relies on synchronous monitoring time and accurate movement interpolation, correct 3D model movements also indirectly validate our time synchronization and interpolation methods. These validations are performed through the simulation, which prepares our SHM system to reproduce the actual movements of the to-be-monitored structure.
are under the assumption that there are no broken joints and rigid body rotations and displacements. The other criterion examines rotations or displacements in the correct direction. Three people inspected the movements by using their browser at the same time, checked the model movements from various angles by moving the camera, and then made observation notes for each movement. Figure 17 shows that eight movements repeat the same sequence continuously because the sensor devices keep delivering a series of predefined acceleration values. Throughout the whole simulation, no disjoints between elements or wrong rotations were seen by the inspectors. This indicated that the system displayed movements correctly and validated our conversion equations. Because a successful 3D model movement also relies on synchronous monitoring time and accurate movement interpolation, correct 3D model movements also indirectly validate our time synchronization and interpolation methods. These validations are performed through the simulation, which prepares our SHM system to reproduce the actual movements of the to-be-monitored structure. Figure 17. Two-story model movements.

Discussion and Conclusions
In this paper, we proposed a SHM system that can demonstrate real-time structural movements in a 3D model on the website so that users can view monitoring results by using any client device that has browsers. In our SHM system, the users can build up their structural 3D model according to the real-world structure and save all the sensor information. With the model and sensor data, once the real-world structure moves under the external force, the user can see the real-time movements on the display webpage.
Our validation results showed that our conversion method, monitoring time synchronization approach, and displacement adjustment equation could display the overall behaviors correctly. Our current result is the initial step for accurately displaying realworld structural monitoring results in a 3D model. That means, we have to consider elastic and non-linear behaviors of the structural element. We understand that our conversion equations have limitations due to our rigid body assumptions. Real-world structural behaviors under external forces do not merely act as a rigid body. Although we can use more sensors to display bent structural elements approximately, we will need to study a 3D deformation equation to show the non-linear behaviors of every structural element in the future. We also know that using acceleration values to get displacements might not be ideal. However, we argue that this deficiency does not impact our proposed approach. In Figure 17. Two-story model movements.

Discussion and Conclusions
In this paper, we proposed a SHM system that can demonstrate real-time structural movements in a 3D model on the website so that users can view monitoring results by using any client device that has browsers. In our SHM system, the users can build up their structural 3D model according to the real-world structure and save all the sensor information. With the model and sensor data, once the real-world structure moves under the external force, the user can see the real-time movements on the display webpage.
Our validation results showed that our conversion method, monitoring time synchronization approach, and displacement adjustment equation could display the overall behaviors correctly. Our current result is the initial step for accurately displaying real-world structural monitoring results in a 3D model. That means, we have to consider elastic and non-linear behaviors of the structural element. We understand that our conversion equations have limitations due to our rigid body assumptions. Real-world structural behaviors under external forces do not merely act as a rigid body. Although we can use more sensors to display bent structural elements approximately, we will need to study a 3D deformation equation to show the non-linear behaviors of every structural element in the future. We also know that using acceleration values to get displacements might not be ideal. However, we argue that this deficiency does not impact our proposed approach. In other words, once we can apply a more accurate displacement measurement technique, our SHM can display more realistic 3D movements of the to-be monitored structure.