Research on ATT-BiLSTM-Based Restoration Method for Deflection Monitoring Data of a Steel Truss Bridge
Abstract
1. Introduction
2. Project Examples
3. Monitoring Point Selection and Anomalous DATA Detection
3.1. Selection of Monitoring Points
3.2. Abnormal Data Detection
- Original sequence:
- 2.
- Sequence normalization:
- 3.
- Calculate the sequence of association coefficients:
- 4.
- Gray correlation calculation:
4. ATT-BiLSTM Anomaly Data Processing
4.1. ATT-BiLSTM Data Preprocessing
- (1)
- Data standardization
- Dataset organization: according to the deflection monitoring data, organized into the original feature dataset:
- Z-score normalization:
- (2)
- Division of the dataset
4.2. ATT-BiLSTM Model Setup and Training
- (1)
- Hyperparameterization
- Learning Rate: The learning rate is a pivotal hyperparameter that dictates the step size of the model during parameter updates, thereby influencing the rate of model convergence. A modest learning rate was selected to guarantee the model’s steady convergence during the training process. The learning rate was established at 0.001, as specified by learning_rate = 0.001. Through a series of experiments and research studies, it was ascertained that this specific learning rate facilitated rapid convergence of the model while ensuring that the optimal solution was not bypassed during the training process, thereby yielding favorable outcomes.
- Dropout Rate: The dropout rate is employed to regulate the random disconnection of neurons during the training process. It is set to 0.01 to maximize the model’s expressive capacity during training while mitigating the risk of overfitting, as specified by dropout_rate = 0.01.
- Number of Layers: The number of network layers refers to the number of LSTM layers in the model. According to related studies, a two-layer bidirectional LSTM is beneficial for the model to capture long-term dependencies in the input sequences. Therefore, the number of network layers in this model is set to two, which is specified by num_layers = 2 [29].
- Hidden Size: The quantity of hidden layer nodes is contingent upon the number of neurons in each LSTM layer, constituting a pivotal parameter within the model. In order to guarantee that the model does not become excessively complex while simultaneously acquiring knowledge of the task features, subsequent to parameter tuning, the number of hidden layer nodes is configured to 128, as specified by the hidden_size parameter set to 128.
- (2)
- Model structure setup
- BiLSTM Layer: A bidirectional LSTM layer is employed for the purpose of bi-directional modeling of input sequences. The number of input features is designated as input_size, while the number of output features is calculated as hidden_size multiplied by two (bidirectional) with num_layers layer, created by nn.LSTM.
- Attention Mechanism: The Attention Mechanism is employed to enhance the model’s attention to the input sequence. Attention inputs are of size hidden_size × 2, and outputs are of size hidden_size × 2. Two linear layers were created by nn.Linear.
- Full Connectivity Layer: The Attention output was mapped to the final output dimension, with an input feature count of hidden_size × 2 and an output feature count of 1. This mapping was created by nn.Linear.
- (3)
- Loss function and optimizer selection
- Loss Function: The primary objective of this model is to predict the value of deflection, a process often referred to as regression. The mean square error, a statistical measure of the average squared difference between predicted and actual values, is employed in regression analysis. The formula for the mean square error is as follows:
- Optimizer: The Adam optimizer (torch.optim.Adam) is an adaptive learning rate optimization algorithm that automatically adjusts the learning rate for each parameter, with different learning rates for different parameters, so that this training process converges more efficiently.
- (4)
- Training Process
- Number of Iterations: The selection of the number of iterations is contingent upon factors such as the size of the training set, the complexity of the model, and the constraints imposed by computational resources. An excess of iterations may result in overfitting, whereas an insufficient number of iterations may hinder the model’s learning process. Through a series of experiments and adjustments, the optimal number of iterations was determined to be three.
- Batch Size: The selection of batch size is constrained by computational resources. A larger batch size may necessitate more memory but can enhance the training speed. A smaller batch size may yield more precise gradient estimation but may result in a noisier training process. Following numerous experiments and adjustments, the batch size is set to 128.
- Loss Curve: The loss curve is a graphical representation of the loss function, the objective of which is to minimize the squared difference between the predicted value and the actual value. This enables the model to better fit the training data. During the training process, the model parameters are adjusted through the back propagation algorithm according to the loss function. This process enables the gradual improvement of the model’s prediction accuracy and ultimately leads to the generation of the model’s loss curve. It is evident that the model’s fit to the training data is optimized when the loss value is minimal.
4.3. ATT-BiLSTM Anomaly Data Repair
5. SVR Exception Data Handling
5.1. SVR Data Preprocessing
5.2. SVR Model Setup and Training
- (1)
- Model Setup
- Kernel Function Selection: Given the evident nonlinear characteristics exhibited by the deflection monitoring data, the radial basis function (RBF) was selected as the kernel function for this model (kernel = ‘rbf’).
- Hyperparameterization: The regularization parameter, which governs the severity of penalty imposed for errors, is intended to avert the model from overfitting on the training set, over-adapting to noise, and losing the capacity to generalize to novel data. In light of the prevailing circumstances and the temporal demands associated with model training, it is advisable to establish a moderate degree of regularization, denoted by = 1.0. The epsilon parameter serves to delineate the model’s tolerance threshold for outliers during the training phase. It further determines the extent to which the model accommodates samples that do not adhere to the constraints imposed by the loss function. The formulation of this model is informed by tuning experiments and heuristic rules. In this context, the epsilon parameter is set to 0.2.
- Optimization Algorithm: The SMO algorithm is employed to address the data anomaly problem, which is essentially the dyadic problem in regression. This algorithm is utilized to solve the convex optimization problem with constraints and to identify the optimal function [30].
- (2)
- Model training
5.3. SVR Exception Data Repair
6. ATT-BiLSTM and SVR Repair Accuracy Evaluation
6.1. ATT-BiLSTM and SVR Accuracy Validation
6.2. Evaluation of ATT-BiLSTM Error Metrics
- Mean Square Error (MSE): A statistical measure of the discrepancy between the restoration value and the true value. A low MSE indicates a high degree of accuracy in the restoration, with smaller values approaching 0. The formula for calculating MSE is shown in Formula (10). Where denotes the sample size, denotes the actual observations, and denotes the model predictions.
- Root Mean Square Error (RMSE): The square root of the mean square error (MSE), which is a metric used to quantify the discrepancy between the values predicted by a model and the actual values [34].
- Coefficient of Determination (R2): A statistical measure of how well a model aligns with observed data. A value of R2 close to 1 indicates a stronger match between the model and the data, suggesting a higher degree of model fit.
- (1)
- Global performance analysis
- The MSE is calculated to be 0.0226, indicating that the mean squared error between the predicted and true values of the model in the entire dataset is minimal. This finding reflects the model’s strong predictive ability in the global range and its ability to accurately fit the overall distribution of the data.
- The RMSE is 0.1505, which further confirms that the absolute value of the model’s prediction error is small, indicating that the model’s prediction accuracy in the whole dataset is high.
- R2 is 0.9943, which is close to 1. This indicates that the model has a good fit to the global dataset and that it is also predictive.
- (2)
- Local performance analysis
- The MSE was calculated to be 0.5576, which is high relative to the global MSE value. However, the error value is still in the acceptable range due to the fact that anomalous data usually has higher uncertainty and complexity [35].
- The RMSE is 0.7467, which is marginally higher than the global RMSE due to the nature of the anomalous data. However, the accuracy generally meets the requirements.
- The R2 value of 0.9388 indicates a slight decrease compared to the global R2 value, yet it remains close to 1, suggesting that the model continues to adequately fit the anomalous data and effectively identify and repair most anomalous data while maintaining high prediction accuracy. The comparison graph of the original anomalous data and the repaired anomalous data is shown in Figure 12.
7. Conclusions and Prospects
7.1. Conclusions
7.2. Prospects
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Fahmy, A.S.; El-Madawy, M.E.T.; Gobran, Y.A. Using artificial neural networks in the design of orthotropic bridge decks. Alex. Eng. J. 2016, 55, 3195–3203. [Google Scholar] [CrossRef]
- Martins, H.M.; Thrall, A.P.; Byers, D.D.; Zoli, T.P. Behavior of incrementally launched modular steel truss bridges. Eng. Struct. 2025, 326, 119509. [Google Scholar] [CrossRef]
- Zhai, M.S.; Qian, J.C.; Chu, L.; Tao, Y.R. Load limit values of steel bridge decks based on fatigue reliability. J. Traffic Transp. Eng. 2024, 24, 245–256. (In Chinese) [Google Scholar]
- López, S.; Barros, B.; Buitrago, M.; Oswaldo, M.N.; Jose, M.A.; Belen, R. Reliability-based structural assessment of steel truss bridges subjected to failure scenarios. Eng. Struct. 2025, 341, 120850. [Google Scholar] [CrossRef]
- Yue, Q.R.; Xu, G.; Liu, X.G. Research on intelligent identification and monitoring method of bridge cracks. China J. Highw. Transp. 2024, 37, 16–28. (In Chinese) [Google Scholar]
- Deng, L.; Lai, S.; Ma, J.; Lei, L.; Zhong, M.; Liao, L.; Zhou, Z. Visualization and monitoring information management of bridge structure health and safety early warning based on BIM. J. Asian Archit. Build. Eng. 2022, 21, 427–438. [Google Scholar] [CrossRef]
- Civera, M.; Pecorelli, M.L.; Ceravolo, R.; Surace, C.; Fragonara, L.Z. A multi-objective genetic algorithm strategy for robust optimal sensor placement. Comput.-Aided Civ. Infrastruct. Eng. 2021, 36, 1185–1202. [Google Scholar] [CrossRef]
- Nicoletti, V.; Quarchioni, S.; Amico, L.; Gara, F. Assessment of different optimal sensor placement methods for dynamic monitoring of civil structures and infrastructures. Struct. Infrastruct. Eng. 2024, 1–16. [Google Scholar] [CrossRef]
- Zhang, Y.; Lei, Y. Data Anomaly Detection of Bridge Structures Using Convolutional Neural Network Based on Structural Vibration Signals. Symmetry 2021, 13, 1186. [Google Scholar] [CrossRef]
- Mao, Y.X.; Xiao, F.; Tian, G.; Xiang, Y.J. Sensitivity analysis and sensor placement for damage identification of steel truss bridge. Structures 2025, 73, 108310. [Google Scholar] [CrossRef]
- Chang, L.; Li, R.F.; Li, Z.W. Research on construction unloading monitoring technology of triangular pyramid space steel structure with a span of 120 m. J. Build. Struct. 2020, 41, 142–148+165. (In Chinese) [Google Scholar]
- Liu, Z.H.; Qin, X.P.; Li, L.; Yi, H.Y.; Liu, H.L. Fault diagnosis of bridge strain sensors based on mutual correlation analysis. Technol. Highw. Transp. 2022, 38, 77–83. (In Chinese) [Google Scholar]
- Fu, M.; Liang, Y.; Feng, Q.; Wu, B.; Tang, G. Research on the application of multi-source data analysis for bridge safety monitoring in the reconstruction and demolition process. Buildings 2022, 12, 1195. [Google Scholar] [CrossRef]
- Li, Y.W.; Ding, Y.L.; Zhao, H.W.; Sun, Z. Data-driven structural condition assessment for high-speed railway bridges using multi-band FIR filtering and clustering. Structures 2022, 41, 1546–1558. [Google Scholar] [CrossRef]
- Sadhu, A.; Peplinski, J.E.; Mohammadkhorasani, A.; Moreu, F. A Review of Data Management and Visualization Techniques for Structural Health Monitoring Using BIM and Virtual or Augmented Reality. J. Struct. Eng. 2023, 149, 1–18. [Google Scholar] [CrossRef]
- Zhao, H.Q.; Jian, F.L.; Dan, D.H.; Zhao, Y.M.; Yin, X.B. Design and Application of Health Monitoring System for Long-Span Continuous Steel Truss Bridges. World Bridges 2024, 52, 56–63. (In Chinese) [Google Scholar]
- Feng, J.; Gao, K.; Wu, G.; Xu, Y.; Jiang, H. A deep learning-based interferometric synthetic aperture radar framework for abnormal displacement deformation prediction of bridges. Adv. Struct. Eng. 2023, 26, 3005–3020. [Google Scholar] [CrossRef]
- Zhu, Q.X.; Wang, H.; Mao, J.X.; Wan, H.P.; Zheng, W.Z.; Zhang, Y.M. Investigation of temperature effects on steel-truss bridge based on long-term monitoring data: Case study. J. Bridge Eng. 2020, 25, 05020007. [Google Scholar] [CrossRef]
- Piao, C.H.; Ji, M.M.; Zhang, Z.G.; Liu, Y.H.; Li, Z.Y.; Dong, X. Research on identification of train load and local health state of bridge deck system based on CNN-LSTM deep learning. J. China Railw. Soc. 2022, 44, 135–145. (In Chinese) [Google Scholar]
- Wang, Z.C.; Wang, Y. Bridge weigh-in-motion through bidirectional Recurrent Neural Network with long short-term memory and attention mechanism. Smart Struct. Syst. 2021, 27, 241–256. [Google Scholar]
- Chen, Y.; Sun, H.; Feng, Z. Study on seismic isolation of long span double deck steel truss continuous girder bridge. Appl. Sci. 2022, 12, 2567. [Google Scholar] [CrossRef]
- Yang, G.W.; Zheng, Y.P. General design and key technologies of Fuzhou Daoqingzhou Bridge. Bridge Constr. 2020, 50, 62–68. (In Chinese) [Google Scholar]
- Lu, J.; Xiang, Y.; Wei, X.M.; Huang, Y.Y. A high-precision calculation model and method for analyzing the large deflection deformation of microbeams under electrostatic force. Eng. Mech. 2009, 26, 250–256. (In Chinese) [Google Scholar]
- Chen, S.L.; Liu, Y.Q.; Zhang, Y.B. Experimental Study on Static and Dynamic Performance of Long-Span Steel Truss Girder Bridges. Railw. Stand. Des. 2016, 60, 38–42. (In Chinese) [Google Scholar]
- Zhang, J.; Zhang, A.; Li, J.; Li, F.; Peng, J. Gray correlation analysis and prediction on permanent deformation of subgrade filled with construction and demolition materials. Materials 2019, 12, 3035. [Google Scholar] [CrossRef] [PubMed]
- Song, S.H.; Niu, Y.N.; Kong, L.P. Correlation analysis of pore structure and frost resistance of carbon nanotube concrete based on gray relational theory. Struct. Concr. 2024, 25, 2855–2867. [Google Scholar] [CrossRef]
- Han, M.; Zhang, R.Q.; Xu, M.L. A Variable Selection Algorithm Based on Improved Grey Relational Analysis. Control Decis. 2017, 32, 1647–1652. (In Chinese) [Google Scholar]
- Han, Y.; Li, J.; Ma, H.Y.; Sun, Z.P.; Pang, K. CNN-LSTM based structural damage diagnosis method for bridges. Foreign Electron. Meas. Technol. 2021, 40, 1–6. (In Chinese) [Google Scholar]
- Xu, Z.K.; Chen, J.; Shen, J.X.; Xiang, M.J. Recursive long short-term memory network for predicting nonlinear structural seismic response. Eng. Struct. 2022, 250, 113406. [Google Scholar] [CrossRef]
- Sun, L.M.; Shang, Z.Q.; Xia, Y. Current status and prospect of bridge structural health monitoring research in the context of big data. China J. Highw. Transp. 2019, 32, 1–20. (In Chinese) [Google Scholar]
- Tanioka, K.; Hiwa, S. Low-Rank Approximation of Difference between Correlation Matrices Using Inner Product. Appl. Sci. 2021, 11, 4582. [Google Scholar] [CrossRef]
- Waller, N.G. Fungible Correlation Matrices: A Method for Generating Nonsingular, Singular, and Improper Correlation Matrices for Monte Carlo Research. Multivar. Behav. Res. 2016, 51, 554–568. [Google Scholar] [CrossRef]
- Liu, H.J.; Chen, C.; Guo, Z.Q.; Xia, Y.Y.; Yu, X.; Li, S.J. Overall grouting compactness detection of bridge prestressed bellows based on RF feature selection and the GA-SVM model. Constr. Build. Mater. 2021, 301, 124323. [Google Scholar] [CrossRef]
- Alatise, M.B.; Hancke, G.P. Pose Estimation of a Mobile Robot Based on Fusion of IMU Data and Vision Data Using an Extended Kalman Filter. Sensors 2017, 17, 2164. [Google Scholar] [CrossRef]
- Mao, J.X.; Wang, H.; Spencer, B.F. Toward Data Anomaly Detection for Automated Structural Health Monitoring: Exploiting Generative Adversarial Nets and Autoencoders. Struct. Health Monit. 2021, 20, 1609–1626. [Google Scholar] [CrossRef]
Data Point | Gr1_2 | Gr2_3 | Gr3_1 |
---|---|---|---|
0 | 0.884405 | 0.868338 | 0.870946 |
1 | 0.876814 | 0.857433 | 0.864177 |
2 | 0.849149 | 0.843059 | 0.845116 |
3 | 0.818202 | 0.810212 | 0.806560 |
4 | 0.803600 | 0.798713 | 0.788723 |
5 | 0.809120 | 0.807189 | 0.803115 |
6 | 0.797163 | 0.785178 | 0.805654 |
… | … | … | … |
99,995 | 0.929417 | 0.929678 | 0.929959 |
99,996 | 0.929417 | 0.929678 | 0.929959 |
99,997 | 0.931791 | 0.929678 | 0.932353 |
99,998 | 0.932966 | 0.931992 | 0.933562 |
99,999 | 0.934044 | 0.931992 | 0.934650 |
Total: 100,000 |
Abnormal Data Point | Abnormal Data Values | Abnormal Data Point | Abnormal Data Values |
---|---|---|---|
6 | 11.8750 | 99,199 | 15.6875 |
311 | 3.9375 | 99,200 | 15.5000 |
312 | 3.5000 | 99,201 | 15.5625 |
313 | 3.0625 | 99,202 | 15.3125 |
314 | 2.9375 | 99,203 | 14.5625 |
… | … | Total: 4061 |
Data Point | Measurement Point 1 Input Data | Actual Anomaly Data for Measurement Point 2 | Measurement Point 3 Input Data | Measurement Point 2 ATT-BiLSTM Repair Data |
---|---|---|---|---|
6 | 12.2500 | 11.8750 | 11.6875 | 12.450481 |
311 | 3.8750 | 3.9375 | 2.3125 | 4.201364 |
312 | 3.5000 | 3.5000 | 2.0625 | 3.868597 |
313 | 3.1875 | 3.0625 | 1.6875 | 3.577300 |
314 | 2.8750 | 2.9375 | 1.3750 | 3.414109 |
… | … | … | … | … |
99,199 | 16.1250 | 15.6875 | 15.8750 | 16.220303 |
99,200 | 15.8750 | 15.5000 | 15.8750 | 16.057016 |
99,201 | 15.6250 | 15.5625 | 15.5625 | 15.795417 |
99,202 | 14.9375 | 15.3125 | 15.3750 | 15.320224 |
99,203 | 13.9375 | 14.5625 | 15.0000 | 14.631486 |
Total: 4061 |
Data Point | Measurement Point 1 Input Data | Measurement Point 3 Input Data | Measurement Point 2 SVR Restoration Data |
---|---|---|---|
6 | 12.2500 | 11.6875 | 12.414220 |
311 | 3.8750 | 2.3125 | 3.614474 |
312 | 3.5000 | 2.0625 | 2.768712 |
313 | 3.1875 | 1.6875 | 2.438040 |
314 | 2.8750 | 1.3750 | 2.078867 |
… | … | … | … |
99,199 | 16.1250 | 15.8750 | 16.027738 |
99,200 | 15.8750 | 15.8750 | 15.763535 |
99,201 | 15.6250 | 15.5625 | 15.206753 |
99,202 | 14.9375 | 15.3750 | 14.994968 |
99,203 | 13.9375 | 15.0000 | 14.723242 |
Total: 4061 |
Data Type | Deflection Monitoring Point | Point1_Data | Point2_Data | Point3_Data |
---|---|---|---|---|
Original anomaly data | Point1_data | 1.000000 | 0.962409 | 0.954059 |
Point2_data | 0.962409 | 1.000000 | 0.960798 | |
Point3_data | 0.954059 | 0.960798 | 1.000000 | |
Anomalous data repaired by ATT-BiLSTM | Point1_data | 1.000000 | 0.989927 | 0.954059 |
Point2_data | 0.989927 | 1.000000 | 0.985518 | |
Point3_data | 0.954059 | 0.985518 | 1.000000 | |
Anomalous data repaired by SVR | Point1_data | 1.000000 | 0.985372 | 0.954059 |
Point2_data | 0.985372 | 1.000000 | 0.985186 | |
Point3_data | 0.954059 | 0.985186 | 1.000000 |
Data Type | Deflection Monitoring Point | Point1_Data | Point2_Data | Point3_Data |
---|---|---|---|---|
All original data | Point1_data | 1.000000 | 0.983706 | 0.978996 |
Point2_data | 0.983706 | 1.000000 | 0.982825 | |
Point3_data | 0.978996 | 0.982825 | 1.000000 | |
All data after ATT-BiLSTM repair | Point1_data | 1.000000 | 0.986342 | 0.978996 |
Point2_data | 0.986342 | 1.000000 | 0.985148 | |
Point3_data | 0.978996 | 0.985148 | 1.000000 | |
All data after SVR repair | Point1_data | 1.000000 | 0.984546 | 0.978996 |
Point2_data | 0.984546 | 1.000000 | 0.983359 | |
Point3_data | 0.978996 | 0.983359 | 1.000000 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, Y.; Liu, R.; Wang, J.; Pan, F.; Lian, F.; Cheng, H. Research on ATT-BiLSTM-Based Restoration Method for Deflection Monitoring Data of a Steel Truss Bridge. Appl. Sci. 2025, 15, 8622. https://doi.org/10.3390/app15158622
Chen Y, Liu R, Wang J, Pan F, Lian F, Cheng H. Research on ATT-BiLSTM-Based Restoration Method for Deflection Monitoring Data of a Steel Truss Bridge. Applied Sciences. 2025; 15(15):8622. https://doi.org/10.3390/app15158622
Chicago/Turabian StyleChen, Yongjian, Rongzhen Liu, Jianlin Wang, Fan Pan, Fei Lian, and Hui Cheng. 2025. "Research on ATT-BiLSTM-Based Restoration Method for Deflection Monitoring Data of a Steel Truss Bridge" Applied Sciences 15, no. 15: 8622. https://doi.org/10.3390/app15158622
APA StyleChen, Y., Liu, R., Wang, J., Pan, F., Lian, F., & Cheng, H. (2025). Research on ATT-BiLSTM-Based Restoration Method for Deflection Monitoring Data of a Steel Truss Bridge. Applied Sciences, 15(15), 8622. https://doi.org/10.3390/app15158622