Next Article in Journal
Experimental Characterization of Cast Explosive Charges Used in Studies of Blast Effects on Structures
Previous Article in Journal
Inspection of PC Pre-Tensioned Girders Deteriorated by Actual Salt Damage via the Triaxial Magnetic Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Recursive Time Series Prediction Modeling of Long-Term Trends in Surface Settlement During Railway Tunnel Construction

School of Civil Engineering, Central South University, Changsha 410000, China
*
Author to whom correspondence should be addressed.
CivilEng 2025, 6(2), 19; https://doi.org/10.3390/civileng6020019
Submission received: 17 December 2024 / Revised: 10 February 2025 / Accepted: 1 April 2025 / Published: 3 April 2025

Abstract

:
The surface settlement of railroad tunnels is dynamically updated as the construction progresses, exhibiting complex nonlinear characteristics. The accuracy of the on-site nonlinear regression fitting prediction method needs to be improved. To prevent surface settlement and surrounding rock collapse during railroad tunnel construction, while also ensuring the safety of the tunnel and existing structures, we propose a recursive prediction model for the long-term trend of surface settlement utilizing a singular spectrum analysis (SSA), improved sand cat swarm optimization (ISCSO), and a kernel extreme learning machine (KELM). First, SSA decomposition, known for its adaptive decomposition of one-dimensional nonlinear time series, reorganizes the early surface settlement data. The dynamic sliding window method is introduced to construct the prediction dataset, which is then trained using the KELM. ISCSO is used to optimize the key parameters of the KELM to obtain the long-term trend curves of surface settlement through recursive time series prediction. The superiority and effectiveness of ISCSO and the model are verified through numerical experiments and simulation experiments based on engineering cases, providing a reference for the early warning and control of surface settlement during the construction of similar tunnels.

1. Introduction

In recent years, China’s railroad construction has ventured deep into arduous and dangerous areas characterized by intertwined mountains and complex terrain with many tunnels. On-site construction faces harsh environments and complex geological conditions [1]. Construction disturbances may induce surface settlement above the tunnel [2], posing risks such as the collapse of the ground and surrounding rock, which can easily lead to safety accidents [3]. Therefore, it is particularly important to excel in predicting surface settlement during railroad tunnel construction to ensure the safety of the tunnel and existing structures [4].
There are various research methods for predicting surface settlement in tunnels. Traditional methods include the empirical formula, random medium theory [5], numerical simulation [6], and elastic strain methods [7]; the empirical formula, numerical simulation, and random medium theory methods are used more often. As multidisciplinary integration develops, machine learning algorithms are increasingly being utilized to predict tunnel surface settlement in a parameter of multiple-input single-output and time-series predictions. Therefore, tunnel surface settlement prediction can be divided into five research directions.
The first research direction applies the empirical formula method, primarily improving Peck’s formula, to predict surface settlement [8]. For example, Khademian et al. (2017) compared the maximum ground settlement values predicted by Peck’s formula and numerical methods for subway tunnel construction [9]. Ma et al. (2023) considered the theory of strata slip fractures and derived a revised formula for the width coefficient of the tunnel surface settlement trough based on Peck’s formula, validating the revised formula through examples [10].
The second research direction applies the random medium theory to predict surface settlement during tunnel construction. For instance, Xu et al. (2023) improved the traditional random medium theory and constructed a prediction model for surface settlement when a tunnel passes beneath an existing tunnel, addressing the issue of large prediction errors [11]. Shang et al. (2024) proposed an enhanced surface subsidence prediction model based on the stochastic medium theory, which predicts surface deformation caused by tunnel passage [12].
The third research direction applies numerical simulations, mainly predicting surface settlement through finite element analysis. For example, Ngoc et al. (2022) used finite element software to study the influence of tunnel distance on settlement troughs and predict troughs [13]. Singh et al. (2018) developed a finite element numerical model to analyze the influence of geotechnical conditions and tunnel parameters on surface settlement [14]. Maroof et al. (2024) used a retrospective analysis to predict surface settlement during urban railway tunnel construction and compared the predicted results using finite element analysis software [15]. Ercelebi (2010) combined finite element models with semi-theoretical predictions for short-term surface subsidence [16]. Fattah et al. (2012) studied the shape of settlement troughs caused by tunnel excavation in viscous foundations via finite element analysis [17].
The fourth direction involves parameter multiple-input single-output prediction using machine learning. This approach takes geological conditions, tunnel design parameters, and construction parameters during tunnel excavation as inputs, and uses surface settlement values as outputs for prediction. For example, Hu et al. (2023) constructed a particle swarm optimization (PSO)-backpropagation neural network (BP) hybrid prediction model for tunnel surface settlement based on the function approximation characteristics of BP neural networks, providing theoretical support for the safety control of rectangular jacked tunnels crossing existing highways [18]. Ambrozic and Turk (2003) used artificial neural networks (ANN) to predict surface subsidence above mining tunnels [19]. Kumar et al. (2021) used Vanilla and Stacked Long Short-Term Memory (LSTM) to predict surface subsidence during underground mining excavation, based on monitoring data from modified Permanent Scatterers Interferometric Synthetic Aperture Radar (PSInSAR) [20]. Bai et al. (2021) constructed a surface settlement prediction model based on differential evolution (DE) and support vector regression (SVR) by combining the surface settlement data characteristics of shield construction tunnels. Fourthly, the time series prediction of surface settlement is carried out by machine learning [21].
The fourth direction involves parameter multiple-input single-output prediction using machine learning. This approach incorporates geological conditions, tunnel design parameters, and construction parameters during tunnel excavation and uses surface settlement values as outputs for prediction. For example, Hu et al. (2023) constructed a hybrid prediction model combining particle swarm optimization (PSO) and backpropagation neural networks (BP) to forecast tunnel surface settlement, leveraging the function approximation capabilities of BP neural networks, thereby providing theoretical support for the safety management of rectangular jacked tunnels intersecting existing highways [18]. Ambrozic and Turk (2003) used an artificial neural network (ANN) to predict surface subsidence above mining tunnels [19]. Kumar et al. (2021) used Vanilla and Stacked Long Short-Term Memory (LSTM) networks to predict surface subsidence during underground mining excavation utilizing monitoring data gathered by a modified Persistent Scatterer Interferometric Synthetic Aperture Radar (PSInSAR) [20]. Bai et al. (2021) constructed a surface settlement prediction model utilizing differential evolution (DE) and support vector regression (SVR) by integrating the characteristics of surface settlement data from shield construction tunnels [21].
The fifth direction involves surface settlement prediction over time using machine learning techniques. For example, Wang et al. (2021) proposed a dynamic prediction method for surface settlement in tunnel entrance sections utilizing GWO-OSELM, considering the nonlinear dynamic variations in surface settlement monitoring values. This provides a new approach for the long-term monitoring of surface settlement in mountainous tunnels [22]. Yin et al. (2024) proposed a deep learning-based method for predicting maximum surface settlement, addressing the challenges existing prediction methods face in simultaneously analyzing nonlinear feature relationships and bidirectional time series data for shield tunnel surface settlement [23].
The application of existing surface settlement prediction methods has achieved some results, but they still have some shortcomings:
  • Peck’s formula does not apply to shallow-buried tunnels [24], and it is more sensitive to geological conditions, making parameter determination difficult and greatly affecting prediction accuracy in areas with complex and variable geological conditions.
  • The random medium theory can be applied to shallow-buried tunnels [25], but its application in engineering is hindered by the complexity of its calculation methods [26].
  • Numerical simulation methods mainly study the settlement trough of the tunnel section surface settlement and cannot predict the long-term trend of surface settlement.
  • Machine learning multi-input predictions based on static geological conditions, construction parameters, and other data all fail to capture the dynamic changes in the construction process. Additionally, the currently used machine learning algorithms have deficiencies. For example, BP neural networks are slow, require a large amount of data for training, and are prone to overfitting [27]. SVR only applies to small data samples, and selecting kernel functions is challenging [28]. The LSTM model is prone to overfitting and contains multiple parameters, which results in a long training time.
  • The current time series surface settlement prediction mostly focuses on data fitting and fails to provide accurate predictions, specifically in predicting future long-term surface settlement trends based on previously recorded surface settlement data.
Based on the above analysis, this study combines an SSA, which adaptively decomposes one-dimensional nonlinear surface settlement time series into simple feature components, and introduces a dynamic sliding window algorithm to enhance the temporal sensitivity of the prediction model. A KELM, which has excellent learning and fitting adaptability abilities, is used for training and prediction. ISCSO, which enhances global search ability, local optimization accuracy, and convergence speed, is used for parameter optimization to construct the SSA-ISSO-KELM model to address the difficulty of determining KELM parameters. This model can accurately reflect the adverse trend of long-term surface settlement during tunnel construction, provide warning information, and achieve long-term recursive time series prediction of railway tunnel surface settlement.

2. Methods

2.1. Surface Settlement Data Pre-Processing

2.1.1. Cubic Hermite Interpolation

According to the “Technical Specifications for Monitoring and Measurement of Railway Tunnel” (Q/CR 9218-2024) [29], the monitoring frequency of surface settlement in railway tunnels depends on the distance between the monitoring section and the tunnel face, as shown in Table 1. Consequently, the tunnel surface settlement monitoring intervals are non-isochronous, and at the same time, there may be missing monitoring data during the monitoring process. To eliminate the effects of non-isochronous and missing interval data on the accuracy of model predictions, this study used the cubic Hermite interpolation method to interpolate the data with irregular time intervals.

2.1.2. SSA

SSA represents a specific case of principal component analysis [30], wherein singular value decomposition is applied to the trajectory matrix of a time series, subsequently extracting the principal components of the signal by differentiating between the signal and noise within the singular spectrum, thereby achieving denoising. It is particularly suitable for analyzing one-dimensional time series and can extract trend, period, and residual terms from the time series [31]. Due to its lack of assumptions regarding parameter models and stability conditions, it is more adaptable when dealing with time series data with nonlinear features. Singular spectrum analysis mainly includes three steps: embedding, decomposition, and group reconstruction.
1. Embedding
For a one-dimensional time series ( X 1 = x 1 , x 2 , , x N ), where N is the length of the time series, it is necessary to first select an appropriate window length ( L ). The one-dimensional time series data are then arranged using a sliding window lag approach and mapped into a trajectory matrix with L rows and K columns, as shown in Equation (1), where K = N L + 1 . The general rule is to set L to approximately N / 3 .
X = x 1 x 2 x L x 2 x 3 x L + 1 x N L + 1 x N L + 2 x N
2. Decomposition
The decomposition method employed in SSA is singular value decomposition. First, the covariance matrix ( S = X X T ) is constructed based on the trajectory matrix ( X ), and then the eigenvalues ( λ ) of S and their corresponding eigenvectors ( U ) are calculated. Afterward, singular value decomposition is performed to obtain L components ( X m ), each with simple features, as shown in Equations (2) and (3).
V m = X T U m / λ m       m = 1 ,   2 ,   ,   L
X = m = 1 L X m = m = 1 L λ m U m V m
where λ 1 λ 2 λ L 0, U , and V are the left and right matrices, and both are unit orthogonal matrices.
3. Group reconstruction
The surface settlement curve has nonlinear characteristics and large differences in random fluctuations; therefore, the routinely used group reconstruction method cannot determine the requirements of recursive time series prediction of surface settlement. Therefore, based on the SSA decomposition, this study obtained L components ( X m ) with simple features. Based on the trend of the original surface settlement curve, the corresponding trend term, half-cycle term, and residual term components were reconstructed and grouped to predict different grouped components, ensuring a more accurate long-term trend prediction of surface settlement.
This study utilized the L components ( X m ) obtained from SSA decomposition’s basic features and, considering the original surface settlement curve’s trend, accounted for random fluctuations in surface settlement, fitted the surface settlement trend, and reconstructed the groups to obtain the corresponding trend, half-periodic, and residual components. Then, it predicted each group component separately to ensure a more accurate long-term prediction of the surface settlement trend.

2.1.3. Sliding Window Method

As shown in Figure 1, the blue shaded area denotes the sliding window interval, while the orange area indicates the predicted values obtained through the rolling forecasting method. Tunnel surface settlement monitoring data were dynamically updated as construction progressed, ensuring the model remained timely and dynamic. This study introduced the sliding window method, which allowed new settlement sample data monitored on-site to be incorporated into the model through a rolling mechanism, enabling recursive time series updates and predictions of surface settlement data.

2.2. KELM

ELM, a single hidden layer feedforward neural network proposed by Huang et al. (2006) [32], boasts advantages such as fast training speed and strong generalization abilities. In 2012, Huang et al. introduced the kernel function into ELM to establish the KELM model [33], effectively improving the model’s predictive performance and stability. The output function and weight (β) formula are as follows:
f ( x ) = k ( x , x 1 ) k ( x , x N ) I C + Ω 1 Y
β = H T I C + H H T 1 Y
This study employed the radial basis function kernel, known for its robust generalization ability, represented as follows:
k ( x i , x j ) = e x p | | x x i | | 2 σ 2
where σ is the kernel function parameter.

2.3. ISCSO

In practical applications of KELM, the regularization coefficient ( C ) and kernel function parameter ( σ ) significantly impact the prediction results and are difficult to determine. This study introduced the sand cat swarm optimization (SCSO) algorithm [34], which can solve complex optimization problems and improve optimization by enhancing global search ability, local optimization accuracy, and convergence speed. The ISCSO algorithm is used to determine parameters C and σ , improving the accuracy and stability of KELM prediction.

2.3.1. SCSO Principle

1. Search for the prey stage
The search for prey stage, i.e., the global search process, is where the algorithm explores the solution space to find potential optimal solutions through stochasticity and exploration. The inspiration for SCSO comes from sand cats’ ability to search and prey on their victims by perceiving noises below 2 kHz. To simulate the auditory characteristics of sand cats, the algorithm model assumes that the sand cats’ auditory sensitivity decreases linearly to zero with the number of iterations (Iteration). This decreasing search range helps the algorithm to search widely in the solution space initially and gradually focus on the neighborhood of the potentially optimal solution, as shown in Equation (7).
r G = S M S M I t e r c I t e r m a x
During the sand cats’ prey search process, to avoid each position update falling into a local optimum, the sensitivity ( r ) of each sand cat varies based on its auditory characteristics, ensuring randomization of search directions. This is expressed as follows:
r = r G r a n d ( 0 , 1 )
Each sand cat updates its position based on the optimal position within the group, its own position, and its sensitivity ( r ). The new position is randomly determined somewhere between the best position of the sand cat group and the cat’s current position. This process gradually guides the algorithm toward the potential optimal solution while maintaining a certain degree of randomness to prevent premature convergence to a local optimum. This is illustrated in Equation (9).
P ( t + 1 ) = r ( P o s b e s t ( t ) r a n d ( 0 , 1 ) P o s ( t ) )
2. Prey stage
The prey stage, i.e., the local optimization process, involves the algorithm finding the optimal solution by fine-tuning the existing solution. In a sand cat’s attack phase, it determines a random position between the optimal position within the group and its own position ( P ). The sand cat then moves closer to the prey, ensuring that the algorithm approximates the optimal solution during the attack phase while maintaining a certain degree of randomness to avoid falling into a local optimum. Assuming that the sand cat’s sensitivity range forms a circle, it uses the roulette wheel algorithm to randomly select an angle θ between 0 and 360 degrees for the attack. This mechanism helps the algorithm to gradually approach and find the optimal solution during the attack phase. The formula is as follows:
P r n d = | r a n d ( 0 , 1 ) P b e s t ( t ) P ( t ) |
P ( t + 1 ) = P b e s t ( t ) r P r n d c o s ( θ )
3. Search and prey stage transition
The transition between the search and prey stages balances global search and local optimization, ensuring that the algorithm gathers enough information to make a smooth transition to the local optimization phase and find the globally optimal solution. The transition between the search and prey stages of the sand cat employs an adaptive strategy, which is determined by the parameter R . When R < 1 , the sand cat is in the attack phase; otherwise, it transitions to the search phase, as shown in Equations (12) and (13).
R = 2 r a n d ( 0 , 1 ) r G r G
P ( t + 1 ) = r ( P b e s t ( t ) r a n d P ( t ) )       | R | > 1 P b e s t ( t ) r P r n d c o s ( θ )         | R | < 1

2.3.2. SCSO

1. Initialization of Optimizing Latin Hypercube (OLM) with the Maximum–Minimum Idea
In the SCSO, the initial position of each sand cat is randomly selected within the defined boundary. To address the issue of weak global search capability due to the random distribution of sand cat positions, this study introduced the OLM method to optimize the initial distribution of the sand cat population. OLM maximizes the minimum distance between each initialized sand cat position and the rest, optimizing their initial distribution. This ensures a uniform and discrete distribution of sand cats within the defined boundary, preventing aggregation. The optimization effect is shown in Figure 2.
2. Adaptive adjustment of sand cat sensitivity
Switching between global search and local optimization, or between search and attack phases, is crucial in optimization algorithms. In SCSO, the value of parameter R serves as the conversion mechanism between the search and attack phases. Parameter R , which is determined by the sand cat’s auditory sensitivity ( r G ) and fluctuates within the range of [ 2 r G ,   2 r G ], decreases linearly with the increase in the number of iterations, as indicated in Equation (4). This does not align with the natural behavior of sand cats during collaborative hunting. Therefore, this study proposed a nonlinear adaptive formula for sand cat sensitivity, as shown in Equation (14):
r G = 2 1 I t e r c I t e r m a x k m
where k and m are adjustment coefficients, with both taking values ≥ 1.
Equation (11) maximizes population diversity with higher sensitivity values in early iterations; it converges faster in later iterations to accelerate local optimization and result in a more balanced and stable switching between search and attack phases, as shown in Figure 3.
3. Golden sine operator improved search strategy
SCSO lacks thoroughness in its search strategy, leading to low solution accuracy and slow convergence in later iterations. In this study, by incorporating the golden sine algorithm (GSA) proposed by Zhang and Wang (2020) [35], which combines the sine function with the golden ratio, the sand cat algorithm was able to search in areas that yield good results, significantly improving the algorithm’s convergence speed and optimization accuracy. The improved search strategy formula is as follows:
P ( t + 1 ) = P o s ( t ) | s i n ( r 1 ) | r 2 s i n ( r 1 ) | ( x 1 P b e s t ( t ) x 2 P o s ( t ) ) |
where r 1 is a random number of [0, 2π]; r 2 is a random number of [0, π]; and x 1   a n d   x 2 are coefficients computed via the golden ratio that update the sand cat’s position adjustment toward the prey.

3. Surface Settlement Prediction Model Based on SSA-ISCSO-KELM

3.1. Indicators for Evaluating Projected Results

In order to evaluate the prediction accuracy of the sand cat optimization algorithm model, Mean Absolute Error (MAE), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Percentage Error (MAPE), and Mean Bias Error (MBE) are selected as the evaluation indexes for model accuracy. The smaller the absolute value of MBE, the greater the modulus. The smaller the values of MAE, MSE, RMSE, MAPE, and M B E , the better the prediction performance of the model is.
M A E = 1 n i = 1 n | x i x ı ^ |
M S E = 1 n i = 1 n ( x i x ı ^ ) 2
R M S E = 1 n i = 1 n ( x i x ı ^ ) 2
M A P E = 1 n i = 1 n x i x ı ^ x i
M B E = 1 n i = 1 n ( x i x ı ^ )

3.2. Overview of Predictive Models

This study applied programming implementation methods using MATLAB R2021b software. The construction of the prediction model entails data pre-processing, parameter optimization, data prediction, and model accuracy evaluation, as shown in Figure 4.
(1) Data Pre-processing
① Cubic Hermite interpolation is performed on the non-isochronous surface settlement data to obtain isochronous data.
② SSA decomposition and reorganization are used to obtain the trend, half-cycle, and residual terms.
③ The prediction dataset for different terms is constructed using the sliding window method.
(2) Parameter Optimization and Data Prediction
① The initial parameters of the ISOCO algorithm are set, and the MAE between the recursive prediction results and the true values of the pre-prediction dataset is used as the fitness function for iterative updating of the sand cat’s position. The optimal regularization coefficient ( C ) and kernel function parameter ( σ ) are obtained.
② Predictions are trained and tested using the kernel extreme learning machine (KELM) to obtain the various change values for the later period.
(3) Model Accuracy Evaluation Analysis
① The predicted late surface settlement value is obtained by summing up the various change values.
② The predicted late surface settlement values and the actual values are inserted into Equations (16)–(19) to calculate the evaluation index values.
③ The models’ prediction effectiveness and degree of fit are evaluated, and the reasons for the differences are analyzed. The overall prediction flowchart is shown in Figure 4.

4. Simulation Experiments and Analysis

4.1. Performance Analysis of the ISCSO Algorithm

To verify the superior performance of the ISCSO algorithm proposed in this study, SCSO, sparrow search algorithm (SSA), gray wolf optimization algorithm (GWO), and seagull optimization algorithm (SOA) were selected for simulation and analysis.

4.1.1. Test Function Selection

Eight test functions were selected for the simulation experiment, with the basic information shown in Table 2. Among them, F1–F4 are single-peak functions used to test the algorithm’s optimization accuracy, while F5–F8 are multi-peak functions used to test the algorithm’s global optimization ability. The experimental software used is MATLAB R2021b, and the hardware configuration is equipped with an I9-13900H processor and 16 GB of RAM.

4.1.2. Algorithm Parameterization

The maximum number of iterations for all algorithms was 1000, and the population size was 50. The parameter settings for each algorithm are shown in Table 3.

4.1.3. Comparative Analysis of Algorithm Performance

To avoid the randomness in experimental results caused by random errors, each algorithm was independently run 50 times under the same parameter conditions. The average value and standard deviation of the test results were used as the basis for measuring the performance of the algorithms, where the average value reflects the algorithm’s optimization accuracy, and the standard deviation reflects the algorithm’s stability.
As can be seen from Figure 5, for single-peak functions, the ISCSO algorithm significantly outperforms the other algorithms in terms of convergence speed and optimization accuracy; for multiple-peak functions, the ISCSO algorithm exhibits a faster convergence speed than the other algorithms, with an optimization accuracy comparable to that of SSA and other algorithms. In summary, the ISCSO algorithm excels in convergence speed and optimization accuracy compared to the other algorithms.
As can be seen from Table 4, for single-peak functions, ISCSO achieves the best average optimization value, and its standard deviation is only slightly higher than that of SCSO for the F4 function. For multi-peak functions, particularly the F5 and F6 functions, ISCSO achieves the best average optimization value. For the F7 function, it is slightly lower than that of SOA, and for the F8 function, it is slightly lower than that of SSA. SCSO has the lowest standard deviation for the F5, F6, F7, and F8 functions. In summary, ISCSO outperforms the other algorithms regarding optimization accuracy and stability.

4.2. Performance Analysis of SSA-ISCSO-KELM Prediction Models

4.2.1. Data Acquisition

Railways have stricter standards for both subgrade settlement and tunnel surface settlement. When the expected settlement value exceeds the acceptable range specified, it is necessary to improve the geological conditions through grouting reinforcement before construction. The PS tunnel passes through the lower Jurassic Ziliujing Group strata, with 19% of the surrounding rock being V-class and 78% being IV-class. The tunnel encounters adverse geological conditions, such as soft rock deformation, fracture zones, joints and fissure zones, and bedding-parallel bias. The shallow-buried section of the tunnel entrance features high and steep slopes, long, one-sided slopes, and other unfavorable factors. The risk of deformation and safety hazards during construction increases when the tunnel passes under gullies and existing structures. Therefore, it is necessary to accurately predict and monitor the surface settlement of the tunnel.
The PS tunnel employs the step method for construction on Ⅲ- and Ⅳ-class surrounding rocks and the three-step method for construction on V-class surrounding rock. Surface settlement monitoring points are arranged at intervals of 2–5 m along the testing sections, with each monitoring section having 10 such points, as illustrated in Figure 6. This study took the 855, 860, 865, and 870 monitoring sections at the DK461 mileage in the V-class surrounding rock section as examples, utilizing 40 monitoring points for prediction.

4.2.2. Surface Settlement Data Preprocessing

The monitoring frequency for the PS tunnel follows what is shown in Table 1. From the start of surface settlement monitoring to the end of data extraction, the monitoring values were first processed using the cubic Hermite interpolation method. Taking the DK461-860-DB07 monitoring point as an example, interpolation was performed at 0.5-day intervals to obtain an equal-interval surface settlement curve, as shown in Figure 7.
This study used the first 10 days of surface settlement data, corresponding to 20 data points, as the basis for training and recursively predicting the next 10 days of data. SSA (singular spectrum analysis) decomposition was applied to decompose these initial 10 days of surface settlement data. The SSA window length (L) was set to (20/3) ≈ 7, and the decomposition results in seven simple feature components (Xm), as illustrated in Figure 8.
The initial settlement of 860-DB07 generally shows a smooth and incremental upward trend. The trend term, half-periodic term, and residual term are reorganized as follows: the trend term contains components 1, 2, 3, 4, and 5; the half-periodic term contains component 6; and the residual term contains component 7. The reorganization is shown in Figure 9.

4.2.3. Model Training Predictions

This study adopted the nonlinear regression surface settlement prediction method commonly used at railway tunnel construction sites, SSA-ISCSO-KELM, and SSA-ISCSO-BIGRU prediction models for surface settlement prediction. The BIGRU model, as a commonly used time series prediction model, excels at capturing long-term dependencies in time series data, has high computational efficiency, and has a certain degree of generalization ability.
The parameter settings for both the SSA-ISCSO-KELM and SSA-ISCSO-BIGRU prediction models are provided in Table 5. The kernel function was uniformly selected as the RBF, with the sand cat sensitivity adaptive parameters set as k = 3.7 and m = 33.33, and the sliding window step size set to 3. Each monitoring point was independently run 30 times under the same parameter conditions to avoid any randomness.

4.2.4. Analysis and Comparison of Forecast Results

After training and prediction, the surface settlement curves predicted by different models were obtained. Two surface settlement monitoring points were randomly selected at each mileage point for the result presentation, as shown in Figure 10. The settlement data were then inserted into Equations (16)–(19) to calculate the evaluation index values for each prediction model’s results, as shown in Table 6.
Table 6 shows that the SSA-ISCSO-KELM prediction model is only slightly inferior to nonlinear regression prediction in the evaluation indicators for 855-DB04 and 860-DB09. Additionally, the evaluation indicators for the 10-day predictions are better than those of the other models.
Nonlinear regression prediction can only respond to the overall movement trend of the data when facing surface settlement data with complex information. It cannot accurately learn the characteristics of the original data or react to the peak fluctuations contained within the data, especially under the complex conditions of frequent fluctuations in surface settlement and uplift. At the same time, it can be observed from Figure 10 that the closer the tunnel surface settlement is to the tunnel face, the greater the settlement growth. However, nonlinear regression prediction cannot accurately respond to the trend of surface settlement in the later stages when the degree of settlement growth is larger.
The SSA-ISCSO-BIGRU model has shown satisfactory performance in the initial stage of surface settlement training. However, the model experienced overfitting in the prediction of future settlement at a later stage and was not able to generalize well to new, unseen data. Prediction errors accumulate over subsequent predictions, leading to the predicted settlement values eventually converging to a constant or a straight line. Additionally, the BIGRU model has numerous parameters to optimize, and the initialization process for parameter optimization is time-consuming. It takes approximately 75 min to perform the first prediction when the population size is 100 and the number of iterations is 200.
The SSA-ISCSO-KELM prediction model, which combines SSA recombination decomposition with the kernel learning method, can more effectively handle surface settlements with nonlinear characteristics. It considers both the surface settlement’s overall motion trend and the influence of random fluctuations. Furthermore, the KELM model has few parameters and is less sensitive to parameter selection. It can perform 30 predictions in just 27 min.
Therefore, the SSA-ISCSO-KELM prediction model can more accurately predict the future long-term trend of surface settlement based on the pre-surface settlement monitoring data. Furthermore, it can recursively predict future settlements by updating real-time data as construction progresses. When the predicted settlement or deformation rate exceeds the control benchmark, specialized construction methods can be adopted in advance, such as using steel supports for initial reinforcement, optimizing excavation through in-cavity grouting to stabilize the surrounding rock, and reinforcing the ground surface through grouting to reduce uneven settlement. This helps prevent uncontrollable settlement during construction before measures are taken, effectively controlling the surface settlement of tunnels and preparing for safe construction.

5. Conclusions

This study proposes a recursive time series prediction model for long-term surface subsidence trends based on SSA-ISCSO-KELM to prevent the risk of ground collapse and surrounding rock collapse during railway tunnel construction. This model uses cubic Hermite interpolation and SSA decomposition to handle non-equidistant, missing, and noisy surface subsidence data. Improvements were made to SCSO to enhance its global search capability, local optimization accuracy, and convergence speed. Numerical experiments were conducted to verify the superiority of ISCSO in terms of optimization accuracy and convergence speed. By optimizing KELM parameters through ISCSO, surface subsidence prediction is achieved. Taking the PS tunnel as a simulation example, it is shown that the model can accurately predict and reflect the long-term trend of surface subsidence. This model can be extended to deformation research in engineering with infinite geological conditions, providing a reference for its prediction and control work. However, this model also has limitations, as some parameters, such as the size of the search boundary and the selection of decomposition and recombination components, need to be determined through testing. In future work, we will systematically test the size of the search boundaries for different projects to ensure that the model works effectively in the remaining projects. The impact of different choices on the model’s performance will also be explored, and the optimal combination will be identified.

Author Contributions

Conceptualization, F.Z. and Q.W.; methodology, Q.W.; software, Q.W.; validation, F.Z., Q.W. and Z.W.; formal analysis, Z.W.; investigation, J.C.; resources, Q.W.; data curation, D.J.; writing—original draft preparation, Q.W.; writing—review and editing, F.Z.; visualization, L.X.; supervision, F.Z.; project administration, Q.W.; funding acquisition, F.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by China Railway 18th Bureau Group Co., Ltd. (Grant No. CR18Gk-24-Y-t-16), with special gratitude extended to Youbo Wei for his invaluable support and assistance.

Data Availability Statement

Processed datasets are available from the corresponding author (Q.W.) upon reasonable request due to privacy restrictions imposed by collaborating railway authorities.

Acknowledgments

The authors gratefully acknowledge their advisors and other researchers in the division for their help in the writing of this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhang, F.; Wu, Z.; Wang, X.; He, Y. Safety risk assessment of railway tunnel construction by drilling and blasting method in complex and dangerous areas. J. Railw. Sci. Eng. 2023, 20, 1891–1901. [Google Scholar]
  2. Wang, Z.; Wong, R.C.K.; Li, S.; Qiao, L. Finite element analysis of long-term surface settlement above a shallow tunnel in soft ground. Tunn. Undergr. Space Technol. 2012, 30, 85–92. [Google Scholar]
  3. Zhang, F.; Xu, W.; Wang, M.; Luo, L. Risk assessment and trend research of railway tunnel construction safety. J. Saf. Environ. 2024, 24, 2473–2482. [Google Scholar]
  4. Saeed, H.; Uygar, E. Equation for Maximum Ground Surface Settlement due to Bored Tunnelling in Cohesive and Cohesionless Soils Obtained by Numerical Simulations. Arab. J. Sci. Eng. 2022, 47, 5139–5165. [Google Scholar] [CrossRef]
  5. Litwiniszyn, J. The Theories and Model Research of Movements of Ground. Available online: https://www.researchgate.net/publication/285379913_The_theories_and_model_research_of_movements_of_ground (accessed on 1 December 2024).
  6. Swoboda, G.; Abu-Krisha, A. Three-dimensional numerical modelling for TBM tunnelling in consolidated clay. Tunn. Undergr. Space Technol. 1999, 14, 327–333. [Google Scholar] [CrossRef]
  7. Sagaseta, C. Analysis of undraind soil deformation due to ground loss. Géotechnique 1987, 37, 301–320. [Google Scholar] [CrossRef]
  8. Peck, R.B. Deep Excavation and tunnelling in soft ground. In Proceedings of the 7th International Conference on Soil Mechanics and Foundation Engineering, Mexico City, Mexico, 25–29 August 1969; Volume 4, pp. 225–290. [Google Scholar]
  9. Khademian, A.; Abdollahipour, H.; Bagherpour, R.; Faramarzi, L. Model uncertainty of various settlement estimation methods in shallow tunnels excavation; case study: Qom subway tunnel. J. Afr. Earth Sci. 2017, 134, 658–664. [Google Scholar] [CrossRef]
  10. Ma, Z.; Xie, X.; Jia, H.; Zhao, J.; He, S.; Wang, X. Prediction and Application of Surface Settlement of Shallow Buried Tunnels Taking into Account Strata Slip Cracks. Processes 2023, 11, 1575. [Google Scholar] [CrossRef]
  11. Xu, Q.; Zhu, Y.; Lei, S.; Liu, Y.; Zhao, W.; Fang, Z.; Wang, C.; Xu, S. Improved stochastic medium theoretical model for predicting deformation of existing tunnels and strata caused by excavation of new undercrossing tunnels. Chin. J. Geotech. Eng. 2023, 45, 301–309. [Google Scholar]
  12. Shang, X.; Miao, S.; Wang, H.; Yang, P.; Xia, D. A prediction model for surface settlement during the construction of variable cross-section tunnels under existing structures based on stochastic medium theory. Tunn. Undergr. Space Technol. 2024, 155, 106177. [Google Scholar] [CrossRef]
  13. Thai, D.N.; Kien, D.V.; Vi, P.V.; Quang, N.V. Prediction of Surface Settlement Due to Twin Tunnel Construction in Soft Ground of Hanoi Metro Line 03. Int. J. GEOMATE 2022, 22, 66–72. [Google Scholar]
  14. Singh, D.K.; Aromal, V.; Mandal, A. Prediction of surface settlements in subway tunnels by regression analysis. Int. J. Geotech. Eng. 2020, 14, 836–842. [Google Scholar] [CrossRef]
  15. Maroof, A.; Mohammadzadeh, S.D.; Karballaeezadeh, N.; Bajgiran, K.S.; Mosavi, A.; Felde, I. Investigation of Empirical and Analytical Methods Accuracy for Surface Settlement Prediction in Train Tunnel Excavation Projects. Acta Polytech. Hung. 2024, 21, 167–186. [Google Scholar] [CrossRef]
  16. Ercelebi, S.G.; Copur, H.; Ocak, I. Surface settlement predictions for Istanbul Metro tunnels excavated by EPB-TBM. Environ. Earth Sci. 2010, 62, 357–365. [Google Scholar] [CrossRef]
  17. Fattah, M.Y.; Shlash, K.T.; Salim, N.M. Prediction of settlement trough induced by tunneling in cohesive ground. Acta Geotech. 2012, 8, 167–179. [Google Scholar] [CrossRef]
  18. Hu, D.; Hu, Y.; Yi, S.; Liang, X.; Li, Y.; Yang, X. Prediction method of surface settlement of rectangular pipe jacking tunnel based on improved PSO-BP neural network. Sci. Rep. 2023, 13, 1–15. [Google Scholar] [CrossRef]
  19. Ambrožič, T.; Turk, G. Prediction of subsidence due to underground mining by artificial neural networks. Comput. Geosci. 2003, 29, 627–637. [Google Scholar] [CrossRef]
  20. Kumar, S.; Kumar, D.; Donta, P.K.; Amgoth, T. Land subsidence prediction using recurrent neural networks. Stoch. Environ. Res. Risk Assess. 2021, 36, 373–388. [Google Scholar] [CrossRef]
  21. Bai, X.; Rong, M.; Wen, Z.; Zhang, N. Prediction of Surface Settlement in Construction Stage of Earth Pressure Balance Shield Tunnel Based on Differential Evolution Algorithm-Support Vector Regression Machine. Tunn. Constr. 2021, 41, 336–345. [Google Scholar]
  22. Wang, S.H.; Zhu, B.Q. Time series prediction for ground settlement in portal section of mountain tunnels. Chin. J. Geotech. Eng. 2021, 43, 813–821. [Google Scholar]
  23. Yin, Q.; Zhou, Y.; Rao, J.A. Deep learning-based method for predicting surface settlement induced by shield tunnel construction. J. Cent. South Univ. (Sci. Technol.) 2024, 55, 607–617. [Google Scholar]
  24. Han, X.; Li, N.; Standing, J.R. An adaptability study of Gaussian equation applied to predicting ground settlements induced by tunneling in China. Rock Soil Mech. 2007, 1, 23–28. [Google Scholar]
  25. Franza, A.; Marshall, A.M. Empirical and semi-analytical methods for evaluating tunnelling-induced ground movements in sands. Tunn. Undergr. Space Technol. 2019, 88, 47–62. [Google Scholar] [CrossRef]
  26. Song, Z.; Tian, X.; Zhang, Y. A New Modified Peck Formula for Predicting the Surface Settlement Based on Stochastic Medium Theory. Adv. Civ. Eng. 2019, 7328190. [Google Scholar] [CrossRef]
  27. Liu, M.; Zhang, Y.; Guo, J.; Chen, J. An Optimized Adaptive BP Neural Network Based on Improved Lion Swarm Optimization Algorithm and Its Applications. Arab. J. Sci. Eng. 2023, 49, 3417–3434. [Google Scholar] [CrossRef]
  28. Jin, C.; Jin, S.-W. Software reliability prediction model based on support vector regression with improved estimation of distribution algorithms. Appl. Soft Comput. 2014, 15, 113–120. [Google Scholar] [CrossRef]
  29. QCR 9218-2024; Technical Specification for Monitoring and Measurement of Railway Tunnels. China State Railway Group Co., Ltd.: Beijing, China, 2024.
  30. Golyandina, N.; Korobeynikov, A. Basic Singular Spectrum Analysis and forecasting with R. Comput. Stat. Data Anal. 2014, 71, 934–954. [Google Scholar] [CrossRef]
  31. Yang, X.; Ren, Z.; Zhou, G.; Yi, J.; He, Y. CNN-BiLSTM Short-term Air Conditioning Load Prediction Model Based on Singular Spectrum Analysis. Chin. J. 2024, 52, 64–73. [Google Scholar]
  32. Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489–501. [Google Scholar] [CrossRef]
  33. Huang, G.B.; Zhou, H.M.; Ding, X.J.; Zhang, R. Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 2012, 42, 513–529. [Google Scholar] [CrossRef]
  34. Seyyedabbasi, A.; Kiani, F. Sand Cat swarm optimization: A nature-inspired algorithm to solve global optimization problems. Eng. Comput. 2022, 39, 2627–2651. [Google Scholar] [CrossRef]
  35. Zhang, J.; Wang, J.S. Improved Whale Optimization Algorithm Based on Nonlinear Adaptive Weight and Golden Sine Operator. IEEE Access 2020, 8, 77013–77048. [Google Scholar] [CrossRef]
Figure 1. Sliding window schematic.
Figure 1. Sliding window schematic.
Civileng 06 00019 g001
Figure 2. Comparison plot of population initialization: (a) OLM initialization and (b) stochastic initialization.
Figure 2. Comparison plot of population initialization: (a) OLM initialization and (b) stochastic initialization.
Civileng 06 00019 g002
Figure 3. Adaptive curve of sand cat sensitivity.
Figure 3. Adaptive curve of sand cat sensitivity.
Civileng 06 00019 g003
Figure 4. Forecasting flowchart.
Figure 4. Forecasting flowchart.
Civileng 06 00019 g004
Figure 5. Comparison of algorithm iteration curves: (a) F1 iteration curve; (b) F2 iteration curve; (c) F3 iteration curve; (d) F4 iteration curve; (e) F5 iteration curve; (f) F6 iteration curve; (g) F7 iteration curve; and (h) F8 iteration curve.
Figure 5. Comparison of algorithm iteration curves: (a) F1 iteration curve; (b) F2 iteration curve; (c) F3 iteration curve; (d) F4 iteration curve; (e) F5 iteration curve; (f) F6 iteration curve; (g) F7 iteration curve; and (h) F8 iteration curve.
Civileng 06 00019 g005aCivileng 06 00019 g005b
Figure 6. Surface Settlement Monitoring Plan Schematic.
Figure 6. Surface Settlement Monitoring Plan Schematic.
Civileng 06 00019 g006
Figure 7. DK461-860-DB07 surface settlement curves.
Figure 7. DK461-860-DB07 surface settlement curves.
Civileng 06 00019 g007
Figure 8. Simple characteristic component decomposition (860-DB07).
Figure 8. Simple characteristic component decomposition (860-DB07).
Civileng 06 00019 g008
Figure 9. Simple feature component reconstruction (860-DB07).
Figure 9. Simple feature component reconstruction (860-DB07).
Civileng 06 00019 g009
Figure 10. Comparison of settlement prediction curves by model: (a) 855-DB04; (b) 860-DB08; (c) 860-DB01; (d) 860-DB09; (e) 865-DB00; (f) 865-DB03; (g) 870-DB02, and (h) 870-DB05.
Figure 10. Comparison of settlement prediction curves by model: (a) 855-DB04; (b) 860-DB08; (c) 860-DB01; (d) 860-DB09; (e) 865-DB00; (f) 865-DB03; (g) 870-DB02, and (h) 870-DB05.
Civileng 06 00019 g010aCivileng 06 00019 g010b
Table 1. Surface settlement monitoring frequency table.
Table 1. Surface settlement monitoring frequency table.
Distance of Monitoring Section from Excavation Surface(m) Monitoring Frequency
(0–1) BTwo times/d
(1–2) BOne time/d
(2–5) BTwo times/two to three d
>5 BOne time/seven d
B represents the excavation width of the tunnel.
Table 2. Test function basic information.
Table 2. Test function basic information.
Function NameFunction FormulaDimensionTest RangeOptimal Solution
Sphere F 1 ( x ) = i = 1 n x i 2   30[−100, 100]0
Schwefel’s Problem 2.22 F 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | 30[−100, 100]0
Schwefel’s Problem 2.21 F 3 ( x ) = m a x i { | x i | ,   1 i n } 30[−100, 100]0
Quartic F 4 ( x ) = i = 1 n i x i 4 + r a n d [ 0 ,   1 )   30[−1.28, 1.28]0
Ackley’s F 5 ( x ) = 20 e x p ( 0.2 1 30 i = 1 n x i 2 ) e x p ( 1 30 i = 1 n c o s 2 π x i ) + 20 + e 30[−32, 32]0
Kowalik’s F 6 ( x ) = i = 1 n [ a i x 1 ( b i 2 + b i x 2 )   b i 2 + b i x 3 + x 4 ] 2 4[−5, 5]0.003075
Goldstein-Price F 7 ( x ) = [ 1 + ( x 1 + x 2 + 1 )   2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 )   ]
× [ 30 + ( 2 x 1 + 3 x 2 )   2 ( 18 32 x 1 + 12 x 1 2 + 48 x 2 + 36 x 1 x 2 + 27 x 2 2 )   ]
2[−2, 2]3
Shekel’s F 8 ( x ) = i = 1 5 [ ( x a i )   ( x a i )   T + c i ] 1 4[0, 10]−10
Table 3. Algorithm parameter setting.
Table 3. Algorithm parameter setting.
Optimization AlgorithmParameterization
ISCSO   r G adaptive nonlinear decrease from 2 to 0
SCSO   r G adaptive nonlinear decrease from 2 to 0
SSAST = 0.8, PD = 0.2, SD = 0.2
GWOa adaptive linear decrease from 2 to 0
SOAfc adaptive linear decrease from 2 to 0
Table 4. Test function optimization results.
Table 4. Test function optimization results.
FnISCSOSCSOSSAGWOSOA
Average ValueStandard DeviationAverage ValueStandard DeviationAverage ValueStandard DeviationAverage ValueStandard DeviationAverage ValueStandard Deviation
F11.83 × 10−26406.90 × 10−23702.85 × 10−1051.98 × 10−1043.83 × 10−708.93 × 10−701.13 × 10−154.03 × 10−15
F21.93 × 10−1351.14 × 10−1342.12 × 10−1238.92 × 10−1232.34 × 10−891.64 × 10−886.90 × 10−405.71 × 10−403.45 × 10−141.19 × 10−13
F32.01 × 10−1191.38 × 10−1182.33 × 10−1021.48 × 10−1011.58 × 10−441.06 × 10−431.52 × 10−172.94 × 10−170.00894040.0122475
F43.86 × 10−50.00010514.56 × 10−055.28 × 10−50.00013100.00012720.00044390.00023730.02114840.0115433
F58.88 × 10−1608.88 × 10−1608.88 × 10−1601.35 × 10−142.58 × 10−151.46 × 10−91.96 × 10−9
F60.00030746.469 × 10−100.00047230.00035170.00031015.24 × 10−60.00320680.00692750.00107440.0003040
F73.00000091.13 × 10−063.00000111.739 × 10−64.085.29089783.00000313.12 × 10−63.00000071.64 × 10−6
F8−10.153195.80 × 10−12−5.9096522.2726361−10.051240.713720−9.4427881.759854−1.2194530.803839
Table 5. Predictive model parameter setting.
Table 5. Predictive model parameter setting.
Predictive ModelPopulation SizeMaximum IterationsParameterizationSearch Boundary
SSA-ISCSO-KELM200400Regularization coefficients, kernel parameters[−1017, 1017]
SSA-ISCSO-BIGRU100200Learning rate; Learning rate decline factor; Regularization factor; Number of hidden layers[5 × 10−10, 1];
[5 × 10−10, 1];
[0, 100];
[1, 15]
Table 6. Comparison of evaluation indicators of different forecasting models.
Table 6. Comparison of evaluation indicators of different forecasting models.
Predictive Model855-DB04MAEMSERMSEMAPEMBE
Nonlinear regressionForecast 5d0.1480.0280.1664.953%0.032
SSA-ISCSO-KELM0.2130.0690.2605.741%−0.111
SSA-ISCSO-BIGRU1.0101.3731.14630.461%−0.862
Nonlinear regressionForecast 10d0.2860.1320.3638.077%0.224
SSA-ISCSO-KELM0.2210.0750.2655.722%0.026
SSA-ISCSO-BIGRU1.3932.3801.50238.568%−1.204
Predictive model855-DB08MAEMSERMSEMAPEMBE
Nonlinear regressionForecast 5 d0.5300.3310.57516.561%−0.530
SSA-ISCSO-KELM0.2050.059 0.2316.632%−0.162
SSA-ISCSO-BIGRU0.8991.030 1.01327.556%−0.880
Nonlinear regressionForecast 10d0.7700.695 0.83419.354%−0.770
SSA-ISCSO-KELM0.2210.0770.2585.861%−0.138
SSA-ISCSO-BIGRU1.6243.3881.83939.482%−1.610
Predictive model860-DB01MAEMSERMSEMAPEMBE
Nonlinear regressionForecast 5 d0.2500.0900.2997.393%−0.209
SSA-ISCSO-KELM0.2070.0620.2446.653%−0.085
SSA-ISCSO-BIGRU0.2700.1030.3128.783%0.382
Nonlinear regressionForecast 10d0.3470.1470.3849.184%−0.326
SSA-ISCSO-KELM0.2260.0730.2596.343%−0.152
SSA-ISCSO-BIGRU0.3670.2030.43510.345%0.388
Nonlinear regressionForecast 5d0.1880.1120.3345.212%−0.173
SSA-ISCSO-KELM 0.1920.1090.3295.428%−0.130
SSA-ISCSO-BIGRU 0.4340.3620.60112.803%−0.393
Nonlinear regressionForecast 10d0.8821.3691.17018.113%−0.875
SSA-ISCSO-KELM 0.8721.3811.17317.955%−0.832
SSA-ISCSO-BIGRU 1.3983.1091.76329.663%−1.340
Predictive model865-DB00MAEMSERMSEMAPEMBE
Nonlinear regressionForecast 5d0.3060.1540.3935.826%−0.307
SSA-ISCSO-KELM 0.1830.0660.2543.469%−0.136
SSA-ISCSO-BIGRU 0.3110.1840.4275.877%−0.291
Nonlinear regressionForecast 10d0.7510.8000.89412.513%−0.751
SSA-ISCSO-KELM 0.5480.4600.6749.064%−0.516
SSA-ISCSO-BIGRU 0.8801.1601.07514.546%−0.872
Predictive model865-DB03MAEMSERMSEMAPEMBE
Nonlinear regressionForecast 5d0.1560.0260.1603.668%−0.051
SSA-ISCSO-KELM0.1510.0370.1933.540%−0.074
SSA-ISCSO-BIGRU0.2980.1030.3156.954%−0.534
Nonlinear regressionForecast 10d0.1110.0160.1282.509%−0.020
SSA-ISCSO-KELM0.1970.0570.2354.212%0.020
SSA-ISCSO-BIGRU0.5910.4700.68112.295%−0.880
Predictive model870-DB02MAEMSERMSEMAPEMBE
Nonlinear regressionForecast 5d0.2700.0950.3086.109%−0.264
SSA-ISCSO-KELM0.1650.0430.2033.813%0.050
SSA-ISCSO-BIGRU0.4460.2620.51110.064%−0.477
Nonlinear regressionForecast 10d0.6740.6580.81112.779%−0.671
SSA-ISCSO-KELM0.4100.2960.5167.787%−0.258
SSA-ISCSO-BIGRU1.0401.5231.23419.773%−1.080
Predictive model870-DB05MAEMSERMSEMAPEMBE
Nonlinear regressionForecast 5d0.3610.1800.4246.457%−0.282
SSA-ISCSO-KELM0.2560.0930.3034.774%−0.039
SSA-ISCSO-BIGRU0.5420.4150.6439.638%−0.460
Nonlinear regressionForecast 10d0.7750.8650.93012.142%−0.736
SSA-ISCSO-KELM0.5520.4840.6878.703%−0.437
SSA-ISCSO-BIGRU1.1141.7191.31017.509%−1.049
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, F.; Wei, Q.; Wu, Z.; Cao, J.; Jian, D.; Xiang, L. Recursive Time Series Prediction Modeling of Long-Term Trends in Surface Settlement During Railway Tunnel Construction. CivilEng 2025, 6, 19. https://doi.org/10.3390/civileng6020019

AMA Style

Zhang F, Wei Q, Wu Z, Cao J, Jian D, Xiang L. Recursive Time Series Prediction Modeling of Long-Term Trends in Surface Settlement During Railway Tunnel Construction. CivilEng. 2025; 6(2):19. https://doi.org/10.3390/civileng6020019

Chicago/Turabian Style

Zhang, Feilian, Qicheng Wei, Zhe Wu, Jiawei Cao, Danlin Jian, and Lantian Xiang. 2025. "Recursive Time Series Prediction Modeling of Long-Term Trends in Surface Settlement During Railway Tunnel Construction" CivilEng 6, no. 2: 19. https://doi.org/10.3390/civileng6020019

APA Style

Zhang, F., Wei, Q., Wu, Z., Cao, J., Jian, D., & Xiang, L. (2025). Recursive Time Series Prediction Modeling of Long-Term Trends in Surface Settlement During Railway Tunnel Construction. CivilEng, 6(2), 19. https://doi.org/10.3390/civileng6020019

Article Metrics

Back to TopTop