Next Article in Journal
Regularized Chained Deep Neural Network Classifier for Multiple Annotators
Next Article in Special Issue
Prediction of Change Rate of Settlement for Piled Raft Due to Adjacent Tunneling Using Machine Learning
Previous Article in Journal
Synergistic Effects of Heat-Moisture Treatment Regime and Grape Peels Addition on Wheat Dough and Pasta Features
Previous Article in Special Issue
Neural Network-Based Prediction: The Case of Reinforced Concrete Members under Simple and Complex Loading
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Probabilistic Design of Retaining Wall Using Machine Learning Methods

1
Department of Civil Engineering, National Institute of Technology, Patna, Bihar 800005, India
2
Department of Civil and Environmental Engineering, Ruhr-Universität Bochum, 44801 Bochum, Germany
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(12), 5411; https://doi.org/10.3390/app11125411
Submission received: 8 May 2021 / Revised: 28 May 2021 / Accepted: 8 June 2021 / Published: 10 June 2021
(This article belongs to the Special Issue Artificial Neural Networks Applied in Civil Engineering)

Abstract

:
Retaining walls are geostructures providing permanent lateral support to vertical slopes of soil, and it is essential to analyze the failure probability of such a structure. To keep the importance of geotechnics on par with the advancement in technology, the implementation of artificial intelligence techniques is done for the reliability analysis of the structure. Designing the structure based on the probability of failure leads to an economical design. Machine learning models used for predicting the factor of safety of the wall are Emotional Neural Network, Multivariate Adaptive Regression Spline, and SOS–LSSVM. The First-Order Second Moment Method is used for calculating the reliability index of the wall. In addition, these models are assessed based on the results they produce, and the best model among these is concluded for extensive field study in the future. The overall performance evaluation through various accuracy quantification determined SOS–LSSVM as the best model. The obtained results show that the reliability index calculated by the AI methods differs from the reference values by less than 2%. These methodologies have made the problems facile by increasing the precision of the result. Artificial intelligence has removed the cumbersome calculations in almost all the acquainted fields and disciplines. The techniques used in this study are evolved versions of some older algorithms. This work aims to clarify the probabilistic approach toward designing the structures, using the artificial intelligence to simplify the practical evaluations.

1. Introduction

Forecasting the failure of geotechnical structures and then digging for remedial measures to avoid those failures is the primary concern of researchers today. In geotechnical practice, retaining walls have been used for decades to mitigate the excessive movements near deep excavation projects. They prevent subsequent damage to neighboring buildings and infrastructures. Slopes close to deep excavations may experience inevitable failures such as rotational failure, translational failure, wedge failure, compound failure, flow, spreads, etc. Therefore, designing a retaining wall is a crucial research area for geotechnical engineers. To quantify the extent of the failure, a parameter called Factor of Safety (FOS), the ratio of resisting force to the driving force, is calculated based on different criteria such as overturning, sliding, and bearing capacity failure [1]. In this regard, many scholars have conducted reliability analyses on cantilever walls [2,3,4].
Incorporating Artificial Intelligence (AI) into civil engineering has taken reasonable minds and diligent efforts of researchers and engineers to join hands from different fields to give solutions regarding the problem. Geotechnical researchers employed Artificial Neural Network (ANN) upon different aspects of the problems, including computing algorithms, reliability methods, design of structure, etc. The application of neural networks to retaining structures design went back to 2005, when Goh and Kulhawy [5] studied the reliability of retaining wall performance using a neural network approach. Chen et al. [3] explored the prediction of safety factor values of retaining walls as major resistance systems for ground forces through AI techniques. A probabilistic model was proposed by [6] for estimating failure probabilities during excavations using observation evidence. They developed a Bayesian network and distance-based Bayesian model updating, using an ANN to create the response surface relationship. Evaluating a structural or any other geotechnical problem using ANN has already entered the minds of many researchers, but the few drawbacks of this technique have led to the development of many other algorithms.
Neural networks (NNs) have exhausted the field of research and cumbersome nature, and the few drawbacks of NNs have led to advanced and modified techniques [7,8]. Three of these techniques are employed in this paper. The three models used in this study have employed 10-fold cross-validation to obtain an optimal model for each algorithm and then compare three optimal models [9]. The incorporation of emotional parameters such as anxiety and confidence into a neural network made an EmNN [10]. Various aspects of this methodology reference different literary works, such as determining the compressive strength of concrete, modeling rainfall–runoff, etc. MARS is a multivariate analysis to extract complex data mapping into high-dimensional data and create a simple and easy-to-understand model [11,12]. The major areas of its application in civil engineering are modeling doweled pavement performance, estimation of deformation of asphalt mixtures, determination of the clay’s undrained shear strength of the clay, etc. [13,14]. A hybrid technique SOS–LSSVM comprises two algorithms, an LSSVM for better learning and an SOS for optimizing the learning process [15,16].
In this study, the reliability of a cantilever retaining wall is evaluated based on the sliding failure criterion. Several soil parameters are engaged while designing the wall, such as cohesion, angle of shearing resistance, angle of wall friction, and unit weight of the soil. The retaining wall design is typically based on deterministic formulations that do not consider the natural variability of geotechnical parameters. As a result of the inherent uncertainty of soil properties, there is an increasing trend in geotechnical engineering to incorporate reliability-based designs to mitigate construction uncertainties. The four parameters as mentioned above are taken as characteristic variables in this study, and three algorithms of machine learning named Emotional Neural Network (EmNN), Multivariate Adaptive Regression Spline (MARS), and Symbiotic Organism Search-Least Square Support Vector Machine (SOS–LSSVM) are used to model the retaining wall. Reliability analysis is done to measure the ability of a structure to meet requirements under a specified period [17]. A reliability index is calculated using the First-Order Second Moment method (FOSM), which implies the failure probability of a structure, and after that, models are assessed based on their prediction capability, reliability index, adaptability, learning, etc. [18]. There are three types of probabilistic uncertainty analysis methodologies [19], namely (i) analytical solutions, (ii) approximation approaches (e.g., First-Order Second Moment), and (iii) numerical approaches (e.g., Monte Carlo simulation). FOSM approach is one of the most extensively utilized approaches in civil engineering applications due to its simplicity. The name FOSM refers to the fact that it employs the first-order terms of the Taylor series expansion concerning the mean value of each input variable and needs up to the second moments of the unknown variables. Many researchers described the detailed methodologies of FOSM, and the reader is referred to them; for instance, see [18,20,21].
This study area of amalgamation of civil engineering and artificial intelligence has led to an interdisciplinary approach where different complex design problems are modeled. Various civil engineering sectors have experienced incorporating artificial intelligence algorithms such as Neural Networks, the most applied technique for robust calculation and modeling, to solve problems precisely and assure certainty as a result. Considering the intrinsic uncertainties in geomaterials, their behavior and interaction with other structural elements, performing probabilistic analysis in this field is essential. With the emergence of soft computing in geotechnical engineering, simulation models and the involved constitutive models get progressively more comprehensive. However, stochastic analyses require a large number of model evaluations, which may make the probabilistic analysis computationally unaffordable for practitioners. AI methods such as NN can be used in different aspects such as design, monitoring, and safety analysis to surrogate the extensive computational models with less computational efforts. Therefore, a well-trained and validated AI model provides the possibility of performing probabilistic analysis and evaluating the reliability of structures for the engineers in practice. This paper has elaborated a retaining wall safety analysis, as a case study, using neural networks, support vector machines, and Adaptive Regression Splines. A reliability index is calculated using FOSM to measure the safety of the structure in a given time period. This research paper involves the probabilistic design and reliability analysis of a cantilever retaining wall by incorporating 10-fold cross-validation. Furthermore, a comprehensive performance assessment is done to conclude which AI model has a better performance. Details of this paper have been thoroughly worked upon for readers to understand the gateway between AI and the geotechnical field to connect their applications.

2. Methods

2.1. Emotional Neural Network

Humans have consistently outperformed everyone, be it animal or any other living species when making decisions. Emotions either emphasize the external stimuli that trigger emotion or the internal responses included in the emotional state [22]. The decision-making process, cognition, and learning in animals and humans have been worked upon by many researchers [23,24,25,26]. Machine learning has given all the fields and areas of expertise a new direction, and evolution in machine learning is relatively rapid. Incorporating emotional skills in the neural network leads to EmNN. The emotional neural network consists of emotional neurons, two emotional parameters (anxiety and confidence), and emotional weights. This simulated ideology has been an aid in increasing the learning and decision-making capability of a model. The emotional factors are altered or updated during the learning phase, and the emotional weights are used along with the conventional weights to make decisions. The learning algorithm employed in the EmNN is Emotional Back Propagation (EmBP) [27]. In the Emotional Neural Network, two emotional neurons are non-processing neurons receiving global average values of input space instead of pixels or segments as in conventional neurons. In addition, emotional weights are updated using two emotional coefficients and not conventional learning or momentum rates. Anxiety and confidence, being the two most crucial decision-making parameters, have been incorporated in the model. The rationale behind this is to recognize a pattern based on the general impression besides the precise details of the subject. In general, when a human starts to learn, he develops a high level of anxiety, and confidence stays alarmingly low, but learning after apt practice, the confidence level increases, and anxiety almost becomes negligible. Therefore, EmNN employs this concept in the process of learning. In practice, when the epochs progress and the networks are trained, anxiety tells the system to pay less attention to the derivative of the error and use all nodes as the average value of the training pattern. The other term, confidence, lets the system pay more and more attention to the change in the weights as the training epochs progress, which is a kind of increasing inertia term to modify the change in the pattern from one to another. EmNN employs EmBP as a learning algorithm that is an evolved version of conventional Back Propagation (BP). BP has been popularly used in neural networks after Rumelhart et al. [28]. BP grabbed fame because of its simplicity of implication and quick training quality when adequate training datasets are available. The reader is referred to [29] for more details about EmNN.

2.2. Symbiotic Organisms Search-Least Square Support Vector Machine

The AI approach, in different disciplines, has outperformed conventional approaches, and its robustness and excellence have led to the advancement of the research platform. AI has improvised the way of learning. SOS–LSSVM is the hybrid of two computational techniques. In this complimentary system, LSSVM operates as a supervised learning-based predictor to establish the dataset’s accurate input–output connection, while SOS works to improve the LSSVM parameters [30,31]. In this study, the predictive tool LSSVM and metaheuristic optimization algorithm SOS are integrated. In addition, the cross-validation technique is employed in this study to validate the training and testing process. Several performance parameters are computed to compare the results and their validity. In order to provide a detailed description of this method, firstly, the metaheuristic algorithm of SOS and nature-inspired based LSSVM are explained in the following.

2.2.1. LSSVM

SVM is one of the most used predictive models modified into LSSVM by the researchers. SVM used quadratic programs as loss functions, whereas LSSVM uses the least square linear system as a loss function [32]. This algorithm is a statistical learning theory, and it differs from SVM in a way that it needs equality constraints while operating the least square cost function. The optimization constraints can be explained using the following formulations:
J p ( w , e ) = 1 2 w T w + γ 1 2 k = 1 N e k 2 m i n i m i s e ,
y k = w T ϕ ( x k ) + b + e k ,   k = 1 , 2 , N ,
where w ϵ R is a vector of undetermined parameters. ϕ ( . ) is a function with non-linearity mapping the input space to high feature space. e k   ϵ   R are error variables, and γ is a regularization constant and is always greater than zero.
y ( x ) = k = 1 N α k K ( x k , x 1 ) + b ,
Equation (3) shows function estimation by the LSSVM model where the solution to the linear system in Equation (4) is α k and b . The kernel function defined here is Radial Basis Function, and it is defined as follows:
K ( x k , x 1 ) = exp ( x k x 1 2 2 σ 2 ) ,
where σ is a parameter of the kernel function. γ and σ need to be specified for better predictive results from the model.

2.2.2. Symbiotic Organisms Search (SOS)

Symbiotic organisms search is a new metaheuristic algorithm introduced by [31]. Symbiotic organisms search is a new metaheuristic algorithm introduced by various researchers [33,34]. This is a symbiotic interactive technique to move the population of the solution, i.e., the ecosystem of the organism, to better and promising areas of search space while searching for an optimal global solution, and this is done iteratively. Organisms in the ecosystem have unique fitness values, which implies their degree of adaptability to the object that is desired or demanded. The steps involved in the SOS algorithm are as follows:
  • Generation of the initial population
  • Do
    • Mutualism
    • Commensalism
    • Parasitism
    • Best solution update
  • Until the criteria of stopping are satisfied.
Researchers have studied other metaheuristic techniques such as Genetic Algorithm, Particle Swarm Optimization, Differential Evolution, etc., along with the SOS, and they concluded that SOS performs better than other techniques. It is more effective and efficient and produces better results. In addition, SOS has been inculcated in many research fields to solve engineering and other fields’ problems [15,35,36].
SOS–LSSVM as a hybrid AI technology that helps in creating a symbiotic environment for input and output. LSSVM acts as a supervised learning-based predictor. SOS optimizes the parameters used in the LSSVM algorithm, i.e., γ and σ .
The SOS–LSSVM technique is completed in eight significant steps categorized into phases: beginning and training phase followed by testing phase [31].
  • Data are collected for training the model.
  • The LSSVM model is used to analyze the ambiguous nature of input and output. In addition, σ and γ are tuned.
  • SOS algorithm:
    This algorithm searches for several combinations of σ and γ parameters and makes the best set of these two parameters. In addition, SOS employs mutualism, commensalism, and parasitism phases to improve the fitness value of the solutions reached slowly.
  • Evaluation of fitness:
    For evaluation of the system, a fitness function is developed that measures the accuracy of the learning system. The best combination of γ and σ represents the accurate and best fitness value. The dataset is not split randomly, and it is are divided into learning and validation subsets. In addition, to avoid the bias of sampling, 10-fold cross-validation is done. The mean square error (MSE) is utilized by the fitness function for aptness and better representation.
  • Criteria of termination:
    The termination criterion used in this technique is the iteration number inculcated in the SOS algorithm.
  • Optimal σ and γ parameters:
    Loop stops and optimal σ and γ parameters are reached.
  • The optimal set of σ and γ parameters are further used for developing the model for testing the data.
  • Data testing:
    The dataset split for testing is tested, and the prediction is used for assessing the performance and accuracy of the model.

2.3. Multivariate Adaptive Regression Splines

MARS is a non-parametric modeling technique that uses a data-driven approach [37]. This study explores the multidimensionality and intrinsic non-linearity of the data associated with the retaining wall. This methodology training dataset is divided into different linear segments (splines) of the different gradients (slopes). No assumptions are made on the relationship between dependent and independent variables. A flexible MARS model results from smooth curves known as Basis Functions (BFs), which connect the splines. Using this model can handle both the linear and nonlinear behavior of the problem. The points that connect the pieces are called knots. Knots indicate the beginning of one region and the end of another and are placed randomly in the range of input variables. MARS searches all the interactions among the variables and goes through all the knots to generate BFs. An adaptive regression algorithm is employed in this technique for placing the knots. This algorithm has two phases: forward and backward. BFs are defined by placing the knots in the random position using the adaptive regression technique in the forward phase. At each progressing step, the model adapts to the knots and their corresponding BFs, ensuring maximum reduction in the residual error of the sum of squares. Adding BFs continues until the number reaches the maximum limit and results in a complex model. The backward phase helps in removing the redundant BFs. A Jekabson’s open MARS code is used to perform the analysis in this paper [38].
After the optimal MARS model is obtained, it can assess the relative parameter importance, which is based upon input variables and BFs.

2.4. Cross-Validation

Selecting a model and then evaluating it based on its performance holds a crucial role in machine learning. Assessing the properties of a model and assessing the properties of a model to the best of its capability are two different meanings. Many methods have been proposed by the researchers and have been applied to different models. Cross-validation’s simplicity and universality are considered and used widely to select and evaluate the model. K-fold cross-validation is used as a sample reuse methodology. By using a dataset, statisticians do many experiments using different algorithms and methodologies. In this research work, 10-fold cross-validation is employed in all three models proposed, i.e., MARS, EmNN, and SOS LSSVM. After 10-fold cross-validation is chosen, the dataset is split into ten sets of 90% training data and 10% testing data, and then, they are fed to all three algorithms. The best model with minimum Mean Absolute Error (MAE) is chosen each from MARS, EmNN, and SOS–LSSVM, and then, they are compared with each other on different scales and graphs. Using the methodology provided by [39], for each dataset, we assign a rank of three to the model with the best value for each performance index and then apply a rank of one to the model with the worst value for each performance index. After that, the overall performance rating of each model was generated by summing its total rank in each dataset.

3. Case Example

Cantilever walls rely on their own weight to resist sliding and overturning, but they also benefit from the weight of the backfill above the heel of the wall. The reinforced concrete cantilever walls come with various geometries. They are much cheaper to erect than the gravity wall, so they can be prefabricated and transferred to the site directly. Sliding as one of the potential modes of collapse must always be considered in reliability designs. The minimum safety factor against sliding is considered to be equal to 1.5. The most significant sliding force component usually comes from the lateral earth pressure acting on the wall’s active (backfill) side. Such force may be intensified by the presence of vertical or horizontal loads on the backfill surface. Figure 1 depicts the geometry of the cantilever wall as well as the major variables considered in this article. The safety factor against sliding can be indicated by the following in which F R is the sum of the horizontal resisting forces, and F D is the sum of the horizontal driving forces.
F O S = F R F D = V   t a n δ + 1.4   c + P p P a c o s ( i )
where V is the sum of the total vertical forces acting. P a and P p are the active and passive rankin pressure, rspectively. The following equations are used to calculate the active and passive pressure
P p = 1 2 × K p × γ × 1 2 ,   where   K p = c o s i + ( c o s i ) 2 ( c o s ϕ ) 2 c o s i ( c o s i ) 2 ( c o s ϕ ) 2 .
P a = 1 2 × K a × γ × 6.2 2 ,   where   K a = c o s i ( c o s i ) 2 ( c o s ϕ ) 2 c o s i + ( c o s i ) 2 ( c o s ϕ ) 2 .
As illustrated in Figure 1, the soil profile behind the retaining wall is considered to have a slope of i = 20 ° , which is considered in the calculation of the soil mass in the above equations. Moreover, c is the cohesion value, ϕ is the angle of shearing resistance, and δ represents the angle of wall friction ( δ = 2 / 3   ϕ ) . In order to calculate all the datasets, the above-mentioned formulation has been scripted in MATLAB code to evaluate the FOS for various combinations of input variables.
This study has a lot to contribute both to the field of AI and geotechnical engineering. This research work results from various new techniques in the market of evolving machine learning being employed in different research fields of medication and technical engineering. Cohesion ( c ), angle of shearing resistance ( ϕ ), angle of wall friction ( δ ), and unit weight ( γ ) were taken as primary variables, following a lognormal distribution function. For this study of retaining walls, data were collected, and the Coefficient Of Variation (COV) values were taken from different previous studies [39,40]. The COV values for ϕ and δ are considered to be equal to 15%, while the mean value of 30° and 20° is applied, respectively. The random parameter of cohesion is assumed to have a mean value of 20 kPa . The fourth material parameter that is considered a random variable is the unit weight of soil mass γ with a mean value of 20   kN / m 3 , and the COV is assumed to be equal to 8%. Afterwards, 80 datasets were generated randomly, of which 90% were put to training and 10% were put to testing.
Three models—MARS, EmNN, and SOS–LSSVM—were used to predict the factor of safety. After modeling the problem, different assessments were done to analyze the performance of the three models. After that, the models were compared amongst each other using various error measurements. In addition, reliability analysis was done by calculating the reliability index using the FOSM method, which assumes that all the variables are independent. Using this, failure probability is calculated, which can further be used for designing purposes.

4. Result and Performance Assessment of Models

4.1. Errors and Other Parameters

For measuring the extent of perfection of the model, different errors and values are computed. Assessing the performance of different models on different scales gives an overview of how the model performed, and it becomes facile for a researcher, a programmer, or a reader to compare the models based on their results. In this study, Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Mean Bias Error (MBE), Bias Factor, Nash–Sutcliffe Efficiency (NS), Variance Account Factor (VAF), Coefficient of Determination (R2), Adjusted Determination Coefficient (ADJ. R2), Performance Index (PI), RMSE To Standard Deviation Ratio (RSR), Normalized Mean Bias Factor (NMBE), Mean Absolute Percentage Error (MAPE), Wilmott’s Index (WI), Legate and McCabe’s Index (LMI), Expanded Uncertainty ( U 95 ) , t-Statistic, Global Performance Indicator (GPI), Correlation Coefficient (R), and Weighted Mean Absolute Percentage Error (WMAPE) are computed for each model. Table 1 shows the values of the errors as mentioned earlier and parameters. These parameters define the trend and relation between the predicted and observed values.
The Scatter Index (SI) employed in this study is a statistical quantity used for measuring the error in the predicted value [41]. For computing, this value RMSE is divided by the mean of observed values. This quantity indicates if the values forecasted by the models are up to the mark or not. The a20-index is an engineering index used to assess the system’s reliability [42]. It is calculated as follows:
a 20 i n d e x = m 20 M .
where m20 is the number of data whose value is the ratio of observed FOS to predicted FOS falling between 0.8 and 1.20, and M is the total number of datasets available. This parameter’s physical engineering significance gives the predicted values that deviate by ± 20 % from the experimental values. Total rank is also calculated in this paper based on what is presented in [43]. After calculating all the parameters mentioned above, models are ranked accordingly. The model value that represents the worst performance is ranked one, and the model with the best performance is ranked three (as we have used three models: EmNN, MARS, and SOS–LSSVM). Thereafter, all the ranks are added to get a total rank. The model that has the highest rank is treated as the best model. This gives an overall view of the prediction capability, trend formation, and performance of a model.
The reliability index of the model is calculated and compared with the reliability index of the actual dataset [17]. All three optimal models have different reference β values, and they have been compared accordingly. From Table 2, it is observed that the β value overlaps in SOS LSSVM. As they both show the same reliability index, it is proved that SOS LSSVM is the better model. In addition, using the reliability index, probability of failure ( Pf ) is calculated. The result coincides with the result generated using β .
As discussed before, MARS produced several basis functions, and after that, a function for calculation of the basis function is given in Table 3, where x1, x2, x3, and x4 are variables c ,   ϕ ,   γ ,   and δ . The regularization constant and kernel function parameter optimized using SOS in LSSVM are 100,000 and 28.4958, respectively.

4.2. Taylor Diagram

A Taylor diagram is the graphical representation of how closely the pattern (or patterns) match the observation quantified in terms of the correlation, root mean square error, and amplitude of their variations (standard deviations). This diagram evaluates the aspects of different complex models and performs a comparative analysis of these models with the reference data (self-observed data) [44]. From Figure 2, it can be seen that while predicting SOS–LSSVM and MARS did not deviate much from the observed or actual values and the models overlap the reference value, EmNN has slightly deviated from the reference, as can be seen in the figure. It has lower correlation than the other two and a high standard deviation and RMSE.

4.3. AOC–REC Curve

The Regression Error Characteristics curve (REC) is a probability curve and a metric system to check the performance of the regression model [45]. Area Over Curve (AOC) is the measure of distinction of the predicted data of the model from the actual data. From Figure 3, it can be seen and analyzed that the AOC value of the SOS–LSSVM model is relatively less; therefore, it proves to outperform the other two models. In addition, the value of AOC for MARS is lesser than for EmNN, resulting in a better model than EmNN.

4.4. R Curve

The performance curve is the curve that shows whether the models follow the trend of the reference models. This curve gives an R-value (coefficient of correlation) calculated and given in the table (Table 1). Figure 4 shows the performance curve, and it can be seen that all the three models overlap each other and follow the same trend approximately. A slight deviation of data can be observed in the EmNN model, and it is clear from other criteria as well that EmNN did not perform that well.

5. Conclusions

The probabilistic design of a retaining wall has been discussed in this paper using different machine learning methods, namely MARS, EmNN, and SOS–LSSVM. The retaining walls are widely used in various infrastructures such as road and bridge construction to stabilize the slopes. A probabilistic analysis against sliding was performed to evaluate the safety of a case study to establish the intelligence prediction model. The input and output datasets were generated based on consideration of uncertainties in soil parameters and fed into the AI models to be trained. Then, the result given by the predictor was assessed based on the performance parameters and graphs such as the Taylor diagram and Regression Error Characteristic curve. For instance, different errors and efficiencies, such as WMAPE, RMSE, MAE, NS, etc., were calculated. Obtained results validate the ability of all three applied advanced methods to evaluate the safety criteria of a cantilever retaining wall with relatively high accuracy. Therefore, the proposed computational intelligence systems can benefit engineers in design and monitoring processes. SOS is an optimization technique for optimizing the parameters of the LSSVM model, and in the current study, it results in a better prediction of the factor of safety. In addition, the reliability index calculated for different optimal models selected gives an overview that SOS–LSSVM is a better model and has outperformed the other two. MARS and EmNN give slightly deviated result, but the FOS generated from MARS almost overlaps that of SOS–LSSVM.
Considering the obtained results, for future works in this field or the design of other geostructures, SOS–LSSVM can be considered, as its learning rate is optimized using SOS, which gives an upper hand to this algorithm over the other two techniques. In addition, to improve the quality of results, the model can be fed with more training data to ensure better relationships between the input and output dataset. However, future research should study the impact of a higher number of inputs and spatial randomness in the material properties on the performance of suggested AI methods. Finally, it should be stated that this study employed synthetic data to train and validate the machine learning methods. Nevertheless, the proposed methodologies can be applied analogously to realistic problems.

Author Contributions

Conceptualization, P.S. and E.M.; methodology, P.S.; software, P.M. and P.S.; validation, P.M., P.S. and E.M.; formal analysis, P.M.; investigation, P.M.; resources, P.S.; data curation, P.M.; writing—original draft preparation, P.M.; writing—review and editing, E.M.; visualization, P.M.; supervision, P.S.; project administration, E.M. All authors have read and agreed to the published version of the manuscript.

Funding

We acknowledge support by the Open Access Publication Funds of the Ruhr-Universität Bochum.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Baecher, G.B.; Christian, J.T. Reliability and Statistics in Geotechnical Engineering; Wiley: New York, NY, USA, 2003. [Google Scholar]
  2. Basha, B.M.; SivakumarBabu, G.L. Optimum design of cantilever sheet pile walls in sandy soils using inverse reliability approach. Comput. Geotech. 2008, 35, 134–143. [Google Scholar]
  3. Chen, H.; Asteris, P.; Armaghani, D.J. Assessing Dynamic Conditions of the Retaining Wall: Developing Two Hybrid Intelligent Models. Appl. Sci. 2019, 9, 1042. [Google Scholar] [CrossRef] [Green Version]
  4. Goh, A.T.; Phoon, K.K.; Kulhawy, F.H. Reliability Analysis of Partial Safety Factor Design Method for Cantilever Retaining Walls in Granular Soils. J. Geotech. Geoenviron. Eng. 2009, 135, 616–622. [Google Scholar] [CrossRef]
  5. Goh, A.T.C.; Kulhawy, F.H. Reliability assessment of serviceability performance of braced retaining walls using a neural network approach. Int. J. Numer. Anal. Meth. Geomech. 2005, 29, 627–642. [Google Scholar] [CrossRef]
  6. He, L.; Yong, L.; Wang, L.; Broggi, M.; Beer, M. Estimation of Failure Probability in Braced Excavation using Bayesian Networks with Integrated Model Updating. Undergr. Space 2020, 5, 315–323. [Google Scholar] [CrossRef]
  7. Deng, J.; Gu, D.; Li, X.; Yue, Z. Structural reliability analysis for implicit performance functions using artificial neural network. Struct. Saf. 2005, 27, 25–48. [Google Scholar] [CrossRef]
  8. Gomes, H.M.; Awruch, A.M. Comparison of response surface and neural network with other methods for structural reliability analysis. Struct. Saf. 2004, 26, 49–67. [Google Scholar] [CrossRef]
  9. Wu, C.Z.; Goh, T.C.A.; Zhang, W.G. Study on Optimization of Mars Model for Prediction of Pile Drivability Based on Cross—Validation; ISSMGE: Singapore, 2019. [Google Scholar]
  10. Biswas, R.; Samui, P.; Rai, B. Determination of compressive strength using relevance vector machine and emotional neural network. Asian J. Civ. Eng. 2019, 20, 1109–1118. [Google Scholar] [CrossRef]
  11. Samui, P. Multivariate adaptive regression spline (Mars) for prediction of elastic modulus of jointed rock mass. Geotech. Geol. Eng. 2012, 31, 249–253. [Google Scholar] [CrossRef]
  12. Zhang, W.; Goh, A.T. Multivariate adaptive regression splines and neural network models for prediction of pile drivability. Geosci. Front. 2016, 7, 45–52. [Google Scholar] [CrossRef] [Green Version]
  13. Attoh-Okine, N.O.; Cooger, K.; Mensah, S. Multivariate adaptive regression (MARS) and hinged hyperplanes (HHP) for doweled pavement performance modeling. Constr. Build. Mater. 2009, 23, 3020–3023. [Google Scholar] [CrossRef]
  14. Cheng, M.Y.; Cao, M.T. Accurately predicting building energy performance using evolutionary multivariate adaptive regression splines. Appl. Soft Comput. 2014, 22, 178–188. [Google Scholar] [CrossRef]
  15. Cheng, M.Y.; Prayogo, D.; Tran, D.H. Optimizing multiple-resources leveling in multiple projects using discrete symbiotic organisms search. J. Comput. Civ. Eng. 2016, 30, 04015036. [Google Scholar] [CrossRef] [Green Version]
  16. Prayogo, D.; Susanto, Y.T.T. Optimizing the prediction accuracy of friction capacity of driven piles in cohesive soil using a novel self-tuning least squares support vector machine. Adv. Civ. Eng. 2018, 2018, 1–9. [Google Scholar] [CrossRef] [Green Version]
  17. Babu, G.; Srivastava, A. Reliability analysis of buried flexible pipe-soil system. J. Pipeline Syst. Eng. Pract. 2010, 1, 33–41. [Google Scholar] [CrossRef]
  18. Wu, T.H.; Kraft, L.M. Safety analysis of slopes. J. Soil Mech. Found. Div. 1970, 96, 609–630. [Google Scholar] [CrossRef]
  19. Mahmoudi, E.; Khaledi, K.; Miro, S.; Koenig, D.; Schanz, T. Probabilistic Analysis of a Rock Salt Cavern with Application. Rock Mech. Rock Eng. 2017, 50, 139–157. [Google Scholar] [CrossRef] [Green Version]
  20. Tang, W.H.; Yucemen, M.S.; Ang, A.H.S. Probability-based short term design of slopes. Can. Geotech. J. 1976, 13, 201–215. [Google Scholar] [CrossRef]
  21. Cornell, C. First-order uncertainty analysis of soils deformation and stability. In Proceedings of the 1st International Conference on Application of Probability and Statistics in Soil and Structural Engineering, Hong Kong, China, 13–16 September 1971. [Google Scholar]
  22. Levine, D.S. Neural network modeling of emotion. Phys. Life Rev. 2007, 4, 37–63. [Google Scholar] [CrossRef]
  23. Fragopanagos, N.; Taylor, J.G. Modelling the interaction of attention and emotion. Neurocomputing 2006, 69, 1977–1983. [Google Scholar] [CrossRef] [Green Version]
  24. Gnadt, W.; Grossberg, S. An autonomous neural system for incrementally learning planned action sequences to navigate towards a rewarded goal. Neural Netw. 2008, 21, 699–758. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Taylor, J.G.; Scherer, K.; Cowie, R. Emotion and brain: Understanding emotions and modelling their recognition. Neural Netw. 2005, 18, 313–316. [Google Scholar] [CrossRef] [PubMed]
  26. Cañamero, L. Emotion understanding from the perspective of autonomous robots research. Neural Netw. 2005, 18, 445–455. [Google Scholar] [CrossRef] [PubMed]
  27. Khashman, A. A modified backpropagation learning algorithm with added emotional coefficients. IEEE Trans. Neural Netw. 2008, 19, 1896–1909. [Google Scholar] [CrossRef] [PubMed]
  28. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning internal representations by error propagation. In Parallel Distributed Processing: Explorations in the Microstructure of Cognition; Volume 1: Foundations; MIT Press: Cambridge, MA, USA, 1986; pp. 318–362. [Google Scholar]
  29. Khashman, A. Application of an emotional neural network to facial recognition. Neural Comput. Appl. 2008, 18, 309–320. [Google Scholar] [CrossRef]
  30. Huang, D.; Zhao, X. Research on several problems in BP neural network application. In Proceedings of the International Conference on Wireless Communications, Networking and Mobile Computing, Shanghai, China, 21–25 September 2007. [Google Scholar]
  31. Cheng, M.Y.; Prayogo, D. Symbiotic organisms search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
  32. Kim, D.; Sekar, S.K.; Samui, P. Model of least square support vector machine (LSSVM) for prediction of fracture parameters of concrete. Int. J. Concr. Struct. Mater. 2011, 5, 29–33. [Google Scholar]
  33. Heng, M.Y.; Prayogo, D.; Wu, Y.W. Prediction of permanent deformation in asphalt pavements using a novel symbiotic organisms search–least squares support vector regression. Neural Comput. Appl. 2018, 31, 6261–6273. [Google Scholar]
  34. Cheng, M.Y.; Cao, M.T.; Mendrofa, A.Y.J. Dynamic feature selection for accurately predicting construction productivity using symbiotic organisms search-optimized least square support vector machine. J. Build. Eng. 2021, 35, 101973. [Google Scholar] [CrossRef]
  35. Tejani, G.G.; Savsani, V.J.; Patel, V.K. Adaptive symbiotic organisms search (SOS) algorithm for structural design optimization. J. Comput. Des. Eng. 2016, 3, 226–249. [Google Scholar] [CrossRef] [Green Version]
  36. Tran, D.H.; Cheng, M.Y.; Prayogo, D. A novel Multiple Objective Symbiotic Organisms Search (MOSOS) for time–cost–labor utilization tradeoff problem. Knowl. Based Syst. 2016, 94, 132–145. [Google Scholar] [CrossRef] [Green Version]
  37. Friedman, J.H. Multivariate adaptive regression spline. Ann. Stat. 1991, 19, 1–67. [Google Scholar] [CrossRef]
  38. Jekabsons, G. VariReg: A Software Tool for Regression Modeling Using Various Modeling Methods; Riga Technical University: Riga, Latvia, 2010. [Google Scholar]
  39. Zhang, H.; Zhou, J.; Tahir, M.M.; Armaghani, J.D.; Tahir, M.M.; Pham, B.; Huynh, V.A. A Combination of Feature Selection and Random Forest Techniques to Solve a Problem Related to Blast-Induced Ground Vibration. Appl. Sci. 2020, 10, 869. [Google Scholar] [CrossRef] [Green Version]
  40. Duncan, J.M. Factors of safety and reliability in geotechnical engineering. J. Geotech. Geoenviron. Eng. 2000, 126, 307–316. [Google Scholar] [CrossRef]
  41. Phoon, K.K.; Kulhawy, F.H. Characterization of geotechnical variability. Can. Geotech. J. 1999, 36, 612–624. [Google Scholar] [CrossRef]
  42. Clancy, R.M.; Kaitala, J.E.; Zambresky, L.F. he Fleet Numerical Oceanography Center global spectral ocean wave model. Bull. Am. Meteorol. Soc. 1986, 67, 498–512. [Google Scholar] [CrossRef] [Green Version]
  43. Asteris, P.G.; Mokos, V.G. Concrete compressive strength using artificial neural networks. Neural Comput. Appl. 2020, 32, 11807–11826. [Google Scholar] [CrossRef]
  44. Taylor, K.E. Summarizing multiple aspects of model performance in a single diagram. J. Geophys. Res. 2001, 106, 7183–7192. [Google Scholar] [CrossRef]
  45. Fawcett, T. An introduction to ROC analysis. Pattern Recognit. Lett. 2006, 27, 861–874. [Google Scholar] [CrossRef]
Figure 1. Considered retaining wall.
Figure 1. Considered retaining wall.
Applsci 11 05411 g001
Figure 2. Taylor diagram for (a) SOS–LSSVM, (b) EmNN, and (c) MARS method.
Figure 2. Taylor diagram for (a) SOS–LSSVM, (b) EmNN, and (c) MARS method.
Applsci 11 05411 g002
Figure 3. REC curve.
Figure 3. REC curve.
Applsci 11 05411 g003
Figure 4. Performance curve for models.
Figure 4. Performance curve for models.
Applsci 11 05411 g004
Table 1. Performance evaluation of applied methods via various error measurements (continued).
Table 1. Performance evaluation of applied methods via various error measurements (continued).
MODELSMARSEmNNSOS–LSSVM
1. WMAPE0.00620.07260.0008
RANK2.00001.00003.0000
2. NS0.99990.98601.0000
RANK2.00001.00003.0000
3. RMSE0.00170.01830.0002
RANK2.00001.00003.0000
4. VAF99.991498.730699.9998
RANK2.00001.00003.0000
5. R20.99990.98601.0000
RANK2.00001.00003.0000
6. Adj. R20.99970.96741.0000
RANK2.00001.00003.0000
7. PI1.99801.93641.9998
RANK2.00001.00003.0000
8. RMSD0.00170.01830.0002
RANK2.00001.00003.0000
9. BIAS FACTOR0.87290.83710.8755
RANK2.00001.00003.0000
10. RSR0.01060.11820.0015
RANK2.00001.00003.0000
11. NMBE−0.36112.78430.0196
RANK2.00001.00003.0000
12. MAPE0.00100.00620.0000
RANK2.00001.00003.0000
13. WI1.00000.99661.0000
RANK2.00001.00003.0000
14. MAE0.00130.01550.0002
RANK2.00001.00003.0000
15. MBE−0.00080.00590.0000
RANK2.00001.00003.0000
16. LMI0.98590.83640.9984
RANK2.00001.00003.0000
17. U950.00430.05000.0005
RANK2.00001.00003.0000
18. t-stat0.00280.01420.0001
RANK2.00001.00003.0000
19. GPI−1.8 × 10−151.08 × 10−095.98292 × 10−22
RANK1.00002.00003.0000
20. R1.00000.99781.0000
RANK2.00001.00003.0000
21. SI0.77338.59620.0899
RANK2.00001.00003.0000
22. a20-index1.00000.87501.0000
RANK2.00001.00002.0000
23. AOC0.00110.01330.0001
RANK2.00001.00003.0000
TOTAL RANK45.000024.000068.0000
Table 2. Reliability index (β) of models.
Table 2. Reliability index (β) of models.
MODELReference βModel’s βModel’s Pf
MARS1.64501.64180.0500
EmNN1.64051.62110.0525
SOS–LSSVM1.86611.86610.0310
Table 3. Basis Functions for MARS.
Table 3. Basis Functions for MARS.
BFEquation
BF1max(0, x4 − 0.695789507393035)
BF2max(0, 0.695789507393035 − x4)
BF3max(0, x2 − 0.897423953169214)
BF4max(0, 0.897423953169214 − x2)
BF5max(0, x3 − 0.902027009611503)
BF6max(0, 0.902027009611503 − x3)
BF7BF4 × max(0, x1 − 0.382105313562625)
BF8BF4 × max(0, 0.382105313562625 − x1)
BF9BF2 × max(0, 0.557894812924718 − x1)
BF10BF6 × max(0, x4 − 0.81157903285569)
BF11BF6 × max(0, 0.81157903285569 − x4)
BF12BF4 × max(0, x3 − 0.408783600920691)
BF13BF4 × max(0, 0.408783600920691 − x3)
BF14BF1 × max(0, x1 − 0.760000104402251)
Main equationy = 0.597051492431488 + 1.2033558916995 × BF1 − 1.01013033595132 × BF2 + 0.262683442461409 × BF3 − 0.290024870554842 × BF4 − 0.0824750745638103 × BF5 + 0.124577962754319 × BF6 − 0.334859635113939 × BF7 + 0.254991811556967 × BF8 + 0.446434437906744 × BF9 + 0.27641860123112 × BF10 − 0.0700841355527202 × BF11 + 0.0920347477963486 × BF12 − 0.0681746540919311 × BF13 + 0.819393035778277 × BF14
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mishra, P.; Samui, P.; Mahmoudi, E. Probabilistic Design of Retaining Wall Using Machine Learning Methods. Appl. Sci. 2021, 11, 5411. https://doi.org/10.3390/app11125411

AMA Style

Mishra P, Samui P, Mahmoudi E. Probabilistic Design of Retaining Wall Using Machine Learning Methods. Applied Sciences. 2021; 11(12):5411. https://doi.org/10.3390/app11125411

Chicago/Turabian Style

Mishra, Pratishtha, Pijush Samui, and Elham Mahmoudi. 2021. "Probabilistic Design of Retaining Wall Using Machine Learning Methods" Applied Sciences 11, no. 12: 5411. https://doi.org/10.3390/app11125411

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop