# Estimation of Tsunami Bore Forces on a Coastal Bridge Using an Extreme Learning Machine

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

## 1. Introduction

**Figure 1.**(

**a**) Damaged piers of the JR Rail Viaduct at the Tsuya River (photo: S. Dashti); (

**b**) Damages to the deck and piers of Utatsu Bridge (Photo: Kenji Kosa).

**.**Consequently, an algorithm for a single layer feed forward NN, identified as Extreme Learning Machine (ELM), was presented by Huang et al. [17]. This algorithm is capable of solving problems caused by gradient descent-based algorithms like back propagation which are applied to artificial neural networks (ANNs). ELM methods are capable of reducing the essential time required for learning by a Neural Network. Certainly, it has been verified that by using the ELM, the training procedure becomes quick and it produces robust behavior [17]. Consequently, numerous researches have been conducted associated with utilizing ELM algorithm efficiently for determining problems in various engineering fields [18,19,20,21,22,23,24,25,26].

## 2. Methodology

#### 2.1. Experiments

#### 2.2. Model

**Figure 3.**Bridge model dimension and the positive direction of drag force, uplift and overturning moment. (

**a**) Model A3; (

**b**) Model A4; (

**c**) Model A5; (

**d**) Model A6.

#### 2.3. Setup

_{W}= 1.0 m, 0.95 m, 0.90 m, were applied in the flume. Ten solitary wave heights were utilized in this research to cover a variety of small to large tsunami wave heights, i.e., ${W}_{h}$ = 0.04, 0.8, 0.12, 0.16, 0.20, 0.24, 0.28, 0.32, 0.36, 0.40 m. In order to ensure repeatability and accuracy of the experiment, each hydraulic test was repeated three times and after eliminating the highest and lowest values, the average was recorded and presented in the Results section. The experiments errors were found to be less than 4.5%. The maximum values of vertical, horizontal forces and overturning moment were applied to the results.

#### 2.4. Input and Output Variables

Inputs | Parameter Description | Parameter Characterization |
---|---|---|

Input 1 | W_{h} | Wave height |

Input 2 | T | Velocity |

Output | Parameter Description | Parameter Characterization |
---|---|---|

Outputs | FX | Uplift |

FZ | Drag force | |

MY | Overturning moment |

#### 2.5. Extreme Learning Machine

#### 2.5.1. Single Hidden Layer Feed-Forward Neural Network

_{i}and b

_{i}represent the learning parameters of hidden nodes. β

_{i}represents the weight connecting the ith hidden node to the output node. $G\left({a}_{i},{b}_{i},x\right)$ shows the output value of the ith hidden node with respect to the input x. The additive hidden node with the activation function of $g\left(x\right):R\to R$ (e.g., sigmoid and threshold), $G\left({a}_{i},{b}_{i},x\right)$ is as per [27]:

_{i}denotes the weight vector which connects the input layer to the ith hidden node. In addition, b

_{i}represented the bias of the ith hidden node. a

_{i}.x represents the inner product of vector a

_{i}and x in R

^{n}. In addition, $G\left({a}_{i},{b}_{i},x\right)$ might be obtained for RBF hidden node with the activation function $g\left(x\right):R\to R$ (e.g., Gaussian), $G\left({a}_{i},{b}_{i},x\right)$ as per Huang et al. [27]:

_{i}and b

_{i}represent the centre and impact factor of the ith RBF node, respectively.

^{+}indicates the set of all positive real values. In this equation, the RBF network describes a particular case of SLFN with RBF nodes in its hidden layer. For N arbitrary distinct samples $\left({x}_{i},{t}_{i}\right)\in {R}^{n}\times {R}^{m}$, x

_{i}is n × 1 input vector and t

_{i}is m × 1 target vector. In cases where an SLFN with L hidden nodes may estimate the value of N samples with zero error, then it signifies that there exist β

_{i}, a

_{i}and b

_{i}such that [27]:

#### 2.5.2. Concept of ELM

_{i}and b

_{i}which are hidden node parameters of ELM, should not be tuned throughout learning and may conveniently be allocated with random values. Liang et al. [31] presented the same in the following theorems:

**Theorem 1.**

**Theorem 2.**

#### 2.6. Artificial Neural Networks

_{ij}and y

_{j}represent the weight and the threshold between the input layer and the hidden layer, respectively, w

_{jk}and y

_{k}assume the outputs of each neuron in a hidden layer and output layer, respectively, as indicated in Equations (9) and (10):

ANN Parameters | ||||
---|---|---|---|---|

Learning Rate | Momentum | Hidden Node | Number of Iterations | Activation Function |

0.2 | 0.1 | 3, 6, 10 | 1000 | Continuous Log-Sigmoid Function |

#### 2.7. Genetic Programming

Parameter | Value |
---|---|

Population size | 512 |

Function set | $+,-,\ast ,/,\surd ,{x}^{2},\mathrm{ln}\left(x\right),{e}^{x},{a}^{x}$ |

Head size | 5–9 |

Chromosomes | 20–30 |

Linking functions | Addition, subtraction, arithmetic, Trigonometric, Multiplication |

Number of genes | 2–3 |

Mutation rate | 92 |

One-point recombination rate | 0.2 |

Two-point recombination rate | 0.2 |

Homologue crossover rate | 99 |

Crossover rate | 31 |

Fitness function error type | RMSE |

Inversion rate | 109 |

Gene transposition rate | 0.1 |

Gene recombination rate | 0.1 |

## 3. Results and Discussion

#### 3.1. Evaluating Accuracy of Proposed Models

^{2}) and Pearson coefficient (r). These statistics are defined as follows:

^{2}):

_{i}and P

_{i}are known as the experimental and forecast values of tsunami bore forces on a coastal bridge, respectively, and n is the total number of test data. $\overline{{\text{P}}_{\text{i}}}$ and $\overline{{\text{O}}_{\text{i}}}$ represent average values of P

_{i}and O

_{i}.

#### 3.2. Architecture of Soft Computing Models

ELM | ANN | GP | |||
---|---|---|---|---|---|

Number of layers | 3 | Number of layers | 3 | - | - |

Neurons | Input: 2 Hidden: 3, 6, 10 Output: 1 | Neurons | Input: 2 Hidden: 3, 6, 10 Output: 1 | Neurons | - - Output: 1 |

- | - | Number of iteration | 1000 | Population size | 512 |

Activation function | Sigmoid Function | Activation function | Sigmoid Function | Function set | $+,-,\times ,\xf7,\surd ,{x}^{2},\mathrm{ln},{e}^{x},{a}^{x}$ |

Learning rule | ELM for SLFNs | Learning rule | Back propagation | Head size | 5–9 |

- | - | - | - | Chromosomes | 20–30 |

- | - | - | - | Number of genes | 2–3 |

- | - | - | - | Mutation rate | 92 |

- | - | - | - | Crossover rate | 31 |

- | - | - | - | Inversion rate | 109 |

#### 3.3. Performance Evaluation of Proposed ELM Model

**Figure 6.**Scatter plots of actual and predicted values of tsunami bore force FX on a coastal bridge using (

**a**) ELM, (

**b**) GP and (

**c**) ANN method.

**Figure 7.**Scatter plots of actual and predicted values of tsunami bore force FZ on a coastal bridge using (

**a**) ELM, (

**b**) GP and (

**c**) ANN method.

**Figure 8.**Scatter plots of actual and predicted values of tsunami bore moment MX on a coastal bridge using (

**a**) ELM, (

**b**) GP and (

**c**) ANN method.

## 4. Performance Comparison of ELM, ANN and GP

^{2}and r, were applied. Table 6 and Table 7 summarize the prediction accuracy results for test data sets since training error is not a credible indicator for the prediction potential of the particular model. This results present average results after many repetitions and many iterations in order to fins optimal structures of the models. The average computation time for the ELM modelling was around 310 seconds using a PC with an Intel Core Duo CPU E7600 @3.06 GHz and 2 GB RAM. The average computation time for the ANN and GP modelling using the same PC with the same performances was 424 and 446 s, respectively. For the ELM modelling, MATLAB software was used.

**Table 6.**Comparative performance statistics of the ELM, ANN and GP tsunami bore force FX on a coastal bridge predictive model.

ELM | ANN | GP | ||||||
---|---|---|---|---|---|---|---|---|

RMSE | R^{2} | r | RMSE | R^{2} | r | RMSE | R^{2} | r |

19.32297 | 0.8612 | 0.92799 | 36.39317 | 0.6173 | 0.78570 | 36.13951 | 0.6107 | 0.78150 |

**Table 7.**Comparative performance statistics of the ELM, ANN and GP tsunami bore force FZ on a coastal bridge predictive model.

ELM | ANN | GP | ||||||
---|---|---|---|---|---|---|---|---|

RMSE | R^{2} | r | RMSE | R^{2} | r | RMSE | R^{2} | r |

17.00278 | 0.8589 | 0.92676 | 26.92342 | 0.7113 | 0.84336 | 28.85849 | 0.7333 | 0.85631 |

**Table 8.**Comparative performance statistics of the ELM, ANN and GP tsunami bore moment MY on a coastal bridge predictive model.

ELM | ANN | GP | ||||||
---|---|---|---|---|---|---|---|---|

RMSE | R^{2} | r | RMSE | R^{2} | r | RMSE | R^{2} | r |

2.53381 | 0.9807 | 0.99030 | 7.52946 | 0.8674 | 0.93532 | 6.81423 | 0.8748 | 0.93132 |

## 5. Conclusions

^{2}, indicate that the ELM predictions are superior to those of GP and ANN. Additionally, the results reveal the robustness of the method.

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Mazinani, I.; Ismail, Z.; Hashim, A.M.; Saba, A. Experimental investigation on tsunami acting on bridges. Int. J. Civ. Archit. Struct. Constr. Eng.
**2014**, 8, 1040–1043. [Google Scholar] - Laituri, M.; Kodrich, K. On line disaster response community: People as sensors of high magnitude disasters using internet GIS. Sensors
**2008**, 8, 3037–3055. [Google Scholar] [CrossRef] - Kataoka, S.; Kusakabe, T.; Nagaya, K. Wave forces acting on bridge girders struck by tsunami. In Proceedings of the 12th Japan Earthquake Engineering Symposium, Tokyo, Japan, 1–2 March 2006; pp. 154–157.
- Iemura, H.; Pradono, M.; Yasuda, T.; Tada, T. Experiments of tsunami force acting on bridge models. J. Earthq. Eng.
**2007**, 29, 902–911. [Google Scholar] - Shoji, G.; Mori, Y. Hydraulic model experiment to simulate the damage of a bridge deck subjected to tsunamis. Annu. J. Coast. Eng.
**2006**, 53, 801–805. [Google Scholar] - Sugimoto, T.; Unjoh, S. Hydraulic model tests on the bridge structures damaged by tsunami and tidal wave. In Proceedings of the 23th US-Japan Bridge Engineering Workshop, Tsukuba, Japan, 5–7 November 2007; pp. 1–10.
- Araki, S.; Ishino, K.; Deguchi, I. Stability of girder bridge against tsunami fluid force. In Proceedings of the 32th International Conference on Coastal Engineering (ICCE), Shanghai, China, 30 June–5 July 2010; p. 2.
- Thusyanthan, I.; Martinez, E. Model study of tsunami wave loading on bridges. In Proceedings of the Eighteenth International Offshore and Polar Engineering, Vancouver, BC, Canada, 6–11 July 2008; pp. 528–535.
- Shoji, G.; Hiraki, Y.; Fujima, K.; Shigihara, Y. Evaluation of tsunami fluid force acting on a bridge deck subjected to breaker bores. Proced. Eng.
**2011**, 14, 1079–1088. [Google Scholar] [CrossRef] - Lukkunaprasit, P.; Lau, T.L. Influence of bridge deck on tsunami loading on inland bridge piers. IES J. Part A
**2011**, 4, 115–121. [Google Scholar] [CrossRef] - Lau, T.L. Tsunami force estimation on inland bridges considering complete pier-deck configurations. Ph.D. Thesis, Chulalongkorn University, Bangkok, Thailand, 2009. [Google Scholar]
- Hayatdavoodi, M.; Seiffert, B.; Ertekin, R.C. Experiments and computations of solitary-wave forces on a coastal-bridge deck. Part ii: Deck with girders. Coast. Eng.
**2014**, 88, 210–228. [Google Scholar] [CrossRef] - Mazinani, I.; Ismail, Z.; Hashim, A.M. An overview of tsunami wave force on coastal bridge and open challenges. J. Earthq. Tsunami
**2015**, 9, 1550006. [Google Scholar] [CrossRef] - Holzinger, A. Trends in interactive knowledge discovery for personalized medicine: Cognitive science meets machine learning. Intell. Inform. Bull
**2014**, 15, 6–14. [Google Scholar] - Holzinger, A.; Blanchard, D.; Bloice, M.; Holzinger, K.; Palade, V.; Rabadan, R. Darwin, lamarck, or baldwin: Applying evolutionary algorithms to machine learning techniques. In Proceedings of the 2014 IEEE/WIC/ACM International Joint Conferences on Web Intelligence (WI) and Intelligent Agent Technologies (IAT), Warsaw, Poland, 11–14 August 2014; pp. 449–453.
- Holzinger, A.; Jurisica, I. Knowledge Discovery and Data Mining in Biomedical Informatics: The Future is in Integrative, Interactive Machine Learning Solutions. In Interactive Knowledge Discovery and Data Mining in Biomedical Informatics; Springer-Verlag: Berlin/Heidelberg, Germany, 2014; pp. 1–18. [Google Scholar]
- Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: A new learning scheme of feedforward neural networks. In Proceedings of the 2004 IEEE International Joint Conference on Neural Networks, Budapest, Hungary, 25–29 July 2004; pp. 985–990.
- Yu, Q.; Miche, Y.; Séverin, E.; Lendasse, A. Bankruptcy prediction using extreme learning machine and financial expertise. Neurocomputing
**2014**, 128, 296–302. [Google Scholar] [CrossRef] - Wang, X.; Han, M. Online sequential extreme learning machine with kernels for nonstationary time series prediction. Neurocomputing
**2014**, 145, 90–97. [Google Scholar] [CrossRef] - Ghouti, L.; Sheltami, T.R.; Alutaibi, K.S. Mobility prediction in mobile ad hoc networks using extreme learning machines. Proced. Comput. Sci.
**2013**, 19, 305–312. [Google Scholar] [CrossRef] - Wang, D.D.; Wang, R.; Yan, H. Fast prediction of protein–protein interaction sites based on extreme learning machines. Neurocomputing
**2014**, 128, 258–266. [Google Scholar] [CrossRef] - Nian, R.; He, B.; Zheng, B.; van Heeswijk, M.; Yu, Q.; Miche, Y.; Lendasse, A. Extreme learning machine towards dynamic model hypothesis in fish ethology research. Neurocomputing
**2014**, 128, 273–284. [Google Scholar] [CrossRef] - Wong, P.K.; Wong, K.I.; Vong, C.M.; Cheung, C.S. Modeling and optimization of biodiesel engine performance using kernel-based extreme learning machine and cuckoo search. Renew. Energy
**2015**, 74, 640–647. [Google Scholar] [CrossRef] - Zou, H.; Lu, X.; Jiang, H.; Xie, L. A fast and precise indoor localization algorithm based on an online sequential extreme learning machine. Sensors
**2015**, 15, 1804–1824. [Google Scholar] [CrossRef] [PubMed] - Zhou, G.; Zhao, Y.; Guo, F.; Xu, W. A smart high accuracy silicon piezoresistive pressure sensor temperature compensation system. Sensors
**2014**, 14, 12174–12190. [Google Scholar] [CrossRef] [PubMed] - Mansourvar, M.; Shamshirband, S.; Raj, R.G.; Gunalan, R.; Mazinani, I. An automated system for skeletal maturity assessment by extreme learning machines. PLoS ONE
**2015**, 10, e0138493. [Google Scholar] [CrossRef] [PubMed] - Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: Theory and applications. Neurocomputing
**2006**, 70, 489–501. [Google Scholar] [CrossRef] - Annema, A.; Hoen, K.; Wallinga, H. Precision requirements for single-layer feedforward neural networks. In Proceedings of the Fourth International Conference on Microelectronics for Neural Networks and Fuzzy Systems, Turin, Italy, 26–28 September 1994; pp. 145–151.
- Huang, S.; Li, C. Distributed extreme learning machine for nonlinear learning over network. Entropy
**2015**, 17, 818–840. [Google Scholar] [CrossRef] - Huang, G.-B.; Chen, L.; Siew, C.-K. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw.
**2006**, 17, 879–892. [Google Scholar] [CrossRef] [PubMed] - Liang, N.-Y.; Huang, G.-B.; Saratchandran, P.; Sundararajan, N. A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans. Neural Netw.
**2006**, 17, 1411–1423. [Google Scholar] [CrossRef] [PubMed] - Singh, R.; Balasundaram, S. Application of extreme learning machine method for time series analysis. Int. J. Intell. Technol.
**2007**, 2, 256–262. [Google Scholar] - Karunanithi, N.; Grenney, W.J.; Whitley, D.; Bovee, K. Neural networks for river flow prediction. J. Comput. Civ. Eng.
**1994**, 8, 201–220. [Google Scholar] [CrossRef] - Govindaraju, R.S. Artificial neural networks in hydrology. I: Preliminary concepts. J. Hydrol. Eng.
**2000**, 5, 115–123. [Google Scholar] - Govindaraju, R.S.; Rao, A.R. Artificial Neural Networks in Hydrology; Springer Netherlands: Amsterdam, The Netherlands, 2010. [Google Scholar]
- Gaur, S.; Deo, M. Real-time wave forecasting using genetic programming. Ocean Eng.
**2008**, 35, 1166–1172. [Google Scholar] [CrossRef] - Koza, J.R. Genetic Programming: On the Programming of Computers by Means of Natural Selection; MIT Press: Cambrige, MA, USA, 1992; Volume 1. [Google Scholar]
- Babovic, V.; Keijzer, M. Rainfall-runoff modeling based on genetic programming. Encycl. Hydrol. Sci.
**2006**. [Google Scholar] [CrossRef] - Khu, S.T.; Liong, S.Y.; Babovic, V.; Madsen, H.; Muttil, N. Genetic programming and its application in real-time runoff forecasting. J. Am. Water Resour. Assoc.
**2001**, 37, 439–451. [Google Scholar] [CrossRef]

© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Mazinani, I.; Ismail, Z.B.; Shamshirband, S.; Hashim, A.M.; Mansourvar, M.; Zalnezhad, E.
Estimation of Tsunami Bore Forces on a Coastal Bridge Using an Extreme Learning Machine. *Entropy* **2016**, *18*, 167.
https://doi.org/10.3390/e18050167

**AMA Style**

Mazinani I, Ismail ZB, Shamshirband S, Hashim AM, Mansourvar M, Zalnezhad E.
Estimation of Tsunami Bore Forces on a Coastal Bridge Using an Extreme Learning Machine. *Entropy*. 2016; 18(5):167.
https://doi.org/10.3390/e18050167

**Chicago/Turabian Style**

Mazinani, Iman, Zubaidah Binti Ismail, Shahaboddin Shamshirband, Ahmad Mustafa Hashim, Marjan Mansourvar, and Erfan Zalnezhad.
2016. "Estimation of Tsunami Bore Forces on a Coastal Bridge Using an Extreme Learning Machine" *Entropy* 18, no. 5: 167.
https://doi.org/10.3390/e18050167