Searching for the Best Artificial Neural Network Architecture to Estimate Column and Beam Element Dimensions
Abstract
1. Introduction
2. Materials and Methods
2.1. Harmony Search Algorithm
2.2. Genetic Algorithm
2.3. Design Parameters for Tubular Column Optimization
2.4. Design Parameters for I-Beam Optimization
2.5. Artificial Neural Networks
2.6. Neural Architecture Search and HyperNetExplorer
Hyperparameter Optimization Algorithms
2.7. Machine Learning
2.8. Model Metrics
2.9. Model Evaluation
2.10. Reliability Analysis
3. Numerical Examples
3.1. I-Section Beam Example
3.2. Tubular Column Example
3.3. NAS Algorithm Comparison
3.4. Reliability of the Tool: HyperNet Explorer
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Kaveh, A.; Izadifard, R.A.; Mottaghi, L. Optimal design of planar RC frames considering CO2 emissions using ECBO, EVPS, and PSO metaheuristic algorithms. J. Build. Eng. 2020, 28, 101014. [Google Scholar] [CrossRef]
- Tunca, O.; Carbas, S. Sustainable and cost-efficient design optimization of rectangular and circular-sectioned reinforced concrete columns considering slenderness and eccentricity. Structures 2024, 61, 105989. [Google Scholar] [CrossRef]
- Akhavan Kazemi, M.; Hoseini Vaez, S.R.; Fathali, M.A. An eco-friendly reliability-based design optimization of intermediate reinforced concrete moment frames. Eur. J. Environ. Civ. Eng. 2023, 27, 1876–1896. [Google Scholar] [CrossRef]
- Çoşut, M.; Bekdaş, G.; Niğdeli, S.M. Cost optimization and comparison of rectangular cross-section reinforced concrete beams using TS500, Eurocode 2, and ACI 318 code. In Proceedings of the 7th International Conference on Harmony Search, Soft Computing and Applications: ICHSA, Virtual, 2 September 2022; Springer Nature: Singapore, 2022; pp. 83–91. [Google Scholar]
- Kaveh, A.; Eslamlou, A.D.; Khodadadi, N. Dynamic water strider algorithm for optimal design of skeletal structures. Period. Polytech. Civ. Eng. 2020, 64, 904–916. [Google Scholar] [CrossRef]
- McCulloch, W.S.; Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 1943, 5, 115–133. [Google Scholar] [CrossRef]
- Russel, S.J.; Norvig, P. Artificial Intelligence: The gestation of artificial intelligence (1943–1956). In Artificial Intelligence: A Modern Approach; Prentice-Hall, Inc: Hoboken, NJ, USA, 1995; p. 16. ISBN 0-13-103805-2. [Google Scholar]
- Prasad, R.; Choudhary, P. State-of-the-Art of Artificial Intelligence. J. Mob. Multimed. 2021, 17, 427–454. [Google Scholar] [CrossRef]
- Ocak, A.; Bekdaş, G.; Nigdeli, S.M.; Işıkdağ, U. Machine Learning Applications in Structural Engineering. In New Advances in Soft Computing in Civil Engineering: AI-Based Optimization and Prediction; Springer Nature: Cham, Switzerland, 2024; pp. 47–76. [Google Scholar]
- Minsky, M. Neural Nets and the Brain-Model Problem. Ph.D. Dissertation, Princeton University, Princeton, NJ, USA, 1954. [Google Scholar]
- Poulton, M.M. A brief history. In Handbook of Geophysical Exploration: Seismic Exploration; Pergamon: New York, NY, USA, 2001; Volume 30, pp. 3–18. [Google Scholar]
- Trajkovic, S.; Todorovic, B.; Stankovic, M. Forecasting of reference evapotranspiration by artificial neural networks. J. Irrig. Drain. Eng. 2003, 129, 454–457. [Google Scholar] [CrossRef]
- Bougadis, J.; Adamowski, K.; Diduch, R. Short-term municipal water demand forecasting. Hydrol. Process. Int. J. 2005, 19, 137–148. [Google Scholar] [CrossRef]
- Kerh, T.; Huang, C.; Gunaratnam, D. Neural network approach for analyzing seismic data to identify potentially hazardous bridges. Math. Probl. Eng. 2011, 2011, 464353. [Google Scholar] [CrossRef]
- Sadowski, L. Non-destructive investigation of corrosion current density in steel reinforced concrete by artificial neural networks. Arch. Civ. Mech. Eng. 2013, 13, 104–111. [Google Scholar] [CrossRef]
- Kiran, S.; Lal, B. ANN-based prediction of shear strength of soil from their index properties. Int. J. Earth Sci. Eng 2015, 8, 2195–2202. [Google Scholar]
- Albuthbahak, O.M.; Alkhudery, H.H. Artificial neural network model for flexural design of concrete hydraulic structures. Int. J. Civil. Eng. Technol. 2018, 9, 265–274. [Google Scholar]
- Akpinar, P.; Uwanuakwa, I.D. Investigation of the parameters influencing the progress of concrete carbonation depth by using artificial neural networks. Mater. Constr. 2020, 70, e209. [Google Scholar] [CrossRef]
- Xiong, C.; Zheng, J.; Xu, L.; Cen, C.; Zheng, R.; Li, Y. Multiple-input convolutional neural network model for large-scale seismic damage assessment of reinforced concrete frame buildings. Appl. Sci. 2021, 11, 8258. [Google Scholar] [CrossRef]
- Ofrikhter, I.; Ponomaryov, A.; Zakharov, A.; Shenkman, R. Estimation of soil properties by an artificial neural network. Mag. Civ. Eng. 2022, 110, 11011. [Google Scholar]
- Sivasuriyan, A.; Vijayan, D.S. Prediction of displacement in reinforced concrete based on artificial neural networks using sensors. Meas. Sens. 2023, 27, 100764. [Google Scholar] [CrossRef]
- Saitoh, T.; Kato, T.; Hirose, S. Automatic detection of concrete surface defects using pre-trained CNN and laser ultrasonic visualization testing. Int. J. Progn. Health Manag. 2024. [Google Scholar] [CrossRef]
- Alibrahim, B.; Habib, A.; Habib, M. Developing a brain-inspired multilobar neural network architecture for rapidly and accurately estimating concrete compressive strength. Sci. Rep. 2025, 15, 1989. [Google Scholar] [CrossRef]
- Naderpour, H.; Poursaeidi, O.; Ahmadi, M. Shear resistance prediction of concrete beams reinforced by FRP bars using artificial neural networks. Measurement 2018, 126, 299–308. [Google Scholar] [CrossRef]
- Hashemi, S.S.; Sadeghi, K.; Fazeli, A.; Zarei, M. Predicting the weight of the steel moment-resisting frame structures using artificial neural networks. Int. J. Steel Struct. 2019, 19, 168–180. [Google Scholar] [CrossRef]
- Djerrad, A.; Fan, F.; Zhi, X.D.; Wu, Q.J. Artificial neural networks (ANN) based compressive strength prediction of afrp strengthened steel tube. Int. J. Steel Struct. 2020, 20, 156–174. [Google Scholar] [CrossRef]
- Hisham, M.; Hamdy, G.A.; El-Mahdy, O.O. Prediction of temperature variation in FRP-wrapped RC columns exposed to fire using artificial neural networks. Eng. Struct. 2021, 238, 112219. [Google Scholar] [CrossRef]
- Hong, W.K.; Nguyen, V.T.; Nguyen, M.C. Artificial intelligence-based novel design charts for doubly reinforced concrete beams. J. Asian Archit. Build. Eng. 2022, 21, 1497–1519. [Google Scholar] [CrossRef]
- Rabi, M.; Abarkan, I.; Shamass, R. Buckling resistance of hot-finished CHS beam-columns using FE modelling and machine learning. Steel Constr. 2024, 17, 93–103. [Google Scholar] [CrossRef]
- Peng, X.L.; Xu, B.X. Data-driven inverse design of composite triangular lattice structures. Int. J. Mech. Sci. 2024, 265, 108900. [Google Scholar] [CrossRef]
- Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
- Işıkdağ, Ü.; Bekdaş, G.; Aydın, Y.; Apak, S.; Hong, J.; Geem, Z.W. Adaptive Neural Architecture Search Using Meta-Heuristics: Discovering Fine-Tuned Predictive Models for Photocatalytic CO2 Reduction. Sustainability 2024, 16, 10756. [Google Scholar] [CrossRef]
- Karaboğa, D. An Idea Based on Honey Bee Swarm for Numerical Optimization; Technical Report-tr06; Computer Engineering Department, Engineering Faculty, Erciyes University: Kayseri, Turkey, 2005; Volume 200, pp. 1–10. [Google Scholar]
- Dorigo, M.; Maniezzo, V.; Colorni, A. The ant system: An autocatalytic optimizing process. IEEE Trans. Syst. Man. Cybern. B 1996, 26, 29–41. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R.C. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
- Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching-Learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
- Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature-Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
- Yang, X.S. Flower pollination algorithm for global optimization. In Proceedings of the Unconventional Computation and Natural Computation 11th International Conference, UCNC 2012, Orléans, France, 3–7 September 2012; Lecture Notes in Computer Science. Durand-Lose, J., Jonoska, N., Eds.; Springer: London, UK, 2012; Volume 7445, pp. 240–249. [Google Scholar]
- Holland, J. Adaptation in Natural and Artificial Systems; University of Michigan Press: Ann Arbor, MI, USA, 1975. [Google Scholar]
- Murty, K.G. Chapter 9: Heuristic Methods for Combinatorial Optimization Problems. In Optimization Models for Decision Making; University of Michigan: Ann Arbor, MI, USA, 2003. [Google Scholar]
- Yücel, M.; Kayabekir, A.E.; Bekdaş, G.; Nigdeli, S.M.; Kim, S.; Geem, Z.W. Adaptive-hybrid harmony search algorithm for multi-constrained optimum eco-design of reinforced concrete retaining walls. Sustainability 2021, 13, 1639. [Google Scholar] [CrossRef]
- Rao, S.S. Engineering Optimization Theory and Practice, 4th ed.; John Wiley & Sons: Hoboken, NJ, USA, 2009; ISBN 978-0-470-18352-6. [Google Scholar]
- Bekdaş, G.; Nigdeli, S.M.; Yücel, M.; Kayabekir, A.E. Yapay Zeka Optimizasyon Algoritmaları ve Mühendislik Uygulamaları; Seçkin: Ankara, Turkey, 2021. [Google Scholar]
- Yang, X.S.; Bekdaş, G.; Niğdeli, S.M. Metaheuristic and Optimization in Civil Engineering; Springer: Cham, Switzerland, 2016; ISBN 9783319262451. [Google Scholar]
- Montesinos López, O.A.; Montesinos López, A.; Crossa, J. Fundamentals of artificial neural networks and deep learning. In Multivariate Statistical Machine Learning Methods for Genomic Prediction; Springer International Publishing: Cham, Switzerland, 2022; pp. 379–425. [Google Scholar]
- Wang, S.C. Artificial neural network. In Interdisciplinary Computing in Java Programming; Springer: Boston, MA, USA, 2003; pp. 81–100. [Google Scholar]
- Rasamoelina, A.D.; Adjailia, F.; Sinčák, P. A review of activation functions for artificial neural networks. In Proceedings of the 2020 IEEE 18th World Symposium on Applied Machine Intelligence and Informatics (SAMI), Herlany, Slovakia, 23–25 January 2020; pp. 281–286. [Google Scholar]
- Karlik, B.; Olgac, A.V. Performance analysis of various activation functions in generalized MLP architectures of neural networks. Int. J. Artif. Intell. Expert Syst. 2011, 1, 111–122. [Google Scholar]
- Glorot, X.; Bordes, A.; Bengio, Y. Deep sparse rectifier neural networks. In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, Fort Lauderdale, FL, USA, 11–13 April 2011; pp. 315–323. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1026–1034. [Google Scholar]
- Misra, D. Mish: A self-regulated non-monotonic activation function. arXiv 2019, arXiv:1908.08681. [Google Scholar]
- Clevert, D.A.; Unterthiner, T.; Hochreiter, S. Fast and accurate deep network learning by exponential linear units (elus). arXiv 2015, arXiv:1511.07289. [Google Scholar]
- Mealpy. Available online: https://github.com/thieu1995/mealpy (accessed on 21 February 2025).
- Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Antiga, L.; et al. Pytorch: An imperative style, high-performance deep learning library. Adv. Neural Inf. Process. Syst. 2019, 32, 8026–8037. [Google Scholar]
- Ocak, A.; Işıkdağ, Ü.; Bekdaş, G.; Nigdeli, S.M. Prediction of Damping Capacity Demand in Seismic Base Isolators via Machine Learning. CMES-Comput. Model. Eng. Sci. 2024, 138. [Google Scholar] [CrossRef]
- Zhang, Z. Introduction to machine learning: K-nearest neighbors. Ann. Transl. Med. 2016, 4, 218. [Google Scholar] [CrossRef]
- Zhang, Z. Too many covariates in a multivariable model may cause the problem of overfitting. J. Thorac. Dis. 2014, 6, E196. [Google Scholar] [PubMed]
- Harrell, F.E. Regression Modeling Strategies: With Applications to Linear Models, Logistic Regression, and Survival Analysis; Springer: New York, NY, USA, 2001; Volume 608. [Google Scholar]
- Mellit, A.; Kalogirou, S. Artificial intelligence techniques: Machine learning and deep learning algorithms. In Handbook of Artificial Intelligence Techniques in Photovoltaic Systems; Academic Press: Cambridge, MA, USA, 2022; pp. 43–83. [Google Scholar]
- Subasi, A. Machine learning techniques. In Practical Machine Learning for Data Analysis Using Python; Academic Press: Cambridge, MA, USA, 2020; pp. 91–202. [Google Scholar]
- Nisbet, R.; Miner, G.; Yale, K. Chapter 9—Classification. In Handbook of Statistical Analysis and Data Mining Applications, 2nd ed.; Academic Press: Cambridge, MA, USA, 2018; pp. 169–186. [Google Scholar]
- Breiman, L.; Friedman, J.; Olshen, R.A.; Stone, C.J. Classification and Regression Trees; Routledge: London, UK, 2017. [Google Scholar]
- Ye, J.; Dobson, S.; McKeever, S. Situation identification techniques in pervasive computing: A review. Pervasive Mob. Comput. 2012, 8, 36–66. [Google Scholar] [CrossRef]
- Li, X.; Wang, L.; Sung, E. A study of AdaBoost with SVM-based weak learners. In Proceedings of the Proceedings. 2005 IEEE International Joint Conference on Neural Networks, Montreal, QC, Canada, 31 July–4 August 2005; Volume 1, pp. 196–201. [Google Scholar]
- Dorogush, A.V.; Ershov, V.; Gulin, A. CatBoost: Gradient boosting with categorical features support. arXiv 2018, arXiv:1810.11363. [Google Scholar] [CrossRef]
- CatBoost Developers. Catboost Python Package. 2022. Available online: https://pypi.org/project/catboost/ (accessed on 21 February 2025).
- Breiman, L. Bagging predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef]
- Zhou, J.; Huang, S.; Qiu, Y. Optimization of random forest through the use of MVO, GWO, and MFO in evaluating the stability of underground entry-type excavations. Tunn. Undergr. Space Technol. 2022, 124, 104494. [Google Scholar] [CrossRef]
- Geisser, S.; Eddy, W.F. A predictive approach to model selection. J. Am. Stat. Assoc. 1979, 74, 153–160. [Google Scholar] [CrossRef]
- Mcknight, P.E.; Najab, J. Mann-Whitney U test. In The Corsini Encyclopedia of Psychology; Wiley: Hoboken, NJ, USA, 2010; p. 1. [Google Scholar]
Name | Equation | Plot |
---|---|---|
Sigmoid | ||
Tanh | ||
Mish | ||
ReLU | ||
Leaky ReLU | ||
ELU |
Parameter Name | Lower Bound | Upper Bound | Options |
---|---|---|---|
Number of Hidden Layers (HLs) | 0 | 2 | 0: Single HL 1: Two HL 2: Three HL |
Number of Neurons in HL = 1 | 0 | 6 | 0: 8 1: 16 2: 32 3: 64 4: 128 5: 256 6: 512 |
Number of Neurons in HL = 2 | 0 | 6 | 0: 8 1: 16 2: 32 3: 64 4: 128 5: 256 6: 512 |
Number of Neurons in HL = 3 | 0 | 6 | 0: 8 1: 16 2: 32 3: 64 4: 128 5: 256 6: 512 |
Activation Function of HL = 1 | 0 | 6 | 0: LeakyReLU 1: Sigmoid 2: Tanh 3: ReLU 4: LogSigmoid 5: ELU 6: Mish |
Activation Function of HL = 2 | 0 | 6 | 0: LeakyReLU 1: Sigmoid 2: Tanh 3: ReLU 4: LogSigmoid 5: ELU 6: Mish |
Activation Function of HL = 3 | 0 | 6 | 0: LeakyReLU 1: Sigmoid 2: Tanh 3: ReLU 4: LogSigmoid 5: ELU 6: Mish |
Predicted Label | |||||
---|---|---|---|---|---|
Class 1 | Class 2 | Class 3 | Class 4 | ||
True label | Class 1 | ||||
Class 2 | |||||
Class 3 | |||||
Class 4 |
Symbol | Definition | Value |
---|---|---|
Pn | Population number | 15 |
mt | Maximum iteration number | 500,000 |
HMCR | Harmony memory considering rate | 0.5 |
FW | Fret width | 0.02 |
L | Beam length (cm) | 200 |
E | Modulus of elasticity (kN/cm2) | 20,000 |
Horizontal load (kN) | 180 | |
P | Vertical load (kN) | 100800 |
Moment stress (kN/cm2) | 6 | |
Minimum beam section width (cm) | 10 | |
Minimum beam section height (cm) | 10 | |
Minimum beam web thickness (cm) | 0.9 | |
Minimum beam flange thickness (cm) | 0.9 | |
Maximum beam section width (cm) | 50 | |
Maximum beam section height (cm) | 80 | |
Maximum beam web thickness (cm) | 5 | |
Maximum beam flange thickness (cm) | 5 |
Vertical Load (kN) | Horizontal Load (kN) | Height (cm) | Width (cm) | Web Thickness | Flange Thickness | Fx |
---|---|---|---|---|---|---|
140 | 5 | 66.07 | 10.24 | thick | thick | 0.008484 |
150 | 3 | 65.55 | 32.22 | thick | thin | 0.006583 |
160 | 43 | 69.99 | 31.14 | thin | thick | 0.005486 |
180 | 42 | 44.64 | 37.71 | thick | thick | 0.014960 |
210 | 3 | 80 | 34.80 | thin | thick | 0.004698 |
350 | 57 | 80 | 50 | thin | thin | 0.008308 |
450 | 12 | 51.75 | 32.29 | thick | thin | 0.033294 |
570 | 61 | 53.96 | 32.95 | thick | thin | 0.045206 |
680 | 27 | 61.06 | 28.75 | thick | thin | 0.047454 |
790 | 48 | 80 | 50 | thin | thin | 0.022872 |
Vertical Load | Horizontal Load | Height | Width | Objective Function | Beam Web Thickness | Beam Flange Thickness | |
Vertical load | 1 | −0.02 | −0.01 | 0.02 | 0.01 | 0.70 | −0.71 |
Horizontal load | −0.02 | 1 | −0.01 | 0.02 | 0.02 | 0.35 | −0.35 |
Height | −0.01 | −0.01 | 1 | 0.73 | −0.45 | −0.22 | −0.03 |
Width | 0.02 | 0.02 | 0.73 | 1 | −0.19 | −0.20 | −0.06 |
Objective function | 0.01 | 0.02 | −0.45 | −0.19 | 1 | 0.08 | 0.03 |
Beam web thickness | 0.70 | 0.35 | −0.22 | −0.20 | 0.08 | 1 | −0.96 |
Beam flange thickness | −0.71 | −0.35 | −0.03 | −0.06 | 0.03 | −0.96 | 1 |
Algorithm | Precision | Recall | F1 Score | Max Accuracy | Mean Accuracy |
---|---|---|---|---|---|
Logistic Regression | 0.9467 | 0.5648 | 0.5887 | 0.9203 | 0.9057 |
Linear Discriminant Analysis | 0.9041 | 0.5662 | 0.5675 | 0.9782 | 0.9068 |
K-Nearest Neighbors | 0.9228 | 0.6676 | 0.7264 | 0.9401 | 0.9255 |
AdaBoost | 0.9588 | 0.8403 | 0.8874 | 0.9692 | 0.9625 |
Random Forest | 0.9873 | 0.9729 | 0.9796 | 0.9982 | 0.9918 |
Decision Tree | 0.9809 | 0.9823 | 0.9837 | 0.9982 | 0.9929 |
CatBoost | 0.9878 | 0.9852 | 0.9864 | 0.9982 | 0.9949 |
Bagging | 0.9846 | 0.9847 | 0.9839 | 1 | 0.9953 |
Algorithm | Precision | Recall | F1 Score | Max Accuracy | Mean Accuracy |
---|---|---|---|---|---|
Linear Discriminant Analysis | 0.9590 | 0.9815 | 0.9692 | 0.9837 | 0.9742 |
AdaBoost | 0.9787 | 0.9771 | 0.9778 | 0.9909 | 0.9819 |
K-Nearest Neighbors | 0.9748 | 0.9841 | 0.9793 | 0.9891 | 0.9829 |
Logistic Regression | 0.9817 | 0.9815 | 0.9816 | 0.9927 | 0.9849 |
CatBoost | 0.9827 | 0.9851 | 0.9838 | 0.9928 | 0.9868 |
Decision Tree | 0.9829 | 0.9856 | 0.9836 | 0.9946 | 0.9871 |
Bagging | 0.9820 | 0.9869 | 0.9846 | 0.9909 | 0.9873 |
Random Forest | 0.9841 | 0.9899 | 0.9865 | 0.9946 | 0.9897 |
Symbol | Definition | Value |
---|---|---|
pn | Population number | 15 |
mt | Maximum iteration number | 10,000 |
HMCR | Harmony memory considering rate | 0.5 |
FW | Fret width | 0.02 |
L | Column length (cm) | 100–500 |
E | Modulus of elasticity (kgf/cm2) | 0.85 × 106 |
P | External load (kgf) | 100~5000 |
Strain (kgf/cm2) | 100–500 | |
Minimum section center diameter (cm) | 2 | |
Minimum section thickness (cm) | 0.2 | |
Maximum section center diameter (cm) | 14 | |
Maximum section thickness (cm) | 0.9 |
Strain (kgf/cm2) | Load (kgf) | Length (cm) | Thickness (cm) | Center Diameter (cm) | Fx |
---|---|---|---|---|---|
100 | 200 | 300 | A | H | 12.0818 |
100 | 500 | 100 | F | H | 19.5972 |
200 | 2800 | 300 | G | J | 53.5751 |
200 | 4900 | 400 | E | M | 98.3072 |
300 | 700 | 300 | A | J | 18.0965 |
300 | 4000 | 400 | C | L | 60.0191 |
400 | 1900 | 100 | F | H | 18.8173 |
400 | 3300 | 300 | B | J | 37.4192 |
500 | 600 | 400 | A | J | 20.8277 |
500 | 4800 | 200 | E | H | 38.5650 |
Class | Number of Data | |
---|---|---|
Column section thickness (t) classes | A | 438 |
B | 100 | |
C | 84 | |
D | 84 | |
E | 72 | |
F | 82 | |
G | 324 | |
Column center diameter (d) classes | H | 424 |
J | 258 | |
K | 302 | |
L | 125 | |
M | 56 | |
N | 19 |
Strain | External Load | Column Length | Objective Function | Column Section Thickness | Column Center Diameter | |
---|---|---|---|---|---|---|
Strain | 1 | 0 | 0 | −0.33 | −0.49 | −0.19 |
External load | 0 | 1 | 0 | 0.30 | 0.46 | 0.47 |
Column Length | 0 | 0 | 1 | 0.01 | −0.39 | 0.51 |
Objective function | −0.33 | 0.30 | 0.01 | 1 | 0.02 | 0.15 |
Column section thickness | −0.49 | 0.46 | −0.39 | 0.02 | 1 | 0 |
Column center diameter | −0.19 | 0.47 | 0.51 | 0.15 | 0 | 1 |
Algorithm | Precision | Recall | F1 Score | Max Accuracy | Mean Accuracy |
---|---|---|---|---|---|
AdaBoost | 0.3951 | 0.3880 | 0.3555 | 0.6017 | 0.5135 |
Logistic Regression | 0.3179 | 0.3274 | 0.2954 | 0.7311 | 0.6360 |
Linear Discriminant Analysis | 0.3573 | 0.3965 | 0.3522 | 0.7227 | 0.6681 |
K-Nearest Neighbors | 0.4270 | 0.4154 | 0.4059 | 0.7311 | 0.6406 |
Decision Tree | 0.5765 | 0.5703 | 0.6083 | 0.7966 | 0.7424 |
Bagging | 0.6404 | 0.6045 | 0.6063 | 0.7983 | 0.7762 |
CatBoost | 0.6191 | 0.5859 | 0.5840 | 0.8151 | 0.7804 |
Random Forest | 0.6609 | 0.6254 | 0.6197 | 0.8151 | 0.7846 |
Algorithm | Precision | Recall | F1 Score | Max Accuracy | Mean Accuracy |
---|---|---|---|---|---|
Logistic Regression | 0.3549 | 0.3827 | 0.3373 | 0.5678 | 0.4865 |
AdaBoost | 0.4062 | 0.4090 | 0.3970 | 0.6807 | 0.6199 |
Linear Discriminant Analysis | 0.5083 | 0.5539 | 0.5012 | 0.7203 | 0.6563 |
K-Nearest Neighbors | 0.7505 | 0.6659 | 0.6824 | 0.8655 | 0.8209 |
Bagging | 0.7896 | 0.8079 | 0.8089 | 0.9580 | 0.8986 |
Decision Tree | 0.8073 | 0.7974 | 0.7810 | 0.9237 | 0.9029 |
Random Forest | 0.8486 | 0.8355 | 0.7989 | 0.9664 | 0.9130 |
CatBoost | 0.8197 | 0.8171 | 0.8079 | 0.9576 | 0.9172 |
Group | n | Mean | Standard Deviation | Median |
GA1D | 1051 | 92.516 | 1.588 | 92.83 |
HS1D | 1051 | 92.576 | 2.171 | 92.83 |
Group | n | Mean | Standard Deviation | Median |
GA1T | 1051 | 89.956 | 2.242 | 90.30 |
HS1T | 1051 | 89.072 | 2.765 | 89.45 |
Group | n | Mean | Standard Deviation | Median |
GA1TF | 1051 | 98.618 | 0.135 | 98.64 |
HS1TF | 1051 | 98.590 | 0.852 | 98.64 |
Group | n | Mean | Standard Deviation | Median |
GA1TW | 1059 | 99.855 | 0.481 | 99.91 |
HS1TW | 1051 | 99.901 | 0.244 | 99.91 |
Group | n | Mean | Standard Deviation | Median |
HS1D | 1051 | 92.576 | 2.171 | 92.83 |
HS2D | 1051 | 92.205 | 2.287 | 92.83 |
Group | n | Mean | Standard Deviation | Median |
HS1T | 1108 | 89.072 | 2.765 | 89.45 |
HS2T | 1051 | 89.251 | 2.551 | 89.45 |
Group | n | Mean | Standard Deviation | Median |
GA1D | 1051 | 92.516 | 1.588 | 92.83 |
GA2D | 1051 | 92.607 | 1.403 | 92.83 |
Group | n | Mean | Standard Deviation | Median |
GA1T | 1051 | 89.956 | 2.242 | 90.30 |
GA2T | 1051 | 89.943 | 2.575 | 90.30 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ocak, A.; Bekdaş, G.; Nigdeli, S.M.; Işıkdağ, U.; Geem, Z.W. Searching for the Best Artificial Neural Network Architecture to Estimate Column and Beam Element Dimensions. Information 2025, 16, 660. https://doi.org/10.3390/info16080660
Ocak A, Bekdaş G, Nigdeli SM, Işıkdağ U, Geem ZW. Searching for the Best Artificial Neural Network Architecture to Estimate Column and Beam Element Dimensions. Information. 2025; 16(8):660. https://doi.org/10.3390/info16080660
Chicago/Turabian StyleOcak, Ayla, Gebrail Bekdaş, Sinan Melih Nigdeli, Umit Işıkdağ, and Zong Woo Geem. 2025. "Searching for the Best Artificial Neural Network Architecture to Estimate Column and Beam Element Dimensions" Information 16, no. 8: 660. https://doi.org/10.3390/info16080660
APA StyleOcak, A., Bekdaş, G., Nigdeli, S. M., Işıkdağ, U., & Geem, Z. W. (2025). Searching for the Best Artificial Neural Network Architecture to Estimate Column and Beam Element Dimensions. Information, 16(8), 660. https://doi.org/10.3390/info16080660