Machine-Learning-Based Predictive Models for Compressive Strength, Flexural Strength, and Slump of Concrete
Abstract
:1. Introduction
1.1. Constituted Materials and Mixture Design of Concrete
1.2. Artificial Intelligence Applied to Concrete Production
2. Materials and Methods
2.1. Business Understanding
2.1.1. Data Sources
2.1.2. Solution Design and Performance Metrics
2.2. Data Acquisition and Understanding
2.2.1. Statistical Description of the Data
2.2.2. Data Cleaning
2.3. Modeling
2.3.1. Feature Engineering
- Data Transformation
- Variables were grouped together because the components they represent are similar and only differ in brand or reference. The quantities of each grouped variable were summed to create a new single variable.
- All predictor variables were standardized to the production of one cubic meter of concrete. This was implemented by dividing each component quantity by the amount of concrete produced.
- For outlier treatment, quality rules established by concrete production experts were used.
- To facilitate model convergence, input variables were standardized using a standard scaler, and categorical variables were converted to numerical using a one hot encoder.
- Feature Selection
2.3.2. Building of Predictive Models
- Regression Tree (RT): These divide the data into homogeneous subsets through binary partitions. The most discriminative variable is first selected as the root node to split the dataset into nodes or branches. The partitioning is repeated until the nodes are homogeneous enough to be definitive. Thus, in a tree structure, the terminal nodes (called leaves) represent class labels, and the branches represent conjunctions of features leading to those class labels [20].
- K-Nearest Neighbors Regressor (KNN): This method predicts for a given data point based on the average of the target values of the K-nearest neighbors. The main advantage of KNN regression is that it is simple to apply and can perform well in practice, especially if there is a clear pattern in the data. However, KNN regression can be computationally expensive as it requires calculating distances between all data points in the training set for each prediction [41].
- Multi-Layer Perceptron Neural Network Regressor (MLP): A computational model inspired by biological neural networks, featuring a layered organization of neurons where the input layer represents the predictor variables, the output layer represents the target variable, and the hidden layers perform nonlinear mappings of the data [42].
- Support Vector Machine Regressor (SVR): Constructs a hyperplane supported by as many data points as possible, called support vectors. They are known for using the kernel trick to increase the dimensionality of the data and achieve prediction [43].
- Random Forest Regressor (RF): Constructs a large number of decision trees at training time, and the prediction outputs are subjected to a vote by the individual trees. The basic concept behind this is that a group of weak classifiers can be joined together to build a strong prediction [44].
- Gradient Boosting Regressor (GBoost):Gradient boosting is an ensemble of multiple models, typically decision trees. The fundamental concept behind this algorithm is that each tree is trained using the errors (residuals) from all the preceding trees. Additionally, the negative gradient of the loss function in the current model is employed as an estimate for these errors in the boosted tree algorithm, which helps adjust the regression or classification tree [27].
- Extreme Gradient Boosting Regressor (XGBoost): It is a technique based on decision trees that takes into account data dispersion for approximate tree learning. It is a gradient boosting algorithm optimized through parallel processing, tree pruning, handling of missing values, and regularization to avoid overfitting and bias [45].
2.3.3. Model Optimization
- Bayesian Optimization (BO):
- Genetic Algorithms (GA):
- Particle Swarm Optimization (PSO):
- Tuned Hyperparameters:
2.3.4. Model Evaluation
2.4. Deployment
3. Results
3.1. Business Understanding
3.2. Data Acquisition and Understanding
3.3. Modeling
3.3.1. Feature Engineering
- Variables were grouped because the components they represent are similar and only differ in brand or reference. The quantities of each grouped variable were summed to create a new single variable. For example, instead of having three references for cement, a single variable called “Cement 1” was kept. Similarly, instead of having five references for a retarding admixture, a single variable called “Admixture_1 Retarding” was kept.
- Variables with zero values were removed.
- In order to observe the influence of different independent variables on the output variable, linear and non-linear relationship analyses were conducted using the mutual_info_regression method, this method quantifies the interdependence of two variables, a higher mutual information suggests a stronger correlation between the variables [49]. Unlike the linear correlation coefficient, mutual information is sensitive to dependencies that may not be apparent in the covariance [50]. For 28-day compressive strength, it was found that in addition to cement, retarding and superplasticizer admixtures, supplementary cementitious materials, and free water, there is also a relationship with the moisture content of the aggregates and their sources. In the case of 28-day flexural strength, the results are similar, but there is a stronger relationship with the superplasticizing admixtures than with the retarding agents, and there is a stronger relationship with the plant where the concrete is produced. Finally, in the prediction of slump, it was found that water has a stronger relationship than in the strength predictions.
- Pearson correlation was analyzed to detect dangerously high relationships between independent variables, verifying that there are no correlations greater than 0.8 or less than −0.8 between independent variables, unless they correspond to the quantity of aggregate and its moisture content from the same source. This way, the existence of redundant variables is ruled out.
- All independent variables were scaled to the production of one cubic meter of concrete.
3.3.2. Predictive Models
3.3.3. Hyperparameter Optimization
3.4. Deployment
4. Discussion
5. Conclusions
- Various supervised learning models were trained and tested to estimate compressive and flexural strength, as well as slump, revealing promising performance. Among these models, the MLP demonstrated superior efficacy in a 10-fold-cross-validation.
- The MLP model achieved an MAE of 4.36 MPa and an MAPE of 11.41% for compressive strength estimation, indicating its robust predictive power. The results are in line with the established objectives of an MAE below 10 MPa.
- The MLP model for flexural strength prediction achieved an MAE of 0.7 MPa and an MAPE of 14.05%. These results align with the established goal of achieving an MAE less than 1.2 MPa.
- The MLP model for slump prediction achieved an MAE of 11.7 mm and an MAPE of 7.27%. These findings are consistent with the target of achieving an MAE under 20 mm.
- Our approach differed from conventional methods by utilizing GA, BO, and PSO for hyperparameter tuning, with GA proving to be the most effective.
- Comparisons with other studies underscored our models’ competitive performance despite the differences in dataset size and attributes. While our models achieved comparable RMSE values, slight disparities in values were noted, likely attributed to dataset variations. Nonetheless, our predictive models for compressive strength, flexural strength, and slump exhibited promising accuracy and reliability.
- Regarding the datasets constructed for this study, it is important to highlight that in the observed studies, no comparable datasets were found with ours. Unlike our case, it is common to find that studies conclude that they need larger datasets.
- Finally, our study stands out from other research endeavors by executing the deployment process within a real-world environment. In the deployment phase, a meticulous process was executed to implement the predictive models in an operational environment.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
AP | Artificial pozzolans |
BO | Bayesian optimization |
CRISP-DM | Cross-Industry Standard Process for Data Mining |
C-S-H | Calcium silicate hydrate |
FA | Fly ash |
GA | Genetic algorithm |
GBoost | Gradient boosting |
GGBFS | Ground granulated blast furnace slag |
KNN | K-nearest neighbors |
MAE | Mean absolute error |
MAPE | Mean absolute percentage error |
MLP | Multi-layer perceptron regressor |
NP | Natural pozzolans |
PSO | Particle swarm optimization |
RF | Random forest |
RMX | Ready mix |
RT | Regression trees |
SaaS | Software as a Service |
SCMs | Supplementary cementitious materials |
SF | Silica fume |
SVR | Support vector regressor |
TDSP | The Team Data Science Process |
XGBoost | Extreme gradient boosting |
References
- Cement Production Global 2023|Statista. Available online: https://www.statista.com/statistics/1087115/global-cement-production-volume/ (accessed on 4 November 2022).
- Cement and Concrete around the World. Available online: https://gccassociation.org/concretefuture/cement-concrete-around-the-world/ (accessed on 10 November 2022).
- Monteiro, P.J.; Miller, S.A.; Horvath, A. Towards sustainable concrete. Nat. Mater. 2017, 16, 698–699. [Google Scholar] [CrossRef] [PubMed]
- Damme, H.V. Concrete material science: Past, present, and future innovations. Cem. Concr. Res. 2018, 112, 5–24. [Google Scholar] [CrossRef]
- Mehta, P.K.; Monteiro, P.J.M. Concrete: Microstructure, Properties, and Materials; McGraw-Hill Education: New York, NY, USA, 2014; pp. 95–108. [Google Scholar]
- Telechea, S.; Diego, S.; Tecnología del Concreto y del Mortero, 5ta Edición. Tecnolog 2001. Available online: https://www.academia.edu/49045048/ (accessed on 9 March 2024).
- Smith, I.A. The Design of Fly-Ash Concretes. Proc. Inst. Civ. Eng. 1967, 36, 769–790. [Google Scholar] [CrossRef]
- ASTM C618-23; Standard Specification for Coal Ash and Raw or Calcined Natural Pozzolan for Use in Concrete. American Society for Testing: West Conshohocken, PA, USA, 2023.
- Moreno, J.D. Materiales Cementantes Suplementarios y Sus Efectos en el Concreto. 2018. Available online: https://360enconcreto.com/blog/detalle/efectos-de-cementantes-suplementarios/ (accessed on 9 December 2023).
- Kosmatka, S.; Kerkhoff, B.; Panarese, W. Design and Control of Concrete Mixtures, EB001. In Design and Control of Concrete Mixtures; Canadian Portland Cement Association: Portland, ON, Canada, 2002; pp. 57–72. [Google Scholar]
- Nagrockienė, D.; Girskas, G.; Skripkiūnas, G. Properties of concrete modified with mineral additives. Constr. Build. Mater. 2017, 135, 37–42. [Google Scholar] [CrossRef]
- Osorio, J.D. Resistencias del Concreto|ARGOS 360. 2019. Available online: https://www.360enconcreto.com/blog/detalle/resistencia-mecanica-del-concreto-y-compresion (accessed on 10 November 2022).
- Nguyen, T.T.; Duy, H.P.; Thanh, T.P.; Vu, H.H. Compressive Strength Evaluation of Fiber-Reinforced High-Strength Self-Compacting Concrete with Artificial Intelligence. Adv. Civ. Eng. 2020, 2020, 3012139. [Google Scholar] [CrossRef]
- Azizifar, V.; Babajanzadeh, M. Compressive Strength Prediction of Self-Compacting Concrete Incorporating Silica Fume Using Artificial Intelligence Methods. Civ. Eng. J. 2018, 4, 1542. [Google Scholar] [CrossRef]
- Hassoun, M.H.; Intrator, N.; McKay, S.; Christian, W. Fundamentals of Artificial Neural Networks. Proc. IEEE 1996, 10, 137. [Google Scholar] [CrossRef]
- Chen, S.C.; Le, D.K.; Nguyen, V.S. Adaptive Network-Based Fuzzy Inference System (ANFIS) Controller for an Active Magnetic Bearing System with Unbalance Mass. Lect. Notes Electr. Eng. 2014, 282 LNEE, 433–443. [Google Scholar] [CrossRef]
- Eiben, A.E.; Smith, J.E. Introduction to Evolutionary Computing; Springer: Berlin/Heidelberg, Germany, 2003. [Google Scholar] [CrossRef]
- Cortes, C.; Vapnik, V.; Saitta, L. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
- Quinlan, J.R. Induction of decision trees. Mach. Learn. 1986, 1, 81–106. [Google Scholar] [CrossRef]
- Grajski, K.A.; Breiman, L.; Prisco, G.V.D.; Freeman, W.J. Classification of EEG spatial patterns with a tree-structured methodology: CART. IEEE Trans. Bio-Med. Eng. 1986, 33, 1076–1086. [Google Scholar] [CrossRef] [PubMed]
- Young, B.A.; Hall, A.; Pilon, L.; Gupta, P.; Sant, G. Can the compressive strength of concrete be estimated from knowledge of the mixture proportions?: New insights from statistical analysis and machine learning methods. Cem. Concr. Res. 2019, 115, 379–388. [Google Scholar] [CrossRef]
- Ahmad, M.; Hu, J.L.; Ahmad, F.; Tang, X.W.; Amjad, M.; Iqbal, M.J.; Asim, M.; Farooq, A. Supervised Learning Methods for Modeling Concrete Compressive Strength Prediction at High Temperature. Materials 2021, 14, 1983. [Google Scholar] [CrossRef] [PubMed]
- Huang, J.; Zhou, M.; Yuan, H.; Sabri, M.M.S.; Li, X. Prediction of the Compressive Strength for Cement-Based Materials with Metakaolin Based on the Hybrid Machine Learning Method. Materials 2022, 15, 3500. [Google Scholar] [CrossRef]
- Moradi, N.; Tavana, M.H.; Habibi, M.R.; Amiri, M.; Moradi, M.J.; Farhangi, V. Predicting the Compressive Strength of Concrete Containing Binary Supplementary Cementitious Material Using Machine Learning Approach. Materials 2022, 15, 5336. [Google Scholar] [CrossRef] [PubMed]
- Khan, K.; Salami, B.A.; Jamal, A.; Amin, M.N.; Usman, M.; Al-Faiad, M.A.; Abu-Arab, A.M.; Iqbal, M. Prediction Models for Estimating Compressive Strength of Concrete Made of Manufactured Sand Using Gene Expression Programming Model. Materials 2022, 15, 5823. [Google Scholar] [CrossRef] [PubMed]
- Silva, V.P.; de Alencar Carvalho, R.; da Silva Rêgo, J.H.; Evangelista, F. Machine Learning-Based Prediction of the Compressive Strength of Brazilian Concretes: A Dual-Dataset Study. Materials 2023, 16, 4977. [Google Scholar] [CrossRef] [PubMed]
- Li, D.; Tang, Z.; Kang, Q.; Zhang, X.; Li, Y. Machine Learning-Based Method for Predicting Compressive Strength of Concrete. Processes 2023, 11, 390. [Google Scholar] [CrossRef]
- Kumar, A.; Arora, H.C.; Kapoor, N.R.; Mohammed, M.A.; Kumar, K.; Majumdar, A.; Thinnukool, O. Compressive Strength Prediction of Lightweight Concrete: Machine Learning Models. Sustainability 2022, 14, 2404. [Google Scholar] [CrossRef]
- Ali, A.; Riaz, R.D.; Malik, U.J.; Abbas, S.B.; Usman, M.; Shah, M.U.; Kim, I.H.; Hanif, A.; Faizan, M. Machine Learning-Based Predictive Model for Tensile and Flexural Strength of 3D-Printed Concrete. Materials 2023, 16, 4149. [Google Scholar] [CrossRef]
- Shah, H.A.; Yuan, Q.; Akmal, U.; Shah, S.A.; Salmi, A.; Awad, Y.A.; Shah, L.A.; Iftikhar, Y.; Javed, M.H.; Khan, M.I. Application of Machine Learning Techniques for Predicting Compressive, Splitting Tensile, and Flexural Strengths of Concrete with Metakaolin. Materials 2022, 15, 5435. [Google Scholar] [CrossRef] [PubMed]
- Zheng, D.; Wu, R.; Sufian, M.; Kahla, N.B.; Atig, M.; Deifalla, A.F.; Accouche, O.; Azab, M. Flexural Strength Prediction of Steel Fiber-Reinforced Concrete Using Artificial Intelligence. Materials 2022, 15, 5194. [Google Scholar] [CrossRef] [PubMed]
- Khademi, F.; Akbari, M.; Jamal, S.M.; Nikoo, M. Multiple linear regression, artificial neural network, and fuzzy logic prediction of 28 days compressive strength of concrete. Front. Struct. Civ. Eng. 2017, 11, 90–99. [Google Scholar] [CrossRef]
- Cihan, M.T. Prediction of Concrete Compressive Strength and Slump by Machine Learning Methods. Adv. Civ. Eng. 2019, 2019, 3069046. [Google Scholar] [CrossRef]
- Zhang, X.; Akber, M.Z.; Zheng, W. Predicting the slump of industrially produced concrete using machine learning: A multiclass classification approach. J. Build. Eng. 2022, 58, 104997. [Google Scholar] [CrossRef]
- Chen, Y.; Wu, J.; Zhang, Y.; Fu, L.; Luo, Y.; Liu, Y.; Li, L. Research on Hyperparameter Optimization of Concrete Slump Prediction Model Based on Response Surface Method. Materials 2022, 15, 4721. [Google Scholar] [CrossRef] [PubMed]
- Jaf, D.K.I. Soft Computing and Machine Learning-Based Models to Predict the Slump and Compressive Strength of Self-Compacted Concrete Modified with Fly Ash. Sustainability 2023, 15, 11554. [Google Scholar] [CrossRef]
- Jani, M. What Is the Team Data Science Process? 2022. Available online: https://learn.microsoft.com/en-us/azure/architecture/ai-ml/ (accessed on 9 March 2024).
- ARGOS. Centro Argos Para la Innovación. 2021. Available online: https://argos.co/centro-argos-para-la-innovacion/ (accessed on 29 May 2022).
- Osorio, J.D. Resistencia mecáNica del Concreto y Resistencia a la Compresión. 2018. Available online: https://360enconcreto.com/blog/detalle/resistencia-mecanica-del-concreto-y-compresion/ (accessed on 7 December 2023).
- PyPI. Pandas-Profiling. 2023. Available online: https://pypi.org/project/pandas-profiling/ (accessed on 9 March 2024).
- Singh, A. KNN Algorithm: Guide to Using K-Nearest Neighbor for Regression. 2023. Available online: https://www.analyticsvidhya.com/blog/2018/08/k-nearest-neighbor-introduction-regression-python/ (accessed on 9 March 2024).
- Lek, S.; Park, Y.S. Artificial Neural Networks. In Encyclopedia of Ecology, Five-Volume Set; PHI Learning Pvt. Ltd.: Delhi, India, 2008; pp. 237–245. [Google Scholar] [CrossRef]
- Gholami, R.; Fakhari, N. Support Vector Machine: Principles, Parameters, and Applications. In Handbook of Neural Computation; Academic Press: Cambridge, MA, USA, 2017; pp. 515–535. [Google Scholar] [CrossRef]
- Shrivastava, D.; Sanyal, S.; Maji, A.K.; Kandar, D. Bone cancer detection using machine learning techniques. In Smart Healthcare for Disease Diagnosis and Prevention; Academic Press: Cambridge, MA, USA, 2020; pp. 175–183. [Google Scholar] [CrossRef]
- Morde, V. XGBoost Algorithm: Long May She Reign! 2019. Available online: https://towardsdatascience.com/https-medium-com-vishalmorde-xgboost-algorithm-long-she-may-rein-edd9f99be63d (accessed on 10 November 2023).
- Snoek, J.; Larochelle, H.; Adams, R.P. Practical Bayesian Optimization of Machine Learning Algorithms. Adv. Neural Inf. Process. Syst. 2012, 25. [Google Scholar]
- Hamdia, K.M.; Zhuang, X.; Rabczuk, T. An efficient optimization approach for designing machine learning models based on genetic algorithm. Neural Comput. Appl. 2021, 33, 1923–1933. [Google Scholar] [CrossRef]
- Qolomany, B.; Maabreh, M.; Al-Fuqaha, A.; Gupta, A.; Benhaddou, D. Parameters optimization of deep learning models using Particle swarm optimization. In Proceedings of the 2017 13th International Wireless Communications and Mobile Computing Conference, IWCMC 2017, Valencia, Spain, 26–30 June 2017; pp. 1285–1290. [Google Scholar] [CrossRef]
- Huang, L.; Zhou, X.; Shi, L.; Gong, L. Time Series Feature Selection Method Based on Mutual Information. Appl. Sci. 2024, 14, 1960. [Google Scholar] [CrossRef]
- Kraskov, A.; Stögbauer, H.; Grassberger, P. Estimating mutual information. Phys. Rev. E—Stat. Phys. Plasmas Fluids Relat. Interdiscip. Top. 2004, 69, 16. [Google Scholar] [CrossRef]
Output Variable | Input Variables | Methods | Source |
---|---|---|---|
Compressive Strength | Cement content, water, fine and coarse aggregates, silica fume, nano silica, fly ash, super plasticizer, and temperature. | AdaBoost, random forest, and decision tree. | [22] |
Compressive Strength | Cement content, water-to-binder ratio, cement–sand ratio, metakaolin-to-binder ratio, and superplasticizer. | Random forests and firefly algorithm. | [23] |
Compressive Strength | Water content, cement, fine aggregate, coarse aggregate, binary SCMs (SCM1 and SCM2), the chemical composition of each SCM including SiO2, CaO, Fe2O3, Al2O3, MgO, physical properties of binary SCMs, and age of specimens. | Multi-layer perceptron network. | [24] |
Compressive Strength | Tensile strength of cement, curing age, crushed stone, stone powder content, fineness modulus of sand, water-to-binder ratio, water-to-cement ratio, water, sand ratio, and slump. | Gene expression programming. | [25] |
Compressive Strength | Cement, blast furnace slag, pozzolana, water, coarse aggregate, fine aggregate, and age. | Artificial neural networks. | [26] |
Compressive Strength | Cement, blast furnace slag, fly ash, water, superplasticizer, coarse aggregate, fine aggregate, age. | Gradient boosting regression tree. | [27] |
Compressive Strength | Cement, water content, fine aggregate, normal weight coarse aggregate, lightweight coarse aggregate, and water–cement ratio. | Gaussian progress regression, support vector machine regression, ensemble learning, and optimized GPR. | [28] |
Tensile and Flexural Strength | Water, cement, silica fume, fly ash, coarse aggregate, fine aggregate, viscosity modifying agent, fibers, fiber properties, print speed, and nozzle area. | Gaussian Process Regression, Decision Tree Regression, Support Vector Machine, and XGBoost Regression. | [29] |
Compressive, Splitting Tensile, and Flexural Strengths | Cement, metakaolin, water-to-binder ratio, fine aggregate, coarse aggregate, superplasticizer, and age. | Gene expression programming, artificial neural network, M5P model tree algorithm, and random forest. | [30] |
Flexural Strength | Water-to-cement ratio, sand-to-aggregate ratio, superplasticizer, silica fume, steel fiber volume. | Gradient boosting, random forest, and extreme gradient boosting. | [31] |
Compressive Strength | Water-to-cement ratio, coarse aggregate, sand, fine aggregate | multiple linear regression, artificial neural network, and adaptive neuro-fuzzy inference system. | [32] |
Output Variable | Input Variables | Methods | Source |
---|---|---|---|
Slump and Compressive Strength | Water–cement ratio, cement content, compressive strength of cement, fine aggregate, fineness module, chemical admixture, type of aggregate (limestone or basalt). | Regression tree, random forest, support vector machines, artificial neural network, partial least square, bagging, fuzzy logic. | [33] |
Slump | Water, cement, fine aggregate, coarse aggregate, fly ash, water-reducing admixture, superplasticizing admixture. | Regularized logistic regression, linear discriminant analysis, K-nearest neighbors, support vector machines, classification and regression tree, random forest, extreme gradient boosting. | [34] |
Slump | Cement, blast furnace slag, fly ash, water, superplasticizer, coarse aggregate, fine aggregate, flow. | Neural networks. | [35] |
Slump | Cement, water-to-binder ratio, fly ash, sand, coarse aggregate, superplasticizer. | Nonlinear regression, multi-linear regression, and artificial neural networks. | [36] |
Hyperparameter | Description | Algorithm |
---|---|---|
momentum | Controls the influence of previous steps in updating weights during training. Can help prevent oscillations and speed up convergence. | MLP |
hidden_layer_sizes | Defines the number of nodes in each hidden layer of the neural network. | MLP |
solver | Specifies the optimization algorithm used to adjust the weights of the network. Can be Adam or stochastic gradient descent, among others. | MLP |
max_iter | Sets the maximum number of iterations the solver can perform. | MLP |
activation | Defines the activation function used in the hidden layers. | MLP |
learning rate init | Some adaptive optimizers automatically adjust the learning rate, but this parameter indicates the initial value. | MLP |
learning_rate | Controls the contribution of each tree to the model update (or the step size in neural networks). A lower value makes the model more robust but may require more trees to achieve good performance. | GBoost, MLP |
n_estimators | Number of trees to build in the boosting process. A higher value usually leads to a more robust model but can also increase training time. | GBoost, random forest |
max_depth | Defines the maximum depth of each tree in the model. A higher value allows the model to learn more complex patterns but can also lead to overfitting. | GBoost, random forest |
min_samples_leaf | Minimum number of samples required in a leaf node. Higher values usually give simpler models and help prevent overfitting. | GBoost, random forest |
max_features | Maximum number of features to consider when building trees. Varying this helps mitigate overfitting and improve model generalization. | Random forest |
subsample | The proportion of training observations used to build each tree. A lower value can make the model more conservative and less prone to overfitting. | GBoost |
Criterion | The function to measure the quality of a split. | GBoost |
Variable | Units | Min | Max |
---|---|---|---|
Coarse aggregate—quantity | kg/m3 | 0 | 2500 |
Coarse aggregate—moisture content | % | 0 | 6 |
Fine aggregate—quantity | kg/m3 | 0 | 2000 |
Fine aggregate—moisture content | % | >0 | 15 |
Free Water | L/m3 | >0 | 200 |
Cement 1 | kg/m3 | 200 | 1000 |
Cement 2 | kg/m3 | 0 | 1000 |
Admixture_1 Retarding | g/m3 | >0 | 5000 |
Admixture_2 superplasticizing | g/m3 | >0 | 5000 |
Admixture_3 Set-controlling | g/m3 | 0 | 2000 |
SCMs | kg/m3 | 0 | 100 |
28 d Compressive Strength | MPa | 10 | 100 |
28 d Flexural Strength | MPa | 0 | 10 |
Slump | mm | 0 | 300 |
Optimization Technique | Momentum | Hidden Layer Sizes | Max_iter | Activation | Solver | Learning Rate Init | MAPE |
---|---|---|---|---|---|---|---|
BO | 0.5200 | 50 | 100 | logistic | adam | 0.003 | 13.38% |
GA | 0.1031 | 134 | 100 | relu | sgd | 0.033 | 11.42% |
PSO | 0.4000 | 65 | 200 | logistic | sgd | 0.003 | 11.73% |
Optimization Technique | n Estimators | Max_depth | Min Samples Split | Min Samples Leaf | MAPE |
---|---|---|---|---|---|
BO | 293 | 25 | 5 | 1 | 12.81% |
GA | 300 | 16 | 2 | 4 | 11.96% |
PSO | 350 | 30 | 6 | 1 | 13.33% |
Optimization GBoost | Learning Rate | n Estimators | Max_depth | Min Samples Leaf | Subsample | MAPE |
---|---|---|---|---|---|---|
BO | 0.0432 | 318 | 16 | 27 | 0.96 | 12.1% |
GA | 0.0431 | 241 | 26 | 26 | 0.64 | 12.5% |
PSO | 0.5 | 400 | 50 | 3 | 0.80 | 14% |
Model | Hyperparameters | MAPE Train | MAE Train | MAPE Test | MAE Test |
---|---|---|---|---|---|
Compressive strength after 28 days | activation = “relu” hidden_layer_sizes = (134) solver = ’sgd’ learning_rate = ’constant’ learning_rate_init = 0.003 momentum = 0.1031 max_iter =100 | 9.37% | 3.54 MPa | 11.41% | 4.36 MPa |
Flexural strength after 28 days | activation=“relu” hidden_layer_sizes = (178) solver = ’sgd’ learning_rate = ’constant’ learning_rate_init = 0.003 momentum = 0.1031 max_iter = 100 | 10.25% | 0.50 MPa | 14.05% | 0.70 MPa |
Slump | activation = “relu” hidden_layer_sizes = (193) solver = ’sgd’ learning_rate = ’constant’ learning_rate_init = 0.003, momentum = 0.1031 max_iter = 100 | 4.83% | 7.08 mm | 7.27% | 11.7 mm |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vargas, J.F.; Oviedo, A.I.; Ortega, N.A.; Orozco, E.; Gómez, A.; Londoño, J.M. Machine-Learning-Based Predictive Models for Compressive Strength, Flexural Strength, and Slump of Concrete. Appl. Sci. 2024, 14, 4426. https://doi.org/10.3390/app14114426
Vargas JF, Oviedo AI, Ortega NA, Orozco E, Gómez A, Londoño JM. Machine-Learning-Based Predictive Models for Compressive Strength, Flexural Strength, and Slump of Concrete. Applied Sciences. 2024; 14(11):4426. https://doi.org/10.3390/app14114426
Chicago/Turabian StyleVargas, John F., Ana I. Oviedo, Nathalia A. Ortega, Estebana Orozco, Ana Gómez, and Jorge M. Londoño. 2024. "Machine-Learning-Based Predictive Models for Compressive Strength, Flexural Strength, and Slump of Concrete" Applied Sciences 14, no. 11: 4426. https://doi.org/10.3390/app14114426
APA StyleVargas, J. F., Oviedo, A. I., Ortega, N. A., Orozco, E., Gómez, A., & Londoño, J. M. (2024). Machine-Learning-Based Predictive Models for Compressive Strength, Flexural Strength, and Slump of Concrete. Applied Sciences, 14(11), 4426. https://doi.org/10.3390/app14114426