Predictive Modeling of Vickers Hardness Using Machine Learning Techniques on D2 Steel with Various Treatments
Abstract
:1. Introduction
2. Materials and Methods
2.1. Materials
- Root Mean Square Error (RMSE): It is the square root of the Mean Square Error (MSE), measures the square of the differences between predicted and actual values (see Equation (2)).
- Mean Absolute Error (MAE): It is the average of the absolute differences between predicted and actual values (see Equation (3)).
- The coefficient of determination (): Provides information on how well a model’s predictions fit the actual data (see Equation (4)). If 1, it indicates that the model explains or fits the data well; otherwise, if <= 0, it indicates that the model does not fit the data.
2.2. Methods
- Decission Tree (DT): DT is a supervised learning algorithm that builds classification and regression models using a hierarchical tree structure. It recursively partitions the dataset into smaller subsets based on specific features, eventually reaching leaf nodes with predicted target values. Key parameters for DT include maximum tree depth, which controls the complexity of the tree, and the splitting criterion, which determines the best feature and threshold for each split. Common criteria include Friedman Mean Squared Error (MSE), squared error, and absolute error [21,22].
- -
- Friedman MSE: This method utilizes the mean squared error with Friedman’s improvement score for potential splits.
- -
- Squared Error: The mean squared error serves as the feature selection criterion, aiming to minimize the L2 loss by assessing the reduction of variance at each terminal node.
- -
- Absolute Error: The mean absolute error minimizes the L1 loss by utilizing the median of each terminal node.
- Adaptive Boosting or ADABoost: It is a meta-estimator that incrementally grows in complexity with each boosting iteration. It employs small decision tree estimators as weak learners, which are added sequentially. Each subsequent model aims to correct the predictions of its predecessor, thereby enhancing overall predictive performance [23].
- Extreme Gradient Boosting or XGBoost: XGBoost is a method designed for improving Gradient Boosting. It utilizes a gradient descent algorithm to minimize the loss when adding new models. In regression tasks, XGBoost employs small decision trees, where each new tree predicts the residuals or errors of the previous trees. These predictions are then combined with the previous tree to make the final prediction [24].
- Random Forest or RF: Random Forest involves building prediction models in classification or regression from a set of decision trees without interaction between them. Key parameters include the criterion, which employs functions like Friedman MSE, squared error, and absolute error, the maximum depth of each tree, and the number of estimators, referring to the number of trees in the forest [25].
3. Results and Discussion
3.1. Hyperparameters Tuning
3.2. Model Testing and Validation
4. Conclusions
- In this work, a new method based on machine learning techniques was developed to predict the Vickers hardness value from the indentation image, applied load, whether it has a coating or not, and the image scale. The method achieved low RMSE and MAE errors (095 and 0.12 respectively), with an close to 1 when employing the Random Forest technique.
- Evaluating the size of the image in the database is crucial since each pixel of the image acts as a descriptor. A larger image introduces redundant information or noise, leading to high RMSE and MAE errors and increasing computation time. Conversely, a very small image can result in a significant loss of information, making learning challenging. Therefore, an image size of 50 × 50 pixels reduces computation time and yields good results RMSE , MAE , and . This size minimizes information loss, facilitating effective data learning.
- In the metal-mechanic industry, material characterization is crucial. By determining the Vickers hardness value, it is possible to assess the quality of the material, whether it has undergone heat treatment, and if the coating is suitable for specific applications. This prevents the material from fracturing or deforming during short-term operation. Therefore, the proposed method charts a new course, diverging from traditional reliance on corner detection, for characterizing materials based on Vickers hardness, generating more efficient quality control and greater reliability of the final product.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Albella, J.M. Láminas Delgadas y Recubrimientos. Preparación, Propiedades y Aplicaciones; Consejo Superior de Investigaciones Científicas: Madrid, Spain, 2003; p. 704. [Google Scholar]
- Baptista, A.; Silva, F.; Porteiro, J.; Míguez, J.; Pinto, G. Sputtering Physical Vapour Deposition (PVD) Coatings: A Critical Review on Process Improvement and Market Trend Demands. Coatings 2018, 8, 402. [Google Scholar] [CrossRef]
- Ohkubo, I.; Hou, Z.; Lee, J.; Aizawa, T.; Lippmaa, M.; Chikyow, T.; Tsuda, K.; Mori, T. Realization of closed-loop optimization of epitaxial titanium nitride thin-film growth via machine learning. Mater. Today Phys. 2021, 16, 100296. [Google Scholar] [CrossRef]
- Wuest, T.; Weimer, D.; Irgens, C.; Thoben, K.D. Machine learning in manufacturing: Advantages, challenges, and applications. Prod. Manuf. Res. 2016, 4, 23–45. [Google Scholar] [CrossRef]
- Lenz, B.; Hasselbruch, H.; Mehner, A. Automated evaluation of Rockwell adhesion tests for PVD coatings using convolutional neural networks. Surf. Coat. Technol. 2020, 385, 125365. [Google Scholar] [CrossRef]
- Martins, L.A.; Pádua, F.L.; Almeida, P.E. Automatic detection of surface defects on rolled steel using Computer Vision and Artificial Neural Networks. In Proceedings of the IECON Proceedings (Industrial Electronics Conference), Glendale, AZ, USA, 7–10 November 2010; pp. 1081–1086. [Google Scholar] [CrossRef]
- Dobrzański, L.; Staszuk, M.; Honysz, R. Application of artificial intelligence methods in PVD and CVD coatings properties modelling. Arch. Mater. Sci. Eng. 2012, 58, 152–157. [Google Scholar]
- Mohamad, M.A.; Ali, N.A.; Haron, H. Computational Intelligence Approach for Predicting the Hardness Performances in Titanium Aluminium Nitride (TiA1N) Coating Process. Int. J. Artif. Intell. Expert Syst. 2014, 5, 1–14. [Google Scholar]
- Wen, C.; Zhang, Y.; Wang, C.; Xue, D.; Bai, Y.; Antonov, S.; Dai, L.; Lookman, T.; Su, Y. Machine learning assisted design of high entropy alloys with desired property. Acta Mater. 2019, 170, 109–117. [Google Scholar] [CrossRef]
- Polanco, J.D.; Jacanamejoy-Jamioy, C.; Mambuscay, C.L.; Piamba, J.F.; Forero, M.G. Automatic Method for Vickers Hardness Estimation by Image Processing. J. Imaging 2023, 9, 8. [Google Scholar] [CrossRef]
- ASTM E92-17; Standard Test Methods for Vickers Hardness and Knoop Hardness of Metallic Materials. ASTM International: West Conshohocken, PA, USA, 2017; pp. 1–27. [CrossRef]
- El-Garaihy, W.H.; Alateyah, A.I.; Shaban, M.; Alsharekh, M.F.; Alsunaydih, F.N.; El-Sanabary, S.; Kouta, H.; El-Taybany, Y.; Salem, H.G. A Comparative Study of a Machine Learning Approach and Response Surface Methodology for Optimizing the HPT Processing Parameters of AA6061/SiCp Composites. J. Manuf. Mater. Process. 2023, 7, 148. [Google Scholar] [CrossRef]
- Fu, K.; Zhu, D.; Zhang, Y.; Zhang, C.; Wang, X.; Wang, C.; Jiang, T.; Mao, F.; Zhang, C.; Meng, X.; et al. Predictive Modeling of Tensile Strength in Aluminum Alloys via Machine Learning. Materials 2023, 16, 7236. [Google Scholar] [CrossRef] [PubMed]
- Dovale-Farelo, V.; Tavadze, P.; Lang, L.; Bautista-Hernandez, A.; Romero, A.H. Vickers hardness prediction from machine learning methods. Sci. Rep. 2022, 12, 22475. [Google Scholar] [CrossRef] [PubMed]
- Jeon, J.; Seo, N.; Son, S.B.; Lee, S.J.; Jung, M. Application of machine learning algorithms and shap for prediction and feature analysis of tempered martensite hardness in low-alloy steels. Metals 2021, 11, 1159. [Google Scholar] [CrossRef]
- Swetlana, S.; Khatavkar, N.; Singh, A.K. Development of Vickers hardness prediction models via microstructural analysis and machine learning. J. Mater. Sci. 2020, 55, 15845–15856. [Google Scholar] [CrossRef]
- Privezentsev, D.G.; Zhiznyakov, A.L.; Kulkov, Y.Y. Automation of measuring microhardness of materials using metal-graphic images. In Proceedings of the 2019 International Russian Automation Conference, RusAutoCon 2019, Sochi, Russia, 8–14 September 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Tanaka, Y.; Seino, Y.; Hattori, K. Automated Vickers hardness measurement using convolutional neural networks. Int. J. Adv. Manuf. Technol. 2020, 109, 1345–1355. [Google Scholar] [CrossRef]
- Buitrago Diaz, J.C.; Ortega-Portilla, C.; Mambuscay, C.L.; Piamba, J.F.; Forero, M.G. Determination of Vickers Hardness in D2 Steel and TiNbN Coating Using Convolutional Neural Networks. Metals 2023, 13, 1391. [Google Scholar] [CrossRef]
- Gonzalez-Carmona, J.M.; Mambuscay, C.L.; Ortega-Portilla, C.; Hurtado-Macias, A.; Piamba, J.F. TiNbN Hard Coating Deposited at Varied Substrate Temperature by Cathodic Arc: Tribological Performance under Simulated Cutting Conditions. Materials 2023, 16, 4531. [Google Scholar] [CrossRef]
- Loh, W.Y. Classification and regression trees. In Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery; John Wiley & Sons: Hoboken, NJ, USA, 2011; Volume 1. [Google Scholar] [CrossRef]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Friedman, J.; Hastie, T.; Tibshirani, R. Additive logistic regression: A statistical view of boosting (With discussion and a rejoinder by the authors). Ann. Stat. 2000, 28, 337–407. [Google Scholar] [CrossRef]
- Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, San Francisco, CA, USA, 13–17 August 2016. [Google Scholar] [CrossRef]
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
Material | Ti (at%) | Nb (at%) | N (at%) | |
---|---|---|---|---|
D2 Steel | 243.25 ± 2.45 | |||
D2 Quenched | 787.05 ± 6.25 | |||
D2 Tempered | 573.17 ± 55.72 | |||
TiNbN 200 °C | 57.86 ± 7.28 | 0.21 ± 0.01 | 41.93 ± 7.27 | 1009.21 ± 14.53 |
TiNbN 400 °C | 51.06 ± 0.57 | 0.18 ± 0.02 | 48.76 ± 0.58 | 846.24 ± 49.16 |
TiNbN 600 °C | 44.72 ± 3.09 | 0.15 ± 0.02 | 55.12 ± 3.08 | 844.35 ± 49.02 |
Scale (m) | Coating Yes = 1 No = 0 | Load (N) | Gray Image Pixel (1,1) … Pixel (n,n) | Hardness Vickers (HV ) | |
---|---|---|---|---|---|
10 10 | 0 0 | 10 2 | 154 162 | 155 156 | 533.75 672.57 |
10 10 | 1 1 | 10 5 | 172 173 | 174 171 | 1001.03 1504.19 |
10 10 | 1 1 | 2 10 | 174 178 | 161 130 | 1474.07 808.35 |
10 10 | 1 1 | 3 10 | 176 171 | 163 169 | 1381.19 806.19 |
20 20 | 0 0 | 10 10 | 205 208 | 254 254 | 612.50 793.30 |
Image Size (px) | (s) | (s) | RMSE | MAE | |
---|---|---|---|---|---|
10 × 10 | 0.19 | 0.00 | 2.74 | 0.10 | 0.99995 |
25 × 25 | 1.13 | 0.00 | 1.96 | 0.09 | 0.99997 |
50 × 50 | 4.70 | 0.02 | 1.54 | 1.15 | 1.00 |
75 × 75 | 11.17 | 0.02 | 5.55 | 0.27 | 0.99978 |
100 × 100 | 20.68 | 0.03 | 4.78 | 0.29 | 0.99984 |
ML 1 | Criterion | Max Depth | Estimators | Mean Score |
---|---|---|---|---|
DT | Squared Error | 10 | 0.9997 | |
ADABoost | 70 | 0.9801 | ||
XGBoost | 20 | 0.9999 | ||
RF | Squared Error | 30 | 30 | 0.9999 |
ML | Train Score | Times Test (s) | RMSE | MAE | |
---|---|---|---|---|---|
DT | 1.00 | 0.02 | 1.94 | 0.21 | 0.99997 |
ADABoost | 0.98 | 1.17 | 53.10 | 43.19 | 0.98085 |
XGBoost | 1.00 | 0.02 | 1.71 | 0.71 | 0.99998 |
RF | 1.00 | 0.06 | 0.95 | 0.12 | 0.99999 |
Image | Scale (m) | Coating | Load (N) | HV True | HV Predict | %Error |
---|---|---|---|---|---|---|
10 | 1 | 10 | 931.31 | 992.93 | 6.62 | |
10 | 1 | 5 | 1108.42 | 1100.93 | 0.68 | |
10 | 1 | 5 | 1104.34 | 1099.42 | 0.45 | |
10 | 1 | 10 | 801.35 | 838.91 | 4.69 | |
10 | 1 | 3 | 1269.00 | 1356.37 | 6.88 |
Image | Scale (m) | Coating | Load (N) | HV True | HV Predict | %Error |
---|---|---|---|---|---|---|
10 | 1 | 10 | 931.31 | 992.93 | 6.62 | |
10 | 1 | 5 | 1108.42 | 1100.93 | 0.68 | |
10 | 1 | 5 | 1104.34 | 1099.42 | 0.45 | |
10 | 1 | 10 | 801.35 | 838.91 | 4.69 | |
10 | 1 | 3 | 1269.00 | 1356.37 | 6.88 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mambuscay, C.L.; Ortega-Portilla, C.; Piamba, J.F.; Forero, M.G. Predictive Modeling of Vickers Hardness Using Machine Learning Techniques on D2 Steel with Various Treatments. Materials 2024, 17, 2235. https://doi.org/10.3390/ma17102235
Mambuscay CL, Ortega-Portilla C, Piamba JF, Forero MG. Predictive Modeling of Vickers Hardness Using Machine Learning Techniques on D2 Steel with Various Treatments. Materials. 2024; 17(10):2235. https://doi.org/10.3390/ma17102235
Chicago/Turabian StyleMambuscay, Claudia Lorena, Carolina Ortega-Portilla, Jeferson Fernando Piamba, and Manuel Guillermo Forero. 2024. "Predictive Modeling of Vickers Hardness Using Machine Learning Techniques on D2 Steel with Various Treatments" Materials 17, no. 10: 2235. https://doi.org/10.3390/ma17102235
APA StyleMambuscay, C. L., Ortega-Portilla, C., Piamba, J. F., & Forero, M. G. (2024). Predictive Modeling of Vickers Hardness Using Machine Learning Techniques on D2 Steel with Various Treatments. Materials, 17(10), 2235. https://doi.org/10.3390/ma17102235