Research on AI-Driven Classification Possibilities of Ball-Burnished Regular Relief Patterns Using Mixed Symmetrical 2D Image Datasets Derived from 3D-Scanned Topography and Photo Camera
Abstract
1. Introduction
2. Materials and Methods
2.1. Stage 1. Forming RRs by BB and Acquiring Their Artificial 2D Images Description
2.1.1. Forming Suitable RRs Patterns by BB Operation
2.1.2. Optical 3D Scanning of the RRs Patterns
2.1.3. Pre-Processing of the Raw 3D Scanned Topography, Using Gwyddion Software
2.1.4. Creating 2D “ARTIFICIAL” Images of RRs from Their 3D Topographies in Blender
- Moving camera, placed normally against the RRs topography (see Figure 4a). In that strategy, the camera is perpendicular to the RRs plane and moves along to the zig-zag type of trajectory, capturing image at interval of 0.5 mm. In that way, the shift fluctuations in the XY-plane were simulated.
- Moving the camera against the RRs topography, when it is fixed to single point (see Figure 4b). The strategy is like the previous one as crawling over the topography, but camera is fixed at the center of the RR and tilting symmetrically during following the zig-zag trajectory. This strategy was used to capture images for RRs with No. 2, 4, and 7 showed in Figure 2a.
- The camera followed a circular trajectory around a fixed point at the geometrical center of the RRs area (see Figure 5a), and captured images, ensuring multiple symmetric angular perspectives. For one full round of the camera circular orbit 100 images were captured, which means that a 2D image was gathered every 3.6 degrees. The radius of the circular orbit was set to 3 mm, which means that the virtual camera’s offset distance (i.e., the arc length) between two consecutive synthetic images was 0.19 mm approximately.
- The camera rotated in elliptical orbit around a fixed point at the geometrical center of the RRs area (see Figure 5b) above the 3D topography and captured images like in the previous strategy. Again, this strategy was applied only for RRs with № 2, 4, and 7 showed in Figure 2a to skip the areas where the burnished traces did not contain cells from their patterns.
- Workbench, this is a fast-rendering engine without photorealistic effects.
- Eevee, this is a lightweight ray-tracing engine, offering a balance between performance and realism.
- Cycles, this is a physically accurate ray-tracing engine producing high-fidelity images, albeit at a higher computational cost.
- High Dynamic Range Imaging (HDRI) lighting environments were used to introduce realistic illumination and reflections, simulating real-world optical conditions [78]. The used HDRi add-ones are as follows: Hanger Exterior Cloudy, Qwantani Moonrise, Qwantani Night, Lakeside Sunrise.
- The following built-in Blender’s Studio Lights options for lighting were employed: Default, basic.sl, rim.sl, studio.sl.
2.1.5. Real Images Capturing Setup Description
2.2. Stage 2. Training and Testing the CNN Models
2.2.1. Description of the Evaluated CNN Architectures
- VGG16—this model of architecture utilizes a simple and uniform architecture with stacked 3 × 3 convolutional filters. Its depth allows the learning of highly abstract, discriminative features critical for identifying subtle surface variations, serving as a robust baseline for evaluating newer architectures.
- DenseNet121—features dense connections in which each layer receives input from all preceding layers and contributes to all subsequent ones. This design promotes gradient flow and feature reuse, enhancing learning efficiency and detail preservation while maintaining a compact parameter footprint.
- EfficientNetB0—employs compound scaling to uniformly adjust network depth, width, and resolution. This leads to an optimal balance between accuracy and computational efficiency, making it ideal for scenarios with limited processing power.
- MobileNetV2—is designed for deployment in mobile and embedded systems. This model uses depth wise separable convolutions to significantly reduce computational complexity while maintaining competitive performance.
2.2.2. Transfer Learning Strategies Used
Feature Extraction Strategy
Gradual Unfreezing Strategy
2.2.3. Model Compilation
2.2.4. Training Settings
2.2.5. Evaluation of the CNN-Models
- Accuracy () is the percentage of correctly classified images with respect to the total number of classified images. The accuracy is calculated as follows:
- Precision measures of how much of the predictions of the trained model regarding a given texture are true, with respect to all the predictions. It is calculated as follows:
- Macro-precision is a metric that averages the precision of the model for each pattern over all the patterns. It is calculated as follows:
- The recall is a measure of how many images that have a certain texture A, were correctly identified as being of their correct texture’s class, and is calculated as follows:
- Macro-recall is a metric that averages the recall of the model for each pattern over all of the patterns, and it is calculated, as follows:
3. Results and Discussion
3.1. Performance Assessment of the Evaluated CNN Models When Tested with RRs Images from the Same Dataset with Which They Have Been Finely Trained
3.2. Performance Assessment of the Three Evaluated CNN Models When Finely Trained with Different Datasets and Tested with Real Captured RRs Images
3.3. Evaluation of the Methodologies Employed for the Preparation of RR’s Image Datasets in Relation to the Duration of Time Required
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix A.1. Resulting Confusion Matrices of the Models Trained via Feature Extraction Transfer Learning Strategy on the Subset of Non-Textured Blender Images, and Tested on Its Own Subset
Appendix A.2. Resulting Confusion Matrices of the Models Trained via Feature Extraction Transfer Learning Strategy on the Training Subset of Textured Blender Images, and Tested on Its Testing Subset
Appendix A.3. Resulting Confusion Matrices of the Models Trained via Feature Extraction Transfer Learning Strategy on the Training Subset of Real Camera Images, and Tested on Its Testing Subset
Appendix B
Appendix B.1. Resulting Confusion Matrices of the Models Trained Using the Gradual Unfrezzing Strategy on an Augmented Dataset of Non-Textured Blender Images, and Tested on a Dataset Composed from Images Taken by a Camera
Appendix B.2. Resulting Confusion Matrices of the Models Trained Using the Gradual Unfrezzing Strategy on an Augmented Dataset of Textured Blender Images, and Tested on a Dataset Composed from Images Taken by a Camera
Appendix B.3. Resulting Confusion Matrices of the Models Trained Using the Gradual Unfrezzing Strategy on an Augmented Dataset Composed from Both Non-Textured and Textured Blender Images, and Tested on a Dataset Composed from Images Taken by a Camera
Appendix B.4. Resulting Confusion Matrices of the Models Trained Using the Gradual Unfrezzing Strategy on a Dataset Composed from a Training Subset of Camera Images, and Tested on the Testing Portion of the Same Camera Image Dataset
Appendix B.5. Resulting Confusion Matrices of the Models Trained Using the Gradual Unfrezzing Strategy on an Augmented Dataset Composed of Non-Textured and Textured Blender Images as Well as the Testing Subset of Camera Images, and Tested on the Testing Portion of the Same Camera Image Dataset
References
- Mahajan, D.; Tajane, R. A Review on Ball Burnishing Process. Int. J. Sci. Res. Publ. 2013, 3, 1–8. [Google Scholar]
- López de Lacalle, L.N.N.; Lamikiz, A.; Muñoa, J.; Sánchez, J.A.A. Quality Improvement of Ball-End Milled Sculptured Surfaces by Ball Burnishing. Int. J. Mach. Tools Manuf. 2005, 45, 1659–1668. [Google Scholar] [CrossRef]
- Gharbi, F.; Sghaier, S.; Al-Fadhalah, K.J.; Benameur, T. Effect of Ball Burnishing Process on the Surface Quality and Microstructure Properties of Aisi 1010 Steel Plates. J. Mater. Eng. Perform. 2011, 20, 903–910. [Google Scholar] [CrossRef]
- Lin, Y.C.; Wang, S.W.; Lai, H.-Y. The Relationship between Surface Roughness and Burnishing Factor in the Burnishing Process. Int. J. Adv. Manuf. Technol. 2004, 23, 666–671. [Google Scholar] [CrossRef]
- Saldaña-Robles, A.; Plascencia-Mora, H.; Aguilera-Gómez, E.; Saldaña-Robles, A.; Marquez-Herrera, A.; Diosdado-De la Peña, J.A. Influence of Ball-Burnishing on Roughness, Hardness and Corrosion Resistance of AISI 1045 Steel. Surf. Coat. Technol. 2018, 339, 191–198. [Google Scholar] [CrossRef]
- Avilés, R.; Albizuri, J.; Rodríguez, A.; López De Lacalle, L.N. Influence of Low-Plasticity Ball Burnishing on the High-Cycle Fatigue Strength of Medium Carbon AISI 1045 Steel. Int. J. Fatigue 2013, 55, 230–244. [Google Scholar] [CrossRef]
- Rodríguez, A.; López de Lacalle, L.N.; Celaya, A.; Lamikiz, A.; Albizuri, J. Surface Improvement of Shafts by the Deep Ball-Burnishing Technique. Surf. Coat. Technol. 2012, 206, 2817–2824. [Google Scholar] [CrossRef]
- Amdouni, H.; Bouzaiene, H.; Montagne, A.; Van Gorp, A.; Coorevits, T.; Nasri, M.; Iost, A.; Van Gorp, A.; Coorevits, T.; Nasri, M.; et al. Experimental Study of a Six New Ball-Burnishing Strategies Effects on the Al-Alloy Flat Surfaces Integrity Enhancement. Int. J. Adv. Manuf. Technol. 2017, 90, 2271–2282. [Google Scholar] [CrossRef]
- Pu, Z.; Song, G.L.; Yang, S.; Outeiro, J.C.; Dillon, O.W.; Puleo, D.A.; Jawahir, I.S. Grain Refined and Basal Textured Surface Produced by Burnishing for Improved Corrosion Performance of AZ31B Mg Alloy. Corros. Sci. 2012, 57, 192–201. [Google Scholar] [CrossRef]
- Swirad, S. The Effect of Burnishing Parameters on Steel Fatigue Strength. Nonconv. Technol. Rev. 2007, 1, 113–118. [Google Scholar]
- Świrad, S.; Wydrzynski, D.; Nieslony, P.; Krolczyk, G.M. Influence of Hydrostatic Burnishing Strategy on the Surface Topography of Martensitic Steel. Measurement 2019, 138, 590–601. [Google Scholar] [CrossRef]
- Swirad, S.; Pawlus, P. The Effect of Ball Burnishing on Dry Fretting. Materials 2021, 14, 7073. [Google Scholar] [CrossRef] [PubMed]
- Одинцoв Леoнид Григoрьевич Упрoчнение и Отделка Деталей Пoверхнoстным Пластическим Дефoрмирoванием. Справoчник. Available online: https://djvu.online/file/ljh1WjzABLPAH (accessed on 23 February 2021).
- Шнейдер, Ю. Эксплуатациoнные Свoйства Деталей с Регулярным Микрoрельефoм. Л. Машинoстрoение 1982, 248, 3. [Google Scholar]
- Pande, S.S.; Patel, S.M. Investigations on Vibratory Burnishing Process. Int. J. Mach. Tool Des. Res. 1984, 24, 195–206. [Google Scholar] [CrossRef]
- Jerez-Mesa, R.; Gomez-Gras, G.; Travieso-Rodriguez, J.A. Surface Roughness Assessment after Different Strategy Patterns of Ultrasonic Ball Burnishing. Procedia Manuf. 2017, 13, 710–717. [Google Scholar] [CrossRef]
- Jerez-Mesa, R.; Travieso-Rodriguez, J.A.; Gomez-Gras, G.; Lluma-Fuentes, J. Development, Characterization and Test of an Ultrasonic Vibration-Assisted Ball Burnishing Tool. J. Mater. Process Technol. 2018, 257, 203–212. [Google Scholar] [CrossRef]
- Jerez-Mesa, R.; Plana-García, V.; Llumà, J.; Travieso-Rodriguez, J.A. Enhancing Surface Topology of Udimet®720 Superalloy through Ultrasonic Vibration-Assisted Ball Burnishing. Metals 2020, 10, 915. [Google Scholar] [CrossRef]
- Jagadeesh, G.V.; Gangi Setti, S. A Review on Latest Trends in Ball and Roller Burnishing Processes for Enhancing Surface Finish. Adv. Mater. Process. Technol. 2022, 8, 4499–4523. [Google Scholar] [CrossRef]
- Jalindar Varpe, N.; Gurnani, U.; Hamilton, A.; Ramesh, S.; Aditya Kudva, S.; Fernández-Lucio, P.; González-Barrio, H.; Gómez-Escudero, G.; Pereira, O.; López de Lacalle, L.N.; et al. Analysis of the Influence of the Hydrostatic Ball Burnishing Pressure in the Surface Hardness and Roughness of Medium Carbon Steels. IOP Conf. Ser. Mater. Sci. Eng. 2020, 968, 012021. [Google Scholar] [CrossRef]
- Georgiev, D.S.; Slavov, S.D. Research on Tribological Characteristics of the Planar Sliding Pairs Which Have Regular Shaped Roughness Obtained by Using Vibratory Ball Burnishing Process. In Proceedings of the 3rd International Conference Research and Development in Mechanical Industry, Herceg Novi, Serbia and Montenegro, 30 June–2 July 2003; Volume 2, pp. 719–725. [Google Scholar]
- Slavov, S.D.; Dimitrov, D.M. A Study for Determining the Most Significant Parameters of the Ball-Burnishing Process over Some Roughness Parameters of Planar Surfaces Carried out on CNC Milling Machine; Slătineanu, L., Merticaru, V., Mihalache, A.M., Dodun, O., Ripanu, M.I., Nagit, G., Coteata, M., Boca, M., Ibanescu, R., Panait, C.E., Eds.; EDP Sciences: Les Ulis, France, 2018; Volume 178, p. 02005. [Google Scholar]
- Dzyura, V.; Maruschak, P.; Slavov, S.; Dimitrov, D.; Semehen, V.; Markov, O. Evaluating Some Functional Properties of Surfaces with Partially Regular Microreliefs Formed by Ball-Burnishing. Machines 2023, 11, 633. [Google Scholar] [CrossRef]
- López de Lacalle, L.N.; Lamikiz, A.; Sánchez, J.A.; Arana, L. The Effect of Ball Burnishing on Heat-Treated Steel and Inconel 718 Milled Surfaces. Int. J. Adv. Manuf. Technol. 2007, 32, 958–968. [Google Scholar] [CrossRef]
- López de Lacalle, L.N.; Rodríguez, A.; Lamikiz, A.; Celaya, A.; Alberdi, R. Five-Axis Machining and Burnishing of Complex Parts for the Improvement of Surface Roughness. Mater. Manuf. Process. 2011, 26, 997–1003. [Google Scholar] [CrossRef]
- Slavov, S.; Dimitrov, D.; Iliev, I. Variability of Regular Relief Cells Formed on Complex Functional Surfaces by Simultaneous Five-Axis Ball Burnishing. UPB Sci. Bull. Ser. D Mech. Eng. 2020, 82, 195–206. [Google Scholar]
- Malleswara Rao, J.N.; Chenna Kesava Reddy, A.; Rama Rao, P.V. Experimental Investigation of the Influence of Burnishing Tool Passes on Surface Roughness and Hardness of Brass Specimens. Indian. J. Sci. Technol. 2011, 4, 1113–1118. [Google Scholar] [CrossRef]
- Morimoto, T. Effect of Lubricant Fluid on the Burnishing Process Using a Rotating Ball-Tool. Tribol. Int. 1992, 25, 99–106. [Google Scholar] [CrossRef]
- Moshkovich, A.; Perfilyev, V.; Yutujyan, K.; Rapoport, L. Friction and Wear of Solid Lubricant Films Deposited by Different Types of Burnishing. Wear 2007, 263, 1324–1327. [Google Scholar] [CrossRef]
- Hemanth, S.; Harish, A.; Nithin Bharadwaj, R.; Bhat, A.B.; Sriharsha, C. Design of Roller Burnishing Tool and Its Effect on the Surface Integrity of Al 6061. Mater. Today Proc. 2018, 5, 12848–12854. [Google Scholar] [CrossRef]
- Fel’dman, Y.S.; Kravtsov, A.N. Proficorder Study of Vibratory Ball-Burnished Surfaces. Meas. Tech. 1969, 12, 1789–1790. [Google Scholar] [CrossRef]
- Schneider, Y.G. Formation of Surfaces with Uniform Micropatterns on Precision Machine and Instruments Parts. Precis. Eng. 1984, 6, 219–225. [Google Scholar] [CrossRef]
- Slavov, S. An Algorithm for Generating Optimal Toolpaths for CNC Based Ball-Burnishing Process of Planar Surfaces. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2018; Volume 680, pp. 365–375. ISBN 9783319683232. [Google Scholar]
- Rodríguez, A.; López De Lacalle, L.N.; Celaya, A.; Fernández, A.; Lamikiz, A. Ball Burnishing Application for Finishing Sculptured Surfaces in Multi-Axis Machines. Int. J. Mechatron. Manuf. Syst. 2011, 4, 220–237. [Google Scholar] [CrossRef]
- Slavov, S.D.; Dimitrov, D.M. Modelling the Dependence between Regular Reliefs Ridges Height and the Ball Burnishing Regime’s Parameters for 2024 Aluminum Alloy Processed by Using CNC-Lathe Machine. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1037, 012016. [Google Scholar] [CrossRef]
- Slavov, S.D.; Iliev, I.V. Research on the Variability of the Burnishing Force during Processing Surfaces with 3D Shape by Using Simultaneous 5-Axis Ball-Burnishing Process Implemented on CNC Milling Machine. Annu. J. Tech. Univ. Varna Bulg. 2017, 1, 6–12. [Google Scholar] [CrossRef]
- Shiou, F.J.; Banh, Q.N. Development of an Innovative Small Ball-Burnishing Tool Embedded with a Load Cell. Int. J. Adv. Manuf. Technol. 2016, 87, 31–41. [Google Scholar] [CrossRef]
- Shiou, F.J.; Chuang, C.H. Precision Surface Finish of the Mold Steel PDS5 Using an Innovative Ball Burnishing Tool Embedded with a Load Cell. Precis. Eng. 2010, 34, 76–84. [Google Scholar] [CrossRef]
- Luo, H.; Liu, J.; Wang, L.; Zhong, Q. Investigation of the Burnishing Process with PCD Tool on Non-Ferrous Metals. Int. J. Adv. Manuf. Technol. 2005, 25, 454–459. [Google Scholar] [CrossRef]
- Chervach, Y.; Kim, A.; Dorzhiev, D. Burnishing Tool Actuators and Their Influence on the Burnishing Force Components. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Bristol, UK, 10–12 January 2016; Volume 124, p. 12153. [Google Scholar]
- GOST 24773-1981; Surfaces with Regularmicroshape. Classification, Parameters and Characteristics. State Standard of the Union of USR: Moscow, Russia, 1988. Available online: https://gostperevod.com/gost-24773-81.html (accessed on 12 April 2025).
- Joshi, K.; Patil, B. Performance Evaluation of Various Texture Analysis Techniques for Machine Vision-Based Characterisation of Machined Surfaces. Int. J. Comput. Vis. Robot. 2020, 10, 242–259. [Google Scholar] [CrossRef]
- Qiao, Q.; Ahmad, A.; Hu, H.; Wang, K. Advancements in Industrial Product Surface Defect Detection: From Traditional Methods to Modern Advanced Techniques. Front. Artif. Intell. Appl. 2024, 393, 82–89. [Google Scholar] [CrossRef]
- Tatari, M.S. Selection And Parametrization Of Texture Analysis Methods For The Automation Of Industrial Visual Inspection Tasks. Appl. Digit. Image Process. X 1988, 0829, 86–94. [Google Scholar] [CrossRef]
- Joshi, K.; Patil, B. Evaluation of Surface Roughness by Machine Vision Using Neural Networks Approach. In Lecture Notes in Intelligent Transportation and Infrastructure; Springer: Singapore, 2020; pp. 25–31. [Google Scholar] [CrossRef]
- Vakharia, V.; Kiran, M.B.; Dave, N.J.; Kagathara, U. Feature Extraction and Classification of Machined Component Texture Images Using Wavelet and Artificial Intelligence Techniques. In Proceedings of the 2017 8th International Conference on Mechanical and Aerospace Engineering, ICMAE, Prague, Czech Republic, 22–25 July 2017; pp. 140–144. [Google Scholar] [CrossRef]
- Haobo, Y. A Survey of Industrial Surface Defect Detection Based on Deep Learning. In Proceedings of the 2024 International Conference on Cyber-Physical Social Intelligence (ICCSI), Doha, Qatar, 8–12 November 2024; pp. 1–6. [Google Scholar] [CrossRef]
- González, E.; Bianconi, F.; Álvarez, M.X.; Saetta, S.A. Automatic Characterization of the Visual Appearance of Industrial Materials through Colour and Texture Analysis: An Overview of Methods and Applications. Adv. Opt. Technol. 2013, 2013, 503541. [Google Scholar] [CrossRef]
- Salcedo-Sanz, S.; Rojo-Álvarez, J.L.; Martínez-Ramón, M.; Camps-Valls, G. Support Vector Machines in Engineering: An Overview. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2014, 4, 234–267. [Google Scholar] [CrossRef]
- Abiodun, O.I.; Jantan, A.; Omolara, A.E.; Dada, K.V.; Mohamed, N.A.E.; Arshad, H. State-of-the-Art in Artificial Neural Network Applications: A Survey. Heliyon 2018, 4, e00938. [Google Scholar] [CrossRef] [PubMed]
- Hmidani, O.; Ismaili Alaoui, E.M. A Comprehensive Survey of the R-CNN Family for Object Detection. In Proceedings of the 2022 5th International Conference on Advanced Communication Technologies and Networking, CommNet 2022, Marrakech, Morocco, 12–14 December 2022. [Google Scholar] [CrossRef]
- Das, A.; Nandi, A.; Deb, I. Recent Advances in Object Detection Based on YOLO-V4 and Faster RCNN: A Review. In Mathematical Modeling for Computer Applications; Scrivener Publishing: Beverly, MA, USA, 2024; pp. 405–417. [Google Scholar] [CrossRef]
- Zaghdoudi, R.; Seridi, H.; Boudiaf, A. Steel Surface Defects Classification Based on Multi-Derivatives Locally Encoded Transform Feature Histogram (LETRIST) Features. In Proceedings of the 2024 2nd International Conference on Electrical Engineering and Automatic Control, ICEEAC 2024, Setif, Algeria, 12–14 May 2024. [Google Scholar] [CrossRef]
- Sarkale, A.; Shah, K.; Chaudhary, A.; Nagarhalli, T. An Innovative Machine Learning Approach for Object Detection and Recognition. In Proceedings of the 2018 Second International Conference on Inventive Communication and Computational Technologies (ICICCT), Coimbatore, India, 20–21 April 2018; pp. 1008–1010. [Google Scholar]
- Teja Sreenivas, K.; Venkata Raju, K.; Bhavya Spandana, M.; Sri Harshavardhan Reddy, D.; Bhavani, V. Performance Analysis of Convolutional Neural Network When Augmented with New Classes in Classification. In Soft Computing for Problem Solving: SocProS; Springer: Singapore, 2020; pp. 631–644. [Google Scholar]
- Mikołajczyk, A.; Grochowski, M. Data Augmentation for Improving Deep Learning in Image Classification Problem. In Proceedings of the 2018 International Interdisciplinary PhD Workshop, IIPhDW 2018, Swinoujscie, Poland, 9–12 May 2018; pp. 117–122. [Google Scholar] [CrossRef]
- Harianto, R.A.; Pranoto, Y.M.; Gunawan, T.P. Data Augmentation and Faster RCNN Improve Vehicle Detection and Recognition. In Proceedings of the 2021 3rd East Indonesia Conference on Computer and Information Technology (EIConCIT), Surabaya, Indonesia, 9–11 April 2021; pp. 128–133. [Google Scholar]
- Kiran, V.K.; Dash, S.; Parida, P. Improvement on Deep Features through Various Enhancement Techniques for Vehicles Classification. Sens. Imaging 2021, 22, 41. [Google Scholar] [CrossRef]
- Falcone, F.; Iliev, T.; Szolga, A.; Balbayev, G.; Markov, M.; Kalinin, Y.; Markova, V.; Ganchev, T. Towards Implementation of Emotional Intelligence in Human–Machine Collaborative Systems. Electronics 2023, 12, 3852. [Google Scholar] [CrossRef]
- Ding, Z.; Zheng, S.; Zhang, F.; Li, Q.; Guo, C. Lensfree Auto-Focusing Imaging with Coarse-to-Fine Tuning Method. Opt. Lasers Eng. 2024, 181, 108366. [Google Scholar] [CrossRef]
- Xiao, L.; Song, J.; Xie, X.; Fan, C. Enhanced Medical Image Segmentation Using U-Net with Residual Connections and Dual Attention Mechanism. Eng. Appl. Artif. Intell. 2025, 153, 110794. [Google Scholar] [CrossRef]
- Xiao, L.; Zhou, B.; Fan, C. Automatic Brain MRI Tumors Segmentation Based on Deep Fusion of Weak Edge and Context Features. Artif. Intell. Rev. 2025, 58, 154. [Google Scholar] [CrossRef]
- Rydzi, S.; Zahradnikova, B.; Sutova, Z.; Ravas, M.; Hornacek, D.; Tanuska, P. A Predictive Quality Inspection Framework for the Manufacturing Process in the Context of Industry 4.0. Sensors 2024, 24, 5644. [Google Scholar] [CrossRef]
- Dovramadjiev, T.; Dobreva, D.; Murzova, T.; Murzova, M.; Markov, V.; Iliev, I.; Cankova, K.; Jecheva, G.; Staneva, G. Interaction Between Artificial Intelligence, 2D and 3D Open Source Software, and Additive Technologies for the Needs of Design Practice. In World Conference on Information Systems for Business Management; Springer: Singapore, 2024; pp. 339–350. [Google Scholar]
- Nawaz, S.A.; Li, J.; Bhatti, U.A.; Shoukat, M.U.; Ahmad, R.M. AI-Based Object Detection Latest Trends in Remote Sensing, Multimedia and Agriculture Applications. Front. Plant Sci. 2022, 13, 1041514. [Google Scholar] [CrossRef]
- Khan, M.F.; Sajid Farooq, M.; Joghee, S. Increase the Degree of Accuracy by Employing A More Accurate Classification Approach. In Proceedings of the 2023 International Conference on Business Analytics for Technology and Security (ICBATS); Dubai, UAE, 7–8 March 2023; pp. 1–7. [Google Scholar]
- Gaudenz Boesch Very Deep Convolutional Networks (VGG) Essential Guide—Viso.Ai. Available online: https://viso.ai/deep-learning/vgg-very-deep-convolutional-networks/ (accessed on 11 May 2025).
- Tammina, S. Transfer Learning Using VGG-16 with Deep Convolutional Neural Network for Classifying Images. Int. J. Sci. Res. Publ. 2019, 9, 9420. [Google Scholar] [CrossRef]
- Lenyk, Z.; Park, J. Microsoft Vision Model ResNet-50 Combines Web-Scale Data and Multi-Task Learning to Achieve State of the Art—Microsoft Research. Available online: https://www.microsoft.com/en-us/research/blog/microsoft-vision-model-resnet-50-combines-web-scale-data-and-multi-task-learning-to-achieve-state-of-the-art/ (accessed on 11 May 2025).
- Dong, K.; Zhou, C.; Ruan, Y.; Li, Y. MobileNetV2 Model for Image Classification. In Proceedings of the 2020 2nd International Conference on Information Technology and Computer Application, ITCA 2020, Guangzhou, China, 18–20 December 2020; pp. 476–480. [Google Scholar] [CrossRef]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely Connected Convolutional Networks. In Proceedings of the 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, Honolulu, HI, USA, 21–26 July 2017; pp. 2261–2269. [Google Scholar] [CrossRef]
- Densely Connected Convolutional Networks|IEEE Conference Publication|IEEE Xplore. Available online: https://ieeexplore.ieee.org/document/8099726 (accessed on 11 May 2025).
- Slavov, S.D.; Dimitrov, D.M.; Konsulova-Bakalova, M.I. Advances in Burnishing Technology. In Advanced Machining and Finishing; Elsevier: Amsterdam, The Netherlands, 2021; pp. 481–525. [Google Scholar]
- Slavov, S.; Markov, O. A Tool for Ball Burnishing with Ability for Wire and Wireless Monitoring of the Deforming Force Values. In Acta Technica Napocensis-Series: Applied Mathematics, Mechanics, and Engineering; Acta Technica Napocensis: Cluj-Napoca, Romania, 2023; Volume 65. [Google Scholar]
- Nečas, D.; Klapetek, P. Gwyddion: An Open-Source Software for SPM Data Analysis. Open Phys. 2012, 10, 181–188. [Google Scholar] [CrossRef]
- Blender.Org—Home of the Blender Project—Free and Open 3D Creation Software. Available online: https://www.blender.org/ (accessed on 20 April 2025).
- Murray, J.D.; VanRyper, W. Encyclopedia of Graphics File Formats; O’Reilly & Associates, Inc.: Sebastopol, CA, USA, 1996; ISBN 1-56592-058-9. [Google Scholar]
- HDRIs • Poly Haven. Available online: https://polyhaven.com/hdris (accessed on 20 April 2025).
- Naidu, G.; Zuva, T.; Sibanda, E.M. A Review of Evaluation Metrics in Machine Learning Algorithms. In Lecture Notes in Networks and Systems; Springer: Cham, Switzerland, 2023; pp. 15–25. [Google Scholar] [CrossRef]
RR Texture 1 | RR Texture 2 | RR Texture 3 | RR Texture 4 | RR Texture 5 | RR Texture 6 | RR Texture 7 | RR Texture 8 | |
Generated by Blender | ||||||||
Derived from the Alicona scanning microscope | ||||||||
Derived from the physical photo camera |
Type of the Image DB Used for CNN—Model Fine Tunning | The Number of All Rendered And/Or Captured Images | Rendered Images in Blender Software | Rendered with Textures Captured by Alicona Microscope | Captured Images by the Physical Photo-Camera | Overall Time for Dataset Preparation, [min] | Average Time for a Single Image Extraction, [s] |
---|---|---|---|---|---|---|
Only Blender rendered images | 34,037 | 34,037 (100%) | - | - | 621 | 1.09 |
Only images rendered with textures from Alicona microscope | 3232 | - | 3232 (100%) | - | 512 | 9.50 |
Only captured images by the physical photo-camera | 2065 | - | - | 2065 (100%) | 452 | 13.13 |
Mixture of Blender rendered images + textures from Alicona microscope | 37,269 | 34,037 (91%) | 3232 (9%) | - | 1133 | 1.82 |
Mixture of Blender rendered images + textures from Alicona microscope + physical camera captured images | 39,334 | 34,037 (87%) | 3232 (8%) | 2065 (5%) | 1585 | 2.42 |
Type of the Image Dataset Used for CNN—Model Fine Trained by Transfer Learning Strategy | Metrics | CNN-Model DenseNet121 | CNN-Model MobileNetV2 | CNN-Model VGG16 |
---|---|---|---|---|
Rendered non-textured images in Blender | Accuracy | 29% | 24% | 29% |
Precision | 26% | 21% | 21% | |
Recall | 33% | 26% | 29% | |
F1-score | 29% | 23% | 24% | |
Rendered with textured images from Alicona microscope | Accuracy | 44% | 26% | 31% |
Precision | 30% | 29% | 25% | |
Recall | 44% | 28% | 29% | |
F1-score | 36% | 28% | 27% | |
Rendered in Blender non-textured + textured images from Alicona microscope | Accuracy | 38% | 31% | 32% |
Precision | 32% | 36% | 30% | |
Recall | 41% | 31% | 32% | |
F1-score | 36% | 33% | 31% | |
Rendered in Blender non-textured + textured images from Alicona microscope + physical camera captured images mix | Accuracy | 91% | 65% | 96% |
Precision | 92% | 77% | 96% | |
Recall | 91% | 63% | 96% | |
F1-score | 91% | 69% | 96% | |
Captured images from the Physical Photo-Camera | Accuracy | 98% | 98% | 99% |
Precision | 98% | 98% | 99% | |
Recall | 98% | 98% | 99% | |
F1-score | 98% | 98% | 99% |
Type of the Image Dataset Used for CNN—Model (Fine-Tuned by Gradual Unfreezing Transfer Learning Strategy) | Metrics (Macro-) | CNN-Model DenseNet121 | CNN-Model VGG16 |
---|---|---|---|
Augmented rendered non-textured images in Blender | Accuracy | 44% | 31% |
Precision | 46% | 33% | |
Recall | 44% | 31% | |
F1-score | 45% | 32% | |
Augmented rendered with textured images from Alicona microscope | Accuracy | 43% | 33% |
Precision | 49% | 36% | |
Recall | 42% | 33% | |
F1-score | 45% | 35% | |
Augmented rendered in Blender non-textured + augmented textured images from Alicona microscope | Accuracy | 40% | 30% |
Precision | 44% | 39% | |
Recall | 40% | 30% | |
F1-score | 42% | 34% | |
Augmented rendered in Blender non-textured + augmented textured images from Alicona microscope + physical camera captured images | Accuracy | 92% | 95% |
Precision | 92% | 94% | |
Recall | 92% | 95% | |
F1-score | 92% | 94% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Slavov, S.D.; Van, L.S.B.; Vozár, M.; Gogola, P.; Dimitrov, D.M. Research on AI-Driven Classification Possibilities of Ball-Burnished Regular Relief Patterns Using Mixed Symmetrical 2D Image Datasets Derived from 3D-Scanned Topography and Photo Camera. Symmetry 2025, 17, 1131. https://doi.org/10.3390/sym17071131
Slavov SD, Van LSB, Vozár M, Gogola P, Dimitrov DM. Research on AI-Driven Classification Possibilities of Ball-Burnished Regular Relief Patterns Using Mixed Symmetrical 2D Image Datasets Derived from 3D-Scanned Topography and Photo Camera. Symmetry. 2025; 17(7):1131. https://doi.org/10.3390/sym17071131
Chicago/Turabian StyleSlavov, Stoyan Dimitrov, Lyubomir Si Bao Van, Marek Vozár, Peter Gogola, and Diyan Minkov Dimitrov. 2025. "Research on AI-Driven Classification Possibilities of Ball-Burnished Regular Relief Patterns Using Mixed Symmetrical 2D Image Datasets Derived from 3D-Scanned Topography and Photo Camera" Symmetry 17, no. 7: 1131. https://doi.org/10.3390/sym17071131
APA StyleSlavov, S. D., Van, L. S. B., Vozár, M., Gogola, P., & Dimitrov, D. M. (2025). Research on AI-Driven Classification Possibilities of Ball-Burnished Regular Relief Patterns Using Mixed Symmetrical 2D Image Datasets Derived from 3D-Scanned Topography and Photo Camera. Symmetry, 17(7), 1131. https://doi.org/10.3390/sym17071131