Next Article in Journal
Double-Layered Authentication Door-Lock System Utilizing Hybrid RFID-PIN Technology for Enhanced Security
Previous Article in Journal
Thermo-Powered IoT Fire Detector: A Self-Sustained Smart Safety System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Integrating Data Science and Numerical Methods for Next-Generation Metal Processing †

by
Amir M. Horr
* and
Rodrigo Gómez Vázquez
LKR Light Metals Technologies Ranshofen, AIT Austrian Institute of Technology, 1210 Vienna, Austria
*
Author to whom correspondence should be addressed.
Presented at the 2nd International Electronic Conference on Metals, 5–7 May 2025; Available online: https://sciforum.net/event/IECME2025.
Mater. Proc. 2025, 24(1), 1; https://doi.org/10.3390/materproc2025024001
Published: 21 August 2025
(This article belongs to the Proceedings of The 2nd International Electronic Conference on Metals)

Abstract

The structured integration of analytical methods, numerical simulations, and emerging data science techniques enables a highly efficient and robust modeling approach for manufacturing processes. To successfully implement advanced analytical strategies, numerical methods, and data-driven tools within digital twin or digital shadow frameworks for next-generation metal processing, several critical requirements must be addressed. This paper discusses the foundational elements necessary for the seamless integration of these technologies, with a focus on achieving impactful optimization and precise control of material processes. The research highlights the outcomes of combining data-driven models with high-fidelity numerical simulations, emphasizing their complementary roles in process control and data generation for future-oriented manufacturing modeling.

1. Introduction

Integrating advanced numerical methods with emerging data science techniques holds significant potential for optimizing existing metal processing operations, designing innovative processes, and enabling active process control. This synergy can transform the manufacturing landscape by fostering intelligent human–machine collaboration, enhancing adaptability, improving efficiency, and reducing costs—contributing to more sustainable and environmentally friendly production systems. Data science approaches, including statistical modeling, interpolation, machine learning, and predictive analytics, can uncover intricate patterns within process data, facilitating real-time optimization, predictive-corrective control strategies, and early fault detection [1]. The challenge of constructing robust data models can be addressed through the combined use of numerical simulations for data generation, strategic sampling, and advanced data science methods for interpolation and database development. Furthermore, numerical techniques such as finite element analysis (FEA), computational fluid dynamics (CFD), and optimization algorithms enable high-fidelity simulations of complex, multi-physical, multi-phase, and multi-scale metallurgical processes, supporting precise control and continuous process improvement.
This integrated approach enables the development of digital twins and shadow frameworks that accurately replicate metal processing operations, facilitating active process control and ultimately leading to enhanced manufacturing performance and reduced energy consumption. In this research, the development and implementation of data models, interpolators, and machine learning algorithms involved both the collection and generation of process data, the construction of real-time predictive models, and their deployment within metal processing environments. Numerical simulations were employed to generate results for a range of sampled process scenarios, forming the basis for constructing validated process databases. Additionally, a dynamic framework incorporating an error-feedback mechanism was proposed, allowing the system to continuously learn from operational data and adapt to evolving process conditions by updating both the data models and their underlying databases.

2. Data Methodology: Process Data Handling

Data science techniques are increasingly being employed to enhance process efficiency, reduce energy consumption and operational costs, and enable active control in advanced manufacturing systems. Metal processing involves inherently complex phenomena—such as phase transformations, heat transfer, and deformation mechanics—that must be accurately understood and modeled to enable reliable predictions and control strategies [2,3,4,5]. These critical aspects are captured through numerical simulation methods, including FEA and CFD, as well as through data-driven models tailored to simulate complex manufacturing processes such as casting, extrusion, and additive manufacturing. On the data science front, expertise in solver technologies, data interpolation, machine learning algorithms, statistical modeling, and data preprocessing—such as filtering, mapping, and transformation—is essential. This includes the ability to sample data effectively, construct robust databases, and manage live datasets generated from sensors, simulations, and experimental setups.

2.1. Data Generation

The application of data-driven models in manufacturing processes such as casting is well established, with both the associated challenges and benefits extensively discussed in the literature [6]. However, during database construction and data modeling, the quality of available data often poses significant limitations. Specifically, poor data quality—characterized by imbalance and under-representation across the multi-dimensional process parameter space—can introduce substantial errors and reduce model accuracy. In processes like casting, which exhibit high data gradients (e.g., steep thermal profiles during heating and cooling), the density and distribution of data across various process dimensions critically influence the predictive performance of data models. Therefore, ensuring a well-balanced and representative dataset is essential for developing accurate and reliable models.
In this research, a comprehensive data acquisition and generation strategy was implemented to construct a balanced database and enable the development of robust real-time models for casting processes. A rigorous framework was established to ensure sufficient process data coverage. The development methodology includes key steps and milestones designed to ensure the accuracy, reliability, and contextual relevance of the resulting models, including the following key steps:
  • Definition of Process Parameters: Identification and specification of relevant process parameters and their practical operating ranges for industrial casting processes.
  • Scenario Generation: Formulation of process scenarios by systematically combining variations in the defined parameters to cover the multi-dimensional process space.
  • Snapshot Matrix Construction: Development of a snapshot matrix representing the process characteristics under varying parameter conditions.
  • Numerical Simulation: Execution of computational simulations for the defined scenarios using an open-source CFD solver [7], leading to the creation of a comprehensive process database.
  • Model Development: Construction of real-time data models using the process database, employing a combination of numerical solvers, interpolation techniques, and machine learning algorithms.
  • Design of Experiments (DOE): Implementation of additional process scenarios through DOE methodologies to support model validation and refinement.
  • Model Validation: Rigorous validation of the developed models through comparative studies, employing both statistical and deterministic evaluation techniques to assess model performance and reliability.
Although a comprehensive representation of the multi-physical and multi-scale nature of casting processes requires fully coupled fluid–thermal–mechanical simulations—including microstructural evolution—this research focuses solely on the development of real-time models based on fluid–thermal simulations, implemented using an open-source CFD solver.

2.2. Data Snapshotting/Model Building

To effectively capture the variability and complexity of casting process parameters across a multi-dimensional parameter space, data snapshotting can be performed using advanced sampling techniques such as Latin Hypercube Sampling (LHS) and Sobol sequences [1]. These methods are employed to cover the whole data space and to generate a representative set of process scenarios by systematically exploring the input parameter space. LHS ensures that each parameter range is sampled evenly by dividing it into intervals of equal probability and randomly selecting values from each interval, thereby maximizing coverage with fewer samples. Sobol sequences, on the other hand, are low-discrepancy quasi-random sequences that provide a more uniform and deterministic distribution of sample points across high-dimensional spaces.
In this study, snapshot matrices were constructed using the LHS technique to ensure a balanced and representative coverage of fundamental process scenarios. The variations in process parameters were systematically combined to define the spatial extent of the search space, capturing the full range of operational limits for baseline scenarios. Additionally, multiple DOE configurations were employed to represent normal, near-boundary, and extreme process conditions, enabling rigorous validation of the data models under realistic industrial scenarios. Post-processing of simulation and model data for real-time applications can be conducted in one-, two-, and three-dimensional formats. This allows for comparative analysis of model predictions and DOE results along lines (1D), within planar contours (2D), or across volumetric domains (3D). For the scope of this research, validation was performed using 1D and 2D contour comparisons.
The selection of appropriate data solvers and their corresponding interpolation schemes plays a critical role in determining the accuracy and robustness of the resulting data models [5]. Particular attention was given to accurately capturing initial process conditions—such as the initial melt temperature—and to ensuring precise fitting in regions with steep thermal gradients, such as zones of rapid cooling or heating. Furthermore, machine learning routines were employed for model calibration and training, enhancing the predictive capability of the models. Finally, the real-time models were validated across a spectrum of process conditions, including normal, near-boundary, and extreme cases. The associated computational performance, including model execution time, was also evaluated to ensure the models meet the practical requirements for real-time deployment in casting process simulations. Figure 1 illustrates a typical snapshot matrix, the correlation of process data, and the graphical interface used for data model generation.

3. Numerical Simulations: Data Generation

One common misunderstanding in the field of data-driven modeling is the belief that data models can fully replace physical understanding, experimental validation, or traditional numerical simulations. While data models—especially those enhanced by machine learning—offer powerful tools for pattern recognition, real-time predictions, and process control, they are inherently dependent on the quality and scope of the data they are trained on. Without a solid foundation in the underlying physics of the process, data models risk becoming black boxes that may produce accurate results under normal process conditions but fail when extrapolated beyond their training domain. Experimental and numerical simulation validations remain essential to ensure that models reflect real-world behavior, particularly in complex, multi-physical process systems like casting. Therefore, rather than serving as a replacement, data models should be viewed as complementary tools that enhance, but do not substitute, physical understanding and rigorous validation.

3.1. CFD Simulation

To investigate the thermal evolution and solidification behavior in two distinct aluminum casting processes—Continuous Casting (CC) and High-Pressure Die Casting (HPDC)—dedicated CFD frameworks were employed. For the CC process, fluid–thermal simulations were conducted using directChillFoam [7], a specialized solver built on the OpenFOAM CFD platform [8], tailored for modeling continuous billet casting. This model captures transient heat transfer and melt flow dynamics, incorporating phase-change phenomena through CALPHAD-based melt-fraction tables. Buoyancy-driven convection effects are modeled using the Boussinesq approximation to account for thermal gradients within the melt.
For the HPDC process, simulations were carried out using the commercial CFD software NovaFlow&Solid (version 6.67) [9], which is optimized for industrial-scale solidification modeling. The primary focus was on capturing the thermal cycling behavior of the die over successive casting cycles. The solver’s flexibility in rapid case reconfiguration enabled efficient parametric studies, maintaining computational feasibility despite the inherently short cycle time characteristic of HPDC operations.

3.1.1. Model Setup

For the CC simulations, a pseudo-2D axisymmetric wedge-shaped computational domain was employed, replicating the geometry validated in the experimental study by [10]. The domain was discretized using structured mesh elements with linear axial grading to enhance resolution in regions of steep thermal gradients. The alloy modeled was Al–6wt%Cu, with thermophysical properties derived from a combination of experimental measurements and CALPHAD-based thermodynamic data. Thermal boundary conditions were applied using heat transfer coefficients (HTCs): primary cooling at the mold–metal interface was modeled using locally averaged, solid-fraction-dependent HTC values, while secondary cooling—representing water spray contact—was implemented using tabulated HTC correlations as a function of surface temperature and cooling intensity. All simulation scenarios were executed in multiple stages, each with a total runtime of 2000 s—exceeding twice the duration necessary to achieve quasi-steady-state conditions for the respective casting cases. During the final 1000 s, time-averaged values of temperature and melt fraction were recorded at the centers of all computational cells, along with their corresponding spatial coordinates (X and Z).
In the HPDC simulations, the computational domain encompassed all critical die components, including the cooling channels, ingate system, and piston chamber. The alloy used was AlSi9Cu3(Fe), and the die material was H13 tool steel, selected for its high thermal fatigue resistance. Initial thermal conditions assumed uniform die and chamber temperatures of 220 °C, while coolant loop temperatures ranged from 80 °C to 160 °C, depending on the cooling medium (water or oil). Each simulation case included six complete filling and solidification cycles to capture the cumulative thermal loading and inter-cycle cooling behavior. The model also incorporated spray cooling and air-blowing stages between cycles to reflect realistic die preparation procedures.
This multi-cycle approach enabled accurate prediction of die temperature evolution and its influence on casting quality and repeatability. In the HPDC simulations, the piston motion profile was defined using predefined velocity curves, with only the maximum piston speed varied across scenarios. The initial melt temperature was treated as an additional process parameter and adjusted accordingly to reflect realistic process variability. Temperature time histories were recorded at six strategically selected sensor locations within the die assembly, capturing both transient and cycle-dependent thermal behavior. This methodology produced a compact, cycle-resolved dataset well-suited for training data-driven models capable of representing the dynamic thermal evolution characteristic of HPDC processes. Figure 2 presents the HPDC and vertical casting machines, along with their corresponding numerical simulation domains.

3.1.2. Running Snapshot Scenarios

To support database development and data model training, multiple transient CC simulations were conducted with systematic variations in key process parameters, including initial melt temperature, cooling flow rate, and casting speed. Each simulation was run for 2000 s to ensure the system reached a quasi-steady-state regime. To account for natural process fluctuations and transient effects, time-averaged fields of temperature and melt fraction were extracted over the final 1000 s of each run. These spatially resolved datasets were exported in CSV format, enabling downstream data analysis and integration into machine learning workflows. For the CC simulations, transient runs were performed across a range of process scenarios, with variations in melt temperature, casting speed, and cooling intensity. The resulting temperature and melt-fraction fields were time-averaged during the quasi-steady-state phase and exported in a structured format for use in data model development.
In the HPDC simulations, the piston motion profile was predefined, with only the maximum piston speed varied between cases. Initial melt temperature was also adjusted across scenarios to reflect realistic process variability. Temperature histories were recorded at six strategically selected locations within the die assembly, capturing both transient and cycle-dependent thermal behavior. This approach yielded a compact, cycle-resolved dataset suitable for training data models capable of representing the dynamic thermal evolution characteristic of HPDC processes.

3.2. Database Building

The development of robust, high-fidelity databases is critical for enabling real-time predictive modeling, process optimization, and advanced process control in manufacturing. In this study, process-specific databases were constructed using sampled snapshot matrices derived from verified numerical simulations, complemented by targeted experimental measurements. This hybrid methodology leverages the scalability and consistency of validated computational models while incorporating empirical insights from physical experiments to ensure realism and accuracy. Each sampled process scenario represents a distinct set of operating conditions, for which transient simulations were performed using high-fidelity numerical solvers. The resulting outputs—such as temperature fields, melt-fraction distributions, and flow characteristics—were structured into snapshot matrices, where each row corresponds to a unique scenario and each column captures spatially or temporally resolved process variables. These matrices serve as the foundational structure of the database, supporting downstream applications including surrogate modeling, machine learning, and sensitivity analysis.
To enhance the semantic richness, interoperability, and scalability of the database, a semantic data structure was implemented using an ontological framework [11]. This framework provides a formalized representation of domain knowledge by defining key entities (e.g., materials, process parameters, and initial and boundary conditions), their interrelationships (e.g., affects/is part of), and associated constraints. Ultimately, the integration of sampled snapshot matrices, verified simulations, limited experimental validation, and ontology-based semantic modeling establishes a comprehensive and extensible foundation for data-centric process databases that are particularly suited for complex manufacturing processes such as casting.

4. Results

Dynamic manufacturing processes such as casting can greatly benefit from the integration of data models into their design, optimization, and control strategies. These models are particularly effective for analyzing the influence of key process parameters—including initial melt temperature, water-cooling configurations, and casting speed—on product quality and process stability. By leveraging predictive data models, manufacturers can proactively mitigate the risks of common casting defects such as hot tearing, cold cracking, and void formation [12]. To reduce computational cost during database generation, the simulation domain for casting processes—especially those involving round billet geometries—can often be simplified to a quarter section or a quasi-2D axisymmetric wedge. This geometric reduction preserves the essential physics while significantly lowering the computational burden. A major challenge in applying data models to casting processes lies in accurately capturing the rapid thermal evolution during cooling and solidification, particularly in regions with steep temperature gradients. These high-gradient zones are difficult for data models to generalize from, often leading to reduced prediction accuracy [13]. To address this, additional post-processing steps were implemented to transform raw CFD outputs into formats more suitable for data-driven modeling. This included spatial smoothing, normalization, and dimensionality reduction techniques to enhance model interpretability and robustness. Figure 3 illustrates the temperature evolution during the six-cycle HPDC process, comparing results from CFD simulations and data models. Highlighting the descending trends of maximum and minimum die temperatures, Figure 4 presents a comparative analysis of temperature values obtained from CFD simulations and data models across a billet cross-section evaluated at 3000 nodal points.

5. Discussion

Comparative analyses and performance evaluations of real-time data models were conducted in this study to assess their predictive capabilities for both vertical casting and HPDC processes. Despite promising results, several challenges remain—particularly in modeling the multi-physical and multi-phase nature of vertical casting, which is highly sensitive to the evolution of the thermal field. An accurate prediction of thermal gradients, solidification fronts, and phase transitions is essential, as these directly influence defect formation and material properties.
Similarly, in HPDC, capturing the thermal field evolution across cyclic casting operations is critical for reliable modeling. The repeated heating and cooling of the die, along with temperature-dependent material behavior, necessitate a data model that is not only computationally efficient but also physically informed. To achieve this, the data models must be rigorously trained to reflect the underlying heat transfer mechanisms, cooling dynamics, and temperature-dependent thermophysical properties. To enhance model fidelity, data from well-designed experimental trials and validated numerical simulations were utilized. These datasets serve as a foundation for calibrating and adjusting the data models, ensuring that they can accurately replicate the thermal behavior observed in real-world casting scenarios. This integrated approach strengthens the reliability of data-driven models for predictive control and optimization in industrial casting applications.
The results presented in the preceding sections highlight the effectiveness of data-driven models in capturing steady-state, transient, and generative behaviors within complex manufacturing processes. These models demonstrate promising accuracy and adaptability, offering valuable tools for real-time prediction, control, and optimization. However, the development of comprehensive process databases remains resource-intensive, requiring significant computational and experimental investment. This challenge can be mitigated through the use of advanced sampling strategies—such as LHS or Sobol sequences—and by ensuring balanced data distributions across multi-dimensional parameter spaces to maximize coverage with minimal redundancy.

6. Concluding Remarks

Integrating features from numerical simulations and data-driven models presents a powerful strategy for modeling dynamic manufacturing processes such as CC and HPDC. Validated numerical simulations, grounded in first-principles physics, offer high-fidelity insights into complex phenomena including heat transfer, fluid flow, solidification, and phase transformations. These simulations are capable of resolving fine-scale spatial and temporal dynamics, making them ideal for generating high-quality snapshot data under controlled conditions. However, their computational intensity often renders them impractical for real-time applications or extensive parametric studies. In this research, the integration of numerical simulation outputs into database construction and data model development has been systematically addressed. A key emphasis is placed on the careful selection of data solvers, interpolation methods, and machine learning algorithms—each tailored to the specific characteristics of the manufacturing process, whether it is multi-physical, multi-phase, or multi-scale in nature. This methodological alignment is critical to ensuring that the resulting models are not only accurate but also robust and generalizable.
Moreover, the study highlights the importance of a well-structured data strategy to support the seamless integration of these models into broader digital manufacturing ecosystems. This includes compatibility with real-time control architectures, digital twins, and intelligent process advisory systems. Ultimately, these advancements contribute to the realization of efficient, adaptive, and sustainable manufacturing operations. The insights and methodologies developed here will serve as the foundation for our forthcoming manuscript, which will further explore the deployment of hybrid modeling frameworks in industrial settings.

Author Contributions

A.M.H.: conceptualization, methodology, writing—original draft preparation, software (data models), validation, data curation, writing—review and editing, visualization. R.G.V.: conceptualization, methodology, software (CFD simulations), investigation, data curation, validation, writing—original draft, proofreading. All authors have read and agreed to the published version of the manuscript.

Funding

This research was financially supported by the Austrian Institute of Technology (AIT) under the UF2024 funding program, by the Austrian Research Promotion Agency (FFG) through the AI4SimProd project (ID: 52286160), and by the European Commission under the Horizon Europe program for the metaFacturing project (HORIZON-CL4-2022-RESILIENCE-01; Project ID: 101091635).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The sharing of raw data required to reproduce the case studies will be considered upon request by readers.

Acknowledgments

The authors gratefully acknowledge the technical and financial support provided by the Austrian Federal Ministry for Innovation, Mobility and Infrastructure, the Federal State of Upper Austria, and the Austrian Institute of Technology (AIT).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Brunton, S.L.; Kutz, J.N. Data Driven Science & Engineering—Machine Learning, Dynamical Systems, and Control; Cambridge University Press: Cambridge, UK, 2019. [Google Scholar] [CrossRef]
  2. Tao, F.; Li, Y.; Wei, Y.; Zhang, C.; Zuo, Y. Data–model Fusion Methods and Applications toward Smart Manufacturing and Digital Engineering. Engineering, 2025; in press. [Google Scholar] [CrossRef]
  3. Horr, A.M. Notes on New Physical & Hybrid Modelling Trends for Material Process Simulations. J. Phy. Conf. Ser. 2020, 1603, 012008. [Google Scholar] [CrossRef]
  4. Dogan, A.; Birant, D. Machine learning and data mining in manufacturing. Expert Syst. Appl. 2021, 166, 114060. [Google Scholar] [CrossRef]
  5. Horr, A.M.; Drexler, H. Real-Time Models for Manufacturing Processes: How to Build Predictive Reduced Models. Processes 2025, 13, 252. [Google Scholar] [CrossRef]
  6. Horr, A.M. Real-Time Modeling for Design and Control of Material Additive Manufacturing Processes. Metals 2024, 14, 1273. [Google Scholar] [CrossRef]
  7. Lebon, B. directChillFoam: An OpenFOAM application for direct-chill casting. J. Open Source Softw. 2023, 8, 4871. [Google Scholar] [CrossRef]
  8. The OpenFOAM Foundation. Available online: https://openfoam.org (accessed on 15 August 2025).
  9. NovaFlow&Solid NovaCast Systems AB. 2022. Available online: https://www.novacast.se/product/novaflowsolid/ (accessed on 15 August 2025).
  10. Bennon, W.D.; Incropera, F.P. A continuum model for momentum, heat and species transport in binary solid-liquid phase change systems—I. Model formulation. Int. J. Heat Mass Transf. 1987, 30, 2161–2170. [Google Scholar] [CrossRef]
  11. Horr, A.M. Real-time Modelling and ML Data Training for Digital Twinning of Additive Manufacturing Processes. Berg Huettenmaenn Monatsh. 2024, 169, 48–56. [Google Scholar] [CrossRef]
  12. Vreeman, C.J.; Schloz, J.D.; Krane, M.J.M. Direct Chill Casting of Aluminium Alloys: Modelling and Experiments on Industrial Scale Ingots. J. Heat Transf. 2002, 124, 947–953. [Google Scholar] [CrossRef]
  13. Horr, A.M.; Gómez Vázquez, R.; Blacher, D. Data Models for Casting Processes—Performances, Validations and Challenges. IOP Conf. Ser. Mater. Sci. Eng. 2024, 1315, 012001. [Google Scholar] [CrossRef]
Figure 1. Typical snapshot matrix, the correlation of process data, and the graphical interface used for data model generation.
Figure 1. Typical snapshot matrix, the correlation of process data, and the graphical interface used for data model generation.
Materproc 24 00001 g001
Figure 2. (a) HPDC machine with numerical simulation domain; (b) vertical casting machines with numerical simulation domain and temperature contours for CFD and data model.
Figure 2. (a) HPDC machine with numerical simulation domain; (b) vertical casting machines with numerical simulation domain and temperature contours for CFD and data model.
Materproc 24 00001 g002
Figure 3. Temperature graphs for CFD and data models with descending maximum and minimum temperature curves: (a) sensor no. 1; (b) sensor no. 2.
Figure 3. Temperature graphs for CFD and data models with descending maximum and minimum temperature curves: (a) sensor no. 1; (b) sensor no. 2.
Materproc 24 00001 g003
Figure 4. Temperature graphs for CFD and data models: (a) DOE no. 1; (b) DOE no. 2.
Figure 4. Temperature graphs for CFD and data models: (a) DOE no. 1; (b) DOE no. 2.
Materproc 24 00001 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Horr, A.M.; Vázquez, R.G. Integrating Data Science and Numerical Methods for Next-Generation Metal Processing. Mater. Proc. 2025, 24, 1. https://doi.org/10.3390/materproc2025024001

AMA Style

Horr AM, Vázquez RG. Integrating Data Science and Numerical Methods for Next-Generation Metal Processing. Materials Proceedings. 2025; 24(1):1. https://doi.org/10.3390/materproc2025024001

Chicago/Turabian Style

Horr, Amir M., and Rodrigo Gómez Vázquez. 2025. "Integrating Data Science and Numerical Methods for Next-Generation Metal Processing" Materials Proceedings 24, no. 1: 1. https://doi.org/10.3390/materproc2025024001

APA Style

Horr, A. M., & Vázquez, R. G. (2025). Integrating Data Science and Numerical Methods for Next-Generation Metal Processing. Materials Proceedings, 24(1), 1. https://doi.org/10.3390/materproc2025024001

Article Metrics

Back to TopTop