3D Printing and Implementation of Digital Twins: Current Trends and Limitations

Fabricating objects with desired mechanical properties by utilizing 3D printing methods can be expensive and time-consuming, especially when based only on a trial-and-error test modus operandi. Digital twins (DT) can be proposed as a solution to understand, analyze and improve the fabricated item, service system or production line. However, the development of relevant DTs is still hampered by a number of factors, such as a lack of full understanding of the concept of DTs, their context and method of development. In addition, the connection between existing conventional systems and their data is under development. This work aims to summarize and review the current trends and limitations in DTs for additive manufacturing, in order to provide more insights for further research on DT systems.


Introduction
The challenging conditions of 21st century markets require the constant evolution of currently existing products. Ranging from innovative small-to-medium-sized enterprises (SMEs) to vast multinational companies, there is a constant struggle to offer the best product range based on consumers' needs. The discovery of new raw materials, the introduction of new manufacturing methods and the continuous effort for research and development of new products set a highly competitive commercial environment that evolves at a prodigious pace. Thus, the need for minimizing product development time is stronger than ever. However, the traditional methods of industrial prototyping cannot always keep up with the demanding time standards set by companies eager to gain a time advantage over their competitors.
Prototypes were and, in some cases, still are made from traditional materials such as wood or clay. This practice is time-and material-consuming. In addition, these kinds of prototypes served mostly as visual aids and could not always stand as functional prototypes.
As a tool to help overcome the limitations of traditional prototype fabrication, in the late 1980s a new manufacturing method was introduced [1]. This fabrication technique has to do with the automated fabrication of three-dimensional solid objects sourced from a digital computer-aided design (CAD) file. This was made possible by using additive manufacturing processes in which the deposition of successive material layers on top of each other leads to the final fabrication of the pre-determined three-dimensional physical object. Each successive layer comprises a sliced horizontal section of the final object.
Additive manufacturing methods give designers the ability to design 3D objects and fabricate them in their office swiftly and cheaply [2]. In this way, designers have the competence to rapidly examine their design in physical form. Therefore, they can evaluate their design and can conduct the necessary modifications that will lead to the ideal product [3][4][5].
However, despite its innovative features, 3D printing technology also features a number of disadvantages. Apart from parameters such as elevated equipment and raw material cost (in some cases), surface quality and structural integrity have been a controversial topic with researchers, suggesting a vast number of ways to achieve the desired results.
For example, in the case of metal 3D printing technologies, literature findings suggest that fabricated items can suffer from uncontrolled porosity and brittleness while being prone to shrinkage and brittleness [6]. In addition, such methods produce a considerable degree of waste while featuring high purchasing and operating costs [6].
On the other hand, fused deposition modelling (FDM) 3D printing technology that utilizes various polymers as raw materials, despite the major advantages featured over injection molding and machining/subtractive technologies, also suffers from a number of challenges having an impact on surface quality and structural integrity. The biggest challenge in this case is the proper tuning of a number of process parameters, prior to printing, that have a direct impact on the surface quality and structural integrity of the 3D-printed item. Such settings have to do with layer height, extrusion-nozzle temperature, deposition speed, retraction speed and distance, etc. What is more, the proper tuning of these parameters is a rather demanding task for the user due to the fact that the impact of each parameter and their interactions cannot be easily predicted and evaluated. Correspondingly, failed prints may occur for a number of reasons. Motor stall, nozzle blockage, bearing failure, timing belt break, abnormal extrusion and item detaching from the 3D printer's bed are some of the reported reasons for failed 3D Prints [7]. Such cases are estimated to cause a material waste of around 19% according to the literature [8,9]. The total percentage of the material wasted in FDM 3D printing processes is estimated to be 34%, plus the potential material used for the fabrication of the support structures as well [9].
In this way, users have to rely on a trial-and-error modus operandi in order to achieve the desired results. However, this is obviously very time-consuming, and is accompanied by an elevated number of expenses. Therefore, researchers are proposing a number of methods in order to prevent 3D prints with undesired characteristics. Researchers propose the use of a variety of sensors (optical, embedded, etc.) [10] to gain accessibility to data such as residual strains [11][12][13], temperature distribution within the printed item as well as its surrounding environment [14], extrusion nozzle flow rate [15], layer adhesion [16], etc.
Furthermore, finite element analysis (FEA) and computational fluid fynamics (CFD) modeling is adopted in order to perform process simulation. A limitation of computational simulation is that it is sensitive to the numerical assumptions adopted to simplify the modeling process. For example, artificial intelligence, and more specifically machine learning, differs from classical computer programming, as it is a cognitive field where learning from real-world examples takes place without relying on rules set in advance in the program. As a result, machine learning models can learn extremely complex compounds from large data amounts without the need for common sense [17].
These methods are focusing on providing all available data to the user beforehand, in order for him to gain the desired knowledge; thus, they empower the user to provide the optimized inputs. Despite the profound value of the acquired data, they are of limited value if they are not incorporated into a greater monitoring system that would not only detect in situ flaws and irregularities but would intervene in order to correct them during the process. Moreover, it would be great if this monitoring system could be enriched with predictive characteristics that would alert users that their initial process parameter inputs might lead to a failed print. Ideally at this point, the system would provide users with the optimized process parameter settings based on the observations and the data inputs collected and processed from previous prints. In order for such a complete system to exist and successfully function, technologies such as digitization, artificial intelligence (AI), machine learning (ML) and deep learning (DL) have to be utilized alongside big data collection from a variety of sensors that would provide all relevant data.
As the literature suggests, the appropriate usage of AI is to deal with problems that are not formulated into code and can only be approached/solved intuitively [18]. Machine learning (ML) is considered a subset of AI and its utilization only recently shifted from pure research to industrial applications [19]. ML can be further subdivided into three categories: supervised learning, unsupervised learning, and reinforcement learning, with the majority of ML methods used in 3D printing applications consisting of supervised methods [19]. While digitalization is a trend vastly embraced and currently incorporated in the industrial sector, there is a great number of datasets produced which can be used as inputs to a ML process that would subsequently gain relevant knowledge. Moreover, deep learning (DL), a subcategory of ML and AI and their uses in the cases of convolutional neural networks (CNN) as well as alternative deep neural network (NN) architectures, showed great effectiveness for relevant applications such as image recognition or detecting fabrication irregularities [20,21].
In this context, these technologies fall into the umbrella of Industry 4.0. Industry 4.0 is the term describing the current trend of increased use of automation and data exchange in modern production technologies. It includes cyber-physical systems, Internet of Things, cloud computing and cognitive computing [22][23][24][25]. Industry 4.0 is commonly referred to as the fourth industrial revolution [26]. According to the pioneering work of Klaus and Hilde Schwab, who are widely recognized as those who originally conceived this term and still support it via the non-profit organization "The World Economic Forum". Industry 4.0 is characterized by a fusion of technologies that limits the barriers between the physical, digital, and biological spheres [27][28][29][30][31]. The implementation of the tools provided by Industry 4.0 has an immense impact on the production technologies field. Undoubtedly, 3D printing is a pillar technology of Industry 4.0 as well as cyber-physical systems. The use of cyber-physical systems in order to describe and help to improve parameters such as surface quality, structural integrity, speed and optimized material usage would have profound benefits.
In this context, a new scientific approach has emerged, coined under the term of digital twins (DT). According to the concept of digital twins, for every living or non-living being/entity (physical object), a digital "copy" is created, namely, its digital twin. Thus, for anything, e.g., for people, products, systems, machines, etc., the connection of the physical operating model with the computer model in the virtual environment is attempted. This connection is made by producing and using real-time data. In the computational environment of cyberspace (virtual cyber-space), which can be hosted, e.g., in a data server, heterogeneous information about the physical object is collected. This includes details about the elements but also about the dynamics of the object, information from the object environment, as well as information about the experiences of other entities from the studied physical object (e.g., user experience during production, end user testimonials, etc.). In this way, digital twins can be understood as a representation of a fabrication process or service in the digital world, governed by specific properties and conditions. Therefore, real-world processes and items along with their surrounding environment can be digitally transferred and described in a cyber-world context. In this way, a digital twin can be utilized in order to analyze and further understand an item or process, leading to its optimization without spending the resources required in a real-world trial-and-error modus operandi [32][33][34][35].
Digital twin technology has seen vast growth in the last decade. It is a natural evolution and fusion of various emerging technologies such as Internet of Things, wireless networks, sensors integration, machine learning, artificial intelligence, big data and data visualization in the essence of virtual and augmented reality. Undoubtedly, it is one of the pillars of Industry 4.0 towards achieving holistic control and communications between labor, machines and management.
The important advantages of a dynamic and fully realistic digital representation of a physical object are numerous. Understanding of physical objects as well as the design of new objects are assisted and the ability to model and optimize is upgraded. Through the recording and archiving of the operating data of the digital copy (digital twin), and with reference to real time, the time tracking (tracing) of behavioral elements of the physical object is now more feasible. Also, the ability for diagnosis (prediction) and prognostics is upgraded.
For the implementation of "digital twin" technology, three distinct steps are primarily followed. Firstly, the digital twin prototype (DTP) must first be created, including initial planning and analysis data along with procedures for implementing the physical system. Then, the digital twin of each of the properties of the physical object (digital twin instances (DTI)) must be designed. Finally, all digital twins of systemic properties are to be integrated (digital twin aggregate (DTA)), so that, through data and the information obtained, further understanding and prediction can be made possible. Ideally, the digital twin should be able to represent with realism, accuracy and fidelity its natural twin, i.e., the physical object, and in real-time to depict the state of the physical object. For this to happen, the use and integration of innovative technologies is required, some of which are in fact the "pillars" of the fourth industrial revolution (Industry 4.0). These are depicted in Figure 1. The Internet of Things (IoT) can certainly be considered a prerequisite for digital twins because, through wireless connectivity between different objects in the same or different physical parts, real-time data acquisition can be achieved. Then the sensors, as a component of tiny and low-energy embedded systems, are integrated into critical points of the physical object, so that the digital duo is constantly updated with valid data from the real physical system to its actual operating environment (in situ). Also, the data networks ensure the smooth exchange of data (measurements and commands) between the physical and digital twins and are usually in different geographical locations along with cloud computing. This includes all the tools and techniques for developing applications in the cloud computing online environment so as to maximize the potential for the inclusion of multiple and heterogeneous human and artificial resources. Other pillar technologies are simulation and emulation, which are well-known technologies whose methods have recently been rapidly evolving, utilizing all technological developments in areas such as computer aided design (CAD), data analytics and big data analytics. In addition, data visualization, through graphics technologies (computer graphics), virtual reality (VR) and augmented reality (augmented reality (AR)), exhibits maximum realism to the representation of the physical object within the virtual cyberspace. Moreover, artificial intelligence (AI), through its technologies for mechanical and deep learning (machine learning (ML), deep learning (DL)) allows the digital twin to self-train (self-leaning, selftraining) and adapt the mathematical-computational model to the real physical model almost in real time.
Digital twins incorporate the aforementioned technologies in order to tackle issues often faced in 3D printing processes. In the field of 3D printing, digital twins can be very useful towards leading to process parameter optimization by monitoring the process and detecting possible faults by utilizing in situ sensors' data as input. There is a constantly growing number of digital twins being used in various fields [36][37][38][39][40][41][42][43][44][45][46][47][48][49]; however, their use in the 3D printing process is still underway or limited.
Digital twin technology can provide significant benefits in all areas of human activity, e.g., in the livestock sector, in shipping, in the automotive industry and in industrial production. The capabilities of digital twins can be applied to improve the design, management, maintenance, development and generally all industrial activities related to products, services, equipment, processes and functions as well as human resources. Indicative applications can be the design of new products based on the "experience" of all involved, but also the deep knowledge of the properties of the products, product lifecycle management (PLM), forecast of the occurrence of damages in production equipment, valid and timely preventive equipment maintenance, diagnosis of production problems causes, analysis of work experience as well as hyperautomation and support for robotic process automation (RPA). Siemens, among others, can be considered as one of the leading companies in this field, providing digital twin solutions for various industrial applications [50,51].
The DT concept does not come without shortcomings and potential challenges. The most important ones are probably system interoperability, respect for conforming with standardization principles, software and hardware openness, training, trust in experts (expertise), financing capabilities and a greater modus operandi allowing us to think as humans and not as machines (think human).

Current Digital Twin Applications in 3D Printing Technologies
Current literature findings suggest that the compilation of successful digital twins to be applied in the various 3D printing technologies is feasible. With such cases being still relatively new, such applications face a number of challenges. The accurate digital representation of the process's hardware and surrounding environment, along with the initial software inputs and the amount of data from different sensors to be processed, possesses a complex workflow to be followed in order to come up with a functional DT [13].
DebRoy et al. performed a first attempt to describe and fabricate a digital twin referring to a 3D printing process that utilized metal powder as raw material. It is described that there are simulation models capable of feeding a DT with inputs and that the combination of knowledge available in 3D printing and welding methods is preferable for making a digital twin than 3D printing techniques using metal as raw material [13]. The aspects of the process attempted to be addressed by the proposed DT are heat and material rheological simulation, solidification simulation, metal powder grain structure and texture, modeling of the material's porous microstructure and mechanical properties as well as residual stresses and subsequent potential distortion predictions. Their proposed DT are comprised of a combination of big data and ML, a statistical model, a control and sensing model, accompanied by a mechanistic model. Knapp et. al. proposed the development of a phenomenological framework, using the term digital twin, able to provide forecasts about the most vital parameters affecting the metallurgical properties of the 3D-printed items [52]. Per se the framework would give users the capability to provide various parameter settings to the 3D printer and, subsequently, to receive estimations about critical parameters such as transient temperature boundaries, molten pool geometrical characteristics, time and spatial alterations of cooling rates and solidification rates. Theoretically, the proposed digital twin of the 3D printing process, when sufficiently fed and validated with experimental data, would cause a shift from the current trial-and-error operating modus operandi to 'numerical experiments' that do not require pores of the physical world.
On the other hand, Gaikwad et al. proposed their own case of a digital twin utilized for in situ real-time monitoring of the 3D printing fabrication process as well as material defect forecast [53]. In this case, this digital twin functionality was based on the inputs of various sensors accompanied by machine learning techniques and physical-model prognosis. It was tested in directed energy deposition (DED) and laser powder-bed fusion (LPBF) metal 3D printing techniques, showing promising results.
In addition, Yang proposed the use of the well-known grey-box modeling that is being used in various other fields as well [54][55][56][57][58]. A figure indicating a grey-box model is depicted in Figure 2. The model, interestingly not coined under the term of "Digital Twin", showed lower forecasting errors by utilizing the sensors' data obtained and performing relevant calculations [59]. The case referred to powder-bed fusion metal 3D Printing. A common feature of all the aforementioned proposed DT cases concerning additive manufacturing is their focus on metal 3D printing. This is mainly due to the fact that metal 3D printing is considered a 3D printing field characterized by elevated equipment and raw material costs. Therefore, a valid prediction of the process outcome in qualitative terms features a profound economo-technical benefit. However, the application of digital twins to 3D printing techniques such as FDM and SLA would be of great value due to their widespread use, regardless of the lower costs that these techniques share.
In this direction, Chhetri et al. proposed a digital twin case that focuses on 3D printing techniques using polymeric materials. It functions under an operating scheme that constantly obtains data about the surface texture and dimensional accuracy of the fabricated object [60]. Therefore, in this way the DT obtains data during the manufacturing procedure and makes constant real-time estimations and corrections. In another case, Khan et al. proposed a digital twin that also focused on the FDM technique, employing polymeric raw materials. This project proposes the use of a CNN deep learning model that will detect in situ real-time material defects, thus reducing the percentage of failed prints. The data input of the CNN model is derived from an integrated optical sensor in the vicinity of the printer's extrusion nozzle.
There are also other such cases which primarily use techniques such as the finite element method, finite difference and level set methods as well as fluidic volume estimations, which are more likely to be defined under the term "simulation" than the term "Digital Twin" [61][62][63][64][65][66][67][68]. While this can be attributed to terminology, since the term digital twin is a term that has very recently arisen, in most cases computational methods and experimental-sensor-derived datasets are being utilized in a fragmented way and not under the greater umbrella of a digital twin model. Therefore, procedures such as quality control, quality monitoring or production process monitoring can be performed as stand-alone procedures, but can be also put under the greater umbrella of digital twins. By this is meant that procedures such as quality control, quality monitoring or production process monitoring can provide data (via the use of optical and other non-invasive sensors) that can be integrated into a digital twin. From that point and after, the digital twin will provide us with useful conclusions by also using computational techniques such as AI, ML and big data.

In Situ Monitoring
Opposed to traditional subtractive technologies, where in situ monitoring technology is being commonly implemented, 3D printing techniques have not yet integrated monitoring sensors, apart from experimental cases. However, current literature findings suggest that new approaches are being tested that will eventually allow for the seamless integration of monitoring techniques and lead to the desired surface quality and mechanical properties standards.
An accurate digital mapping and representation of the physical environment surrounding the 3D printer is highly desirable due to its immense role in the outcome of the process. This environment is highly dynamic and is also influenced by the human factor. Airstreams as well as humidity in the surrounding environment are known factors that can affect 3D printing items' overall quality [2]. Therefore, it needs to be constantly and accurately monitored for the digital twin to be able to predict these interactions. In this way, in situ process monitoring is based on the analysis of in situ acquired sensor data. Specific datasets are being subsequently linked to fabricated items' distortions and failures, geometries and specific process conditions using ML models [69,70].
Unfortunately, there are a number of drawbacks on utilizing solely data-driven approaches in order to perform the in-situ monitoring process. Firstly, a large number of completely different datasets derived from various sensors have to be acquired and processed. These can range from temperature and strain readings to high-rate imaging data. Thus, acquiring and processing of different data need to be combined with intra-system compatibility, the possibility of which remains questionable. In addition, the computational power for such an operation needs to be elevated, and in most cases desktop PCs are seldom up to this task.
Secondly, monitored factors such as built item geometrical characteristics, deposition patterns, temperatures and deposition speeds are all linked to specific acquired signal patterns; thus, data driven models need to be retrained if any of these factors change. Obviously, this causes a great concern due to the fact that such a model will not be compatible with another 3D printing technology.
Thirdly, ML models often require the acquisition of an elevated amount of input sensor data, which increases model complexity and may lead to prediction uncertainty. However, in the case of a digital twin to be utilized in the greater field of 3D printing technologies, a large amount of data is needed in order for the DT to be sufficiently trained and thereby for its long-term accuracy to be improved. Such data to be entered to the DT could be derived from real-world experiments, literature, various embedded sensors, or numerical and theoretical simulations. Gaikwad et al. [53] propose the combination of a theoretical model and a lower number of in situ sensor readings in order to reduce the computational burden as well as the amount of sensor data that need to be acquired and processed. Another suggestion could be that for each of the common 3D printing processes, there should be a predefined model for each different scenario (i.e., variety of raw materials, 3D printer environment, printer category, etc.). Thus, users would just need to input the data to use that targeted DT of their specific real-time fabrication process. However, further research is still needed to reach this point.

Valid Forecast of the 3D Printing Procedure Results
Nowadays, simulation forecasts concerning 3D-printed items mostly deal with general distortions and alterations (such as contractions from residual strains) of the pre-determined CAD-designed shape of the item as well as temperature distribution within the fabricated item's body. Figure 3 shows such a distortion of a 3D-printed item, called warping, caused by thermally induced residual strains. Such operations are based on the simplistic idea of dividing the part into different unit cells and, by computing the properties of each cell, obtaining the properties of the whole part. Figure 4 depicts a mechanistical approach model based on the literature [34].
Being able to forecast the thermal history, deformation, microstructure, proper material rheological characteristics from the extrusion nozzle and the mechanical properties is of paramount importance for the DT to make accurate predictions. Only by forecasting the aforementioned information is it possible to tailor the process parameters and gain full control of the fabrication process, to successfully monitor the surrounding environment conditions and to reach the desired results. Figure 5 depicts the process workflow of a digital twin in additive technologies.

Conclusions
3D printing technologies are considered to be an integral part of the fourth industrial revolution. Their contribution towards bridging the gap between the designed and the final manufactured product is immense. Its global adoption due to the expiration of its initial patents and the launch of desktop 3D printers is an undeniable fact.
However, users still mainly rely on trial-and-error to reach the desired characteristics of the final parts, which often suffer from a number of imperfections. Simulations of the processes mainly exist and are being carried out at an academical level that surpasses the knowledge and resources that an average user can allocate (especially desktop FDM 3D printer users, who are the majority).
Digital twins are a newly introduced and very promising way to overcome many problems faced in 3D printing technologies, such as process simulation, process monitoring and control. They can help to understand in depth the contribution of the various processing parameters and the influence of these parameters on the overall final quality of the fabricated item. In addition, they can provide feedback information for gaining active control of the manufacturing process by performing real-time corrections. The first generations of 3D printing DTs are either still being developed or their tests are still underway. However, future work should focus on developing accessible and easy-to-apprehend DTs for every single 3D printing technique, which will act as a stepping-stone for users to adopt and contribute to their further evolution.