1. Introduction and Motivation
When discussing topics related to digitization and technological innovation in the context of an industrial system, Industry 4.0 is frequently mentioned. Industry 4.0 pertains to the intelligent interconnection of machines and processes in the industrial sector by utilizing Cyber–Physical Systems (CPSs). The CPS is a technology that enables intelligent control by integrating embedded networked systems [
1]. It is subject to various interpretations. Vogel-Heuser and Hess [
2] examined the primary design principles of Industry 4.0, which include a service-oriented reference architecture, intelligent and self-organizing Cyber–Physical Production Systems (CPPSs), interoperability between CPPSs and humans, and secure data integration across disciplines and the entire lifecycle.
The Digital Twin (DT) has become increasingly prominent in recent years, being widely discussed in the literature on Industry 4.0 and conversations within large companies, particularly in the IT and digital sectors. The term “Digital Twin” is evocative and partially self-explanatory, although not entirely accurate. As a result, two outcomes have emerged. Firstly, the concept has gained significant popularity across various application areas, surpassing its initial domains, such as aerospace engineering, robotics, manufacturing, and IT [
3]. Furthermore, there is a lack of consensus concerning the exact definition of the term, its areas of application, and the minimum requirements to distinguish it from other technologies.
Although there is no consensus on the definition of the DT, it is possible to identify the following definition as comprehensive enough for the manufacturing environment: the Digital Twin consists of a virtual representation of a production system that is able to run on different simulation disciplines, characterized by the synchronization between the virtual and real system, thanks to sensed data and connected smart devices, mathematical models, and real-time data elaboration [
4].
Compared to Grieves’ (2005) initial idea [
5], it is important to highlight that a fundamental concept remains intact. This concept emphasizes that a DT is not simply a digital representation of a physical system. What sets a Digital Twin apart from a basic digital model is its ability to engage in a bidirectional exchange of information. It is also possible to identify some of the most common dimensions and peculiar axes that can be used to capture the nature of a DT [
6]: the integration breadth, the connectivity mode, the update frequency, the simulation capabilities, the human interactions, and the digital model richness.
When examining the concept of the DT in the industrial domain, specifically in manufacturing, it is important to consider several crucial factors to characterize the work direction accurately. Relevant examples include the scope of the area that the Digital Twin needs to encompass, the time horizon it needs to capture and simulate, the information systems it must interact with to generate additional value, the technology employed to create the digital component of the Digital Twin, and the reference framework [
7,
8].
Numerous studies and industrial applications have demonstrated that DTs have numerous uses across various decision-making levels. For instance, in the context of manufacturing processes, they can be employed to enhance production schedules, detect quality control problems, and anticipate equipment malfunctions. In engineering and manufacturing, the DT has been demonstrated to be an incredibly versatile and useful tool for decision support across the entire decision-making hierarchy, from more operational to strategic decisions.
Starting from a strategic point of view, the work of Kunath and Winkler [
9] has introduced a DT model tailored for manufacturing systems, functioning as decision support for order management within the framework of Cyber–Physical Systems. Simultaneously, Liu et al. [
10] pioneered the application of the DT in Computer-Aided Process Planning, specifically addressing the evaluation of optimal machining process routes. Complementing this, Yi et al. [
11] undertook an extensive case study centering on the intelligent design of a smart assembly process.
Taking a tactical approach, Biesinger et al. [
12] developed a CPS-rooted DT for production planning. Also, Novák et al. [
13] delved into the integration of the DT within smart production planning, focusing on integrating the DT with production planners employing artificial intelligence methods. There are also examples of DT applications to facilitate the optimization of short-term production planning [
14,
15].
Regarding operations, Zhang et al. [
16] proposed a tailored DT for dynamic job-shop scheduling, providing a responsive solution to the evolving demands of manufacturing environments. Previously, Fang et al. [
17] formulated a DT framework supporting scheduling activities in a job-shop scenario. In advancing this narrative, Negri et al. [
18] presented a holistic DT framework designed to tackle scheduling challenges in the presence of uncertainty, enhancing the adaptability in operational scenarios.
Turning attention to the control and monitoring activities, Zhuang et al. [
19] introduced a DT designed to enhance the traceability in the complex products manufacturing process. Venturing beyond traceability, Pietrangeli et al. [
20] structured a DT model leveraging Artificial Neural Networks to monitor the behavior of a specific mechanical component. On a different trajectory, Liu et al. [
21] contributed by implementing a DT for the remote control of a Cyber–Physical Production System.
It is significant to observe that, despite the varying specific application areas and scopes, a substantial portion of these contributions depends on sensors or information systems to facilitate effective information exchange, while the virtual counterparts of the DT are frequently modeled using simulators, specifically discrete event simulators, and artificial intelligence or neural networks. The matter of information sources and the extent of decision making in the utilization of digital technologies is highly relevant and delicate, highlighting how varied management and the integration of identical technologies can address a broad spectrum of demands and requirements.
This diverse set of contributions, as summarized in
Table 1, highlights the broad applicability and nuanced benefits of Digital Twins across various phases and facets of industrial operations. In
Table 2, the acronyms used in this paper are presented.
This work aims to provide a framework for the integration of a DT supporting decision making for production planning in a discrete manufacturing context, working with a specific company in Italy. The scope of the decision making will be related to operation monitoring and management to make the production planning decisions more informed and resilient in the case of delays and/or disruptive events.
The paper is structured as follows: in
Section 2, the methodology followed for the study is presented; in
Section 3, the peculiarities of the information systems worked with are presented, proposing integration requirements for DTs; in
Section 4, the results are presented.
Section 5 discusses the main points of the framework, and conclusions follow.
2. Materials and Methods
The following is a description of the working method used in the drafting and proposal of a framework for a DT, fed by the MES data, to support decision making in production planning for a specific company in northern Italy.
The framework development proposed in this paper is conducted in five main phases and performed incrementally. The phases are as follows and are presented in
Figure 1:
DT objective definition.
Functional requirements definition.
Information and material flow mapping.
Integration requirements and definition.
Framework proposal.
The development of these points is only partially exhaustive for the actual technical feasibility testing for the implementation. However, it provides an initial benchmark for comparing the next steps after the project definition. Adopting this approach allowed for the development of a solution tailored to the specific business context.
The first phase is the definition of strategic and operational objectives by the company, which expresses the need for decision-making support related to production management in order to be able to improve the internal punctuality of the production department and improve the material accumulation situation in the inter-operational buffers. The aim of this initial phase is to identify the area of business interest to convert it into business objectives. The business objectives will serve as the input for the subsequent phase, which will focus on identifying the technical requirements to be fulfilled for the DT.
The identification of technical objectives requires the comprehensive characterization of the final product that matches with the previously established business objectives. The specific technical requirements will be elaborated upon, including the selected implementation medium and software, while the analysis of characteristic requirements will be based on Stark’s work [
6] as a reference. Notably, developing a DT does not require an aprioristic effort toward increasing or improving all dimensions. At this framework proposal stage, it is necessary to identify the critical and most relevant ones, and the right trade-offs. The output of this phase is a list of system requirements and the identification of the most suitable software to develop the DT.
The third phase is logically independent from the previous two and focused on mapping the information and material flows within the company. Data from existing machinery, control systems, sensors, and management software are analyzed. This phase is essential to understand how to seamlessly integrate real data from the production system within the virtual model of the Digital Twin. The mapping of material flows involves an analysis of existing production processes and the technological infrastructure already in place in the company.
Once the information flow is mapped, the requirements analysis for the integration is conducted, which must be performed for both the enterprise information systems and the techniques and technologies already used in the company. This phase allows for the integration of outputs from the second and third phases of the project, where the technical requirements are contextualized within the examined manufacturing environment, enabling the identification of the information flow exchange requirements within the final DT structure.
In the final stage, the outcomes of the preceding stages will be consolidated and synthesized. The amalgamation of technical elements, functional elements, business objectives, and integration components forms the proposed framework, which will be presented in the Results section.
3. DT Objective and Functional Requirements Definition
The first two phases of the work are related to the identification of the objectives of the DT and the definition of the depth of the functional requirements.
3.1. DT Objective Definition
To identify the objectives of the DT for the company, it was first necessary to identify why the company opted for such a solution. The company has the following two main goals: operational goals and organizational goals. The operational goals focus on enhancing on-time demand satisfaction. The organizational goals aim to enhance the efficiency and accuracy of the processing times, cycle times, and makespan updates for different codes. The Digital Twin will autonomously acquire some of this information and auto-adapt, eliminating the need for explicit efforts from the company.
The DT will serve two different purposes that align with the business operational goals. The DT will need to monitor the production progress from a predictive process monitoring perspective, providing appropriate alerts when a batch is expected to be completed late, and it will need to propose rescheduling options should it be decided to take action to avoid such a reported delay. This second function requires optimization through the simulation of what-if scenarios, and it is the core functionality supporting and improving decision making.
The overall objective of the DT is to facilitate planning activities and to enable comprehensive, sophisticated, and responsive analysis in response to the presented scenarios. An illustration of this analysis is simulation optimization: utilizing an appropriate Digital Twin framework, one could effectively assess rescheduling alternatives in the event of a disruptive occurrence, such as extended machine downtimes, significant defects in previously synchronized assembly lots, or supplier delays in inputs. Furthermore, the optimization objective could differ: it may be designated as the aim to minimize inventory buildup or to reduce subsequent delays to the end-customer.
3.2. Functional Requirements Definition
After conducting a software/tool selection process, a Discrete Event Simulation (DES) model was chosen as the virtual counterpart of the Digital Twin.
The Discrete Event Simulation’s operation, which updates the memory state solely upon the occurrence of specific events, renders it the most suitable technical solution for depicting the functioning of a batch processing factory. Additionally, a primary cause of schedule disruptions is linked to machinery or resource availability, events that can be effectively modeled using Discrete Event Simulation. The DES model will be implemented using open-source software to enhance the integration across various technologies and tools already present in the company. The specific characteristics of the environment under analysis determine the selection of a discrete event simulator.
In this case, the company operates a large number of machines and processes batched orders, making a discrete event simulation model the most suitable approach to describe the system accurately. The interactions that will be defined in the DES model will be obtained from the historical data acquired from the MES. It is important to underline that the data fed into the DES model will be prepared and preprocessed using process mining techniques. The decision to opt for an open-source tool is motivated by the necessity to integrate other existing functions within the company seamlessly and to avoid the substantial initial costs associated with the project, reflecting a cautious approach during these initial stages.
As anticipated in the work of Stark [
6], it is possible to identify eight baseline dimensions for a Digital Twin, which are as follows: the integration breadth, connectivity mode, update frequency, Cyber–Physical System intelligence, simulation capabilities, digital model richness, human interaction, and Product Lifecycle. The functional characteristics of the required DT can be identified by evaluating these dimensions within the studied context. The requirements are described as follows:
Integration breadth: the model has to be integrated into the whole production system.
Connectivity mode: the interconnection between the real and virtual worlds must be bidirectional; however, during the initial usage phase, system decision-makers will scrutinize and filter any insights gained from the simulation and forecasting. The goal for this dimension is to gradually shift to a more automated mode of connection.
Update frequency: while consistent with the Industry 4.0 philosophy, a completely real-time dataflow is currently impractical and would contain too much detail for effective planning. The frequency of updates is primarily determined by the time required to extract data from the MES system and have them preprocessed and cleaned to be fed into the DES model. It is prudent to monitor data updates and analyses regarding order progression daily and to implement corrective measures judiciously over time to avoid excessive interventions or rescheduling activities, which can be detrimental if too frequent.
Cyber–Physical System intelligence: in the first phase of the work, the interconnections from the virtual world to the real one will be partially automated and occasionally human-triggered.
Simulation capabilities: Discrete Event Simulation is a specialized method for modeling stochastic, dynamic, and discretely evolving systems. This technique facilitates the dynamic analysis of production progress and enables timely deviation assessment, thereby enhancing the ability to identify underlying causes. Besides assessing deviations, it will simulate diverse order distributions to facilitate more thorough optimization regarding business objectives and overall service levels.
Digital model richness: the digital model will be sufficiently expressive of the intricate relationships among the batches, the orders, and the availability of the machine, and the level of complexity will be dependent on the data quality that will be possible to extract from the company information system.
Human interaction: this dimension will be the less-developed one, given that only highly trained decision-makers will interact with the DT, and there will not be a deeply evolved visualization for the digital model.
Product Lifecycle: the represented phase of the products will be the mid-life.
4. DT Information and Material Flow Mapping and Integration Requirements
The third and the fourth phases are dedicated to delving into the analysis of the integration requirements for the “as-is” company scenario.
4.1. Information and Material Flow Mapping
Regarding the mapping of information and material flows, it is important to briefly describe the reference system. The company under consideration is a manufacturing company that produces final units in batches and has a mixed market response system: it produces both to order and based on the forecasts of its distributors. There are priority treatments for certain types of orders, but there is no explicit rule of customer differentiation, since production campaigns can last from a few days to several months. The production logic is a job shop with hundreds of machines divided into departments within the production floor. Given the breadth of the range, the extremely volatile production mix, and the time extension of the production campaigns, not only can it be challenging to plan monthly production or schedule weekly and daily production, but also the data that can be acquired from information systems can be very fragmented and complex to aggregate.
The way to track material flows in the company was identified by analyzing historical data from the company’s main information systems using data analysis techniques, data mining, and process mining.
The reference information systems in a manufacturing company are Product Lifecycle Management (PLM), Enterprise Resource Planning (ERP), and the aforementioned MES.
These three systems are independent but intrinsically interconnected. It is possible to define them as follows: the PLM is a strategic methodology for managing data, processes, documents, drawings, and resources throughout the lifecycle of products and services, starting from the conception, development, production, and recall; the ERP consolidates all business processes, such as sales, purchasing, inventory management, finance, and accounting into a unified system to improve managerial support; the MES is a system designed to manage and control a company’s production processes, including order dispatch, quantity and time optimization, inventory warehousing, and direct machinery connectivity. These three systems integrate the following different information sources: the PLM integrates information from different sources throughout the life of a product, the ERP integrates different business functions, and the MES integrates heterogeneous information that may come from the production unit.
Specifically, the MES collects highly accurate and timely information on machine utilization, resource utilization, and batch progress, and the recorded times can be accurate down to real time due to direct connection on the machines via the PLC and SCADA. However, it is important to point out that not all machines are automatically connected to the MES. There are many manual pouring machines in the company’s reality, so the quality, veracity, and synchronization of the data depend to a substantial extent on the accuracy of the operator.
The points of contact of these information systems, which, as explained, already represent systems to consolidated data, are depicted in
Figure 2. It is evident that these systems need to be frequently updated and checked, and that there are data attending, albeit for different purposes, in more than one of the information systems. In general, it can be assessed that the PLM contains the most time-stable information, while the MES contains the most rapidly changing information.
Looking at the relationship between the PLM and ERP, the PLM provides the ERP with information related to product master data, the bills of material, and technical specifications. From the ERP, on the other hand, it is possible to identify revision orders to the PLM if the products are discontinued, components are discontinued, suppliers are changed, or products undergo re-engineering changes.
Finally, between the ERP and MES, mainly data related to production and planning are exchanged: from the MES, the progress of specifications is derived; the ERP, on the other hand, supports the drafting of production planning in an aggregate sense.
The communication among these three elements is frequently not seamless and necessitates considerable effort to acquire, extract, cleanse, and transfer data for usability. This is often performed during periods when the plant operates at its lowest productivity, such as overnight.
These are not the only information systems that can be found in the enterprise. The Pyramid of Industrial Automation is shown in
Figure 3, defined as “the hierarchical representation of the different levels of automation and control available in the industrial manufacturing process” by the ISA95 standard [
22].
Integrating the Pyramid of Automation into the relevant business context makes it possible to give a more precise interpretation of the relationships between the various information systems, emphasizing their hierarchical dependencies.
With respect to the previous information, it is possible to identify the ERP as the top of this hierarchical pyramid. Consequently, all other information systems depend on it. It is supported, however, as a data source by the PLM.
The MES operates at the operational level, the second level of the hierarchy. Interestingly, the link between the ERP and MES is mediated by the Global Planning System, a business decision support information system that, through built-in features, can support decision-makers in the planning arena. SCADA, PLCs, and sensors represent the supervisory, control, and field-level devices beneath the MES.
Although the ANSI/ISA95 standard was developed specifically to guide the integration of various enterprise information systems more consciously and sensibly, meeting an industry challenge, the complex problem of dataflow management and data passing structures remains.
4.2. Integration Requirements Definition
The objectives of integrating the DT’s virtual counterpart with its real reference system must be clear. In this case, evaluating the business information systems and the other techniques used to link these systems together was important.
The functional objective of the DT’s two-way linkage is shown in
Figure 4. The data exchange process must be appropriately structured to allow information systems to be accessed from the real world. Through data analysis, data mining, and process mining, it is then required that the data be adequately prepared for use within the DES model.
The results of the DES model must provide the following two categories of data: monitoring the progress of production and providing predictive process monitoring to signal when and how much actual production will deviate from planned production by presenting appropriate alerts and flags. From there, if the decision-maker triggers the command, the simulation system must be able to provide proposals for managing the delay, which may be rescheduling or adjustments in production planning for a larger period.
Should the proposed options be accepted, one would then return to the physical world with an impact on the production system. Should a decision be made to accept the delay, the decision-maker would still have chosen in a better information situation than before, finding himself advantaged in assessing the costs and benefits of the choices to be made.
This would begin the exchange cycle again, collecting data from the physical world to pass on to the real world and vice versa.
The DT unit, therefore, can be seen as a structure with autonomous functionality (e.g., the DES model, with the possibility to provide what-if analyses), but not independent of its own reference system, nor other information systems, nor even of data filtering, cleaning, and preprocessing techniques, which should remain external to the simulation model itself.
Given that the data structure and the simulation scenarios obtained through the DES model will rely on the historical data obtained from the MES, and given the difficulties in handling such a large volume of data, it is required that data analysis techniques, such as process mining and other data analysis techniques, appropriately filter, prepare, and preprocess the data that will be fed into the DES model.
It is important to note that historical data from the MES will also be used to have benchmark metrics regarding the processing times, cycle times, delivery times, and downtime. The simulation model will then be validated on the appropriately processed historical data, and the real situation will be progressively simulated with the updated data that the MES provides.
The connection between virtual and real counterparts is thus not only intricate because of the number of information systems involved and the large amount of heterogeneous data that must be managed, but it is also complex in that data from the same sources and recorded at different times can be used for distinct purposes, increasing the number of logical links between the systems under consideration.
This highlights how the DT cannot replace any enterprise information system. Furthermore, it highlights that thinking about how to integrate the flow of enterprise data for a DT can have positive effects on the entire enterprise IT management. Indeed,
Figure 5 shows the necessity of increased attention to dataflow management and its continuous update through aggregation and consolidation of the various data sources.
5. Results
The results shown represent the summary representation of the proposed DT integration framework. Please refer to
Figure 6.
Nodes (1) and (4) represent the already described information systems: the PLM, ERP, and MES. It is shown that the MES is still directly connected with the PLM and ERP, while the ERP transmits data through other nodes.
Node (2) represents the GPS (Global Planning System). It is an active decision-making hub: the production plan decided through (and with the help of) the GPS is transmitted to the floor level (3), whose advances, both automatic and manual, are recorded on the MES (4). Passing these advances to the ERP (1) closes the external cycle of the system.
Through these nodes, it is possible to define the arcs (a), (b), and (e): arc (a) represents the incoming information, specifically regarding new production orders; arc (b) represents the net requirements that need to be planned in the GPS node; arc (e) represents the production information that is funneled into the MES.
Integration with the DT (5), whose virtual counterpart will involve a DES model, requires the following interactions, described on the arcs:
Arc (c) represents the structural information needed for the boundary conditions for any type of simulator: the incoming orders, inventory levels, product master data, production cycles, bill of materials, suppliers for external processing, customer master data, assembly cycles, machinery involved in the department, and machinery specifications. These are the starting data to describe and contextualize the reference environment and begin conceptually modeling the DES model to implement.
Arc (d 2–5) represents the production planning processed by the decision-makers with the support of the GPS. Using this benchmark, the DT will monitor the production and analyze deviations.
However, as anticipated, arc (f 4–5) is a link that represents the following two logically different links: historical data and update data.
The historical data are those related to previous productions and will be processed with appropriate data mining systems to be able to derive the necessary information for the subsequent analysis, such as the cycle times, through times, punctuality by order or product category, queue waits, and the impact of rework on the system. These final data will be the basis for factory times and will be kept up to date, so that truthful data are always available to measure the process.
Update data, on the other hand, are those that are collected by the system at every reasonable time step, and with which the DT simulation system will perform monitoring and predictive process monitoring.
While the historical data will require periodic schedulable updating, the linkage to the update data must be reduced as much as possible while respecting the data extraction and simulation time, so that timely action can be taken should the need for decision making arise.
Arc (g) represents the information gained to support the decision-maker: the simulation system thus structured will allow for an immediate and in-depth presentation of the production progress, providing monitoring and predictive monitoring, signaling in a timely manner if deviations from the initial formulated plan occur. Upon triggers from decision-makers, the DT simulation system will propose options for dealing with delays or other disruptive events that could cause deviations from the plan (e.g., delivery delays, extraordinary machinery maintenance, or a sudden increase in defects).
Thus, this integrated system is the implementation framework that will guide the following steps: the DT, in continuous connection with the lower levels of the automation pyramid, will prove to be supportive to top decision-makers.
6. Discussion
The presentation of this framework leads to the development of two thematic threads inherent to the DT, which are the use of data and the potential for the industry solutions.
Regarding the use of data, it should be noted that the historical data from the MES, properly processed and preprocessed, will also be used to have the reference metrics to validate the simulation model (processing times, through times, and cycle times); the simulation model of the DT will have to progressively perform the simulation of the real situation with the updated data that the MES provides [
23].
The link between the DT and big data is addressed within the Industry 4.0 literature, and several contributions manifest the possible positive synergies [
24,
25]. However, it is important to note that big data management techniques, such as data mining and process mining, are not a sufficient shield for possible human error. Indeed, operators manually load ESM data for those steps that are difficult to automate, too sensitive to be automated, or non-economical to automate.
A particular adaptation of the real system to ensure data quality and a continuous dataflow is necessary due to challenges that cannot be resolved through process mining alone. Relying only on historical data as the reference for the scenario analysis obtained through the DES model is inherently fragile, and certain types of data corruption, or lack thereof, can make the entire process unreliable. Therefore, it is crucial to establish an unambiguous standard for the quality of the required data and the collection method.
As for industry application, it is well known that, in human-managed decision making, the quality of the solution identified is contingent and depends on the experience and information of the person who is making the decision. Employees with greater experience may more readily anticipate the consequences of their actions than those with less experience; however, regardless of the experience level, the accuracy of independently calculated estimates or the probabilistic prediction of events remains inadequate [
26].
This is compounded by the fact that disruptive events, such as extraordinary maintenance and significant changes in the order mix and quantity, make it increasingly challenging to organize demand management in a defined and unsophisticated way.
In addition, a certain lack of transparency in the management of orders (from their receipt to their fulfillment) is a problem that all systems, not just manufacturing ones, face; especially in small- and medium-sized enterprises, data are often not sufficiently digitized, integrated, coordinated, or updated.
To manage demand volatility and system dynamism, many companies are starting to invest in optimization systems; the problem is that problems are often evaluated locally and not globally, thus deceiving the decision-makers instead of helping them, leading them to prefer locally excellent solutions instead of globally excellent solutions that better fit the overall business goals [
9].
Although an effort in the direction of digitization to integrate a DT in the company certainly has risk factors and downsides (technologies present in the company that are not yet mature enough, company employees not ready for the technology transition, initial costs, and system maintenance costs), it also represents a willingness for the explicit commitment to want to improve one’s internal data management and updating system, which is one of the challenges of the digitalization [
27].
The benefits directly related to the DT are related to its ability to enhance the stability of the planning performance amidst unforeseen circumstances. As mentioned earlier, there is a need for decision support systems that can be more transparent, more traceable, and more reliable, and that can propose solutions that have sustainable impacts on the entire evaluated system, not just on the production line or a specific machine.
7. Conclusions
This study explored the first steps required for the implementation and integration of a Digital Twin (DT) in a discrete manufacturing context, with a focus on the synergy with existing information systems such as the Manufacturing Execution System (MES) and Enterprise Resource Planning (ERP).
The importance of a well-defined integration framework that supports dataflow synchronization and facilitates monitoring and control within the manufacturing department was highlighted.
It has been pointed out that data management is crucial to the success of DT implementation, as the quality and reliability of information directly affects the reliability of the monitoring, simulations, and what-if analyses, and thus the ability to make informed decisions. Integrating existing enterprise information systems is critical to ensure operational cohesion, enabling the DT to provide transparent support and effective decision-making, as well as allowing for the continuity of information. Thanks to its virtual representation of the real world of reference and its bidirectional interaction with the real world, which tends as much as possible toward real time, the DT can contribute to a more timely informed management of warning signals that may be identified.
In the proposed case scenario, the reality of complex production requires an explicit effort to create data-sharing, management, and processing structures. Without these, it is impossible to structure a DT with a Discrete Event Simulator model as the core of the virtual part.
The main limitation of this work lies in the fact that it is based on a particular case study in a discrete manufacturing context, which may limit the transferability of the results to other companies or industries. In addition, the work is still in the early stages of development; therefore, the conclusions may only partially reflect the complexities and challenges that arise during a full DT implementation.
To further develop the research, it is critical to focus first on structuring the simulation system, creating the infrastructure for the integration with business information systems, especially the connection with historical data, properly cleaned and preprocessed, from the MES.