Generic Multi-Layered Digital-Twin-Framework-Enabled Asset Lifecycle Management for the Sustainable Mining Industry

: In the era of digitalization, many technologies are evolving, namely, the Internet of Things (IoT), big data, cloud computing, artiﬁcial intelligence (IA)


Introduction
Considering the competitiveness of the global market, businesses must create dataoriented interactions to satisfy sustainability, which is playing a relevant role in all phases of asset lifecycle management [1].In the early phases of a product's development, several issues need to be addressed, including the product's design, manufacture, usage, disposal, and, respectively, its impact on society.Each of these stages has its own specific sustainability considerations and measurements.As an area of interest in digital transformation, from manufacturing to service and operations, the Fourth Industrial Revolution (Industry 4.0) presents the promise of enhanced flexibility, higher quality, and improved productivity [2].
Every physical asset needs a digital representation conducive to fully realizing Industry 4.0 potential.To tackle difficult business problems, mirroring digital representations of actual assets can be quite valuable.One of the most desired qualities in the age of Industry 4.0 is hence the ability to deliver distinctive characteristics at scale.The fast advancement of communication and information technology has made it possible for digitalization to change how business is conducted along industrial value chains.Another term for this procedure is "Industry 4.0", or "the fourth industrial revolution".The goals include highly customized and optimized manufacturing as well as enhanced automation and adaptability.The industrial sector is already moving toward more sustainable practices by suggesting a number of strategies to improve the effectiveness of these environmentally friendly activities.The digital twin (DT) is one of the most promising enabling technologies for achieving these Industry 4.0 ambitions.Having a two-way dynamic mapping between a physical thing and its digital model, which has a structure of linked parts and metainformation, a DT is a digital representation of a physical entity [3].A framework for digital twin manufacturing is described in the ISO 23,247 series as virtual representations of actual manufacturing components such as workers, goods, assets, and process specifications.
The present work aims to provide a new approach based on a multi-layered DT framework toward enabling asset life cycle management that would support sustainability in the mining industry, answering several challenges that are faced at the mine, such as the big data, the heterogeneous types of the gathered data, and the supporting infrastructure that must comply with specific standards such as the communication linking, the distributed IoT-components, big data processing or real-time response, the high cost of maintenance, and the complexity of the process.
The increasing demand for customized products is having an impact on all phases of the product lifecycle, which is challenging production.By using engineering and management technologies such as digital twin, digitization creates a variety of options.This study proposes a general DT architecture framework to fully utilize DT capabilities in response to issues in the mining industry.The framework intends to accomplish sustainable mining by using RAMI 4.0, managing assets throughout their lifecycles, and creating a collaborative ecosystem that incorporates four services.
As shown in Figure 1, this paper is structured as follows: After the introduction, Section 2 provides the state of the art in terms of Industry 4.0 and the mining industry.Section 3 discusses the DT reference architecture model in Industry 4.0, besides its discussion of the integration level of DT technology and its potentialities of enabling value lifecycle management and creating a Phygital (physical-digital) collaborative environment.Section 4 proposes a DT reference multi-layers approach for the mining industry as the leading contribution of this study; on top of this, a section of discussion will debate the features to consider for designing DT framework architecture and the advantages of this proposed developing method to create a DT for the mining industry.Section 5 concludes and outlines future work.

Research Methodology
In this research, the following methodology aims to create a digital twin framework for asset lifecycle management of the mining industry.The methodology comprises several steps.Firstly, an exploratory literature review was conducted to examine the prevalent services, concepts, architectures, and frameworks related to DTs within the context of manufacturing, including engineering, production, process, and operations.This literature review served as a foundation for the conceptualization of the proposed digital twin framework.Secondly, focus groups were organized to gather information and feedback on the theme of digitalization within the mining industry.These focus groups were composed of experts and stakeholders from the mining industry, including managers, engineers, and technicians.The information collected from these focus groups provided valuable insights into the challenges and opportunities related to digitalization within the mining industry.Thirdly, based on the findings of the literature review and focus groups, the study aimed to identify the challenges faced by the mining industry in terms of the integration of digitalization, specifically digital twin technology.Additionally, the study listed the promising benefits of using this technology for asset lifecycle management within the mining industry.Fourthly, using the information gathered in the previous steps, a conceptual framework architecture of DT-enabled asset lifecycle management was developed.This framework architecture adhered to the standards and concepts of the reference architecture model in Industry 4.0 (RAMI 4.0) and was designed to be highly flexible and adaptable to the specific conditions and challenges of the mining industry.Lastly, to validate the proposed framework, the experimental open pit mine of OCP Benguerir was selected as the case study site.The framework was implemented in the mine and stakeholders were involved throughout the process, including managers, engineers, and technicians.The results of the case study provided valuable feedback on the performance and effectiveness of the proposed digital twin framework.In conclusion, this methodology, which is a comprehensive research methodology, provides a robust and reliable approach for the development and validation of the proposed digital twin framework for asset lifecycle management within the mining industry.The emergence of Industry 5.0 is predicated on the observation or assumption that Industry 4.0 is less concerned with the original principles of social justice and sustainability and more concerned with digitalization and AI-driven technologies for enhancing the efficiency and flexibility of production [4].DT is viewed as a crucial enabler in Industry 5.0 for developing intelligent, autonomous systems that can interact and communicate with one another, resulting in a highly connected and highly optimized manufacturing process.DT can be used to simulate and model intricate physical systems, including each of its individual parts and interactions.The last step is to use these data to forecast future performance and pinpoint potential improvement areas.The DT also enables the remote management and control of physical systems, which lowers the demand for human involvement and boosts productivity and efficiency generally.By integrating digital and virtual solutions in multiple industries conducive to enabling smart features such as predicting production, optimizing energy efficiency, and better scheduling maintenance, these smart features are leading the industry towards 5.0 where the process can be more autonomous in multidimensional levels, creating a full synergy between human and robot co-working [5].However, many challenges have been faced to implement these smart features, for example, the lack of skills of the integrators, because of the newest technologies which require the time to understand and master them by technicians and engineers who are responsible for this digital transition in the industry.The data and cybersecurity also create big challenges in industrial case studies because the networking field in process automation is very basic compared to Internet and cloud computing networks; therefore, cyber attackers always have the ability to find vulnerabilities in industrial networks.As a solution to some vulnerabilities in cyber systems, novel components have been introduced in recent works that are blockchain-based, for example, smart meters which are applied in different industries [6].Other challenges such as culture change and capital investment are creating big obstacles, where those responsible for the industrial process are hardly convinced to invest in digital transformation [7].
Industry 4.0 has become a living context where some industries have successfully implemented most of the features and are ready for fully autonomous Industry 5.0; however, in other industries, many challenges have been recently discussed and case studies are being developed such as integrating artificial intelligence, Internet of Things, and machine learning.Additionally, the DT is a smart feature that enables Industry 4.0, which researchers are currently developing.Aksa [8] has covered a wide range of research topics, from intelligent information management of complex models to building information management and the interaction of building systems, where researchers are becoming more interested in using the DT to manage their information and in developing new research lines focused on data exchange and the interoperability of building information modeling (BIM) and facility management (FM) [9].On the other hand, Giulio [10] has discussed the DT studies and applications in the fields of engineering and computer science as well as spot research hotspots and emerging trends which were designed and tested to help operators in both regular and emergency situations and to improve their capacity to regulate safety levels.The benefits of DTs as they are currently perceived [11] are not well understood, nor have DTs across the product lifecycle or the DT lifecycle received enough research, and, also, how DTs can help with decision making or cost reduction is unclear, according to Rui [11] who developed a meta systematic review of sustainability requirements of DT-based applications.It is necessary to improve and better integrate DTs' technology implementation into the IoT.
As a matter of fact, DTs are now being used by 13% of companies implementing IoT projects, while the remaining 62% are either deploying DTs now or have plans to do so, according to Gartner.The DT market is anticipated to grow from USD 3.8 billion in 2019 to USD 35.8 billion by 2025, at the compound annual growth rate (CAGR) of 37.8%, according to a new Markets and Markets report [12].
Table 1 highlights some DT applications in different industrial sectors where researchers around the world are currently developing them with multiple features such as predictive maintenance, monitoring, forecasting, diagnosis, production pace improvement, virtual reality integration, cybersecurity, and cyber-physical system integration.[26] 2022 Grinding mill case study Predictive, maintenance-based, data-driven Case study Allessandro [27] 2020 Manufacturing production Monitoring ergonomics Case study Jusso [28] 2021 Overhead crane Accelerating production pace Case study Toh Yen [29] 2021 Shipyard Fully integrated digital thread and digital twin Framework architecture Nabil [30] 2022 Open pit mine stacker machine Full digital twin of a stacker machine Architecture

Mining Industry: The Experimental Open Pit Mine
According to a TMR analysis, the worldwide smart mining market will increase at a CAGR of 10.2% between 2021 and 2031 [31], and the mining equipment market size will achieve USD 185 billion by 2030, growing at 4.1% CAGR due to rapidly rising construction activities in emerging nations (exclusive report by Acumen Research and Consulting [32]); therefore, the need to develop and to pursue the digital transformation in the mining industry is obviously and extremely important.The experimental open pit mine was first built for a necessary function to extract mining goods and optimize output while minimizing energy usage and monitoring grid quality.The maintenance experts employed a strategy of curative and preventative maintenance.The supervisory control and data acquisition (SCADA) system was fully focused on process machining supervision and key performance indicators in production, meaning that the field was ready to implement DT solutions in open pit mines using the implemented solutions of Industry 3.0, namely, distributed control systems, industrial automation, and other enterprise resource planning (ERP) and manufacturing execution systems (MESs).In this context, Adila [33] has proposed an architecture to forecast and monitor the energy consumption based on a research study on smart energy management as a feature of digital open pit mine developed by Oussama et al. [34], where the authors described how power meters can be conclusive in an automated open pit mine.Mariya [35] has developed an online diagnostic technique of a jaw crush which is a very critical piece of equipment in the mining industry that can be fully integrated into a DT by acquiring and pre-processing the used vibration data.The same approach in the mining industry has been followed using machine learning algorithms for the diagnostics of squirrel cage induction motors using only electrical data [36,37].In addition to determining energy consumption, data correlated with other smart sensors, such as moisture, temperature, and others, can also be used to assess the health index of the power transformer, one of the key components in all industries which represents the condition of all machines in the cyber system and builds a DT of all industry components [38,39].
In this case study of the open pit mine of Benguerir, the DT and its features will focus on the fixed installations which are divided into three essential parts, namely, destoning unit, screening unit, and the train loading station.The destoning unit contains belt conveyors, a stone jaw crusher system, stackers, and a wheel bucket reclaimer.The screening unit, which is alimented by a 1 km belt conveyor from the first unit, has screeners and a storage system that contains a transborder and a stacker machine.The train loading station has two-wheel bucket reclaimer conveyors, system positioning, and train loading in two railways.To sum up, every station has a different machine;   [40], the DT has experienced an impetus in academia and industry and, currently, it is recognized as a foremost method in the transformation process to Industry 4.0.A DT can be defined as a dynamic virtual representation of physical, economic, and/or social systems and the processes that are related to the system in question, enabling the tracking, adjusting, and/or predicting of its status in a real-time manner [41].However, the DT can be considered as a polysemous term since there is no consensus yet on a unique definition of this concept.The most widely accepted definition is given by NASA which identifies the DT as an "integrated multi-physics, multi-scale, probabilistic simulation of a vehicle or system that uses the best available physical models, sensor updates, fleet history, etc., to mirror the life of its corresponding flying twin" [42].Otherwise, there are abundant domain-and application-specific definitions which preclude a common understanding of this concept and impede its evolution.Indeed, the DT has overshadowed many domains since its first application in aerospace by NASA in 2010 to track the state of a flying spacecraft [43].Thereafter, the DT was investigated and implemented by many sectors, including smart city, healthcare, construction, automobile, agriculture, manufacturing, and the aviation industry, to name but a few.
In the manufacturing context, the DT is a virtual representation that mimics the realtime operation of a physical manufacturing asset (e.g., machine, production line, shop floor, product, worker) enabling decision making, such as its real-time monitoring as well as predicting its future behavior, state, performance, and maintenance needs [24].In [44], three major application scenarios of DT were stated, namely, supervisory-DT provides the realtime status of the counterpart physical system which supports the decision-making process; interactive-DT transcends supervisory tasks by automatically adjusting a parameter or a set of parameters of the manufacturing asset once a disruption occurs; and predictive-DT supervises and predicts the future state of the manufacturing asset with respect to implementing the corrective measures that maintain and/or optimize the current performance.Based on these three capabilities, the DT has rejuvenated many manufacturing tasks: (1) Equipment health management: DT allows one to enhance the reliability, availability, and safety of manufacturing systems and workers through seamless monitoring and effective maintenance decisions founded based on the prognosis/diagnosis outputs of the DT.In the literature, abundant studies have leveraged DT in equipment health management.For instance, [45] presents a model-driven approach to estimate the remaining useful life (RUL) of an equipment's component based on data acquired from the physical machine's controllers and the simulation of virtual models.Ref. [46] presents a data-driven DT for fault diagnosis under data insufficiency which is usable in both the development and maintenance stages.At first, the digital model enables an intelligent design where potential problems can be detected and solved.At the same time, a data-driven fault diagnosis model is trained by the data generated from the physics-based model.In the second stage, the previously trained diagnosis model is adjusted using the transfer learning technique with respect to enabling timely monitoring and predictive maintenance.
(2) Production control and optimization: With the dynamic changes and uncertainty in the manufacturing environment, the production process must be continuously monitored.Real-time data provide the DT with the awareness of the current status of the manufacturing asset providing corrective measures to optimize the overall throughput.In the literature, [47] presents a data-driven DT that dynamically optimizes controllable parameters' values to realize production control optimization in a petrochemical industry environment.Ref. [48] presents an MES-assisted DT which can react to disturbances on the shop floor in a real-time manner, enabling error state management and reactive disassembly of assembled products once quality standards are not met.
(3) Production scheduling: The uncertainty in the production processes subverts the static production scheduling methods.The DT can dynamically elaborate and/or verify schedules once a disruption occurs on the shop floor.The literature on DTs in production scheduling remains nascent.For example, [49] proposes a DT-assisted dynamic production scheduling framework that allows for detecting disturbances based on the distance between physical and virtual models, as well as predicting the impact of disturbances, thus triggering the rescheduling to avoid hazardous and/or cascading effects along with the performance evaluation of elaborated schedules.Ref. [50] presents data-driven DT-based dynamic scheduling that monitors and schedules tasks for robot manufacturing systems by communicating with robots the optimal path to complete a specific task.
Looking through the lens of elementary components of a generic DT in the manufacturing sector, a DT must comprise a physical asset in question, a virtual asset, and a real-time two-way information flow between physical and virtual assets.More specifically, a DT framework is grounded on IoT devices with respect to acquiring sensors' readings from different sub-components of the physical asset, while maintaining a high-fidelity connection between IoT devices for reliable and timely information exchange.Data collected from different IoT devices and software are another key enabler of a DT; it is fundamental to monitor the physical asset, maintain its normal operation, and provide input to the decision model.Storage tools and big data analysis for extracting relevant information from data are also required within the DT.Artificial intelligence such as machine learning is also required to predict the future behavior of the physical asset, as well as to identify efficient mitigation strategies in abnormal situations.The privacy and security of data among various components involved in the DT must be tackled with respect to preserving data from both external and internal attacks that may tamper with sensitive data, jeopardizing the safety of the physical asset and the workers.
DT research is growing rapidly, particularly in the manufacturing industry, where DT is viewed as a virtual representation of a physical asset utilized for real-time monitoring, decision making, and behavior prediction.However, DT is a polysemous phrase, and there is no agreement on a single definition, which makes it difficult for it to evolve.A physical asset, a virtual asset, and a real-time, two-way information exchange between the physical and virtual assets are required components of DT in manufacturing.To build shared knowledge of DT and address the difficulties in its use in the industrial sector, additional study is required despite advancements in equipment health management, production control and optimization, and production scheduling.Because product lifecycles are reduced, the value chain is becoming more globalized, and pivotal choices must be made quickly, Industry 4.0 makes all business units in the manufacturing sector more complex [3].
To successfully integrate cutting-edge technology while navigating this complexity, it is essential to understand the concept of DTs.Furthermore, it is necessary to have a common definition when discussing Industry 4.0 and DTs; the reference architecture model for Industry 4.0 (RAMI 4.0) includes the key components of Industry 4.0 [3].Using this framework, many perspectives on the DT can be organized methodically.RAMI 4.0 can assist in simplifying complex tasks into packages that are well-aligned with key Industry 4.0 characteristics [51].A digital twin reference architecture model for Industry 4.0 is shown in Figure 2.This three-dimensional layered model, which describes all key facets of the DT, is made up of a coordinate system in three dimensions.The first dimension aims to describe the product lifecycle, from the first idea to the decommissioning of the product according to the IEC 62890 industrial process measurement, control, and automation-lifecycle management for systems and components.The second dimension that encapsulates the architecture layers from the asset into business without any issues is the utilization of the DT layer.Additionally, the third dimension represents the level of integration of the DT layer, conforming to the data flow.This method simplifies complex relationships into smaller, more manageable clusters, which include the agile value lifecycle, the Industry 4.0 architecture layers, and the DT integration hierarchy [52].

Horizontal Axis: Agile Product Lifecycle
The central axis depicted in Figure 2 is discussed in this section.The left horizontal axis depicts a continuous, iterative lifecycle of product and service value that encapsulates a set of interrelated value creation activities ranging from prototype development to products including design, manufacturing, logistics, sales, and services.Within this axis, types and instances must be distinguished.A type is always created with ideation.This includes preliminary design, simulation, prototype development, and testing.Once design and prototyping have been finished and the manufacturing stage is triggered, a "type" becomes an "instance".
The agile approach as a proactive way to anticipate changes across the lifecycle can be implemented by leveraging the capabilities of the DT (e.g., data-driven individualization) with respect to enabling self-directed and continuous learning.The agile DT value lifecycle virtually duplicates the physical world by using relevant data that describe the actual status of things enabling incremental development.In the bargain, the agile value lifecycle can predict the future by using past scenarios and observing the behavior of the twin with minimum trial-and-error cost.Hence, the value lifecycle axis has an important role in this reference model since both agile and twinning concepts are suitable for individualization.On the other hand, the DT can be involved in requirement definition and design improvement through a seamless interaction with customers, capturing their preferences on functionalities and appearance.
Vertical Axis: RAMI4.0Layers The vertical axis describes the combined conceptual layered architecture enclosed by Industry 4.0 hierarchy and the DT layer, comprising the six layers associated with RAMI 4.0, which are as follows: • The asset layer: This corresponds to the physical entity where the DT's physical representation is located.

•
The integration layer: Run-time data and engineering data can be separated in this layer.Run-time data are produced by sensors or events and show the physical entity's current state.Typically, they are time series data.The underlying infrastructure must adapt to application-specific requirements, such as big data processing or real-time reaction, because these data are particularly dynamic.Often static, engineering data do not frequently alter over time.Examples include details about a physical component inside a plant or the topology of the plant.

•
The communication layer: Mostly, industrial communication or Industrial Internet of Things (IIoT) protocols can be used.The specific protocol is determined by the demands of the application, such as real-time capabilities or support for publications and subscriptions.As a result, a combination of several protocols can be envisaged, essentially to ease the data flow into the upper levels.

•
The information layer: Here, the acquired data are semantically processed and related to additional context information; the core element of the information layer is the shared knowledge base, which contains context-sensitive data about resources and services.This information's semantics are increased by linking data to it.By using various information modeling tools, various levels of semantic expressivity can be attained.Data should ideally be kept in suitable databases in their original formats.

•
The functional layer: From an architectural perspective, the function layer describes functions and services, along with their interactions.In applications, systems, and components, the functions are represented without the use of actors or actual physical implementations, whereas it provides a framework for the horizontal mixing of diverse functions and allows for the formal description of functions and services.It includes a run-time environment for applications and technical functions, as well as a modeling environment for services supporting business processes.

•
The business layer: This relates to process logic, abstract business models (rules), and assuring the quality of operations along the value chain.This layer facilitates business model mapping, value stream function integrity assurance, and process output assurance.It permits the modeling of the requirements that the process must adhere to and contains the legal and regulatory framework conditions.This layer also establishes a connection between various business processes and organizations.

Data Flow Axis
This axis is defined by the data flow specifications that dictate the integration level of each digital replica (model, shadow, twin) [30], and aims to network the different participants across hierarchical levels and maintain the interaction between the physical and the cyber part of the CPS, conductive to ensuring the flexibility of the systems, machines, and functions all over the networked entities of the RAMI 4.0 vertical axis layers.More details about this axis will be discussed in the next section.

Digital Twin Integration Level
The simplest definition of "Digital Twin" is the seamless integration of data between the physical layer and the cyber layer of the cyber-physical system (CPS).There are three levels of integration, and each one of them is defined essentially by the data flow exchange between the physical plant and the digital system, regardless of whether it is a digital model, digital shadow, or DT. Figure 3 explains the differences between digital model, shadow, and twin according to the digitalization levels of integration for each of them.Each physical asset in the cyber layer is made up of the data exchange type that creates a digital replica of the asset [30].The definition of this digital replica can be changed to three possible types according to its digitalization levels of integration [30].While the integration of data into the physical, digital, and cyber layers differs, a "Digital Model," unlike simulations and mathematical models, does not have a real-time connection to or from a physical object.In the same way, a "Digital Shadow" is a significant application in real-time monitoring if a change in a thing's state causes an instant change in the state of the digital things.This is possible with one-way real-time data communication from physical to digital space.The digital replica in the top integration level is transformed into a DT when it combines the five features as seen in Figure 3: connectivity, simulatability, active data acquisition, synchronization with the physical asset, and active decision applicability through a real-time link between the DT and the physical twin.

•
Connectivity is the ability to connect systems, services, or application programs.
Ideally, these connections are established without requiring many changes to the applications or the systems on which they run.
• Simulatability: The ability of the system to replicate the model and the behavior of the physical asset with the capacity to deal with all of its properties and parameters.
• Active data acquisition is the key point to feed the models with data and establish the real-time link between the two sides of the cyber-physical system (CPS).
• Synchronization brands the sequencing of the data flow between both components of the cyber-physical system, and without any problems creates a full, synchronous, and real-time exchange for the DT in its third level of integration.

•
Active decision applicability: In our case, the significant benefit of the utilization of the DT replica is the bidirectional data flow exchange centrally located in the physical part and DT of the system.This makes it possible to exploit and apply the decision token on the physical plant.
Over the top of the DT replica, there is the intelligent digital twin (IDT) which combines artificial intelligence with all of the features of a DT to create an independent and mature system [53].Conductive to optimizing operations and continuously testing what-if scenarios, the intelligent DT can therefore apply machine learning algorithms to the DT's models and data.This opens the door for predictive maintenance and a more adaptable and efficient production process through plug-and-produce scenarios.In applications such as open pit mines, where a lot of data are entered manually, some design elements are represented as digital shadows in this article.However, the majority of these elements are fully integrated DTs created to utilize programmable logic controllers and automation systems.

Digital-Twin-Enabled Value Lifecycle Management
Companies frequently have to rely on new technologies to develop new methods to develop company value.The DT may be a key driver in maximizing the potential of the company by providing a variety of use cases.Using all data and information of the organization, the idea of "DT" technology goes beyond just representing a real thing virtually.It is capable of providing a dynamic process information flow that may be used in a variety of case situations that strongly affect the company value [48].
The entire lifecycle is presented in Figure 4; the links between data increase data availability, and this aims to simplify working in model-oriented and partly automated systems, which leads to exploiting the opportunities that could be gained.In the area of quality, developers are able to more effectively respond to consumer behavior by using their product [2].They can also automatically detect the trend of product quality defects or even predict them.
In the area of warranty cost and services, it is worth relying upon accurate product descriptions based on the DT [54].Even after several conversations and maintenance procedures, it is necessary to document the current product configuration at any moment in real time.This enables the company to modify its service in the best possible way.
Operating costs: When operating a product, manufacturers aim to utilize as few of the necessary resources as possible.This is not an easy procedure because of how many subsystems are being produced nowadays.However, to solve this issue, the DTs can interact with one another to discuss the ideal production order while manufacturing.This allows us additional process flexibility along with the simple configuration of the system.We can benefit from DTs not only to synchronize individual machines, but also to enhance the machines' performance.Enhancing the functionality of our systems and achieving a higher level of dependability can be achieved, for example, by collecting sensor data and conducting analysis with the use of open, networked models and intelligence.
In addition, opportunities that are associated with the 'DT-enabled value lifecycle management' are the record retention and services which become difficult with products that undergo continuous improvement and expansion, because every product depends on its configuration and the update, respectively.Therefore, this operation requires the continuous tracking of every step of the product lifecycle.Luckily, the DT provides us with a detailed description of the product with the current setup in the field and the history of every iteration of the product configuration.
At the end is revenue growth opportunities; this step shows the chances for a rise in sales.The DT, in the first place, improves the availability of information, which is valuable because looking for information occupies a large part of work time that can be used in development.As seen, the products were made and maintained in a more efficient and improved process.Finally, the DT gives us the chance to create whole new business models.It is considerably simpler to offer product-based services with the presence of DT.Many benefits are involved in this step: reducing time-to-market, encouraging new business model development, and reducing the cost of production.

Digital-Twin-Based Collaborative Environment
Considering the DT benefits shown in the previous section, the digital part of the CPS-the DT-is connected to the physical part.Hence, this constitutes as seen in the Figure 5a collaborative environment enabling the setup of the whole DT of the system including product, production, process, and services.Thus, to expand the DT benefits, Figure 4 enumerates the four services that are involved, as follows:

•
Engineering: Improve the efficiency of your engineering procedures, so that new products may be produced with insights based on the behavior of existing items in the real world, connecting the virtual models with the actual physical objects.By conducting what-if scenarios, the DT helps you to uncover product faults in early product design and development phases, which lowers engineering costs and speeds up design cycles.For instance, client usage is represented in the DT and is integrated into the processes of product development and manufacturing.The DT will strengthen the bond between your organization's technical and operational teams by establishing a feedback loop throughout the lifecycle of your product.
• Production: Gain knowledge about the manufacturing processes from the DT to prognostic or diagnostic condition monitoring.For example, the maintenance of plants might be optimized using this knowledge.The DT's analytical capabilities will increase production effectiveness by predicting faults so that they may be corrected before having an impact on manufacturing goals.By modifying factors throughout the production process in the DT to enhance utilization, you may simulate different plant plans.DTs assist in detecting quality trend impacts and maintaining quality standards during real manufacturing.
• Operation and process: By optimizing the information flow throughout all of your operational and servicing activities, the DT enables you to comprehend how to run your shop floor more effectively and efficiently.Performance evaluations may be evaluated by the DT conductive to reducing production costs, which is one of its most important capabilities.Gain an insight into everything that happens to your items, and your supply chain network will be improved overall.Owners and operators, for instance, may track their fleet of cars, fleet assets, or logistical assets using DTs, as well as improve infrastructures.
• Marketing and go-to-market: Implementing DTs will improve the effectiveness of customer interaction workflows in marketing and sales.Before, when a product was handed over to a client, product development essentially came to a stop.These times are long gone.With product feedback going well beyond product delivery into service, complete data management is made possible.Improve your customer validation strategy and aim for top-notch digital marketing.You may be able to identify new business models for the product as a service depending on how the product is used or consumed.
As demonstrated, a DT has a wide range of applications and advantages.The information flow across all business operations is linked by digitization, which is the foundation that enables you to adapt swiftly to changing business situations.An Industry 4.0 tool called the "DT method" will link and optimize your company, allowing it to operate to its fullest potential.

Digital Twin Multi-Layer Reference Architecture Framework for the Mining Industry
The proposed conceptual framework architecture presented in Figure 6 is used to characterize the design of a tree-level DT that satisfies all of the features of integration as shown in Figure 3, considering its traceability back to the requirements, components, and control systems of the physical asset.It includes two parts of the CPS layers, cyber and physical layers, and three sublayers, including the data preprocessing sublayer, edge computing sublayer, and cloud sublayer.The details of each sublayer will be discussed in the following subsections.

Physical Layer
Before entering the DT architecture, data must first be collected from the open pit mine's physical assets.Data from the mine are disseminated through a variety of methods, including sensor readings, videos, photographs, the staff clocking system, electric meters, and many others.Most often, they are time series data [30].Since these data are constantly changing, the supporting infrastructure must adhere to application-specific standards, such as big data processing or real-time response.These data can be kept in archives, such as specialized time series databases, for diagnosis and model identification.Engineering data are typically static, meaning they do not frequently change over time.Examples include information about a plant's physical structure or its topology.This information must be converted to a digital form because it is typically only available in analog representations such as pipe and instrumentation diagram drawings.Even if the data are already in machine-readable form, they still need to be translated and added to the DT as contextual data.The data that are gathered must contain all of the information that is pertinent to the objective of the DT, such as information on the operation of the equipment, scheduling, and the production environment.In general, mine data are information received from people, machines, objects, and the environment.

Data Pre-Processing Sublayer
Conductive to employing data mining techniques, raw data must be transformed into well-formed data sets through the process of data pre-processing.Raw data are typically unformatted and insufficient.Every data analytics project's performance is directly correlated to how well the data were prepared.After data collection is complete, how does it proceed via the next sublayer of the DT architecture using its many applications, such as data cleaning, which comprises adding missing values or removing rows with incomplete data, reducing noise in the data, and resolving data discrepancies, as well as the data integration process, which entails reconciling data conflicts by merging data with different representations?
On the other side, when the amount of data is high, databases may become slower, more expensive to access, and more challenging to store.Data reduction aims to provide a condensed version of the data in a data warehouse.

Edge Computing Sublayer
Edge computing is processing that operates close to the data source or end user of a system, depending on where the information is coming from or going.Faster processing is made possible by edge architecture by reducing latency and lag.Applications and programs that are based on the edge can function more rapidly and effectively, improving user experience and overall performance.A distributed information technology (IT) architecture known as edge computing processes client data at the network's edge, as near to the source as is practical.Edge computing benefits in the following list are clear as a result of this feature:

•
Quick data processing and analysis in real time: The edge computing approach does not upload data to a cloud computing platform, instead storing and processing it on edge devices.The rapid rise in data volume and the strain on network capacity are drawbacks of cloud computing.Edge computing offers an advantage over regular cloud computing in terms of response time and real time.Since the edge computing node is located closer to the data source, it may perform computing and data storage functions locally, reducing the need for intermediate data transmission.It emphasizes being close to users and offering them superior intelligent services, which enhances data transmission performance and ensures real-time processing while decreasing wait time.In the realm of automatic driving, intelligent manufacturing, video surveillance, and other forms of location awareness, quick feedback is crucial.Edge computing offers users a range of fast response services.
• Security: For unified processing, a centralized processing technique, traditional cloud computing demands that all data be uploaded to the cloud.Risks such as data loss and leakage will be present during this procedure; thus, security and privacy cannot be guaranteed.Account passwords, past search history, and even trade secrets can all be made public.The security of data can be guaranteed since edge computing exclusively handles the tasks that are within its purview, relies on local processing rather than uploading to the cloud, and avoids the dangers associated with network transmission.When data are attacked, just the local data are impacted-not all data.
• Low cost, low bandwidth cost, and low energy cost: Since edge computing does not require the processing of data to be uploaded to a cloud computing facility, there is a reduction in the demand on the network's bandwidth as well as a significant reduction in the energy consumption of intelligent devices at the network's edge.Edge computing is "small-scale", and, in practice, businesses can lower the price of processing data on local hardware.Edge computing hence reduces the volume of data carried across the network, lowers the cost of transmission and the demand on the network's capacity, lowers the energy consumption of local equipment, and increases computing efficiency.

Cloud Computing Sublayer
IT has advanced, and a key business model for providing IT resources is cloud computing.With cloud computing, people and businesses can access managed and customizable IT resources such as servers, storage, and services via a shared network on demand.As a result, industrial dynamics are accelerated, established business models are disrupted, and the digital revolution is fueled.It also supplies the infrastructure that has supported key digital developments such as the Internet of Things, big data, and artificial intelligence.However, cloud computing offers a wide range of advantages and possibilities [30].
Located at the top of the cyber layer of the CPS, here is the final sublayer of the proposed framework of DT, the cloud computing sublayer, which is distinguished by housing the cloud database, the main database that receives data from the layer above for either storing and feeding the virtual mine and its optimization and predictive data applications, such as predictive production and maintenance scheduling, process optimization, supply chain, and predictive shift-work scheduling.These applications will also involve interfacing with virtual models that are present locally on the edge computing sublayer.
In particular, the edge computing layer's online defects detection and online process quality control per each component, machine station, and the cloud layer's predictive maintenance application would interact to create a general predictive maintenance plan for the mine.To prevent models from being lost if an edge computing application fails, models created in the edge computing layer must be duplicated in the cloud layer.Furthermore, the cloud layer is in charge of developing and refining algorithms and models at both the edge computing and cloud levels.Regular updates and optimizations of the virtual model's algorithms will take place in the cloud before being sent back to the edge, ensuring that the edge layer always has the most recent versions of both the model's algorithms and the models themselves.
The main database that is located in this sublayer feeds the DT by updating data coming from the optimization and predictive data applications for exploiting the three major services that constitute the core DT.

Co-Simulation Service
Conductive to designing ever-more-complex systems under increasing market pressure, it is critical to find new techniques to make it possible for specialists from many disciplines to interact more effectively.Using a heterogeneous-model-based approach could be a method for solving this problem.Different teams could create their models and conduct mono-disciplinary analyses, but they could also couple their models together for simulation (co-simulation), which would make it possible to study the system's overall behavior [30].
At the cyber layer, the simulation service located at the edge computing sub-layer relates to the synthesized data database which is fed by the data monitoring coming from the applications performed at this level, such as online defects diagnosis, online energy management, and online process quality control.The virtual model could operate through the simulation service that allows one to handle a what-if simulation of the concerned physical entity with continuous monitoring data updates.In addition, the co-simulation service at the cloud sublayer, located at the heart of the DT, allows one to constitute the virtual mine, throughout the simulation of each model that belongs to the physical twin, simultaneously with a virtual interconnection between the models; thus, the cosimulation service would endorse a multidisciplinary simulation that may be exploited to replicate the entire process flow of the physical side of the CPS.Therefore, a simulation of the real interactions between the models represented by the virtual mine is needed, conducive to realistically simulating the overall system, both the physical assets, and the physical interactions.Consequently, the co-simulation is dynamic, which means that the co-simulation can include more models, respectively, to run their simulations during the run-time.
Additionally, we distinguish between two types of co-simulation approaches, the domain-independent co-simulation approach and the domain-specific co-simulation approach.Respectively, the co-simulation approaches that are used independently of the domain represent the domain-independent approaches of co-simulation.On the other hand, a variety of domain-specific co-simulation approaches exist, which are used for the simulation of a specific field related to a specific domain; for instance, Mosaik, EPOCHS, and ADEVS are simulation tools used in the electric power grids field.

•
Co-simulation based on functional mock-up interface (FMI): The functional mock-up interface (FMI) offers the ability to interchange models between simulation programs in addition to providing a standard for co-simulation.It is not allowed to incorporate simulation tools into the co-simulation that do not support FMI due to the need that the employed simulation tools support FMI.Likewise, the tools only communicate at specific times while operating independently of one another in between.The data exchange and slave simulations are synchronized by a master algorithm, with FMI allowing for configurable time steps between two synchronization phases.A functional mock-up unit (FMU) that implements the simulation tool's interface must be used to represent each simulation tool in a co-simulation.FMI is not suited for a dynamic co-simulation with "Plug-and-Simulate" features, since its master algorithm requires input from each slave simulation before beginning the simulation [55].
• Co-simulation based on high-level architecture (HLA): The United States Department of Defense created the high-level architecture (HLA) as a distributed and parallel simulation architecture federation for the many simulators, and the run time infrastructure (RTI), a hub for federate coordination, which makes up the co-simulation with HLA known as a federation.An object model template defines the information that may be shared between the federates, and an interface specification defines the interfaces between the federates and the RTI.There are also some HLA rules that a simulator must adhere to, conductive to being considered to conform to HLA.The RTI may be thought of as the simulation master in charge of synchronizing the federates.Although HLA permits the dynamic entry of federates during run-time, a new Federation Agreement, that is domain-and use-case-specific, must be prepared for each federate.There are several RTI implementations available, both paid and unpaid [56].

• Co-simulation based on open services gateway initiative (OSGI):
The open services gateway initiative (OSGi) is a framework that was created for the building of applications made up of dynamically combinable and reusable components.It is based on Java.The implementation of the components in OSGi is isolated from other components, conductive to decreasing complexity, and the components communicate with one another via services.The framework may load, delete, swap, or update the bundles that serve as representations of the components at run-time.These bundles may be located on a single computer or spread across a number of machines.A dynamic co-simulation is made possible by the dynamic exchange of bundles.A bundle that is linked to the simulation by a simulator coupler serves as a representation of each simulation.The simulation coupler makes use of OPC and permits data exchange and synchronization between the simulations.Another option is to directly incorporate a model into an OSGi bundle.For switching simulators during run-time, a separate bundle is needed, in which the states of the withdrawn simulators are maintained to allow for their re-entry.
• Co-simulation based on OPC unified architecture: The OPC Foundation created OPC UA, a machine-to-machine communication standard that is service-oriented and enables the exchange of process data and their machine-readable description.
The reference implements a co-simulation with OPC UA, in which each simulator is connected to a generic adapter via an interface that houses both an OPC-UA server and an OPC-UA client.This adapter uses OPC UA to connect to a central server that also has an OPC-UA server and a client.Each simulator must be registered on the central server, with the first simulator to do so being the simulation master and the following simulators being the simulation slaves.The co-simulation is coordinated and synchronized by the master.If the co-simulation master exits, another simulator assumes control of the co-simulation.
• Agent-based co-simulation: This concept of multi-agent systems was chosen to couple the simulation tools, as long as software agents can join and leave multi-agent systems at any time during run-time, are domain-independent, have no restrictions during any stage of the lifecycle, and have the ability to add intelligence on top of the models [56].
Each IoT component is modeled in its own simulation, maybe using multiple simulation tools, and each simulation is linked to and embodied by an agent, as shown in Figure 7.It is feasible to interchange the models at run-time by simulating each IoT component separately and linking them to agents, due to the ability of the agents to enter and exit the multi-agent system at any moment, just like IoT components can enter and exit the IoT system.Through agent-to-agent communication, the agents advance the interaction between the models.For illustration, if a heating unit model asks for the temperature value of a temperature sensor model, the agent associated with the heating unit model sends this request to every other agent, who then sends it to their corresponding models.The models then determine for themselves whether the message received is valuable before responding appropriately.A notion for an interface has to be established in order to facilitate communication between the agents and their respective simulation tools.

Operation Data Service
A physical asset's operating data must be able to be collected by a DT, as well as processed and analyzed.This serves as the foundation for knowledge extraction, which will make the DT intelligent and enable it to perform a variety of aid activities including diagnosis and prediction accuracy.Operation data collecting, pre-processing, and semantic annotation serve as the basis for this.A significant difficulty in industrial implementation is the capacity to sample process data very precisely and to assign it to the correct digital replicate conducive to enriching the DT of a CPS with operation data.
The DT must have up-to-date asset operating data conducive to appropriately representing the behavior and status of the asset.Both sensor data, which are continually streamed and recorded, and control data, which define the physical component's present condition and are likewise documented during their entire lifecycle, may be used as examples.Here, you may also save new orders and other company information.The interface for data acquisition, a structural element of the DT, is used to do this.A database that stores and processes this kind of data is referred to as operation data.These data might be used by algorithms which are both live and offline to enhance knowledge of the asset and improve the relevant models.

Synchronization Service
The anchor point method can be useful in identifying changes that are taking place in the physical production system and in examining the connections between the modified components within it, conducive to synchronizing cross-domain models of the DT.
Three principal phases that constitute the anchor point method are as follows: automatic change detection, relational analysis, and change management.The first phase relies on importing the control code in two different times conducive to performing a relation comparison of models [57].Two versions of the system's control code are imported from a repository in the first step.These versions are saved at various times, the first being immediately upon system commissioning and the second being in the present day.

Features to Consider for Designing Digital Twin Frameworks
Digital twins have numerous benefits including the ability to simulate real-world behavior, monitor performance, and optimize the design.Within the mining industry, the use of DTs can improve the efficiency and safety of operations by enabling engineers to analyze and optimize the mine designs, predict and prevent potential problems, and monitor the performance of mines in real time.There are various frameworks available for creating DTs in the mining industry, each of which may have its own set of tools and methods for modeling and simulating mine behavior, integrating data from various sources, and visualizing results.When comparing the architecture of different DT frameworks for the mining industry, it is important to consider the following factors:

•
The level of detail and complexity of the models: DT frameworks may vary in the level of detail and complexity of their models, which can impact their ability to capture a wide range of physical and operational characteristics of a mine [58,59].

•
The type of data sources and integration methods: Different DT frameworks may utilize different types of data sources and integration methods, such as sensors, simulations, and historical data, and may have varying levels of interoperability between these sources [60].

•
The visualization and analysis tools: The quality and flexibility of the visualization and analysis tools provided by a DT framework can significantly influence its usability and effectiveness [61].

•
The scalability and performance of the system: As the size and complexity of a mine increases, the scalability and performance of the DT framework may become increasingly important considerations [62].

•
The cost and resources required for implementation and maintenance: The cost and resources required for implementing and maintaining a DT framework can vary greatly depending on the system's complexity and scope [63].
• Customizability and adaptability: Some DT frameworks may offer a higher degree of customizability and adaptability to meet the specific needs and requirements of a mine, while others may be more rigid in their capabilities [64].
• Ease of use and user experience: The usability of a DT framework can greatly impact its adoption and effectiveness.Some frameworks may be more intuitive and user-friendly, while others may require specialized training or expertise to be used [65].
• Integration with other systems and platforms: A DT framework that can easily integrate with other systems and platforms used by a mine, such as enterprise resource planning (ERP) systems and asset management systems, may be more effective [66].
• Security: DT frameworks may handle sensitive data and intellectual property, so it is important to consider the security and privacy measures in place to protect this information.
• Support and maintainability: The level of support and maintenance provided by the developer or vendor of a DT framework can be important factors in its long-term viability and reliability [66].

Proposed Digital Twin Framework Advantages
The proposed DT framework architecture aims to propose a roadmap towards the design of DT-enabled product lifecycle management with the aim of answering the challenges faced in the field of the mining industry.With a focus on the mining industry, it is essential to have a comprehensive understanding of the system and its interconnections.Barricelli et al. [67] surveyed the definitions of DTs and their main characteristics.The process of developing DT is complex and involves the integration of various technologies.Leng et al. [68] studied how manufacturing systems have evolved from Industry 1.0 to the current Industry 4.0, focusing on the use of DTs in this process.They examined the design steps and the technological advancements that have taken place over time.There are many relevant conditions to consider when it comes to the mining industry, such as the collaboration of different systems, the creation of intricate models, the use of various technologies, the long distances that can cause latency and connection delays, and the large volume of heterogeneous data generated.The paragraph below represents our contribution and discusses the integrity of the framework combining these challenges.
Steindl et al.
[51] enumerate various architectures and frameworks in order to create a technology-agnostic generic DT architecture that is consistent with the reference architecture model Industry 4.0 (RAMI 4.0), which is the main subject of our work.Through the proposed DT framework architecture, we provide a multi-layered architecture that links the physical layer and the cyber layer, considering the three sublayers presented inside the cyber layer of the CPS, which are, respectively, the preprocessing sublayer, edge computing sublayer, and cloud computing sublayer.In furtherance of dealing with the big volume of data gathered and its heterogeneity, many techniques are presented in the processing sublayer: data cleaning, data reduction, and data integration.In addition, the framework affords an architecture-based edge-cloud computing collaboration, manifested by two sublayers named, respectively, the edge computing sublayer and cloud computing sublayer.
For real-time monitoring and online diagnosis, edge computing has potential applications due to its ability to process data closer to the source, potentially leading to faster and more efficient analysis.
However, the cloud computing sublayer can be utilized to enhance the performance of optimization and prediction algorithms due to its ability to offer scalable and flexible computing resources.This can enable the algorithms to operate more efficiently and effectively.
Optimization algorithms are used to determine the optimal solution to a problem, often by minimizing or maximizing an objective function.These algorithms can be resourceintensive and may require large amounts of data to be processed.By utilizing cloud computing resources, it is possible to increase the computational power and storage capacity as needed, allowing the optimization algorithm to operate more efficiently and potentially find better solutions.
Prediction algorithms, also known as machine learning models, are utilized to make predictions or forecasts based on data.These algorithms often require a large amount of data for training and may need to make predictions in real time.By utilizing cloud computing resources, it is possible to quickly process and analyze the data, potentially leading to more accurate predictions.
In summary, the use of cloud computing as a sublayer for hosting the optimization and prediction algorithms can provide the required computational resources and allow the algorithms to operate more efficiently.Furthermore, the cloud computing sublayer also hosts the virtual mine that represents the core of the DT framework architecture, enhanced by the operation data service that provides access to data related to the operation of a system or process.These data could be used to feed the DT, allowing it to accurately represent the current state and behavior of the physical entity.
The co-simulation service allows for the simultaneous simulation of multiple models of systems or processes.This could be useful for analyzing the interactions between different components of a system, or for testing the behavior of a system under different scenarios.The synchronization service ensures that data are kept up to date and are consistent across different systems or processes.This could be useful for ensuring that the DT accurately reflects the current state of a physical system, or for coordinating the operation of multiple systems.
It has been suggested that by utilizing a combination of services, the accuracy and functionality of the DT could be improved, allowing it to represent the behavior and performance of the physical system more closely.
With this DT framework architecture, we aim to suggest a generic developing method for designing DT-enabled asset lifecycle management specifically for the mining industry.Hribernik et al. [69], in their work, conducted a survey of the necessary properties and requirements for a DT.They then presented a roadmap towards the development of DTs that are autonomous, context-aware, and adaptive.Nevertheless, the promising benefits of using DT in the industry are still limited by the lack of DT-developing methods, as the authors point out in their review [70].
Several factors should be taken into account when developing a digital twin framework for the mining industry, including the degree of detail and complexity of the models, the types of data sources and integration techniques, the visualization and analysis tools, scalability and performance, cost and resources, customizability and adaptability, ease of use and user experience, integration with other systems and platforms, security, and support and maintainability.With a focus on the mining sector, the suggested digital twin framework architecture offers a multi-layered framework that connects the physical layer and cyber layer, combining data cleansing, reduction, and integration techniques, as well as edge-cloud computing cooperation.Cloud computing can improve the performance of optimization and prediction, whereas edge computing is useful for online diagnostics and real-time monitoring.The incorporation of DTs has the potential to bring numerous benefits and a range of applications, despite facing certain challenges.As a result, various industries are exploring ways to adapt their operations to align with future trends and to evaluate the long-term viability of their business models concerning the lifecycle of assets.

Conclusions and Future Work
Driven by the rising demand for individualized and customized products, production is becoming increasingly complicated in terms of structure.Impacting all of the product lifecycle phases, digitization is the main key into a transformation process that can lead to plenty of opportunities.It is empowering to introduce many engineering and management applications such as the DT.The current research aims to take full advantage of DT capabilities in response to the challenges faced in the mining industry.The goal behind this paper is to propose a generic DT architecture framework for the sustainable mining industry by adopting the reference architecture model in Industry 4.0 (RAMI 4.0) and enabling asset lifecycle management by discussing the outcomes and potentialities of the use of DT technology.This framework establishes a collaborative environment that joins the two sides of the cyber-physical system, integrating the four services impacted: engineering, production, process and operation, and marketing and go-to-market.
Motivated by the benefits and opportunities enabled by using the DT approach, the goal now is to design a proof of concept of this framework in future work by developing an industrial case study of DT that facilitates the achievement of a sustainable mine considering the asset lifecycle management mirror and assuring the simulation and test of the process consistency over the lifecycle phases.

Figure 2 .
Figure 2. Digital twin reference architecture model in Industry 4.0.

Figure 6 .
Figure 6.Proposed digital twin multi-layer architecture framework for the mining industry.

Table 1 .
Digital twin applications in different industrial sectors.