A Roadmap to Integrate Digital Twins for Small and Medium-Sized Enterprises

: In the last decade, Australian SMEs are steadily becoming more digitally engaged, but they still face issues and barriers to fully adopt Industry 4.0 (I4.0). Among the tools that I4.0 encompasses, digital twin (DT) and digital thread (DTH) technologies hold signiﬁcant interest and value. Some of the challenges are the lack of expertise in developing the communication framework required for data collection, processing, and storing; concerns about data and cyber security; lack of knowledge of the digitization and visualisation of data; and value generation for businesses from the data. This article aims to demonstrate the feasibility of DT implementation for small and medium-sized enterprises (SMEs) by developing a framework based on simple and low-cost solutions and providing insight and guidance to overcome technological barriers. To do so, this paper ﬁrst outlines the theoretical framework and its components, and subsequently discusses a simpliﬁed and generalised DT model of a real-world physical asset that demonstrates how these components function, how they are integrated and how they interact with each other. An experimental scenario is presented to transform data harvested from a resistance temperature detector sensor connected with a WAGO 750-8102 Programmable Logic Controller for data storage and analysis, predictive simulation and modelling. Our results demonstrate that sensor data could be readily integrated from Internet-of-Things (IoT) devices and enabling DT technologies, where users could view real time data and key performance indicators (KPIs) in the form of a 3D model. Data from both the sensor and 3D model are viewable in a comprehensive history log through a database. Via this technological demonstration, we provide several recommendations on software, hardware, and expertise that SMEs may adopt to assist with their DT implementations.


Introduction
Industry 4.0 (I4.0) is rapidly evolving, and it has the potential to transform collaboration and operational efficiency of assets throughout the entire product lifecycle management (PLM) phase. The focus of I4.0 is to integrate the physical product and process with information technologies and the internet for growth and sustainable competitiveness [1][2][3].

Digital Twin and Industry Sector
The sectors referred to in this article are those largely related, but not limited, to engineering and manufacturing. Major companies such as Tesla, General Electric, Siemens, etc., have adapted the concepts of I4.0 by integrating physical production and operation with digital twin (DT) model and smart technology, Artificial Intelligence (AI) and big data to ensure better connectivity and continuity of workflow to improve productivity, efficiency and competitiveness. In fact, the DT paradigm continues to reinvent business models, opening doors to new and additional revenue streams. For example, Tesla builds a DT model of every vehicle manufactured, collecting data from the sensors of each vehicle and integrating this data into the DT's simulation. The DT data have enabled Tesla to remotely monitor the state of each vehicle and perform maintenance via software updates [1,2]. Furthermore, the combination of AI, machine learning and big data have allowed Tesla to create a system of dense traffic data that will be used in the development of autonomous vehicles [1,2]. This is a potential new product, an upgrade of the existing product, and a new avenue for increased revenue. General Electric (GE) has embraced the DT model for its benefits of increased reliability, reduced risk and maintenance and faster time to value. The company claims to have saved $1.6 billion for customers through DT capabilities [3]. Rolls-Royce, an engineering company known for its motor cars, has also implemented DT technology into its turbine design, testing and maintenance [4]. However, Rolls-Royce has gone beyond simply improving turbine design and has innovated its entire business model. Whereas previously Rolls-Royce participated in the construction and selling of engines in the aviation industry, it now operate under a business model of selling 'power by the hour' [5]. Under this model Rolls-Royce maintains ownership of the engine and instead sells thrust hours to the airlines.
As the production and manufacturing processes have evolved and transitioned to become increasingly digital, the concept of a DT, which is a virtual representation of the as-designed, as-planned, as-built and as-maintained physical product, is key to support product development [6]. DT is commonly referred to as an 'integrated multiphysics, multi-scale, probabilistic simulation of a complex product, which functions to mirror the life of its corresponding twin [7]. Hicks [8] defined DT as an "appropriately synchronised body of useful information (structure, function, and behaviour) of a physical entity in virtual space, with flows of information that enable convergence between the physical and virtual states".
In the machine tool sector where accuracy in the machined geometry is sought, DT has been used for error mapping simulation with the use of three axis milling machine tools [9]. In the tool making industry, even the accurate method of wire electro discharge machining is being improved by computer simulation [10]. The authors reported that although experimental modelling of wire deformation provided a slight increase in accuracy over computer simulation, computer simulation modelling was far more practical in terms of time and cost savings.
In addition, DT models have been deployed in other sectors such as education [11], where DT has been implemented to aid electrical engineering students in diagnosing electrical faults in a controlled environment. This has given the students the opportunity to apply their learning to situations where it would otherwise not be safe to do so while also building versatility in tackling unexpected occurrences. Another example is service engineering [12], where DT has enabled the virtual commissioning of projects such as plants, production lines, control systems and complete robot systems. DT models have also been used in healthcare [13] and with continued effort and development, medical DTs can change the way personal health is approached. With increased data for diagnostics, the optimum diagnosis and treatment can be found, which could save lives. Furthermore, DT in the agriculture and food industry has been proposed in [14], which suggests the benefits of implementing DT in "Agrifood" industry ranges from improved quality and shelf life, cost reduction, minimised losses, improved logistics and more.
From our review of the reported studies discussed briefly above, it is apparent that engineering and manufacturing are the fields in which DT and its enabling technologies are most developed with the most sophisticated DT models. These industries and companies are those with the largest resources. Industry sectors such as Agrifood, machine and tooling are considering upgrading their existing system and investigating a process to develop system linking the physical and digital spaces.

Barrier and Challenges in Digital Twin Transformation
The concept of having a DT that truly replicates its physical twin in practice raises practical challenges in data acquisition, system integration, data visualisation and value generation from data [15]. These challenges have been echoed by researchers, such as in [16], where they created a DT of a cutting tool using a 'tweeting' machine that sends information and communication between a production line and other consumers. They acknowledged that modern production lines may not have this specific equipment. Furthermore, for their DT to work, they needed to modify the existing hardware and software to improve connectivity. Hicks [8] stated that the challenge of DT development is the data harvesting and data usage of the physical asset such that there is a continuous or period 'twinning' for the digital asset to mirror the condition of its physical counterpart. In the case of big data in [17], the question of how to prepare or process the vast amount of data harvested was raised.
Furthermore, the DT development challenges can be specific to individual circumstances. For example, He et al. [18] have raised concerns about the wireless connection, specifically signal processing in their research to integrate DT technology with surveillance systems. They noted that the current work of integrating non-intrusive sensors with existing systems is costly, difficult for monitoring the large number of targets and is associated with other issues such as low quality of raw data due to hardware imperfections.
At the disposal stage of the product lifecycle in waste electronics, Wang and Wang [19] noted the connectivity and integration issues of data after the product is sold to the end user. More specifically, they noted a gap at the end of the product lifecycle, the disposal stage, where harvested data was largely absent.

Enabling Digital Twin Technology
Rasheed et al. [20] summarised the enabling technologies that can address these challenges into five major categories, namely: 1. physics-based modelling, 2. data-driven modelling, 3. big data cybernetics, 4. infrastructure and platforms, and 5. human-machine interface. For example, data management, safety, and security can be addressed by datadriven modelling and infrastructure and platforms, specifically cryptography, blockchain and big data technologies. Signal processing, real time communication and latency are addressed by data compression and communication technologies such as 5G and Internet of Things (IoT). Connectivity issues and data collection after the sale of the product can be improved with the infrastructures and IoT technologies used for environmental monitoring.
As such, information and communications technology (ICT) is essential to support the DT transformation. ICT emphasises the unified communication of information through the integration of telecommunication, computers and devices, enterprise software, and storage, enabling all stakeholders (both inside and outside the companies) to access, transmit and use the required information.

Cyber-Physical System (CPS)
CPS is typically referred to as the close integration of human made systems in the physical space with the computation, communication and control systems of the digital space [21]. This creates the network of sensors and actuators that are embedded in equipment, machine parts and connected to the internet to monitor and control physical processes and systems. This is accomplished via electronic devices and communication protocols embedded within a mechanical or electrical system intended to perform specific functions [22]. Monostori and colleagues [21] outlined several case studies of CPS, one of which has direct implications for machine tooling. The authors noted the problem of chattering in the cutting edge during the milling process. However, integrating sensors within the tool to detect chattering allowed for adjusting relevant process parameters, resulting in an improved surface quality and an increase to tool life. The authors further stated that sensors can be implemented to the clamping device and thereby allow the machine to adapt.

Computer and Communication Networks
The network communications refer to the wireless and internet technologies that serve to link machines, work products, systems and people, both within the manufacturing plant and with original equipment manufacturers (OEMs) and suppliers. For network communications to occur, protocols that define the rules of communications between hardware and software must first be established. Three industry wide communication protocols, namely Modbus, OPC UA and MQTT provide a path for machine-to-machine communications that can meet the emerging needs of industrial automation. These protocols were considered, and Table 1 summarises the advantages and disadvantages of these communication protocols.

Visualisation and Analytics
Grafana [29] is an open-source visualisation and analytics web application that is used to create visual tools such as graphs from time-series data. This is done through the creation of dashboards that consist of multiple panels. Each panel is configured to display data in a desired way. The display options can range from graphs, heatmaps, histograms and more. The data are collected from databases, for many of which Grafana has native support such as MySQL, InfluxDB, Prometheus and more. Grafana has plugins for databases that do not have native support. As a DT developer, the benefits of Grafana as a data visualisation tool is two-fold. Firstly, the dashboards created by Grafana automate the collection, management and viewing of data, thereby improving operation efficiency. It also contains alert functions, which trigger for anticipated scenarios and can alert the user via communication platforms such as Slack. Secondly, at the management level, having information presented in a visually understandable manner aids in understanding complex scenarios and improved decision making.

Rationale of This Article
The overarching theme of the relationship between DT and PLM is the data and the Digital Thread (DTH). At each PLM phase, data are harvested, communicated between all stakeholders, stored and accessed to enable informed managerial decisions. The DTH links information generated at each stage of the product's lifecycle in order to enhance present and future decision making [36]. It aims to be the 'authoritative data' that contains every iota of data known about the system at that point in time that is both current and complete [37]. Mondolla et al. [38] proposed the use of a supply chain DT to address a specific supply chain issue in the aircraft industry relating to the traceability and transparency of operations information. The authors presented a case study of a DT of the supply chain, enabled through blockchain technology in conjunction with additive manufacturing. The supply chain DT would serve as the historical data source providing reference and transparency. Liu et al. [39] proposed a cloud-based DT that communicates with distributed edge DTs, i.e., DTs that are at different stages of the product lifecycle, for metal additive manufacturing data management. They developed a conceptual framework that consists of a number of DTs, known as edge DTs, that each represents a segment of the product lifecycle, ranging from product design to manufacturing and quality measurement. At each stage of the product lifecycle, the edge DTs perform their designated tasks, which include device control, analysis, report generation and data storage. The edge DTs are further managed by a cloud-based DT which harvests edge data and applies advanced data analytics.
Although the individual components, e.g., sensors, computational hardware, modelling and analytical software and databases, of the DT are readily available, albeit sometimes requiring modifications, the link between them still requires some development. In other words, the DTH that collects and shares the harvested data over the product lifecycle is not yet built to the point that it is a 'one-size-fits-all' model. This applies for DT modelling as well, where each DT model must be built from the ground upwards. There is no standard method yet developed. The majority of sophisticated DTs can be found in larger companies such as GE and Siemens due to the fact that integrating such components poses a large challenge [40]. In the case of small and medium-sized enterprises (SMEs) that are looking to transition towards the digital transformation, they may face obstacles in specific steps, expectations, costs and learning curve. According to a 2017 report by Deloitte [41], Australian SMEs are steadily becoming more digitally engaged but are still unable to maximise the potential of I4.0 tools. They face issues, barriers and limitations such as: a lack of expertise to manage complex I4.0 structures; concerns about data and cyber security; lack of appropriate digital infrastructure; lack of knowledge of digitalization and extract valuable data to help business and visualise the data [41,42]. However, these reasons should not deter SMEs from at least incrementally adopting DT and I4.0 framework. As SMEs increase their digital engagement, opportunities in revenue growth, new job creation and innovation are improved as well [41].
This article, therefore, aims to develop a roadmap of DT and DTH implementation across the physical asset, sensor, database and modelling platform designed specifically for SMEs, and, subsequently, offer practical insights to help them overcome barriers with simple and low-cost solutions. This is achieved via theoretical and practical demonstrations that are generalise to, and are replicable by, SMEs, thereby showcasing the feasibility of DT and DTH adoption. In order to do so, this paper will address the following research questions: How can a communication framework be designed that connects the physical sensors to the database and onto our modelling application? • RQ2: How can a digital model be developed that is driven by physical inputs, thus simulating the present actions of its physical asset?

Methodology
In order to demonstrate the connectivity between the key elements of DT and DTH, we presented a scenario, along with tools and technologies to address the research questions. The idea was to present appropriate guidelines and approaches for SMEs on how to kick start their DT and DTH implementation.
The methodology section outlines the key decisions made regarding the individual components that make up the DT and DTH. The flow of information can be summarised into the flowchart as shown in Figure 1 as the industry best practice. The scope of this article was divided according to the components of I4.0 and DT, namely CPS, networking, simulation modelling and big data and analytics.

Experimental Scenario
The proposed scenario ( Figure 2) presented a practical approach to transform data harvested from a resistance temperature detector (RTD) sensor connected with a WAGO 750-8102 PLC for data storage and analysis, predictive simulation and modelling. The WAGO PLC was connected with a DF9GMS Micro Servo, which was coded and driven when temperature conditions of over 20 • C were met and remained stationary otherwise. The middleware was a computing device with an OPC UA communication protocol implemented on Node-Red. The OPC UA protocol stack was essential for communicating with common PLC and industrial devices. The middleware was for conveying the data to the intended MySQL database, Grafana dashboard and 3DExperience modelling platform.
In our experimental setup, MySQL database software was installed on the other computer which was representative of the datacentre. A dashboard software was further implemented on the database unit for visualising the data graphically in real-time. We opted to use Grafana, an open-source web-browser based platform. In the setup, the middleware unit was granted the database write access right, while the dashboard module was granted the read access right.

Communication Framework between Digital Twin and Physical Asset
In order to investigate the scenario experimentally and to demonstrate how all these technologies work together, the setup was designed in the following manner:

1.
Digitalise the physical asset (i.e., creating a 3D model of the asset of interest) 2.
Transform and set-up storage of data 3.
Map the physical asset Input/Output (I/O) data to the 3D model data 5.
Display and analyse data in a format usable for managerial decision-making 6.
Connect data bi-directionally (i.e., connection data between the physical asset and DT to flow both ways) With the aim of covering the aforementioned steps, a communication framework that connects the physical sensors to the database and onto our modelling application was developed, as portrayed in Figure 2.
First, a DT model of the micro servo was created using applications within the 3DExperience PLM platform to replicate the actions of its physical twin. These applications were CATIA Part Design, Assembly Design, Mechanical System Design, Functional and Logical Behaviour and Dymola Behaviour Modelling. Part Design was used to model the case, shaft and fan blades. Assembly Design was used to create the engineering connections between these parts. An engineering connection in the Mechanical Systems Design was used to create the kinematics of the assembled parts. A system model that drives the kinematic behaviour of the servo was built with Dymola Behaviour Modelling. This application used Modelica language, which is an object-oriented, equation-based language used to model the individual components of electrical circuit, dynamics and kinematics of the servo motor. An example of Modelica modelling can be seen in Figure 3. Modelica Editor, a function of Dymola, was used to build a customised block that read input variables and wrote present model state to different text files to be used at a later stage. The custom input block can be seen in Figure 4. Functional and Logical Design was then used to design a low-level system model, i.e., more specific to individual components of a systematic operation, of the servo by creating a flow between each Dymola block from the input source to electrical circuit to kinematics. XAMPP is a multiplatform package that contains MySQL among other server solution; however, the scope of this article was limited to MySQ. After XAMPP was installed on the middleware computer, Apache and MySQL were activated in the XAMPP control panel.
In our experiment, two different users were created, one with read only rights for Grafana and another to write and read for Node-Red and 3DExperience. A database named "digitaltwin" was created with three columns. The first was an incremental integer named "ID" for tracking the number of readings. The second was for the time stamp to record the date and time of the specific reading listed under "time". The third column was a double data type for recording the temperature under the heading "T1". A sample of readings can be seen in Figure 5. A similar table named "servomodel" was created within the same database "digitaltwin" for storing the DT's state. However, the third column was a VarChar data type named "State" which records a "STATIONARY" or "RUNNING" depending on whether the servo model is stationary or rotating, respectively. The sensor, PLC and micro servo were wired together followed by the software and coding setup. Figure 6 shows the physical setup of the RTD sensor, WAGO PLC and DF9GMS micro servo. A breadboard with resistors in series was wired to step down the voltage from 24 V to 5 V. The PLC itself was connected via ethernet to the middleware.  Node-Red, which is an open-source connectivity tool, was used to bridge the gaps between PLC and the MySQL database, and between the PLC and the DT modelling software, i.e., 3DExperience. It is a flow-based editor that connects various IoT devices, services, and application programming interfaces (APIs). It operated via drag and drop blocks called nodes, which were wired together to create a 'flow' of data. It was installed using Node.js which is a JavaScript runtime. Node.js and Node-Red were installed on the middleware device, and in the workspace the node packages for OPC UA and MySQL were installed.
Inject nodes extracted the temperature readings from the PLC node using the node identifier and PLC IP address, ran through a custom function block (Figure 8), which wrote the data to the database and then pushed through the MySQL node. Another function block mapped the temperature data as an input for 3DExperience system model in a text file format. Another separate flow was created that writes to the MySQL database regarding the current state of the 3D model. The Grafana dashboard was then connected with the MySQL database, and each table of data was linked. Figure 9a highlights options for selecting data in Grafana from the database. These included the data source, data table, time column and measured parameters. Figure 9b presents Grafana's presentation options which range from timeseries, graphs, stats and text. Temperature readings were presented as a time-series graph, temperature values for latest reading, maximum and minimum readings and average values. A gauge icon of latest temperature reading was displayed. The state of the 3D model (either running or static) was displayed as text with a time-series.

Digitalisation of Servo Motor
Our experiment aimed to simulate the rotational motion of the physical servo, driven by temperature readings from the RTD sensor, thus mirroring the actions of the physical servo. As such, in driving the shaft, the systems model gave attention to three components, i.e., source, electric circuit and kinematics ( Figure 10). The process began with the source input with a pulse block, each rising edge triggering the coded block that reads the temperature from Node-Red and outputs it to the electrical circuit. The electrical circuit recorded the temperature reading and converted it to a Boolean variable, which subsequently turned on the electrical switch when the variable was true, thus driving the motor. It also contained a brake that was active when the Boolean variable was false. The final block contained the kinematics of the servo motor. The kinematics were generated in 3DExperience based on the engineering connections within the created 3D model. For this case study, a simplified 3D model was used consisting of a frame, a shaft and the fan blades.

Data Harvesting and Storage
The RTD sensor, controlled by the WAGO PLC, measured the ambient temperature which was held in the MySQL database. A sample of temperature readings can be seen in Figure 12a. Further data were extracted from the 3DExperience platform regarding the relative actions of the 3D model which can be seen in Figure 12b. It is noted that at temperatures of greater than 20 • C, the state was RUNNING meaning the servo model was rotating. As the temperature crossed below the 20 • C threshold to the state changed to STATIONARY.

Input/Output (I/O) Data Mapping
The temperature readings were used as the driving input for the 3D model in 3DExperience. The I/O data mapping was completed via Node-Red writing the temperature readings to a text file which was subsequently read using Modelica's file reading node ( Figure 13). A coded Modelica block (Figure 4) used this value as a real variable. In the second Node-Red flow (Figure 14), a text file containing the current state of the 3D model generated by 3DExperience, which contained whether the model was "RUNNING" or "STATIC" was read in Node-Red and pushed to the MySQL database.

Data Visualisation
Grafana was used as a visualisation platform where significant information was presented in a manner that would be conducive to decision making. The dashboard was customised to present key information regarding temperature readings and DT model state. The dashboard can be seen in Figure 15.

Discussion
The experiment with a servo motor scenario described previously provided an overview of the key technologies on a small, but realistic scale for SMEs to embrace the concept of DT and DTH. We considered two main aspects, namely communication framework and digital transformation, of the implementation principles to facilitate the transfer of knowledge from research to industry. The authors of this paper note that other methods or approaches to DT integration exist. These methods may range from utilising blockchain technology to build a supply chain DT [38] to cloud-based DTs for each stage of the product lifecycle [39]. Existing methods do not provide a simplified process for SMEs to follow and are tailored towards companies with greater resources, knowledge and experience. It is not the intention of this paper to compare the existing methods but rather to present a framework and prove its feasibility to the readers.

RQ1: Design of the Communication Framework
The earlier studies have defined DT as an appropriately synchronised body of useful information. However, there are 'twinning' challenges of DT development [8,15]. We sought to address the implementation of an ethernet-based connectivity between a physical asset and its DT model in the communication framework. The focal attribute of the physical asset (i.e., the servo motor) was its rotational motion. As such, the useful information in this situation was whether the servo motor is rotating or stationary at any given time. We demonstrated through the use of some off-the-shelf tools that it is possible to create an integrated system whereby data were acquired at the field equipment level and communicated to the cloud-based system for storage and processing, which includes data visualisation as presented in Section 3.4.
To collect data, CPS was required at the field level. In this case, the CPS was the WAGO PLC, a common instrumentation tool that can provide a range of status information including measurements, temperature and position.
Although the amount of data was neither vast nor complex, our experiment demonstrated a method of continuously collecting, processing and storing real-time data continuously in practice. The concept of value generation from the DT development lies with the user and will be specific to their circumstances. When needed, the proposed framework can be expanded easily with new sensors, communication devices and software toolboxes to meet the unique needs of SMEs.
Upon examining the choices for communication protocol (i.e., Modbus, OPC UA and MQTT), the final decision was weighed against a set of criteria: versatility between hardware devices, scalability and security. OPC UA was the prime candidate for the following reasons. Firstly, OPC UA has industrial standardisation among hardware and software, giving it compatibility with most available tools. Essentially, much of the existing hardware and software related to industrial automation are readily compatible with OPC UA. Secondly, it has sophisticated security features in its application layer and transport layer of communication. We considered security a high priority and especially relevant to any industry users to ensure confidentiality and security of information. Thirdly, OPC UA provided opportunities for scalability in both its data representations and its minimal resource constraints. In other words, data modelling of objects through its attributes and relationship to other objects allows users to build a larger data model as new sources of data, which can be included and mapped without any large overhaul of SMEs' existing communication network (i.e., all the connected hardware and software). Furthermore, since OPC UA has a high compatibility, it allows devices with different capabilities to be interconnected. For example, OPC UA can be used for connecting field level devices to control level devices such as PLCs, as demonstrated in this paper. It can also be used at control level for PLC to operate in a supervisory control and data acquisition (SCADA) system at remote locations.
The considerations for SMEs may vary from our results. For a system where comprehensive data models are the key to operations, a protocol such as OPC UA may be necessary. When the available hardware is restricted in its ability to handle long communication messages and operates on a network with low bandwidth, MQTT will be more appropriate. Regarding costs, most communication protocols are open source and free; however, implementation and customisation that suit each SME requirements may depend on their resources, in-house expertise and constraints.
We used a structured database as the RTD sensor recorded a time-series of temperature measurements. This data posed a minimal complexity for harvesting and minimal variety in data types for ease in structuring the database. Similarly, the data regarding the present state of the 3D model could be represented in time-series. A MySQL database was chosen to fit the low-cost structured database need. As with other database solutions in the market, MySQL comes with the essential user authentication functionality.
For SMEs, there is a large choice of data storage options. The decision begins with an understanding of what type of data are being extracted. For data that largely consist of numbers and values, the structured database holds greater value. This is especially true for situations where the SMEs may have limited storage space and data processing power. In other cases when the data are of a qualitative nature, such as images, text or generally does not fit into a defined model, the unstructured database will be suitable. In either case the SME must then decide how their data will be processed and utilised in decision making processes. In the case of structured databases, as in the use case of this article, SMEs will be able find many readily available resources (e.g., MySQL and Grafana) to store and visualise their data. For unstructured data, SMEs can hardly find a 'one-size-fits-all' solution but rather require fine-tuning of available solutions which are mainly driven by the costs and the nature of the data. SMEs have many options for low-cost databases, both on-site and on the cloud. If the SME is looking for multi-client access, the cloud-based solution may be more appropriate. The implementation of databases, though not overly complicated, may require skills in package installation, knowledge in basic programming syntax and conversions among data types, which has been demonstrated in (Section 3.2).
For our experiment, Node-Red was used as a software to wire together the various individual components, namely the field equipment PLC, the MySQL database and the 3DExperience modelling software. Node-Red is built on the Node.js where it has the added benefit of being an I/O model that is lightweight and efficient enough to run on cost-efficient hardware such as Raspberry Pi. Node-Red is also a user-friendly software that combines its visually intuitive design with drag-and-drop blocks and connections, as well as its ability to express code in a visualised manner as opposed to textual code. Besides, Nodered has the added advantage of being multi-client accessible, lightweight, robust and easy to access via the cloud for developers in I4.0, DT, IoT connectivity and smart devices. It also has vast programming capabilities with its large node library, including built-in support for popular IoT protocols (e.g., Modbus, OPC UA and MQTT), giving it the ability to wire together field devices and automation controllers to cloud services and databases. For situations where nodes for a specific purpose are not available, there is customisability with function blocks using JavaScript. This was demonstrated in our experiment when storing data in the specified data tables.
Our communication framework also allowed the end-to-end implementation of DTH, which allows the digitization and traceability of the flow of information up-and downstream throughout the entire PLM processes. Our experiment scenario demonstrated DTH could be readily integrated into typical manufacturing scenarios, such as machining tools, where users could view a comprehensive history log through the database and keep track of the manufactured items along their life cycles.
We acknowledged that the experiment in this study has its limitation in fully demonstrating the capability of DTH; however, it does not limit the DTH from expanding its coverage that encompasses the entire PLM from design to disposal, integrating different stakeholders along the supply chain, enhancing collaboration, decision making and data management. SMEs that aim to extract greater value along the entirety of the product lifecycle, as noted in Sections 1 and 2, may consider supply chain integration into the DT and DTH and utilising cloud-based solutions with multi-client access. This application would serve to build a more comprehensive DT model and DTH where data from all stages would be collected to form a single source of truth. The users of such data would be able to make well-informed decisions, be adaptive to unforeseeable circumstances and yield higher productivity.
Grafana was the visualisation tool used in this article to display key temperature readings and current state of the 3D model simulation (and as a result, the physical twin). The customisable dashboard grants users the ability to project data on a cloudbased platform, monitor events and recognise opportunities for operational improvement. Similar to Nodered, Grafana is both open-source and low cost, fitting the software criteria for SMEs.

RQ2: Digitisation and Digital Transformation Strategy
The purpose of the digitisation of physical assets into 3D models was two-fold. Firstly, the 3D model itself provided a visual representation of the physical asset, in essence the face of its DT, and precise mathematical representations of its behaviours. Secondly, it enabled predictive capabilities in the form of simulations and adjustments of variables. The digitisation process in this study was completed in two stages: (1) the systems model, and (2) the 3D model. The systems model was a process-oriented representation where each element consisted of a mathematical relationship of the inputs and outputs. It was used to map out the physics behind the physical asset, essentially dictating how the DT will behave.
For the digitisation process, the platform of choice was important as DT development requires a program with high functionality and versatility [31]. The software would need to be an open platform such that it performs as programmed but gives the users versatility in adapting the software to their specific needs since DT is not a one-size-fits-all methodology. We used one of the enabling DT technologies, i.e., 3DExperience (described in Section 1.3) for our model development. We started with the 3D model of the micro servo, which was designed in a simpler fashion with the focus on representing the rotational motion. The engineering connections (i.e., constraints) established when assembling the model were used to determine the kinematic motion of the 3D model.
Then, we proceeded with the development of a system model. The system model was divided into three components that were deemed to be capable of modelling the behaviour of interest. The first component was designed to create the connection between temperature sensor data and driving input into the electric and logic circuits. Initially, there was a significant challenge of mapping temperature data from the RTD sensor and PLC unit into the 3DExperience software that did not have readily available connectors. After examining the Modelica library, it was noted that utility functions related to file manipulation were available. After some research into Modelica coding, a block that called the Read File function was built and output it as a real variable to be used at the next stage. The second component contained the electrical circuit, logic components and a brake for driving the moving parts of the servo. The third component was the kinematics of the servo model. These were generated directly from the 3D model; however, it is possible to map out the kinematics directly from the Modelica library similar to Figure 3 in Section 2.
The development of the systems model is dependent on the relevant parameters chosen by the SME or user. It is possible to build a comprehensive system model to measure the essential parameters for productivity evaluations. For example, in modelling a crankshaft, where the rotational variables are the topic of interest, the user may omit variables that are not of interest, such as vibration or temperature. This simplified model reduced time and costs, and potentially removed the need for someone with expertise in computer-aided engineering while achieving the same outcome.
It is in this stage that predictive simulations can be achieved through adjusting variable values. For example, experimenting with brake variables such as maximum normal force exerted, running the simulation and observing the subsequent changes of the simulation. From the engineering design point of view, our simplified 3D modelling approach may trade off a degree of accuracy when compared to more complex models. However, using this approach, we managed to achieve a model that replicates the live state of the micro servo, can output predictive variable values of the parameters that are adjusted, and did not limit our team from creating an iterative process whereby further details could be added to the DT model in the future.
In the case where predictive accuracy is sought, parameter values can be adjusted to assess the outcomes of certain scenarios. For example, for productivity, it will be far more important to measure the throughput of a process rather than the vibration of a machine. The DT models should always leave room for adding further details such as new parameters and key performance indicators. This will allow system engineering to generate plans to install sensors strategically to capture new information. Most digitally enabled PLM solution suites that have various enabling DT technologies, such as 3DExperience, can handle this with its behaviour modelling applications that run simulations and present quantifiable data at all measurable instances.
The challenges outlined in the demonstration unit of this work are good representations of what most SMEs are facing in their DT development processes. The incorporation of temperature readings in the DT model allows it to interact with external circumstances, which can then be generalised and applied to any systems that react dynamically with extrinsic/intrinsic parameters in real time.
SMEs at the modelling stage should investigate modelling tool options and find one that fits their purposes and budget as this phase will be critical to the cost and time control of the whole project. Regarding the level of detail of both the systems model and the 3D model, SMEs must decide what aspects of the DT represent the desired functionality of the physical twin. An important note is that although detailed systems models may require deeper understanding of an asset's functionality, it also sets up for greater modelling accuracy and predictive simulation capabilities through variable adjustments. The outputs of such predictive simulations are entirely dependent on the inputs; therefore, SMEs must ensure data integrity when extracting data from their existing system. For DT developers with limited knowledge on the underneath software, there is an expected learning curve which may vary depending on the complexity of the task.
There is potential for value generation via adopting the proposed framework. As seen in companies such as Tesla and General Electric, the opportunities created by realtime monitoring provide the basis for informed decision-making, increased transparency, and improved responsiveness. The effect of real-time monitoring and informed decisionmaking grants insights into the fundamentals of a business in areas such as supply chain management, the customer relationships, marketing strategies and the company's bottom line. This may lead to a transformation of the business model such as that accomplished by Rolls-Royce, as mentioned in the introduction, or it may lead to a leaner and less wasteful manufacturing processes-both positive results. Ultimately, companies that choose to incorporate the digital twin concepts will be those that value information obtained in a timely manner and are able to generate new revenue from it.

Future Research
Despite its effectiveness, this DT model and communication framework has some limitations with respect to industrial-scale systems. For example, the simplified 3D design does not reflect the true characteristics of PLM and IoT systems that have multiple stakeholders, users and many components, devices, and data types. Such characteristics require a comprehensive analysis of communication protocol, digital transformation and larger efforts in system integration. As such, we note the following areas for future research: This article has focused primarily on building the connection between the physical asset, the digital asset and the database while giving priority to components with in-built security. SMEs and industry users may consider implementing internal safety control measures to ensure data security and intellectual property protection.
We noted throughout the development of this article that different platforms communicate data with their proprietary format such as 3DExperience's 3DXML file format for the systems model. There are the many commercial simulation platforms available, each with their own proprietary data formats; however, this is not limited to modelling and simulation platforms but rather covers many hardware and software that utilise data for their operations. Due to the number of platforms and data types, we observe that the standardisation of data and models is a potential field of study for future research.
Regarding the physical assets, the data sources and the data types, this article has focused on a singular source of structured data; however, unstructured data may generate business value if utilised correctly. Future DT developers may look to implement methods of utilising unstructured data into the DTH. As different sources of data are integrated into the DTH, users will observe the increase in traffic and therefore optimise and synchronise the communication systems such that data silos and communication interruptions are avoided and collaboration is improved.
We faced a challenge in integrating the individual components that make up the communication framework and found one of many viable solutions. This solution was suitable for the purposes of this article. In the industry setting, a smoother integration between I4.0 and DT components and their reliability and resilience will require further dedicated analysis and evaluation. Collaboration between those representing the various hardware and software developers would aid significantly in creating a unified platform of the digital twin.

Conclusions
The interest from SMEs to I4.0 has grown exponentially in the last decade. This paper proposed a guideline for SMEs to adopt I4.0 technologies into their existing models in order to reap the benefits of DT and DTH models.
In doing so, this paper aimed to answer two key questions regarding the design of the database management and data analytics, the flow of data through sensors and networking, digitization and systems integration in relation to SMEs.
The communication framework composed of multiple components that each handle data capturing, conditioning, format conversions and data transmission and aggregations. The final step was to present the data in a manner usable by management. A key challenge of this article has been the integration of these components in a manner that extracts and maps the I/O data between physical assets and their digital models.
It was important to determine what data were to be collected based on their relevancy to the productivity or the goal of the complete system. We also noted that aggregating data from different sensors, different sources, and of different types can easily lead to errors in DT modelling, PLM, and real-world decision making. As such, it was paramount to understand the parameter being measured and the format this data would take to avoid these errors.
The digitisation process was completed in two stages: the systems model, and the 3D digital model. Each served the purpose of modelling the behaviour and modelling the asset, respectively. Challenges of digitisation revolved around connecting the software to external sources in such a way that it was possible to run a continuous simulation which would adapt to changing external factors. The approach taken by this article was to utilise various software with a preference for low-cost, low-code, and open-source options where possible. This has been accomplished with the use OPC UA protocol, 3DExperience, Modelica, Nodered and MySQL.
Despite the benefits offered by DT and I4.0 technologies, there are many factors that SMEs need to be aware of. First and foremost are costs. Although we have outlined low-cost options in this article, costs in the form of software and expertise will inevitably arise due to the complexity and scale of the system. It is up to the SME to perform a cost-benefit analysis of their specific circumstance and evaluate whether such costs justify the long-term benefits. Furthermore, a patchwork integration of different platforms can create difficulties in verifying accuracy of modelling and simulation.