You are currently viewing a new version of our website. To view the old version click .
Machines
  • Editorial
  • Open Access

6 August 2023

Digital Twin Certified: Employing Virtual Testing of Digital Twins in Manufacturing to Ensure Quality Products

Digital Twin Institute, University of Central Florida, Orlando, FL 32616, USA
This article belongs to the Special Issue Advances in Digital Twins for Manufacturing

Abstract

Quality products are a main focus for manufacturers. Product users only determine a product to be a quality product if it performs in operation to the user’s perceived standard. Product manufactures need to take a product lifecycle quality (PLQ) perspective of quality and not simply focus on manufacturing quality control, which is more accurately specification control. Manufacturing is the key phase where products take their physical form. There are increasing costs and decreasing risks of different physical quality strategies. The information provided using digital twins and virtual testing promises to be both low risk and cost and has the potential to predict what the customer will experience in operation by testing products passively with data and actively with simulation to destruction. Digital Twin Certified (DTC) is proposed as the methodology for accomplishing this. DTC will be especially important for the adoption of additive manufacturing.

1. Introduction

No discussion by an executive of a product company is complete without a mention of quality. Quality is one of the most important characteristics of products today. Manufacturing companies expend a great deal of time, money, and effort in an attempt to build quality products. All other things being equal, a product that is higher in quality commands a better price than a product that is lower in quality. Below a certain threshold of quality, the price buyers will pay will drop to approximately zero.
The cost of quality (COQ) is significant to manufacturers. COQ includes both failure costs and prevention plus appraisal costs. The classical view is that COQ is a U-shaped curve with prevention plus appraisal costs rising dramatically the closer quality becomes to the 100% level. A contradictory claim is that prevention and appraisal cost are linear and that COQ does not exceed about a quarter of sales costs [,]. However, both claims put the minimum COQ at between 20–30% of sales.
The American Society of Quality (ASQ) states [], “Many organizations will have true quality-related costs as high as 15–20% of sales revenue, some going as high as 40% of total operations. A general rule of thumb is that costs of poor quality in a thriving company will be about 10–15% of operations.”
However, the issue of quality is a complicated one. While there is much talk about quality, there is also much confusion about what quality actually is. If we look to the definition of quality in the dictionary, we find that there are at least eight different definitions of quality that have very separate meanings from each other.
The source of much of the confusion is the term “quality control.” Quality control originated in manufacturing and has been exhaustively covered [,]. However, this “quality control” is actually specification control. Manufacturing Quality Control or MQC, which the paper will use to refer to this, is the assessment of whether the manufactured product meets its specifications.
However, simply manufacturing products to its required specification does not ensure that a product is a quality product. A product is only a quality product if it performs in operation to the user’s perceived standard. The only true measure of quality is a posteriori, that is at the end of the product’s useful life in the physical world. Obviously, that is far too late to be useful to a prospective buyer or user of the product.
What we need is a means of predicting that life performance. Digital Twins (DTs) are a potential capability to better predict the ability of a manufactured product to perform to the user’s perceived standard. With Additive Manufacturing (“AM”), it is only going to become more difficult to make these kinds of predictions. The remainder of the paper will address this issue.

3. Manufacturing Quality Hierarchy of Risk vs. Cost Strategies

Focusing on the manufacturing product lifecycle phase, MQC strategies are about balancing risks with the costs of preventing non-conforming products. Figure 5 shows product quality hierarchy in terms of risk versus cost. This applies to both internal and external production.
Figure 5. Manufacturing Risk vs. Cost Hierarchy.

3.1. Acceptance

The highest risk with the lowest cost is simply to accept whatever has been produced. The assumption is that the product has been produced to the required specifications and tolerances. There is no checking with this assumption. The product is simply produced within the factory or is shipped in from a supplier, and we simply install it as part of our manufacturing process.
Clearly, there are huge potential problems with this approach. The cost of this assumption being incorrect could be astronomical not only in terms of MQC issues but also in terms of product safety.

3.2. Certification

The next approach is certification. The producer of the product goes through a certification process that attests to the following of a specific process that will result each and every time in producing products that meet the requisite requirements of MQC. The certification process may be one time or on some periodic basis. The risk decreases, but the cost will increase using this approach. The customer is still relying on the assumptions that the product producer will follow the process that is required by the certification and that following the process will result in a conforming product. Certifications are not free so the product producer will increase the price of the product as a result.

3.3. Inspection

With an inspection approach, risk decreases and cost increases. Inspection can be performed using statistical methods or at the hundred percent level, meaning every product is inspected. However, inspections are at a superficial level, meaning the inspections are conducted on what can be observed. External measurements can be taken, but internal measurements usually cannot. Inspection regimes are useful in decreasing risk of nonconforming products. However, they do add expenses, sometimes substantially, in terms of labor costs and time delays.

3.4. Testing

Testing can be of two types: nondestructive or destructive. Testing in manufacturing is usually that the manufactured part meets its requirements. It is not a test of behavior.

3.4.1. Testing—Nondestructive

NDT will reduce risk substantially but will increase costs, also substantially. NDT can be performed by statistical sampling or at the hundred percent level. NDT can give more assurance than simply inspection. However, it is based on visualization and not behavior. NDT techniques can look internally into metal structures to determine whether there are flaws. The determination of whether those flaws render the product nonconforming is a matter of heuristics as opposed to certain information that the flaw will result in the product not performing properly.

3.4.2. Testing—Destructive

Again, risk decreases very substantially, but costs increase also substantially. Destructive testing can be conducted via statistical sampling or to the hundred percent level. While 100% destructive testing will give great confidence that the product was a conforming product in all respects, it obviously makes no sense. There is no market for completely destroyed products, except by scrap dealers. Destructive testing gives us a great deal information about the ability of the product to withstand forces that it is exposed to. Destructive testing gives us great confidence that the product is conforming and would produce to the expectations of the user.
Destructive testing is generally only used during the development process to ensure that the product design and engineering met its requirements. Destructive testing could be used in the first batch from a new supplier or when a supplier changes their processes. However, despite the lowering of risk, the costs increase dramatically to the extent that frequent destructive testing is rarely used.

3.5. Information

The lowest risk is if we had completely reliable information about the product and how it performs throughout its life. If we could look into the future and follow the product throughout its life, we would eliminate all risk of the product not being conforming. If this information would be available to us, it would also be the least costly approach. Outside of time travel, this approach would appear to be closed to us. However, there may be a substitute for generating equivalent information that would be a proxy we could use.
That proxy would be virtual testing. That virtual testing can be of two types, NDT, which is data analysis, and destructive testing, which is simulation. While physical NDT requires that we perform some active physical operation, virtual NDT (VNDT) can be performed passively. We can use data analytics in the form of AI/ML. Using the DTA of operational products, AI/ML would indicate the set of DTI product attributes that correlated with a future failure. While this might not tell us what the causal relationship was for failure, it will indicate that we need to carry out more testing of this particular product.
Virtual destructive testing (VDT) will be the simulation of forces on the DTI. While it can be the testing of specifications. It can also be testing for behaviors that the product will exhibit in use. With the required computing capability and exploiting that, in virtual space, time is unconstrained, VDT could virtually test a newly manufactured airplane wing with identified flaws by simulating the flying of that wing through a lifetime of flight profiles. The forces of those profiles could be increased until the wing failed. The forces then would be a tracked as the plane flew missions so that the testing limit would never be allowed to be approached.
While the processing of data into information is not free, Figure 2 illustrates that even if virtual and physical testing are equal today, it becomes increasingly less expensive over time compared to physical testing. Virtual testing creates its data without destroying physical products and can destroy the exact same virtual product over and over again. This means that we can reduce more risk at a significant reduction in cost as shown in Figure 2.

4. Digital Twin Certified (DTC) for Traditional Manufacturing

There are two types of testing that can be considered DTC: data-based and simulation-based. The data-based testing would test the DTI attributes of the manufactured product against products that subsequently had substantial adverse effects or failed in operation. The simulation tests would run simulations to failures of forces on the DTIs.
While DTC can be used for all types of testing, i.e., fit and finish, tolerance stacking, CFD, rheology, etc., the focus of this paper will be on product structure. Some configurations may be susceptible to failures. The durability of product is also dependent on its ability to endure forces without failing. The two forces it must endure or withstand are external forces it is subject to and internal forces it generates. Product structure durability from these forces can be evaluated by Finite Element Analysis (FEA).
The key elements of DTC are:
  • Capture of the data from the physical manufacture of the DTI.
  • Capture of data from DTIs of physical products in operation.
  • The virtual tests that will produce the highest level of assurance of future product performance.
  • AI/ML analysis of product configurations as subsequent failures.
The DTC methodology would consist of the following:
  • As the product proceeded through the manufacturing process, the defined data would be captured to produce its unique DTI.
  • At selected points in the manufacturing process, the DTI would be subject to virtual testing up to and including failure.
  • This virtual testing could be repeated during all phases of product manufacturing: components, sub-assemblies, assemblies, and up final product assembly.
It is important to note that DTC does not start at the manufacturing phase. As the PLQ model showed, this is a product centric approach and not a functional siloed approach in terms of information. Data and information that is populated in one phase can be consumed at another phase of the product lifecycle.
The testing in the creation phase has historically included physically testing the product to destruction. Automobiles are crash tested. Airplanes have their wings bent until they break. NASA tested their new composite fuel tank until its destruction []. Testing to destruction gives a high level of assurance of the safety factor and indications of the most probable sites of weakness.
However, this physical destructive testing is extremely expensive in both cost and time, so manufacturing organizations are very interested in replacing these physical tests with virtual tests. The goal is to replace the destructive physical testing with destructive virtual testing, with a final physical destructive test to validate the destructive virtual testing.
The virtual testing that is devised and used in the creation phase now can be used in the build (manufacturing) phase that will give assurance that the product will perform in operation throughout its life. In addition, the creation phase of virtual and physical testing will give the development engineers indications of where the most likely structural failure points are. The virtual testing during the manufacturing phase can prioritize these likely failure points.
The testing that will come out of the creation phase is going be primarily physics-based, although data-based testing is also possible from previous generations of products if they share a high degree of similarity with the new generation. The simulations that are used will be “white box”, meaning that they are driven by formulae, equations, and causation. It would be very risky to approve designs that used even highly correlated data. Black box or even grey box methods by themselves would usually be insufficient to approve new product designs.
The testing also has been designed to verify and validate that the design requirements are being met. However, when products are in operation, there will be failure conditions that arise that the design testing did not consider. As VNDT testing, AI/ML can be employed to associate that data with DTI data to predict configuration failures that physic-based methodologies are unable to predict or predict accurately []. While VNDT might start with the AI/ML of previous generations of products, the accuracy of VNDT will improve the more DTIs there are and the larger the DTA population grows.
The application of VDT in the manufacturing process will depend on the specific product with its estimate of potential product weakness. The general idea is that the DTI would be created by capturing geometric data as the product is manufactured. The DTI, which is the DTP 3D model modified with actual measurements, would undergo virtual FEA testing.
This testing at selected points in the manufacturing process would be to the point of failure, focusing on identified weak points. The results of each FEA test would be compared to the requirements to see if the DTI passed. This could be carried out at the component level, the assembly, and up to the entire product. As discussed above, the VDT could test not just its specification compliance, but its behavioral performance in a virtual lifetime.
As an example, given the requisite computing capability, every automobile could be crash tested as a DTI. A by-product of DTC is that this would be both a protection and a defense in product liability claims. One of the ambush tactics used by a plaintiff attorney when the product manufacturer presents their product crash testing as a defense is, “But did you crash test this particular product.” This would allow the product manufacturer to answer, “Yes.”

5. Digital Twin Certified and Additive Manufacturing (AM)

The certification of AM is the top challenge for the manufacturing industry. It is costing the industry billions of dollars because of AM complexity, the unknowns in AM processes, and the severe delays in not being able to adopt AM []. This is negating the major potential benefits of adopting AM for safety related products and general production.
The assumption in DTC for traditional or subtractive manufacturing is that the material is not a factor because it is homogeneous and undergoes minimum transformation in the manufacturing process. Composites can be considered a form of additive manufacturing. While the material is homogeneous, the geometric shape is formed by layering and curing. The issues composites have would be better addressed by this section than by the previous section. The material is made into its required shape by a variety of manufacturing methods. It is subtracted by cutting or milling a larger piece of material. It is formed into the required shape by stamping or extruding. Welding is one of the few processes where material is not homogenous, and as a result it is one of the most common failure points in subtractive manufacturing.
That is not the case for additive manufacturing. In additive manufacturing, the material undergoes a transformation as it is processed. The shape is created by depositing or printing layer by layer. In the case of metals, the material is transformed from powdered metal to a continuous form by creating liquid melt pools with high powered lasers that merge material horizontally and vertically.
AM has both aleatory and epistemic uncertainty in its process inputs []. As a result, there are substantial efforts to address the inherent variability of AM at the front end of production. Particularly for metals, there are host of qualification and certification initiatives that involve standards, rules, and regulations []. While this will be helpful in reducing uncertainty, the inherent variability of AM processes requires a production backend methodology like DTC.
My first introduction of the DTC concept was for AM because current methods did not address these new issues that came with AM technology []. The three key elements of DTC are the same, except that the first element, the capture of the data from the physical manufacture of the DTI will need to be much more extensive and robust. In situ monitoring will need to capture and record data on the formation of the material on a voxel-by-voxel basis. The captured data of these voxels will need to be assembled into a 3D image that can be virtually tested.
The structural simulation tests will also need to be more extensive. Structural tests of subtractive manufacturing assume that the forces are uniformly transmitted through the material. However, that assumption will not apply to AM. There may be anomalies within the material due to inconsistent development of melt pools. As a result, forces may not be transferred uniformly. Virtual testing will need to be robust enough to determine when these anomalies will result in potential structural failures. Depending on the product, forces can be oriented differently to give a profile of where the structure is most at risk and how much force over the requirements will cause failure.
Again, the virtual testing can be performed until structural failure occurs with each DTI. This will establish the safety factor of such failures. Currently, the determination of the nonconformance of an AM build is to capture in situ monitoring data and attempt to characterize the build to other successful and unsuccessful ones []. Virtual testing of each specific DTI should be far more reliable than correlations.
We will also be able to use data-based DTC over time. With a voxel-by-voxel DTI, we will be able to use AI/ML to assess which pattern of structural anomalies are benign and which result in future failures. Since this is unknown territory now, it would be useful to start collecting this data as soon as possible.
This is an issue with the assessment of AM structures today. The concern and ability to use AM in safety rated products is a barrier to AM use. The attempted mitigation is to use NDT techniques such as CT scanning. However, this is only a visual inspection technique. The widespread adoption of AM could be enabled and accelerated by DTC capability.
To add more complexity to this, the development of new and unique materials is being investigated for AM. In order to increase a product’s fit for purpose, one area of research is Integrated Computational Materials Engineering (ICME) []. The intent would be to design custom materials that then would be additively manufactured. This would mean that the VDT would need to incorporate the ICME specific characteristics into its testing.

6. Research Needed

In order to realize DTC, research and advancements are needed in a number of areas.
  • The virtual testing simulations will need to mirror the physical tests. We will need a high degree of confidence that if we test the DTI structure to failure that a physical test will produce the same results with an acceptable margin of error. For AM, we will need to incorporate ICME capability. This will need to be done for all the different virtual tests that we intend to rely on. How such virtual tests can be qualified so that all parties have confidence on relying on their results is a research area of significant interest.
  • For AM, we will need in situ monitoring capabilities that will allow us to capture anomalies during the process and create DTIs that will be reliable in virtual testing.
  • There is much work needed in developing the fusion of physics based and data hybrids.
Advances in these areas will require substantial computing capabilities to be fully realized. Based on Moore’s Law, I have been estimating that there is a tremendous amount of new computing capability that will be available in the not-too-distant future [] (I recently was an invited speaker at a researchers’ conference of a top semiconductor manufacturing company. As a provider of the machines that make semiconductors, they are not as optimistic about the continuation of Moore’s Law as it relates to transistor density. However, there is optimism that overall computing capability will still increase significantly). These estimates do not include the availability of quantum computing. We should be planning our capabilities now to take advantage of those increases.

7. Conclusions

DTC does not by itself ensure that the product will meet the perceived value the customers expect of their products. As the PLQ Model illustrates, all the manufacturer’s organizational functions need to populate and consume the appropriate data and information. Marketing needs to have accurately captured the customer’s requirements. The design and engineering function needs to produce those requirements in material, geometry, and behaviors in a virtual product that they rigorously test. The product needs to be properly supported and repaired quickly and efficiently, ideally predicting failures before they occur.
However, the critical phase in this lifecycle is manufacturing which takes the virtual product from the creation phase and produces multiples instances of physical ones that will never be exactly the same in a physical world. Virtually testing these unique products will give us a high level of confidence that they will perform to the user’s expectations. This will be of special importance as we move from traditional or subtractive manufacturing to additive manufacturing. In fact, it may be the only way that we can have assurance that additive manufacturing is producing the products that will perform to the perceived value of the customer.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Schiffauerova, A.; Thomson, V. A review of research on cost of quality models and best practices. Int. J. Qual. Reliab. Manag. 2006, 23, 647–669. [Google Scholar] [CrossRef]
  2. Plewa, M.; Kaiser, G.; Hartmann, E. Is quality still free? Empirical evidence on quality cost in modern manufacturing. Int. J. Qual. Reliab. Manag. 2016, 33, 1270–1285. [Google Scholar] [CrossRef]
  3. ASQ. Cost of Quality (COQ). Available online: https://asq.org/quality-resources/cost-of-quality (accessed on 15 May 2023).
  4. Juran, J.M.; Godfrey, A.B. Juran’s Quality Handbook, 5th ed.; McGraw Hill: New York, NY, USA, 1999; p. 1v. [Google Scholar]
  5. Juran, J.M. A History of Managing for Quality: The Evolution, Trends, and Future Directions of Managing for Quality; ASQC Quality Press: Milwaukee, WI, USA, 1995; Volume xv, 688p. [Google Scholar]
  6. Grieves, M. Product Lifecycle Quality (PLQ): A Framework within Product Lifecycle Management (PLM) for Achieving Product Quality. Int. J. Manuf. Technol. Manag. 2010, 19, 180–190. [Google Scholar] [CrossRef]
  7. Grieves, M. Product Lifecycle Management: Driving the Next Generation of Lean Thinking; McGraw-Hill: New York, NY, USA, 2006; p. 319. [Google Scholar]
  8. Grieves, M. Virtually Perfect: Driving Innovative and Lean Products through Product Lifecycle Management; Space Coast Press: Cocoa Beach, FL, USA, 2011. [Google Scholar]
  9. Grieves, M. Don’t ‘Twin’ Digital Twins and Simulations. EE Times, 2022. Available online: https://www.eetimes.com/dont-twin-digital-twins-and-simulations/ (accessed on 8 March 2023).
  10. Grieves, M. Completing the Cycle: Using PLM Information in the Sales and Service Functions [Slides]; SME Management Forum: Troy, NY, USA, 2002. [Google Scholar]
  11. Grieves, M.; Vickers, J. Digital Twin: Mitigating Unpredictable, Undesirable Emergent Behavior in Complex Systems. In Trans-Disciplinary Perspectives on System Complexity; Kahlen, F.-J., Flumerfelt, S., Alves, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; pp. 85–114. [Google Scholar]
  12. MIT Technology Review Insights. In The Emergent Industrial Metaverse; MIT Technology Review: Cambridge, MA, USA, 2023; Available online: https://wp.technologyreview.com/wp-content/uploads/2023/03/MITTR_Siemens_The-Emergent-Industrial-Metaverse.pdf (accessed on 8 March 2023).
  13. Grieves, M. Intelligent digital twins and the development and management of complex systems. Digital Twin 2022, 2, 8. [Google Scholar] [CrossRef]
  14. Kapusuzoglu, B.; Mahadevan, S. Information fusion and machine learning for sensitivity analysis using physics knowledge and experimental data. Reliab. Eng. Syst. Saf. 2021, 214, 107712. [Google Scholar] [CrossRef]
  15. Hughes, W.; Zhang, W.; Cerrai, D.; Bagtzoglou, A.; Wanik, D.; Anagnostou, E. A hybrid physics-based and data-driven model for power distribution system infrastructure hardening and outage simulation. Reliab. Eng. Syst. Saf. 2022, 225, 108628. [Google Scholar] [CrossRef]
  16. Guo, Y.; Klink, A.; Bartolo, P.; Guo, W.G. Digital twins for electro-physical, chemical, and photonic processes. CIRP Ann. Manuf. Technol. Forthcom. 2023. [Google Scholar] [CrossRef]
  17. NASA. NASA Engineers Break SLS Test Tank on Purpose to Test Extreme Limits. Available online: https://www.nasa.gov/exploration/systems/sls/nasa-engineers-break-sls-test-tank-on-purpose-to-test-extreme-limits.html (accessed on 16 July 2023).
  18. Vickers, J. (NASA, Washington, DC, USA). Personal Communications, May 2023.
  19. Mahadevan, S.; Nath, P.; Hu, Z. Uncertainty quantification for additive manufacturing process improvement: Recent advances. ASCE-ASME J. Risk Uncertain. Eng. Syst. Part B Mech. Eng. 2022, 8, 010801. [Google Scholar] [CrossRef]
  20. Chen, Z.; Han, C.; Gao, M.; Kandukuri, S.Y.; Zhou, K. A review on qualification and certification for metal additive manufacturing. Virtual Phys. Prototyp. 2022, 17, 382–405. [Google Scholar] [CrossRef]
  21. Grieves, M. The Future of Additive Manufacturing. Dassaut Syst. 2020. Available online: https://www.youtube.com/watch?v=jPVQfiME6AM&t=1s (accessed on 16 July 2023).
  22. Sahar, T.; Rauf, M.; Murtaza, A.; Khan, L.A.; Ayub, H.; Jameel, S.M.; Ahad, I.U. Anomaly detection in laser powder bed fusion using machine learning: A review. Results Eng. 2022, 17, 100803. [Google Scholar] [CrossRef]
  23. Wang, W.Y.; Li, J.; Liu, W.; Liu, Z.-K. Integrated computational materials engineering for advanced materials: A brief review. Comput. Mater. Sci. 2019, 158, 42–48. [Google Scholar] [CrossRef]
  24. Mackenzie, S. Dr. Michael Grieves with the Digital Twin Institute; Mackenzie, S., Ed.; Industrial Talk: Barcelona, Spain.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.