Next Article in Journal
Elicitation as a Process of Enhancing Bioactive Compounds Concentration in Sprouts
Next Article in Special Issue
Research on Flexible Braking Control of a Crawler Crane during the Free-Fall Hook Process
Previous Article in Journal
Thermophysical and Electrical Properties of Ethylene Glycol-Based Nanofluids Containing CaCO3
Previous Article in Special Issue
Generalized Conditional Feedback System with Model Uncertainty
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

PADDME—Process Analysis for Digital Development in Mechanical Engineering

1
Engineering Design, Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), Martensstr. 9, 91058 Erlangen, Germany
2
PSW Automotive Engineering GmbH, Carl-Benz-Ring 7, 85080 Gaimersheim, Germany
*
Author to whom correspondence should be addressed.
Processes 2024, 12(1), 173; https://doi.org/10.3390/pr12010173
Submission received: 19 October 2023 / Revised: 21 December 2023 / Accepted: 9 January 2024 / Published: 11 January 2024
(This article belongs to the Special Issue Modeling, Simulation, Control, and Optimization of Processes)

Abstract

:
Design processes are always in motion, since more and more data-driven methods are used for various design and validation tasks. However, small and medium enterprises especially struggle with enhancing their processes with data-driven methods due to a lack of practical and easy-to-use analysis and redesign methods which can handle design process characteristics. In this paper, we present PADDME, which stands for process analysis for digital development in mechanical engineering, as a novel method that, in contrast to currently available analysis methods, considers those design process characteristics with respect to the integration of data-driven methods. Furthermore, a novel technology-readiness framework for digital engineering is introduced. Using the PADDME method, an industrial case study on introducing data-driven methods into the design and evaluation process chain is presented. The usability and novelty of the method are shown by the case study. Thus, PADDME allows a detailed capturing of current design processes and paves the way for process optimisation through data-driven methods. PADDME is a valuable method for advancing digital mechanical engineering processes in small and medium enterprises, and future work will focus on refining and expanding its application and evaluation.

Graphical Abstract

1. Introduction

In the dynamic landscape of modern product development, where interconnected systems, changing work environments, and emerging technologies shape the industry, the need for adaptation in traditional processes is paramount [1,2]. As a result, traditional product development processes need to adapt to this new reality [3]. This evolution presents challenges and opportunities for the industry, demanding a seamless integration of digital engineering and data-driven methods.
Those product development processes can be viewed as a continuous problem-solving cycle, guiding the transformation from unresolved requirements to the final product [3]. However, many existing methodological approaches in this domain tend to focus primarily on either mechanical [4,5] or mechatronic [6,7] product development processes, often overlooking the industrial perspective. Industrial product development is typically milestone-driven, emphasising deliverables and tight timeframes.
Currently, product development processes heavily rely on virtual engineering techniques, such as computer-aided design (CAD) and computer-aided engineering (CAE), which excel at processing and generating data. However, they fall short in terms of interpreting and evaluating the data. The real value lies not just in the data itself but in the knowledge it holds [8]. In the past, gathering knowledge and data was a painstaking process, involving expert interviews and elaborate experiments [9]. This data collection was systematised and automated through knowledge-based engineering (KBE).
Recently, technological barriers to data collection and analysis have diminished, aided by increased computational power. These advancements have significantly reduced the obstacles to digitalisation, opening the door to various digital applications that can support designers throughout the entire product development process [2,10]. Leveraging data-driven methods can enhance efficiency and error avoidance. Data-driven methods allow product developers to extract valuable information from diverse data sources, supporting informed and partially autonomous decision making [9].
To fully realise the potential of these new digital engineering and data-driven methods, they must be seamlessly integrated into existing development processes. These processes need to have a certain technology readiness to ensure the applicability of the new methods. This transformation not only unlocks hidden potential but also shifts virtual product development towards comprehensive digital engineering.
The primary objectives of this work are to address the following research questions:
1.
How can a method for process analysis in product development be designed to facilitate cost-effective process optimisation for digital engineering?
2.
In what way can a technology’s readiness level for digital engineering methods be measured based on a process analysis?
While the research questions posed in this study focus on process analysis and technology readiness in digital engineering, the underlying problem extends to the broader context of enhancing efficiency, mitigating errors, and realising the full potential of comprehensive digital engineering practices. The inability of existing methodologies to adequately address these challenges highlights the importance of this research.
Within the scientific discipline of design process management, this research advances our understanding of the digital readiness of design processes and process evaluation regarding the integration of digital engineering methods, providing a foundation for future studies at the intersection of product development, digital engineering, and process management. These contributions go beyond the existing literature by addressing gaps in current methodologies, as shown in Section 2.2. Furthermore, current methods for integrating digital engineering methods mainly address the implementation work while disregarding the important process analysis part.
To achieve this, the foundational knowledge required is provided in the Section 2. Subsequently, in Section 3, the purpose and scope of the work are outlined. This is followed by the introduction of the developed PADDME method and the technology-readiness framework for digital engineering in Section 4. The application of the method in an industrial context, including an evaluation, is presented in Section 5. Finally, the work concludes with a discussion of potential future research.

2. Materials and Methods

In order to be able to integrate digital engineering methods into existing product development processes, the basic concepts and methods of digital engineering are introduced. Afterwards, the fundamentals of design process management are explained.

2.1. Digital Engineering

Several definitions of digital engineering exist, for example, in [11,12,13,14]. These definitions emphasise the continuous use of digital methods and tools, where knowledge generated in later product life-cycle phases is leveraged for optimisation and development based on existing data [15]. This contribution follows the definition given by [13]. Thus, digital engineering is the consistent knowledge and information extraction, using data-driven methods, from data generated in design, testing, or operation and the usage of these insights during engineering processes. Figure 1 gives an overview of relevant terms and methods. The available methods are subsumed within the overall terms data mining and machine learning.
According to Fayyad et al. [17], data mining is the application of specific algorithms to extract patterns from data. Its main applications are regression, classification, or clustering tasks. However, no clear definitions exist for machine learning. Samuel [18] defines it as enabling computers to learn a task without being programmed explicitly to perform this task. Furthermore, an adaption to new and unknown data is possible [19]. In design processes, machine learning is defined more specifically, and is used for knowledge extraction and decision making [20]. Both categories enable the support of different development tasks with individual objectives [15]. Several explicit use cases for data-driven methods have been reported in design processes [21,22], as well as overview papers on the early phases [20] or mechatronic product development [23,24].
Two common methods for implementing data-driven methods are the KDD (knowledge discovery in databases) [25] and CRISP-DM (cross-industry standard process for data mining) [26] frameworks, which are further detailed in Appendices Appendix A.1 and Appendix A.2, respectively.
To integrate data-driven methods in established development processes, some prerequisites have to be met. Mehlstäubl et al. [27] identify 19 aspects, divided into four clusters, which can serve as levers for successful method integration. The four clusters are the management and business view, the data view, the method development view, and the method user view. From the management view, the need for integrating data-driven methods has to be present and the necessary qualifications and capacity must be provided. Data in the process must be available in a digital format and, in the optimal case, consistent through different phases. Furthermore, the data storage is highly relevant for the application of data-driven methods, since the data needs to be accessed and evaluated. For the modelling phase, the process needs to be tool-supported and the required computing power has to be provided. Lastly, the employees have to use the developed tools for the evaluation.

2.2. Design Process Management

In the industrial context, several types of processes exist, such as manufacturing, controlling, or management. According to ISO 9001 [28] (standard for quality management systems), a process is defined as a set of interrelated or mutually influencing activities that uses specific inputs to achieve a predetermined result. In contrast, a business process is a sequence of tasks or activities that span several organisational units and pursue the general business strategy [29]. Following Scheer [30], the business process is the model-like description of the function to be carried out in a company in its content and temporal dependence. In the opinion of the authors, business processes are influenced by controlling and supporting tasks, which cannot directly be mapped to inputs or outputs as visualised in Figure 2. To support or automate a process step or a whole process using data-driven methods, all influencing factors—input, output, control, and support—have to be known and digitally available.
Business process management sustainably pursues the goal of effective and efficient business processes. Several methods for business process management (BPM) are available in the literature and industrial praxis [32]. BPM can be divided into four main areas: process organisation, controlling, management, and optimisation [33]. Since the integration of data-driven methods can be seen as a process optimisation or a business process re-engineering (BPR), the focus in the following is on those methods. BPR is a concept that is detached from previous structures and fundamentally scrutinises the processes [34]. One core idea of BPR is the support of the new processes with IT systems [35]. A less drastic approach is process optimisation, which has the main goal of an increase in efficiency [33]. The main procedure during process optimisation is according to the plan–do–check–act (PDCA) cycle. The most prominent representatives of this approach are total cycle time (TCT) [36], Kaizen [37], and six sigma [38].
All of these optimisation approaches are based on an initial process analysis and capturing to transfer real processes into process models. This is necessary, since processes are not directly tangible and need this model to interact with [39]. Depending on the target of the analysis, several types of models are available according to Stacey et al. [39]. Possible targets are visualisation, planning, control, and development. To transform processes into process models, process discovery is used, in which the focus lies on information capture [29]. There are different discovery models [40,41,42], which roughly follow a similar structure [43]. After an initial definition of the setting, level of detail, and boundaries, the information capture is performed. Thereafter, the gathered information is transformed into a process model, and finally, validated.
Process discovery is only the first step of process analysis. Subsequent process model visualisation has to be employed. The two most common methods are data-flow-oriented [44] and control-flow-oriented [45] methods. An in-depth analysis of all of the available approaches would go beyond this article. Therefore, they are summarised in Figure 3.
An example of a data-flow-oriented method is integration definition for function modelling (IDEF) diagrams [31]. With the unified modelling language (UML) [46] a general-purpose modelling language is available. The business process model and notation (BPMN) is a graphical notation to represent business processes in a control-flow-oriented manner [47]. With event-driven process chains (EPCs) [48], a business modelling technique is available to represent sequences of events and functions within a process. Based on EPCs, the architecture of integrated information system (ARIS) was developed [30] to support different views. The lean management technique value stream mapping (VSM) is used to analyse and improve the flow of materials and information in a process and is mostly used in production contexts [49]. Data-flow-oriented methods can document data but have weaknesses in control and process flows. Therefore, several control-flow-oriented methods were developed. Within those methods, data can barely be taken into account [50]. Combinations of the approaches are possible as well, to mitigate the weaknesses of individual methods [51].
A common characteristic of all the introduced approaches is their development for classical processes in management or production. To deal with the characteristics of product development mentioned above, at least 23 different modelling approaches are available [52]. One of the most known approaches is the design structure matrix (DSM), developed by Eppinger and Browning [53]. Another possible way of managing engineering activities is the engineering operating system [3]. Two excellent reviews of product development process-modelling approaches are given by Smith and Morrow [54] and Wynn and Clarkson [52]. Kossak et al. [55] compare the available modelling languages with respect to the relevant aspects and the integration, while Trauer et al. [51] present criteria for selecting a suitable method for the individual use case.
Business processes can be evaluated in quality and quantity. Qualitative process analysis asks about process capability and the presence of certain process characteristics, while quantitative evaluation measures process performance.
Probably the best-known qualitative analysis of business processes is the maturity model. An example is the 20-keys system, which allows companies to evaluate the implementation status of lean production [56]. The European Foundation for Quality Management (EFQM) model is also a qualitative analysis tool for process evaluation and is based on the total quality management movement [57]. In order to assess the current state of development in digital transformation, there are also maturity models that specifically query the state of digitalisation. Three maturity models record the development status of the entire company [58,59,60], whereas the latter specifically records the status of the processes [61].
In the quantitative evaluation of actual processes, they are first recorded structurally and then evaluated using ex-ante process analysis techniques. The quantitative analysis of processes is an important tool for assessing the plausibility of qualitative considerations such as calculations, simulations, or models with qualitative figures. As examples, the particularly well-known process indicators should be mentioned. They are multifunctional and indispensable for understanding company-related thinking and actions [62].
Important and relevant parameters are shown in Figure 4. These can be recorded in a process analysis in the operational or administrative environment.

Product Development Characteristics

The aforementioned processes from manufacturing, controlling, or management are mostly fixed and reproducible, with defined outputs [63]. In contrast, engineering design processes differ from those process types [64]. Gonnet et al. [65] confirm this statement and argue that design processes are mostly vaguely defined. Innovation and creativity are essentially dependent on the environment, the working climate, and the level of qualification of the employees. Furthermore, product developers within their own company use different technologies and program versions in the processes. From classic CAx systems, virtual reality, EDM/PDM systems, office, and internet applications, a wide variety of tools are in use, depending on personal (company) preference. Due to this diversity, the available possibilities are often not fully exploited. Employees are faced with the challenge of finding the most suitable system for the development task and the corresponding information, knowledge, and documents at the right time, in the right quality, at the right place. The non-transparent flow of information also makes it difficult to grasp the current status of the development. Additionally, the following characteristics can be found:
  • Design processes are highly dynamic and creative [63].
  • Results are known in substance but are not finally defined [63,64].
  • Changing product requirements or boundary conditions [64].
  • Every design process differs, since a unique product, not existing at the beginning, is designed [66].
  • The process is highly problem-driven and generates new knowledge [67].
  • Shared information is not taken into account [3].

3. Purpose and Scope

In this section, the research gap and the novelty of the developed method is elaborated. The potentials of digital engineering are undisputed. Unfortunately, they are not widely spread in industrial praxis yet. One possible reason is the lack of sufficient process analysis approaches focusing on the crucial aspects for method integration and the identification of beneficial use cases for data-driven methods [43]. Additionally, the flows of data and information in existing processes lack adequate representation in established documentation approaches [67]. Therefore, this data and information documentation is not sufficient to analyse the processes with respect to the application of data-driven methods, since data are the critical basis for these methods. The transformation hereby crucially relies on the capture of existing processes and the analysis of their bottlenecks, as well as beneficial use cases for data-driven methods [43]. Summarising the mentioned characteristics of design processes, established process management tools are not directly applicable to design processes [64]. Especially the high abstraction level of established methods results in disadvantages in the documentation of information flows [67]. Often only development results are captured, but not the development procedure [65]. Furthermore, self-developed tools, used information, and data which are archived but not further used, are also not captured [43]. Lastly, most existing methods only focus on the capturing of the process. The decision as to whether a process is already sufficiently qualified for the integration of digital engineering methods and where in the process these can be integrated is based exclusively on experience. A framework for process evaluation and the subsequent identification of digital engineering potentials is missing. Since the integration of these new techniques in existing design processes offers great potential, an appropriate method is needed.
The proposed method introduces a novel approach for analysing and evaluating design processes, with several innovative aspects at its core. First and foremost, it empowers small and medium enterprises to assess their current level of digitalisation in design processes. Additionally, it pinpoints specific subtasks within the design process where data-driven methods can be instrumental in assisting designers. This diagnostic capability is particularly valuable as it helps organisations understand whether they need to make further process improvements before integrating data-driven methods effectively. This technology-readiness level for design processes outperforms the methods currently available for design processes. Given the inherent challenges related to the intangible and model-based nature of design processes [39,67], a holistic capturing is not possible. But the presented approach captures not only the procedural steps but also the software tools, information sources, and archival data fragments. This comprehensive perspective includes well-established aspects such as the final results, the team members involved, and the timeframes. The method excels in providing a detailed overview of the current design processes, with a specific focus on the data generated and the corresponding data types. It generates a structured representation that meticulously documents the essential components: procedural steps, software tools, data sources, and data types. This documentation serves as a foundation for process optimisation using data-driven methods. Ultimately, the primary objective of this method is to facilitate the evaluation of the design process, emphasising its optimisation by incorporating data-driven techniques. In doing so, it addresses a central and overarching goal, making it a distinctive and valuable innovation in the field.

4. Methodological Approach

The aim of this section is to create a formal method for design process analysis, focusing on the integration of data-driven methods into these processes. Therefore, an initial use case is presented before the requirements for process analysis in design for small and medium enterprises are introduced. Afterwards, the novel analysis method PADDME is shown, followed by an explanation of the developed evaluation framework.

4.1. Use Case: Integrating Data-Driven Methods into Product Development Processes

In product development, it is crucial to prioritise data analysis and uncover hidden correlations that may not be immediately apparent. Fortunately, there are powerful evaluation methods available, as mentioned earlier. Understanding the capabilities of these methods and finding a suitable use case for initial implementation is essential. Once identified, companies can integrate these new methods into their product development process. Figure 5 provides a summary of the principal steps to transform established design processes to digital design processes.

4.2. Requirements for Process Analysis in Design Departments of Small and Medium Enterprises

Small and medium enterprises mostly focus on slightly different aspects when aiming for process optimisation. To consider those aspects, requirements for process analysis in small and medium enterprise design departments are analysed in a first step. For conventional processes, those requirements are defined by Becker, Rosemann, and von Uthmann [68] in principles of process modelling. In addition, design process characteristics have to be met.
The following requirements are defined:
  • the method must be economical [68];
  • correct representation of reality [68];
  • comparability of different captures [68];
  • consideration of knowledge, information, and data as well as their storage location [67];
  • consideration of the variability in design processes [43].
The aspect of the economic evaluation of process capture in design was analysed in more detail as part of an industrial survey. In our study, eight industrial partners were asked to take part in a survey on the platform Unipark, in which different requirements are weighed against each other in order to obtain a ranking of the requirements.
The participants were asked to rank the five aspects quality, required time, cost, quantifiability, and consistency. Rank five is the most important aspect, while rank one is the least important. The average rank of the individual aspects is visualised in Figure 6. It is shown that the most important aspect is small cost, directly followed by the required time to perform the process capture. In addition to the ranking, the participants reported a high relevance of comprehensibility, easy usage, and compatibility with data protection rules.

4.3. Analysis Method—PADDME

In the following, the methodical approach PADDME—process analysis for digital design in mechanical engineering—[69] will be introduced in detail. In Figure 7, the whole process is shown. The proposed method represents the further development of the basic concept for the capture method presented by Gerschütz et al. [43,69]. While the previous work focused on capturing the process, the following steps present the overall method with a focus on the analysis and evaluation of the design process and the necessary preparation steps to realise pilot design processes supported by data-driven methods.
To realise a systematic analysis of design processes with respect to digital technology transformation, a reproducible meta-model is necessary. The method consists of five successive phases, where the order is not fixed and loops and feedbacks are allowed. The five phases are:
0.
Preparation;
1.
Process capturing;
2.
Process evaluation;
3.
Potential analysis;
4.
Process redesign and integration of data-driven methods.
Each phase ends with defined results. These are to be completed before moving to the next phase. In the following subsections, the individual phases are introduced and explained. For each phase, a description of the central goal and results is presented, followed by the underlying methods and, if applicable, the tools used.

4.3.1. Phase 0: Preparation

At the beginning of each PADDME process optimisation, some general preparations have to be performed. For design process optimisation, it is useful to check the goals of the company against the goals of the process optimisation for compatibility. To measure the success of the design process redesign, optimisation targets should be set according to the company goals.

Goal and Results

The central goal of this phase is the evaluation of the initial digitalisation level of the company as well as an identification of processes which are promising for digital engineering integration.

Methods

At first, an overall process structure plan should be created if none is available in the company yet. In this plan, all central processes, within the company and reaching out to others, are documented. Based on this process structure plan the chosen design process can be easily set into a bigger context. Although PADDME focuses mainly on product design, processes influencing or depending on product development should be added to the structure plan as well. In addition to start and end events, all relevant organisational units have to be identified during the preparation phase. Best and Weth [42] emphasise the relevance of this step to prevent structural mistakes.
The last task of the preparation phase is to evaluate the current level of digitalisation of the company. In order to integrate data-driven methods in design processes, the company should have developed several prerequisites according to digitalisation. Transformation models allow a description of the current digitalisation state in an efficient and comparable way. When using these models, the purpose and meaning must be explicitly emphasised in order to avoid misunderstandings. When critically examining the term transformation model, especially in the context of digital transformation, it may be more economic for companies not to achieve the highest maturity level of the model. The high investment costs for (further) digitalisation may not make economic sense for some processes due to the low process flows.

4.3.2. Phase 1: Process Capturing

In the first phase of the PADDME method, the capturing of the current conditions in the product development is performed. Therefore, an in-depth process capturing of the design process is executed.

Goal and Results

At the end of this phase, a process documentation of all relevant processes is available. The documentation focuses on process steps and employees as well as the used and generated data.

Methods

The methods used in this phase can be divided into two main subgoals: process capturing and process documentation.
Process capturing: The caption process is performed through the three-piled method presented by Gerschütz et al. [43] and shown in Figure 8.
The key advantage of this approach is the separated view on the real process through the employee view as well as on the planned process through the management view. These two aspects open the opportunity to identify optimisation potentials with a target–performance comparison. The focus on the single tasks as well as the included data enables the capture of the relevant design process aspects. Both views are captured by semi-structured interviews. This method is abstract enough to support the problem-driven and dynamic characteristics of design processes. By providing guidelines, the interviews are comparable but also give the option to react to individual particularities. During the interviews, the interviewer should ensure they capture the important aspects of the process as well as the influence of changing product requirements and boundaries.
Process documentation: During the process interviews, the process is visualised by the process engineer. For this purpose, a combination of business–process–modelling–notation (BPMN) and the value-stream method is used. This combination is necessary since in classical BPMN data and information flows are under-represented or fully missing, which is not the case in the value-stream method. However, this method lacks a sufficient process documentation. By combining both methods, all relevant aspects of product design processes regarding the integration of data-driven methods can be documented.
The combined method should represent all of the captured process aspects in a clear and understandable way. The developed representation is shown in Figure 9.
At the top, the established BMPN flowchart is shown. Additionally, the bottom part, with the orange swim lanes, shows the value-stream part of the documentation. These swim lanes allow the modelling of different storage pools, no matter if they are analogue, digital, or network-based. Furthermore, a distinction between data, information, and knowledge is made. Data refers to every artefact which is generated or changed during the process. Information is all aspects which are needed during the process, like simulation guidelines, while the term knowledge comprises the experience of the employees. For every storage pool, one swim lane is generated. The arrows, pointing from the individual process steps to a data pool symbolise the generation of data. If the arrow is pointing from the data pool to a process step, the data are used or needed by the process step. It is important to realise that the data pools represent knowledge as well as information. For example, if an employee needs information from another, this would be an arrow from a data fragment in the employee swim lane to the respective process step. Since the basic notation of the approach is well known, we do not explain every element in detail, but some reference material (http://www.bpmb.de/images/BPMN2_0_Poster_EN.pdf (accessed on 7 January 2024)) exists.

Tools

The documentation and visualisation is realised in the program Camunda Modeler (https://camunda.com/de/products/camunda-platform/modeler/, (accessed on 7 January 2024)). Camunda is a BPMN modelling tool that, unlike others, is adaptable and customisable. This allows the implementation of a plug-in, enabling functions needed for the PADDME representation.
The following additions have been implemented:
  • In communication tasks, the medium of the messages can be set (e.g., e-mail, paper, or by voice).
  • In normal tasks, the used programs and tools are added to the context menu.
  • Approvals have also been realised by adapting the existing task template using custom fields for sender, receiver, and approval information.
  • The data and information fragments are extended by the data format and version. If a task has the same input and output file, the version is incremented.

Intermediate Summary

After this first phase, insight in two central aspects is generated. On the one hand, the company acquires an in-depth look into its state of digitalisation, and on the other hand potential product development processes are captured, which can be further analysed and optimised in the subsequent process.

4.3.3. Phase 2: Process Evaluation

After the capture of the actual design processes, a detailed analysis of the data is performed in the second phase of the PADDME method.

Goal and Results

After this phase, an in-depth evaluation of the process, especially in respect to aspects about bottlenecks, is available. The goal of this evaluation is to identify bottlenecks in the design process which can be optimised with data-driven methods.

Methods

At first, the decomposition of the design process into sub-processes and iterations is performed as shown in Figure 10.
This split enables the identification of weak spots on all process levels. With this horizontal subdivision, a structured dissection is generated which fits the borders of the super-ordinate processes at all levels. To identify sub-processes, the following criteria are used [29]:
  • accrual of services, e.g., changing responsibility;
  • defined output of a sub-process;
  • distinct requirement profile or client–contractor relationship;
  • defined individual-provided resources;
  • autonomy with respect to subsequent units;
  • performance goals for specified sub-processes.
After process decomposition, every level of detail is evaluated independently to generate a digital-engineering-technology-readiness level of the process step. Since this section mainly focuses on the overall procedure, the evaluation criteria are introduced in Section 4.4.
With the catalogue of criteria created, small and medium enterprises can evaluate their design processes independently. The maturity model of the business processes for the use of data-driven methods is supported by quantitative key figures for process evaluation. The result of the evaluation is shown in a spider diagram, which is common for maturity models (see Figure 11). This provides a visual representation of the current state of digital transformation of the business processes.
The identification of bottlenecks is performed using a top-down approach, from the business process level to the sub-process, process step, and work step level, as visualised in Figure 10.
The decomposition also allows a comparison of the identified process components (see Figure 12). Each sub-process or iteration loop can be evaluated in terms of its performance relative to the other sub-processes. The evaluation and identification of weaknesses is carried out by a process engineer. In this way, weaknesses can be identified and compared across the entire process flow. This gives companies the opportunity to investigate root causes and close gaps in the process flow for a successful digital transformation.

Intermediate Summary

This phase allows weak points to be identified and compared across the entire process flow using a technology-readiness-level system. Furthermore, companies have the opportunity to investigate the causes in a targeted manner in order to close gaps in the process flow for a successful digital transformation. Additionally, the influence of the design process’s inherent dynamic and changing requirements on the different levels is evaluated and, therefore, can be optimised too.

4.3.4. Phase 3: Potential Analysis

After analysing the processes in phase two, the optimisation potentials and use cases for data-driven methods are elaborated in phase three of the PADDME method, to support companies.

Goal and Results

The central goal of this phase is the identification of suitable use cases as well as data-driven methods which can be integrated into those use cases.

Methods

In the first step, the result from the previous section is examined with regard to the applicability of data-driven methods. It is possible that companies score well in the general digital transformation assessment but their processes are not developed enough to directly apply all types of data-driven methods yet.
Therefore, in a second step, minimum requirements with respect to the evaluation criteria are set. Those requirements have to be fulfilled by the companies in order to use data-driven methods in an economically and technologically sensible way. Using method stencils, companies can evaluate which prerequisites are needed for the individual type of data-driven method.
Four central approaches for the use of data-driven methods can be identified. For easy communication of the potentials of data-driven methods, those approaches can be formulated in a comprehensible, practical way:
1.
Prediction of different values;
2.
Identification of interrelationships and contexts;
3.
Use of old data as a basis for new product generations;
4.
Support for decisions.
With these application tasks, the data-driven methods can be classified. Based on the evaluation criteria, the necessary technological, organisational, qualitative, and data-based prerequisites can be determined, as shown in Figure 13. Companies thus find out which methods can already be used in the actual process and also have a direct comparison of the points at which the digital transformation should be advanced.
The visually presented prerequisites of the application types for data-driven methods enable companies to quickly recognise which technologies they can already use. The templates also provide good guidance for future development projects.
To identify explicit methods for integration, the AI4PD ontology [70] is used. Here, the captured process can be described and suitable methods are proposed. Since this has been presented in previous contributions, it is not presented in detail here. For managers and decision-makers, this approach offers a clear path to assess their organisation’s readiness for data-driven methods. By identifying the specific prerequisites and their current status, they can make informed decisions about where to invest resources and efforts. However, it is important to note that while this assessment provides a qualitative understanding of readiness, an a priori numerical assessment of the time and cost required for implementation will need further research. Nonetheless, this structured evaluation provides a solid foundation for strategic planning, helping companies harness the potential of data-driven methods in a purposeful and efficient manner.

Intermediate Summary

In this phase, application potentials and digital engineering methods are evaluated using the presented technology-readiness evaluation and the AI4PD ontology.

4.3.5. Phase 4: Process Redesign and Integration of Data-Driven Methods

The final step of the PADDME method is implementation and process redesign. Pilot projects in cooperation with comparable companies or research institutions can be particularly helpful here.

Goal and Results

After this phase, an implemented pilot process is realised and the digital engineering method is integrated into the existing process.

Methods

Unlike the previous phases, phase four of PADDME cannot provide standardised implementation support, although methods like the CRISP-DM [26] can give guidelines for some problems. Each company has its own individual competencies and weaknesses that should be taken into account during implementation and realisation. Depending on the application, outsourcing can save time and money for companies with a low level of qualification. Companies can also merely obtain the algorithm as a service and have customisation and training carried out by their own employees.
Speaking about customisation, further clarification of this term should be given. To integrate data-driven methods into existing design processes, they have to be adapted to the given company infrastructures and specialities. This adaption is referred to as customising of methods. To be exact, a data-driven method has to be seen as a black box and is not customisable by SMEs by design, since they are implemented already. Therefore, customisation of the interfaces has to be performed. For this purpose, the digital engineering method is embedded in a program framework, as shown in Figure 14. The interfaces can be implemented in this framework.
A first interface is required to the “IT system training”, which takes over the model training. This generally requires a system with high computing power, which is either available in the company or can be purchased, for example, through cloud hosting services in the “as-a-service” model. Furthermore, the program framework must enable integration into the existing process flow and realise the most seamless integration possible into the existing process and software landscape. This requires a human–machine interface or a graphical user interface (GUI) to enable the users to operate the system, which is realised in the interface “IT system utilisation”. Digital engineering methods always work on the basis of data on both the input and output side, which is why further interfaces are necessary. The actual data evaluation or “data analytics” is carried out in the “digital engineering method” block. Input data must be identified, converted, and made available. Data handling must, therefore, be implemented. Output data must also be identified in the first step in order to define which results are to be determined. Then, the results have to be generated and presented. In addition, companies generally have data management systems in place that also have to be connected to the program framework. This ensures that the required data can be accessed.

Intermediate Summary

After this phase, a customised digital engineering method is integrated into the design process and supports bottlenecks. It is especially important to directly involve the employees in the early phase of planning. This not only increases motivation among the staff but also ensures an exchange of information and knowledge. Especially when external providers support the integration of the methods, the experience gained from everyday processes is urgently needed.

4.4. Technology-Readiness Framework

During phase 3 of the PADDME process, an evaluation with respect to the integration of data-driven methods into product development processes is performed. Therefore, the aim is to develop criteria that enable the analysis and evaluation of digital processes in product development. The developed system is based on the evaluation model developed by Bitkom [61]. The four major dimensions of technology, data, quality, and organisation cover the main topics that are important for the further digitalisation of business processes. Since the method to be developed is based on a visualised process model and a supplementary interview questionnaire, the evaluation criteria presented by Bitkom cannot be adopted directly. Additionally, the criteria are matched to the requirements for data-driven methods presented by Mehlstäubl et al. [27] and the relevant design process aspects shown by Gerschütz et al. [43]. Various types of scales are available, as presented in Table 1.
The evaluation structure is built up with a five-level scale. Level 1 corresponds to the basic level and the maximum degree of development is reached at level 5. In the following, the key figures were defined to realise the technology-readiness framework for data-driven processes. The qualitative self-assessment of the companies is supported by quantitative elements. For this purpose, key figures were determined that also provide a mathematical basis for individual categories. The individual technology-readiness aspects need different scale types, as shown in Table 2.
The criteria are summarised in a criteria catalogue as explained below. Companies can use these newly developed technology-readiness statements to evaluate their own level of performance.

4.4.1. Technology

A successful digital process should have a good technological environment and a well-developed IT infrastructure. The used IT structure and the way information is transferred between departments or employees is very important. All processes should be systemically linked to each other and have as few system discontinuities as possible.

Technology Basis

For a uniform digital technology basis, incoming and outgoing information from the process or from the process step should be available in digital form. On the one hand, this enables fast process handling, as data do not have to be entered or, in the worst case, transferred, and on the other hand, the available data can be used for data-driven methods. The processing of the (sub)process should take place digitally in a network for uncomplicated cooperation, for example, in the company’s own network or on the internet.

Tools

Evaluation of the current support with digital tools relevant for design processes. Possible tools are CAx as well as digital management tools like PDM systems. Automation can be used as well. The degree of automation puts already automated process steps ( A A ) in proportion to the total number of process steps ( A G ) . Equation (1) calculates the degree of automation p A according to IEC 60050-351 [71]. Although IEC 60050-351 refers to the electronics industry, adaption to the design domain is possible since similar problems are being solved.
p A = A A A G 100 %
The degree of digitalisation compares the already digitalised process steps ( A D ) with the total number of process steps ( A G ) to obtain the degree of digitalisation ( p D ) . The goal is to digitalise as many process steps as possible.
p D = A D A G 100 %

System Integration

All used programs and tools are integrated in a central product or simulation data management and the generated data are collected here. This allows easy access to the database for the use of data-driven methods. Complex interfaces, different data sources, and inconsistencies in the data are avoided.

Media Discontinuity

Media discontinuities make it difficult to integrate the data into data-driven processes. As soon as the medium is changed during the transfer of information within a process, for example, from e-mail to paper, media discontinuities occur. When searching for a suitable database, this information, which is partly analogue, cannot be taken into account without further work. Therefore, media discontinuities are captured. For the qualitative assessment, the number of media breaks ( A M ) is put in proportion to the total number of process steps ( A G ) . In Equation (3), the ratio of media discontinuities ( p M ) is calculated. Here, the ideal ratio is as small as possible.
p M = A M A G
Media discontinuities can also result from different data formats, for example, the CAD-to-simulation data transformation.

4.4.2. Data

A uniform database is the ideal state for the use of data-driven methods. However, this also includes extensive data collection in order to gather as much data as possible and to simplify the use of the obtained data.

Data Acquisition

All relevant data should be captured and archived automatically. In this regard, all data is captured explicitly from erroneous process runs too. Especially in simulation departments, results are deleted after a defined time. This should be taken into account as well.

Data Transfer

Data in the process is transferred with identical methods and to the same database. Data transfer between departments is automated to reduce transfer and waiting times. The total number of transmission methods used ( A T ) in relation to the number of process steps ( A G ) determines the proportion of the total process ( p T ) . The result reflects the proportion of method changes in the process. The aim is to achieve the smallest possible value so that the flow of information can be documented without interruptions.
p T = A T A G

Data Provision

Existing data should be provided digitally to all departments to generate the greatest possible benefit. Especially in design departments, this could be difficult due to different programs being used. The company should also prepare the data in a visual, easily understandable form. Each department is responsible for archiving the input and output.

Data Usage

Existing data is used in the current processes, regardless of whether the use is automated or manual. In order to find the maximum benefit from the data for the company, there should be a central interface for accessing the internal database. Then, other AI applications can also access the data externally. When accessing data from external networks, data security should have the highest priority (see the point on security in the dimension of quality). All decisions made in a digital company are based on data and no longer on empirical values.

4.4.3. Quality

Particular importance for successful digital processes has to be attributed to the quality. The following criteria in the evaluation concept focus on this.

Operation

Process operation is evaluated with respect to quality and stability. Additionally, input should be digital and used digitally as well. The focus is on the digital process step to be able to digitally track the complete process. Peak loads are, therefore, no problem for the digital process.

Traceability

Digital processes should be digitally traceable with respect to their current status and progress. This allows flexible reactions in case of problems.

Transfer Time

The proportion of transport, transfer, and waiting times do not exist due to digital transport (e.g., for other departments like simulation or service providers).

Security

Legal and regulatory requirements are respected and regularly checked in audits. The security of the company’s internal data is regularly updated by an appropriate firewall and well-trained IT experts.

4.4.4. Organisation

The entire organisation plays a mostly underestimated role when implementing development projects. It is important to involve and motivate all employees from the lowest level up to top management to give them the opportunity to contribute their experience and knowledge about products and processes.

Responsibility

Decisions during the process are made by people with technical responsibility. A clear assignment of who is responsible for which sub-step of the process leads to a better process flow. This is also linked to the criterion that decisions are always made by the next-highest hierarchical level. In the ideal process flow, i.e., without considering iterative loops, each department should only be assigned responsibility once.

Qualification

The digital competence of the employees is measured. A successful digital company needs a high digital competence, which needs consistent development. Employees are trained to use new technologies effectively and efficiently.

Gateways

Gateways are clearly defined between departments and supported with rules and expectations. Additionally, transfer documents enable consistent documentation.

Knowledge-Based Work

To support processes with data-driven methods, the proportion of knowledge-based work ( A K ) with respect to routine tasks ( A R ) is evaluated:
p k b w = A K A R

5. Case Study

To prove the usability of the PADDME method, a case study was carried out in cooperation with the PSW automotive engineering GmbH.

5.1. Phase 0: Preparation

In the preparation phase, the current state of digitalisation was captured and an overall process map was developed. Additionally, the current digital capabilities and optimisation goals were elaborated with the project-responsible process manager. In this talk, we identified sufficient digital possibilities concerning the general use of data-driven methods. IT and storage capacity were available, as well as general knowledge about these methods. The central company goal is to support designers in CAD work steps by providing useful information. Furthermore, a general analysis for further process digitalisation should be carried out.
During this case study, the process chain of development in between the disciplines of design, simulation, and testing of automotive parts was analysed, as shown in Figure 15.

5.2. Phase 1: Process Capturing

During the process-capturing phase, a total of seven interviews were conducted in June and July 2021. The two-hour interviews, with various employees from the simulation (2 interviews), testing (1 interview), design (2 interviews), and production (1 interview) departments, were performed from the employee view. In addition, a two-hour interview with the development manager provided insight into the planned process from the management view. The inclusion of the data view was not possible. Therefore, that information is missing. The interviews were carried out with two interviewers, one of them leading the interview. The other one was responsible for the simultaneous process visualisation, allowing potential errors in the documentation to be corrected during the interview. The visualisation presented in Figure 9 was used to document the recorded processes.
During the interviews, the employees could give their subjective insights on problems and optimisation potentials. A summary of the central problems is given in the following list:
  • Many iterations and change requests in design processes. This leads to long development times and many repetitive work steps.
  • High coordination requirement between departments, which results in many reconciliation meetings and a high number of iterations as well.
  • Short timeframes combined with long waiting times, for example, for simulation or test results, during the design.
  • Data retrieval from simulation to design is subject to media discontinuities, since simulations needs different data formats to design.
  • Elaborate evaluation of simulation results requires a high level of staff expertise and time, which is not always available and results in a high workload in the department.
  • Correction iterations with simulation service providers are necessary if there are errors in the simulation setup. The data check as well as the iterations cost time.
  • Testing is the bottleneck in the approval process due to long timeframes. Therefore, test results are available not until two iteration loops ahead. This results in additional iteration loops being required to fix potential errors.

5.3. Phase 3: Potential Analysis

The recorded development processes were evaluated based on the criteria described in Section 4.4 during the potential analysis phase. The evaluation was performed by the first author in cooperation with the second one. Following the decomposition procedure shown in Figure 10, the evaluation was performed for the whole process, the department levels as well as the process step and task levels. Figure 16 shows the results for the whole process as well as the department results for design, simulation, and testing. Further details cannot be shown due to company rules.
Based on the evaluation results and feedback, various optimisation potentials were identified. For most data-driven methods, prerequisites are met, especially in the sub-processes of design and simulation. One approach is a reduction in iterations through design and simulation departments like result estimations during the design phase. This could give designers insights into whether new designs have problems in standard load cases. Furthermore, plausibility checks can support the simulation result evaluation, which also leads to a reduced workload in the department. Within the test sub-process, additional effort is needed to enable the identified digitalisation potential. This can be seen in the test evaluation results in Figure 16 as well. The central weakness of this process is the low digitalisation ratio, since only a few digital tools are used, and the administrative aspects like traceability and data provision should be strengthened as well. After these aspects are optimised, similar approaches as in the simulation are found in the testing process chain. Additionally, decision support would be helpful to evaluate if a test is even necessary. The potential use cases were evaluated with the industry partner to identify a pilot use case that is implemented in the process redesign phase.

5.4. Phase 4: Process Redesign

The high-effort iterations between the simulation and design departments have been identified as the problem with the highest potential. To solve this, a design support system is developed during the process redesign phase, in which the design is evaluated for standard load cases. The identification of possible data-driven methods is achieved with the possibilities of the AI4PD ontology [70]. The data-driven methods are used on existing simulation data to predict maximum displacements at critical points using neural networks, as shown in Figure 17. The method had to be customised, as presented above. The following customisation steps were performed.

5.4.1. IT System Training

A machine learning model was generated to predict the results. For this purpose, the data of the respective load cases were read in and the model was trained on this basis. In addition, quality criteria were implemented to achieve adequate model quality. A detailed consideration of these quality criteria would go beyond the scope of this report and is not the subject of this research. In particular, overfitting or excessive validation errors were prevented.

5.4.2. IT System User

A user interface was implemented for the user-friendly execution of the prediction. In addition, the status of the prediction was clearly visualised via a traffic light scheme and the results were also listed in tabular form. The user can load the necessary model and check new components with just a few entries.

5.4.3. Data Management System

In the prototype developed, the data were stored in a folder structure on a shared network drive. In this folder structure, component databases were stored in which all data belonging to the same component were saved.

5.4.4. Data Input

To create the meta-model, the raw data contained in the stored data had to be extracted. The extraction was performed by reading out the data via a Python script. The material and thickness of the individual parts, and the node IDs of the load application points were derived. In addition, the simulation result data were read from the result files and transformed into text form. The data were then normalised and scaled.

5.4.5. Data Output

The predictions were shown graphically in the user interface. Three levels of detail were available. These were the aforementioned traffic light scheme, maximum deformation and stress, as well as a plot of the predicted values as a false colour image of the component. The prediction was also saved in the file structure to enable later access.

5.4.6. Process

At the current stage, a prototype is available, which is realised as a standalone program. To use it, the model must be exported from the CAD program and integrated into the prediction program. For complete process customisation, a direct plug-in for the CAD program is desirable.

6. Discussion

To verify the functionality of the PADDME method, an evaluation with respect to the requirements presented in Section 4.2 was performed in cooperation with the case study partner and is presented in Section 6.1. After the case study discussion, a method discussion is presented, analysing the PADDME method independent from the use case.

6.1. Case Study

The presented and evaluated criteria were the economics (cost and time), quality, quantifiability, and consistency, as well as easy usage, as introduced in Section 4.2. Furthermore, knowledge, information and data should be recorded with their storage location. It was possible to confirm the functionality of the process with the help of the interviews. However, existing problems with the recording of processes, as presented by [29], were also confirmed.

6.1.1. Economy

The economics of process capturing is further divided into cost and time. Of course, time can be transformed to labour cost, which often is the biggest cost in process improvement. Therefore, the relative cost with respect to the required time has a great effect on economy, as shown in the industry study in Section 4.2. The sub-process capturing relied on interviews of about two hours per employee. Afterwards, the analysis and evaluation took an additional two to three hours per sub-process. Overall, since the amount of employee time for the interviews was small, the sub-requirement of required time was met. There are no further costs, except the relative cost for the required time. There are no license costs; in the current state of the method there are also no certification courses resulting in costs.

6.1.2. Quality

The subsequent check of the developed process models with the interview partners proved the high quality of the whole method. No significant errors were found. This is particularly due to parallel process modelling and, thus, immediate error correction during the interviews. Furthermore, after finishing the case study, a retrospective on data-driven method identification and integration showed the high quality of both steps. Of course, one case study does not allow a general statement but allows a first resilient inclination.

6.1.3. Quantifiability

Using the quantitative parameters, as well as the score system for qualitative evaluation, a quantification of the results is possible. Therefore, different processes are comparable. In particular, the quantification of the benefit of process redesigns should be evaluated in more detail.

6.1.4. Consistency

The given elements of the process visualisation enable a consistent process representation of different tasks but allow the consideration of individual process characteristics.

6.1.5. Representation of Data, Information, and Knowledge

Data, information, and knowledge are the main basis for the application of data-driven methods. Therefore, those aspects should be captured and documented alongside the main process to allow a retrospective analysis. With the developed visualisation approach, these aspects are documented and visualised in an easily understandable and graphical way.

6.2. Method

It was possible to confirm the functionality of the PADDME process. However, existing problems with the recording of processes, as presented by [29], were also confirmed. The problem of distributed process knowledge is less significant due to the broad base of the interview partners from all relevant departments and knowledge levels, like senior experts, designers, and simulation and test engineers. Problems of case thinking and the reporting of the processes in as general a way as possible had to be counteracted during the interviews by regularly returning to the central questions.
At this point, a critical view on the requirements of the method presented by [43] has to be presented. Foremost, the authors postulate a holistic process capturing, which seems rather difficult through interviews due to the known issues presented by [29]. These are distributed process knowledge, case thinking, and a lack of understanding for business modelling languages. According to the process model categories presented by [39], our claim is to develop a method with the purpose of process development.
Additionally, in our studies the data view encountered some problems. In most companies, highly dynamic processes like product development are not fully logged in a central system. This is because many tasks are rather social (like meetings or information exchange) or a development of data versions. Current PDM systems generate logs but they are counterparts of the planned processes in the system and often only represent several development states. The time during those states is not logged. Therefore, data view has no great influence for now but may become more relevant in the future. PADDME, like any process analysis method, provides insights based on a snapshot of the current state of digital engineering processes. It may not inherently capture the dynamic nature of evolving technologies, market trends, or changing organisational priorities. While PADDME focuses on digital engineering processes, it might not comprehensively address the human factors involved. People’s skills, collaboration dynamics, and adaptability are crucial components of successful digitalisation, and the method may not fully capture these elements.
An evaluation on different levels of detail is necessary, since individual situations may offer different potentials. Some companies may implement a data-driven method for a whole business process, while others do not meet the necessary requirements on this top level but on a lower one. For example, a design department considering FE results for a first design evaluation may not have sufficient data provision. A simulation department with the same task has a different initial starting point, since other data are available, for example. With this section, a comparison between the identified levels and sub-processes can be performed with respect to their value and strength.
The PADDME method holds practical implications that can significantly impact the efficiency, effectiveness, and overall success of digital engineering practices in product development. One key implication lies in cost-effective process optimisation, where the method facilitates the identification of bottlenecks, redundancies, and inefficiencies, allowing organisations to streamline workflows and allocate resources more efficiently. Additionally, the method enables informed decision-making by providing a deeper understanding of the product development life cycle. Practitioners can make more informed decisions based on a comprehensive analysis of relevant data, contributing to better outcomes and reduced risks. Moreover, PADDME promotes enhanced collaboration among cross-functional teams involved in digital engineering. By visualising and analysing the entire product development process, teams can better understand their roles and dependencies, fostering a collaborative environment that enhances overall productivity. The inclusion of a framework for assessing the technology-readiness level of digital engineering methods is another practical implication. This allows organisations to gauge the maturity and applicability of their digital tools and methodologies, helping them to make informed decisions on technology adoption and implementation. PADDME supports a continuous improvement mindset by providing a structured approach to process analysis. Organisations can use the method iteratively to assess the impact of changes, implement adjustments, and monitor the outcomes, supporting an ongoing cycle of refinement for digital engineering practices. Practical implications also extend to resource allocation and training initiatives. Organisations can identify specific skill gaps or resource needs through the PADDME analysis, allowing them to invest in targeted training programs or allocate resources strategically to enhance competencies in digital engineering. Furthermore, PADDME helps organisations to adapt to changing work environments, technological advancements, and industry trends. By understanding the current state of digital engineering processes, organisations can proactively adjust their strategies to stay aligned with the dynamic landscape of product development. Lastly, the method promotes the integration of digital engineering into the entire product development life cycle. This holistic approach ensures that digital methods are seamlessly embedded in existing processes, maximising their impact on efficiency and innovation. In summary, the PADDME method offers practical implications that empower organisations to optimise processes, make informed decisions, enhance collaboration, assess technology readiness, drive continuous improvement, and adapt to the evolving demands of modern product development. These implications can contribute to the overall success and competitiveness of organisations in the digital era.
Regarding the research question, the PADDME method allows an economic process evaluation, as shown above. Furthermore, novel key factors for potential analysis were developed and their suitability was checked in the case study. The factors are suitable for identifying bottlenecks and possible data-driven methods to integrate them into these process steps, which is an innovation compared to existing methods. Comparing PADDME to established methods such as CRISP-DM or the KDD process, it is clear that these methods excel at the actual programme development, but lack support in identifying use cases and appropriate methodologies. Therefore, PADDME can serve as an upfront framework to support the application of those well-known tools. Additionally, the phase-like character of the method allows companies to split the process evaluation or drop, for example, phase 0 if that information is already available. Iterative process optimisation is possible with this design too. Furthermore, the method focuses only on the aspects that are relevant for a digital process transformation, making it a light and easy-to-use method when compared to methods like ARIS, which is great for a high-level overview of full company processes but, on the downside, is hard to generate due to the complexity of different information layers. The transferability of the method to other domains and areas has not yet been tested in practice. Nevertheless, such functionality is expected, at least in the area of product development with the associated sub-domains.
Looking at the second research question, the development of a technology-readiness framework for digital engineering was postulated. This framework was presented in Section 4.4. Of course, many maturity models are already available, as shown in the Introduction. Nevertheless, the presented digital engineering framework focuses on the problem of digital engineering and the prerequisites and aspects which are relevant for a successful process transformation. The mix of qualitative and quantitative aspects, all available from the process-capturing step of the PADDME method, allow for an easy-to-use process evaluation and a comparability through different processes and sub-processes.

7. Summary and Outlook

The digital revolution of product development processes in small and medium enterprises is crucial for the future success of this highly relevant economic sector. To realise this, an in-depth process analysis concerning mechanical engineering and data-driven methods has to be carried out. For this purpose, we introduced PADDME, a method for economic process optimisation for digital mechanical engineering in those companies. As shown before, PADDME enables an easy-to-use method for digital transformation, while utilising the potential of well-known methods like CRISP-DM. Additionally, a novel framework was introduced, allowing a process evaluation regarding digital engineering applications, further potential, and use cases, as well as weaknesses of the given design process. The method was evaluated successfully with an industrial case study, which confirmed the availability of a practical approach for process analysis and integration of data-driven methods in design processes. Future work will focus on the process redesign phase to further integrate PADDME into the established CRISP-DM to make it even more reliable. Additionally, the development of evaluation criteria to show the advantages of a digital process redesign will be carried out in future work. Given this, an ex-ante evaluation of the benefits of digital engineering would be possible before even starting the transformation.

Author Contributions

Conceptualisation, B.G. and Y.C.; methodology B.G.; writing—original draft preparation, B.G.; writing—review and editing, B.G., Y.C., S.G. and S.W.; visualisation, B.G.; supervision, S.G. and S.W.; funding acquistion S.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Bayerische Forschungsstiftung (BFS), grant number AZ-1392-19.

Data Availability Statement

Data are contained within the article.

Acknowledgments

This research work is part of “FORCuDE@BEV-Bavarian research association for customised Digital Engineering for Bavarian SME’s” and funded by the “Bayerische Forschungsstiftung (BFS)”. The authors are responsible for the content of this publication. A special thank goes to the Bayerische Forschungsstiftung (BFS) for financial support of the whole research project. Furthermore, C. Sauer is thanked for his commitment during the realisation of the case study and B. V. M. Spießl for supporting this research with her masters thesis.

Conflicts of Interest

Author Yvonne Consten was employed by the company PSW Automotive Engineering GmbH. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as potential conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ARISArchitecture of Integrated Information System
BPMBusiness process management
BPMNBusiness process model and notation
BPRBusiness process re-engineering
CADComputer-aided design
CAEComputer-aided engineering
CRISP-DMCross-industry standard for data mining
DSMDesign structure matrix
EDMEnterprise data management
EFQMEuropean Foundation for Quality Management
EPCEvent-driven process chain
IDEFIntegration definition for function modelling
KBEKnowledge-based engineering
KDDKnowledge discovery in databases
PADDMEProcess analysis for digital development in mechanical engineering
PDCAPlan–do–check–act
PDMProduct data management
SMESmall and medium enterprises
TCTTotal cycle time
VSMValue-stream mapping

Appendix A. Approaches for Integrating Data-Driven Methods

For further explanation, the detailed steps of the KDD and CRISP-DM processes are presented in the following section. Furthermore, a comparison to the presented PADDME approach is possible, due to the in-depth description.

Appendix A.1. KDD Process

The KDD process consists of the following stages [25], as shown in Figure A1:
1.
Data selection: In this initial phase, relevant data are identified and selected for analysis. The dataset is chosen based on the project’s objectives, domain knowledge, and data availability. The selected data should align with the specific problem or research question at hand.
2.
Data preprocessing: Once the data are selected, they undergo preprocessing to prepare them for analysis. This phase involves cleaning the data by handling missing values, correcting errors, and resolving inconsistencies. Data integration may also be performed to combine multiple datasets into a unified format. Transformation techniques, such as normalisation or aggregation, can be applied to make the data suitable for further analysis.
3.
Data transformation: In this phase, the preprocessed data are transformed into a suitable representation for analysis. This typically involves converting the data into a format that can be effectively processed using data mining algorithms. Feature selection or extraction techniques may be applied to reduce the dimensionality of the dataset and capture the most relevant information.
4.
Data mining: The core of the KDD process lies in the data mining phase. Here, advanced algorithms and techniques are applied to extract patterns, relationships, and insights from the transformed data. Data mining algorithms can be categorised into various types, including classification, clustering, regression, association rule mining, and more. The choice of algorithm depends on the nature of the problem and the knowledge that is desired to be extracted.
5.
Pattern evaluation: Once patterns and relationships have been discovered through data mining, they need to be evaluated for their quality, significance, and usefulness. This evaluation is performed based on domain expertise, statistical measures, and evaluation metrics specific to the problem domain. Patterns that meet the desired criteria are considered valuable and can be further analysed.
Figure A1. KDD process for applying data mining, according to [25].
Figure A1. KDD process for applying data mining, according to [25].
Processes 12 00173 g0a1

Appendix A.2. CRISP-DM Process

The CRISP-DM process consists of six major phases [26], as shown in Figure A2:
1.
Business understanding: This initial phase focuses on understanding the project objectives, requirements, and constraints from a business perspective. It involves identifying the goals of the project, defining the problem statement, and forming a clear understanding of how the project outcomes will benefit the organisation.
2.
Data understanding: In this phase, data sources are identified and collected. The data are then explored to gain a comprehensive understanding of their structure, quality, and potential limitations. Data issues and challenges are addressed, and initial insights are derived to determine the feasibility of the project.
3.
Data preparation: This phase involves preparing the data for analysis. It includes data cleaning, transformation, and integration to ensure data quality and consistency. Data preprocessing techniques, such as handling missing values or outliers, are applied to create a clean and reliable dataset.
4.
Modelling: In this phase, various data mining and machine learning techniques are applied to build and validate models. The appropriate modelling techniques are selected based on the project objectives and the nature of the data. Iterative experimentation and model refinement are performed to achieve the desired level of accuracy and performance.
5.
Evaluation: The models developed in the previous phase are evaluated against the business objectives and criteria established in the first phase. Model performance and effectiveness are assessed using appropriate evaluation metrics. This phase helps determine if the models meet the project requirements and if further improvements are needed.
6.
Deployment: The final phase focuses on deploying the data mining results into the operational environment. This involves integrating the models into existing systems or processes, creating user interfaces or reports for end-users, and providing documentation and training to ensure the successful implementation and adoption of the results.
Figure A2. CRISP-DM process for implementing data-driven applications, according to [26].
Figure A2. CRISP-DM process for implementing data-driven applications, according to [26].
Processes 12 00173 g0a2

References

  1. Tomiyama, T.; Lutters, E.; Stark, R.; Abramovici, M. Development Capabilities for Smart Products. CIRP Ann. 2019, 68, 727–750. [Google Scholar] [CrossRef]
  2. Stark, R.; Brandenburg, E.; Lindow, K. Characterization and Application of Assistance Systems in Digital Engineering. CIRP Ann. 2021, 70, 131–134. [Google Scholar] [CrossRef]
  3. Lunnemann, P.; Stark, R.; Wang, W.M.; Stark, R.; Manteca, P.I. Engineering Activities—Considering Value Creation from a Holistic Perspective. In Proceedings of the 2017 International Conference on Engineering, Technology and Innovation (ICE/ITMC), Madeira, Portugal, 27–29 June 2017; pp. 315–323. [Google Scholar] [CrossRef]
  4. Pahl, G.; Wallace, K.; Blessing, L.; Pahl, G. (Eds.) Engineering Design: A Systematic Approach, 3rd ed.; Springer: London, UK, 2007. [Google Scholar]
  5. VDI 2221 Blatt 1 Design of Technical Products and Systems—Model of Product Design; Beuth: Berlin, Germany, 2019.
  6. Beuth Verlang GmbH. VDI/VDE 2206:2021-11-Development of Mechatronic and Cyber-Physical Systems; Technical Report; Beuth Verlang GmbH: Berlin, Germany, 2021. [Google Scholar]
  7. Nattermann, R.; Anderl, R. The W-model—Using Systems Engineering for Adaptronics. Procedia Comput. Sci. 2013, 16, 937–946. [Google Scholar] [CrossRef]
  8. Wickel, M.; Schenkl, S.A.; Schmidt, D.M.; Hense, J.U.; Mandl, H.; Maurer, M. Knowledge Structure Maps Based on Multiple Domain Matrices. Inimpact J. Innov. Impact 2013, 5, 5–16. [Google Scholar]
  9. Montáns, F.J.; Chinesta, F.; Gomez-Bombarelli, R.; Kutz, J.N. Data-Driven Modeling and Learning in Science and Engineering. Comptes Rendus Mec. 2019, 347, 845–855. [Google Scholar] [CrossRef]
  10. Bertoni, A. Data-Driven Design in Concept Development: Systematic Review and Missed Opportunities. Proc. Des. Soc. Des. Conf. 2020, 1, 101–110. [Google Scholar] [CrossRef]
  11. Schenk, M. Digitales Engineering und virtuelle Techniken zum Planen, Testen und Betreiben technischer Systeme. In Proceedings of the 13. IFF Wissenschaftstage, Magdeburg, Germany, 15–17 June 2010. [Google Scholar]
  12. Schumann, M.; Schenk, M.; Schmucker, U.; Saake, G. Digital Engineering-Herausforderungen, Ziele Und Lösungsbeispiele. In Proceedings of the 14. IFF Wissenschaftstage 2011, Magdeburg, Germany, 28–30 June 2011. [Google Scholar]
  13. Gerschütz, B.; Sauer, C.; Kormann, A.; Wallisch, A.; Mehlstäubl, J.; Alber-Laukant, B.; Schleich, B.; Paetzold, K.; Rieg, F.; Wartzack, S. Towards Customized Digital Engineering: Herausforderungen und Potentiale bei der Anpassung von Digital Engineering Methoden Für den Produktentwicklungsprozess. In Proceedings of the Stuttgarter Symposium für Produktentwicklung 2021 (SSP 2021), Stuttgart, Germany, 20 May 2021. [Google Scholar]
  14. Duigou, J.L.; Bernard, A.; Perry, N.; Delplace, J.C. Generic PLM System for SMEs: Application to an Equipment Manufacturer. Int. J. Prod. Lifecycle Manag. 2012, 6, 51. [Google Scholar] [CrossRef]
  15. Gerschütz, B.; Schleich, B.; Wartzack, S. A Semantic Web Approach for Structuring Data-Driven Methods in the Product Development Process. In Proceedings of the DS 111: Proceedings of the 32nd Symposium Design for X, Tutzing, Germany, 27–28 September 2021. [Google Scholar] [CrossRef]
  16. StackExchange. Distinction between AI, ML, Neural Networks, Deep Learning and Data Mining. 2019. Available online: https://softwareengineering.stackexchange.com/q/366996 (accessed on 7 January 2024).
  17. Fayyad, U.; Piatetsky-Shapiro, G.; Smyth, P. From Data Mining to Knowledge Discovery in Databases. AI Mag. 1996, 17, 37. [Google Scholar] [CrossRef]
  18. Samuel, A.L. Some Studies in Machine Learning Using the Game of Checkers. IBM J. Res. Dev. 2000, 44, 206–226. [Google Scholar] [CrossRef]
  19. Kulin, M.; Kazaz, T.; De Poorter, E.; Moerman, I. A Survey on Machine Learning-Based Performance Improvement of Wireless Networks: PHY, MAC and Network Layer. Electronics 2021, 10, 318. [Google Scholar] [CrossRef]
  20. Shabestari, S.S.; Herzog, M.; Bender, B. A Survey on the Applications of Machine Learning in the Early Phases of Product Development. Proc. Des. Soc. Int. Conf. Eng. Des. 2019, 1, 2437–2446. [Google Scholar] [CrossRef]
  21. Dworschak, F.; Kügler, P.; Schleich, B.; Wartzack, S. Integrating the Mechanical Domain into Seed Approach. In Proceedings of the Design Society: International Conference on Engineering Design, ICED, Delft, The Netherlands, 5–8 August 2019; Volume 2018, pp. 2587–2596. [Google Scholar] [CrossRef]
  22. Zirngibl, C.; Schleich, B.; Wartzack, S. Robust Estimation of Clinch Joint Characteristics Based on Data-Driven Methods. Int. J. Adv. Manuf. Technol. 2023, 124, 833–845. [Google Scholar] [CrossRef]
  23. Patel, A.R.; Ramaiya, K.K.; Bhatia, C.V.; Shah, H.N.; Bhavsar, S.N. Artificial Intelligence: Prospect in Mechanical Engineering Field—A Review. In Data Science and Intelligent Applications; Kotecha, K., Piuri, V., Shah, H.N., Patel, R., Eds.; Springer: Singapore, 2021; Volume 52, pp. 267–282. [Google Scholar] [CrossRef]
  24. Gerschütz, B.; Sauer, C.; Kormann, A.; Nicklas, S.J.; Goetz, S.; Roppel, M.; Tremmel, S.; Paetzold-Byhain, K.; Wartzack, S. Digital Engineering Methods in Practical Use during Mechatronic Design Processes. Designs 2023, 7, 93. [Google Scholar] [CrossRef]
  25. Azevedo, A.; Santos, M.F. KDD, SEMMA and CRISP-DM: A Parallel Overview. In Proceedings of the IADIS European Conference on Data Mining 2008, Amsterdam, The Netherlands, 24–26 July 2008; Abraham, A., Ed.; IADIS: Lisbon, Portugal, 2008; pp. 182–185. [Google Scholar]
  26. Chapman, P.; Clinton, J.; Kerber, R.; Khabaza, T.; Reinartz, T.; Shearer, C.; Wirth, R. CRISP-DM 1.0 Step-by-Step Data Mining Guide; The CRISP-DM Consortium: Chicago, IL, USA, 2000. [Google Scholar]
  27. Mehlstäubl, J.; Nicklas, S.; Gerschütz, B.; Sprogies, N.; Schleich, B.; Lohner, T.; Wartzack, S.; Stahl, K.; Paetzold, K. Voraussetzungen Für Den Einsatz Datengetriebener Methoden in Der Produktentwicklung. In Proceedings of the DS 111: Proceedings of the 32nd Symposium Design for X. The Design Society, Virtual, 27–28 September 2021. [Google Scholar] [CrossRef]
  28. ISO 9001:2015; Quality Management Systems. ISO: Geneva, Switzerland, 2015.
  29. Dumas, M.; La Rosa, M.; Mendling, J.; Reijers, H.A. Fundamentals of Business Process Management, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar] [CrossRef]
  30. Scheer, A.W. ARIS—Business Process Modeling; Springer: Berlin/Heidelberg, Germany, 2000. [Google Scholar] [CrossRef]
  31. Grover, V.; Kettinger, W.J. (Eds.) Process Think: Winning Perspectives for Business Change in the Information Age; Idea Group Pub: Hershey, PA, USA, 2000. [Google Scholar]
  32. Weske, M. Business Process Management: Concepts, Languages, Architectures; Springer: Berlin/Heidelberg, Germany, 2007. [Google Scholar] [CrossRef]
  33. Schmelzer, H.J.; Sesselmann, W. Geschäftsprozessmanagement in der Praxis: Kunden Zufriedenstellen, Produktivität Steigern, Wert Erhöhen, 9th ed.; Hanser: München, Germany, 2020. [Google Scholar]
  34. Grover, V.; Malhotra, M.K. Business Process Reengineering: A Tutorial on the Concept, Evolution, Method, Technology and Application. J. Oper. Manag. 1997, 15, 193–213. [Google Scholar] [CrossRef]
  35. Susanto, H.; Fang-Yie, L.; Chen, C.K. Business Process Reengineering: An ICT Approach, 1st ed.; Apple Academic Press: Oakville, ON, Canada, 2019. [Google Scholar]
  36. Mason-Jones, R.; Towill, D.R. Total Cycle Time Compression and the Agile Supply Chain. Int. J. Prod. Econ. 1999, 62, 61–73. [Google Scholar] [CrossRef]
  37. Farris, J.A.; Van Aken, E.M.; Doolen, T.L.; Worley, J. Critical Success Factors for Human Resource Outcomes in Kaizen Events: An Empirical Study. Int. J. Prod. Econ. 2009, 117, 42–65. [Google Scholar] [CrossRef]
  38. Schroeder, R.G.; Linderman, K.; Liedtke, C.; Choo, A.S. Six Sigma: Definition and Underlying Theory. J. Oper. Manag. 2008, 26, 536–554. [Google Scholar] [CrossRef]
  39. Stacey, M.; Eckert, C.; Hillerbrand, R. Process Models: Plans, Predictions, Proclamations or Prophecies? Res. Eng. Des. 2020, 31, 83–102. [Google Scholar] [CrossRef]
  40. Sharp, A.; McDermott, P. Workflow Modeling Tools for Process Improvement and Applications Development; Artech House: Boston, MA, USA, 2009. [Google Scholar]
  41. Brenner, J. Lean Administration: Verschwendung in Büros Erkennen, Analysieren und Beseitigen; Praxisreihe Qualitätswissen; Hanser: München, Germany, 2018. [Google Scholar]
  42. Best, E.; Weth, M. Geschäftsprozesse Optimieren: Der Praxisleitfaden für Erfolgreiche Reorganisation; Gabler Verlag/GWV Fachverlage GmbH: Wiesbaden, Germany, 2009. [Google Scholar]
  43. Gerschütz, B.; Spießl, B.V.M.; Schleich, B.; Wartzack, S. An Adapted Method for Design Process Capturing to Meet the Challenges of Digital Product Development. In Proceedings of the International Conference on Engineering Design (ICED21), Gothenburg, Sweden, 16–20 August 2021; Volume 1, pp. 365–374. [Google Scholar] [CrossRef]
  44. Russell, N.; ter Hofstede, A.H.M.; Edmond, D.; van der Aalst, W.M.P. Workflow Data Patterns: Identification, Representation and Tool Support. In Conceptual Modeling—ER 2005, Proceedings of the 24th International Conference on Conceptual Modeling, Klagenfurt, Austria, 24–28 October 2005; Lecture Notes in Computer Science; Delcambre, L., Kop, C., Mayr, H.C., Mylopoulos, J., Pastor, O., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; pp. 353–368. [Google Scholar] [CrossRef]
  45. van der Aalst, W.; ter Hofstede, A.; Kiepuszewski, B.; Barros, A. Workflow Patterns. Distrib. Parallel Databases 2003, 14, 5–51. [Google Scholar] [CrossRef]
  46. Booch, G.; Rumbaugh, J.; Jacobson, I. The Unified Modeling Language User Guide, 2nd ed.; Addison-Wesley: Upper Saddle River, NJ, USA, 2005. [Google Scholar]
  47. Business Process Model and Notation (BPMN), Version 2.0.2. 2013. Available online: https://www.omg.org/spec/BPMN/2.0.2/PDF (accessed on 7 January 2024).
  48. Scheer, A.W.; Thomas, O.; Adam, O. Process Modeling Using Event-Driven Process Chains. In Process-Aware Information Systems; Dumas, M., Van Der Aalst, W.M.P., Ter Hofstede, A.H.M., Eds.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2005; pp. 119–145. [Google Scholar] [CrossRef]
  49. Rother, M.; Shook, J. Learning to See: Value-Stream Mapping to Create Value and Eliminate Muda, ; Version 1.5; 20th Anniversary Edition; Lean Enterprise Inst: Boston, MA, USA, 2018. [Google Scholar]
  50. Meyer, A.; Smirnov, S.; Weske, M. Data in Business Processes. EMISA Forum 2011, 31, 5–31. [Google Scholar]
  51. Trauer, J.; Wöhr, F.; Eckert, C.; Kannengiesser, U.; Knippenberg, S.; Sankowski, O.; Zimmermann, M. Criteria for Selecting Design Process Modelling Approaches. Proc. Des. Soc. 2021, 1, 791–800. [Google Scholar] [CrossRef]
  52. Wynn, D.C.; Clarkson, P.J. Process Models in Design and Development. Res. Eng. Des. 2018, 29, 161–202. [Google Scholar] [CrossRef]
  53. Eppinger, S.D.; Browning, T.R. Design Structure Matrix Methods and Applications; Engineering Systems; MIT Press: Cambridge, MA, USA, 2012. [Google Scholar]
  54. Smith, R.P.; Morrow, J.A. Product Development Process Modeling. Des. Stud. 1999, 20, 237–261. [Google Scholar] [CrossRef]
  55. Kossak, F.; Illibauer, C.; Geist, V.; Natschläger, C.; Ziebermayr, T.; Freudenthaler, B.; Kopetzky, T.; Schewe, K.D. Hagenberg Business Process Modelling Method; Springer International Publishing: Cham, Switzerland, 2016. [Google Scholar] [CrossRef]
  56. Kobayashi, I. 20 Keys to Workplace Improvement; rev. and expanded ed.; Productivity Press: Portland, OR, USA, 1995. [Google Scholar]
  57. Helmold, M. Quality Excellence Models. In Virtual and Innovative Quality Management across the Value Chain: Industry Insights, Case Studies and Best Practices; Springer International Publishing: Cham, Switzerland, 2023; pp. 87–97. [Google Scholar] [CrossRef]
  58. Appelfeller, W.; Feldmann, C. Die Digitale Transformation des Unternehmens: Systematischer Leitfaden mit Zehn Elementen zur Strukturierung und Reifegradmessung; Springer Gabler: Berlin/Heidelberg, Germany, 2018. [Google Scholar] [CrossRef]
  59. North, K.; Aramburu, N.; Lorenzo, O.J. Promoting Digitally Enabled Growth in SMEs: A Framework Proposal. J. Enterp. Inf. Manag. 2019, 33, 238–262. [Google Scholar] [CrossRef]
  60. Petzolt, S.; Hölzle, K.; Kullik, O.; Gergeleit, W.; Radunski, A. Organisational digital transformation of SMEs—Development and application of a digital transformation maturity model for business model transformation. Int. J. Innov. Manag. 2022, 26, 2240017. [Google Scholar] [CrossRef]
  61. Britze, N.; Schulze, A.; Fenge, K.; Woltering, M.; Gross, M.; Menge, F.; Mucke, A.; Ensinger, A.; Keller, H.; Oldenburg, L.; et al. Reifegradmodell Digitale Geschäftsprozesse; Bitkom: Berlin, Germany, 2020. [Google Scholar]
  62. Zhu, L.; Johnsson, C.; Varisco, M.; Schiraldi, M.M. Key Performance Indicators for Manufacturing Operations Management–Gap Analysis between Process Industrial Needs and ISO 22400 Standard. Procedia Manuf. 2018, 25, 82–88. [Google Scholar] [CrossRef]
  63. Vajna, S.; Weber, C.; Zeman, K.; Hehenberger, P.; Gerhard, D.; Wartzack, S. CAx für Ingenieure: Eine Praxisbezogene Einführung; 3rd, vollständig neu bearbeitete auflage ed.; Springer Vieweg: Berlin, Germany, 2018. [Google Scholar]
  64. Roelofsen, J.; Lindemann, U. An Approach Towards Planning Development Processes According to the Design Situation. In Modelling and Management of Engineering Processes; Heisig, P., Clarkson, P.J., Vajna, S., Eds.; Springer: London, UK, 2010; pp. 41–52. [Google Scholar] [CrossRef]
  65. Gonnet, S.; Henning, G.; Leone, H. A Model for Capturing and Representing the Engineering Design Process. Expert Syst. Appl. 2007, 33, 881–902. [Google Scholar] [CrossRef]
  66. Blessing, L.T.; Chakrabarti, A. DRM, a Design Research Methodology; Springer: London, UK, 2009. [Google Scholar] [CrossRef]
  67. Mehlstäubl, J.; Atzberger, A.; Paetzold, K. General Approach to Support Modelling of Data and Information Flows in Product Development. In Proceedings of the Balancing Innovation and Operation—The Design Society, Lyngby, Denmark, 12–14 August 2020. [Google Scholar] [CrossRef]
  68. Becker, J.; Rosemann, M.; von Uthmann, C. Guidelines of Business Process Modeling. In Business Process Management: Models, Techniques, and Empirical Studies; van der Aalst, W., Desel, J., Oberweis, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2000; pp. 30–49. [Google Scholar] [CrossRef]
  69. Gerschütz, B.; Goetz, S.; Wartzack, S. Realization of the Digital Transformation in Product Development—Processes, Methods and Application. Z. Wirtsch. Fabr. 2023, 118, 163–168. [Google Scholar] [CrossRef]
  70. Gerschütz, B.; Goetz, S.; Wartzack, S. AI4PD—Towards a Standardized Interconnection of Artificial Intelligence Methods with Product Development Processes. Appl. Sci. 2023, 13, 3002. [Google Scholar] [CrossRef]
  71. International Electrotechnical Committee. DIN IEC 60050-351:2014-09, International Electrotechnical Vocabulary-Part 351: Control Technology (IEC 60050-351:2013); International Electrotechnical Committee: Geneva, Switzerland, 2014. [Google Scholar] [CrossRef]
Figure 1. Overview of terms in digital engineering, according to [16].
Figure 1. Overview of terms in digital engineering, according to [16].
Processes 12 00173 g001
Figure 2. Representation of a business process, according to IDEF0 model [31].
Figure 2. Representation of a business process, according to IDEF0 model [31].
Processes 12 00173 g002
Figure 3. Overview of process-modelling approaches.
Figure 3. Overview of process-modelling approaches.
Processes 12 00173 g003
Figure 4. Overview of quantitative process analysis parameters.
Figure 4. Overview of quantitative process analysis parameters.
Processes 12 00173 g004
Figure 5. Use case of integrating data-driven methods into design processes.
Figure 5. Use case of integrating data-driven methods into design processes.
Processes 12 00173 g005
Figure 6. Result of the requirements ranking for process analysis. Rank five is most important, rank zero least important.
Figure 6. Result of the requirements ranking for process analysis. Rank five is most important, rank zero least important.
Processes 12 00173 g006
Figure 7. PADDME method overview according to [69]. The five phases are introduced and explained in detail below.
Figure 7. PADDME method overview according to [69]. The five phases are introduced and explained in detail below.
Processes 12 00173 g007
Figure 8. Three-piled design process capturing method according to [43].
Figure 8. Three-piled design process capturing method according to [43].
Processes 12 00173 g008
Figure 9. Schematic example of the visualisation approach.
Figure 9. Schematic example of the visualisation approach.
Processes 12 00173 g009
Figure 10. Schematic process decomposition of a business process to sub-process and process steps and procedure for evaluating different process levels. The relevant decomposition level is highlighted.
Figure 10. Schematic process decomposition of a business process to sub-process and process steps and procedure for evaluating different process levels. The relevant decomposition level is highlighted.
Processes 12 00173 g010
Figure 11. Example result of a process evaluation. Five is seen as the best rating, zero as the worst.
Figure 11. Example result of a process evaluation. Five is seen as the best rating, zero as the worst.
Processes 12 00173 g011
Figure 12. Comparison of process levels and steps to identify weaknesses. Left: the analysed decomposition levels. Right: the table presents the average rating of the four dimensions, the graph represents the table graphically. Bottom right: the four (sub)process results are shown in detail.
Figure 12. Comparison of process levels and steps to identify weaknesses. Left: the analysed decomposition levels. Right: the table presents the average rating of the four dimensions, the graph represents the table graphically. Bottom right: the four (sub)process results are shown in detail.
Processes 12 00173 g012
Figure 13. Template for methods to use old data as a basis. Five is seen as the best rating, zero as the worst.
Figure 13. Template for methods to use old data as a basis. Five is seen as the best rating, zero as the worst.
Processes 12 00173 g013
Figure 14. Interfaces for the integration of data-driven methods.
Figure 14. Interfaces for the integration of data-driven methods.
Processes 12 00173 g014
Figure 15. Process map of the processes conducted within the case study.
Figure 15. Process map of the processes conducted within the case study.
Processes 12 00173 g015
Figure 16. Evaluation results for the captured processes. Top: the full development process is shown. Bottom: (left) the design process, (middle) the simulation process, and (right) the test process results are visualised. Five is seen as the best rating, zero as the worst.
Figure 16. Evaluation results for the captured processes. Top: the full development process is shown. Bottom: (left) the design process, (middle) the simulation process, and (right) the test process results are visualised. Five is seen as the best rating, zero as the worst.
Processes 12 00173 g016
Figure 17. Schematic representation of the developed use case of a simulation copilot for prediction of simulation results.
Figure 17. Schematic representation of the developed use case of a simulation copilot for prediction of simulation results.
Processes 12 00173 g017
Table 1. Used scale types for process evaluation.
Table 1. Used scale types for process evaluation.
ScaleLevel 1Level 2Level 3Level 4Level 5
LikertNot digitalMostly not digitalPartly digitalMostly digitalFully digital
PercentageLess than 20 % 20– 40 % 40– 60 % 60– 80 % More than 80 %
ConsentNot applicableMostly not applicablePartly applicableMostly applicableApplicable
Table 2. Assignment of aspect scale types. The letter “x” symbolises the use of the respective scale type for the aspect.
Table 2. Assignment of aspect scale types. The letter “x” symbolises the use of the respective scale type for the aspect.
AspectLikertPercentageConsent
Technology
Technology basisx
Tools x
System integration x
Media discontinuity x
Data
Data acquisitionx
Data transfer x
Data provisionx
Data usage x
Quality
Operation x
Traceability x
Transfer time x
Security x
Organisation
Responsibility x
Qualification x
Gateways x
Knowledge-based work x
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gerschütz, B.; Consten, Y.; Goetz, S.; Wartzack, S. PADDME—Process Analysis for Digital Development in Mechanical Engineering. Processes 2024, 12, 173. https://doi.org/10.3390/pr12010173

AMA Style

Gerschütz B, Consten Y, Goetz S, Wartzack S. PADDME—Process Analysis for Digital Development in Mechanical Engineering. Processes. 2024; 12(1):173. https://doi.org/10.3390/pr12010173

Chicago/Turabian Style

Gerschütz, Benjamin, Yvonne Consten, Stefan Goetz, and Sandro Wartzack. 2024. "PADDME—Process Analysis for Digital Development in Mechanical Engineering" Processes 12, no. 1: 173. https://doi.org/10.3390/pr12010173

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop