Next Article in Journal
Security and Resilience of a Data Space Based Manufacturing Supply Chain
Previous Article in Journal
Digital Innovation, Business Models Transformations, and Agricultural SMEs: A PRISMA-Based Review of Challenges and Prospects
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Handling Preliminary Engineering Information: An Interview Study and Practical Approach for Clarifying Information Maturity

1
Department of Mechanical and Mechatronics Engineering, University of Auckland (Alumnus), Auckland 1010, New Zealand
2
Department of Integrated Design, Interdisciplinary Research Center for Smart Mobility and Logistics, King Fahd University of Petroleum and Minerals, Dhahran 31261, Saudi Arabia
*
Author to whom correspondence should be addressed.
Systems 2025, 13(8), 674; https://doi.org/10.3390/systems13080674
Submission received: 15 June 2025 / Revised: 17 July 2025 / Accepted: 6 August 2025 / Published: 8 August 2025
(This article belongs to the Section Systems Engineering)

Abstract

Handling preliminary information appropriately is a critical challenge for many aspects of systems engineering design. The topic is gaining renewed visibility due to the expanding possibilities to apply AI to preliminary information to support systems design, engineering, and management. However, there are few empirical studies of the practicalities of handling immature information and there is a lack of concretely developed, empirically evaluated, and practical approaches for clarifying information maturity levels, needed to ensure such information is appropriately used. This article addresses the gap, contributing new insight into how immature information is handled in industrial practice that is derived from interviews with 15 engineering and product development professionals from 5 companies. Thematic analysis reveals how practitioners work with preliminary information and where they require support. A solution was developed to address the empirically identified needs. In 5 follow-up interviews, practitioner feedback on this concept demonstrator was supportive. The main result of this research, in addition to the insights into practice, is a practical maturity grid-based assessment system that can help the providers of preliminary information self-assess and communicate information maturity levels. The assessments may be stored alongside the information and may be aggregated and visualised in CAD, augmented reality, or a range of charts to make information maturity visible and hence allow it to be more deliberately considered and managed. Implications of this research include that managers should promote greater awareness and discussion of preliminary information’s maturity and should introduce structured processes to track and manage the maturity of key information as it is progressively developed. The detailed maturity grids presented in this article may provide a foundation for such processes and can be adapted for particular situations.

1. Introduction

“If only I had known how preliminary that data was… how good the data was […] Sometimes you look at it and you go: ‘It will probably be preliminary, and we can shift some of this stuff around.’ Then, you find out that it’s not.”
(Senior Development Engineer)
Practitioners in systems engineering design projects frequently provide, receive, and work with preliminary information. They must decide when such information is ready to share, how much effort to dedicate to tasks given the maturity levels of their preliminary inputs, and how much allowance to give for the possibility of preliminary information changing. At present there is little practical support for such decisions and they are usually made in ad hoc ways. As suggested by the remark above, which was made by a participant in the interview study described in this article, significant problems can be caused if preliminary information is not handled effectively—usually manifesting as wasteful design rework.
The management of preliminary information in large-scale engineering and product development was researched intensively in the 1990s and early 2000s—to illustrate the level of interest, the seminal publication of Krishnan et al. [1] in 1997 had received more than 900 citations by the time of writing this article (Google Scholar, May 2025). Many insights have been generated but the topic since became less prominent in research journals, while practical challenges remain. Another look at this topic is timely today, because the increased and ongoing digitalisation of engineering work in recent decades has increased the availability and flow of preliminary information in projects [2]. While the possibility to learn from archived design information using AI is being very actively researched [3,4], it requires the quality and maturity of the input information to be well-appreciated if results are to be sensible.
This article addresses the topic of preliminary information in engineering design from a practical and empirical perspective, identifying and addressing gaps in the prior literature. We report findings from an interview study and thematic analysis across five companies and contribute a concept demonstrator showing how some newly revealed needs can be addressed. In an initial feedback study, practitioners told us that making preliminary information’s maturity visible using the approach could help appreciate assumptions and knowledge gaps, assist communication, and inform better decisions. This is the first article to address these issues from an empirical perspective and to empirically evaluate a developed solution. Findings imply that managers should deploy simple processes and tools to raise awareness of key information’s maturity during design and development projects; we discuss opportunities for further work to explore how such awareness can support the management of preliminary information flow and use.

2. Research Framework and Article Overview

This research project was structured using the design research methodology (DRM) framework [5]. DRM emphasises the development of support for engineering design in a way that generates both scientific contribution and practical value, by ensuring traceability between research stages and between needs, research decisions, and result evaluation. DRM prescribes four research stages; this article is structured accordingly, as follows:
  • Research Clarification—is required in DRM to formulate the research objective and preliminary success criteria. In this research, this stage was carried out using a literature study, which is discussed in Section 3.
  • Descriptive Study I—is used in DRM to explore the context into which support is to be delivered and further clarify the problem factors addressed. This research addressed the stage through an empirical study to appreciate the real-world problems of dealing with immature information. The study involved the thematic analysis of 13 interviews with 15 practitioners across five companies involved in the engineering design of products and systems. It is detailed in Section 4, which also provides full detail of the research method used for this DRM phase.
  • Prescriptive Study—DRM focuses on developing support for practitioners to address the identified problems. In this research, a concept demonstrator was developed for assessing information’s maturity and making it visible with the aim to support the deliberate management of preliminary information flows and decisions based on them. This is described in Section 5.
  • Descriptive Study II—involves the empirical evaluation of the developed support to establish whether success criteria are met. In this research, follow-up interviews were done with five of the original study participants to gather feedback on the developed support and assess it against the success criteria established earlier. The research method and results for this DRM phase are presented in Section 6, followed by a discussion of the implications, limitations, and future work in Section 7 and Section 8.

3. Research Clarification: Literature Review

To recap, the aim of the research clarification stage in DRM is to “identify and refine a research problem that is both academically and practically worthwhile and realistic” [5] (p. 43). This was done by a literature study; findings are discussed in this section. First, Section 3.1 reviews how preliminary information is used in design and development and discusses some of the challenges involved. Second, Section 3.2 reviews the concept of information maturity, which is a key information attribute describing preliminary information’s degree of development. Third, Section 3.3 reviews prior approaches to enhancing the design and development process by improving awareness of preliminary information’s maturity. Section 3.4 then highlights the limited empirical work in this area. The specific gap addressed by the rest of this article is pinpointed in Section 3.5.

3.1. Preliminary Information in Design and Development

Preliminary information is ubiquitous in design and development projects [6], and particularly in concurrent, large-scale systems engineering design. It exists across many process phases, but the need to work with such information is greatest in the early, less-structured phases of design [7]. Here, the knowledge-intensive, one-off nature of design and development work means that process flexibility is needed so the skilled professionals can share their work-in-progress with colleagues and adapt their designs to deal with issues as they arise. For example, a draft report may be shared between colleagues before it is considered ready, perhaps with incomplete sections or draft content—the purposes for sharing it in preliminary form may include collecting input, gaining feedback, socialising the content before formal release, and allowing colleagues to consider the content as early as possible. Computer Aided Design (CAD) models of designed parts are usually shared and used to begin downstream work before they are fully finalised; for example, finite element analysis (FEA) will often be possible and useful before all details are added to a part, while manufacturing system design can usually start before all design details are known—the outline shape and size of a part is often sufficient for planning manufacturing machines in advance of the precise geometry and tool paths being finalised.
The value of sharing preliminary information flexibly may be set against the need to control its flow to avoid chaos, particularly in large scale projects. In a stage-gate process framework [8], understanding the uncertainty around preliminary information presented for gate review may help make effective go/no-go decisions, by helping practitioners identify and challenge the fitness of the information [9] (p. 37). In a set-based framework, appreciating the uncertainties inherent in preliminary information may help to clarify the ranges in which key parameters may eventually fall [10], and hence clarify the design ranges which interfacing solutions should accept. In the structured setting of large-scale concurrent engineering, many teams must be coordinated to work in parallel, which creates significant interdependency [11]. The usual practice is to involve a range of specialists in key decisions as early as possible with the aim to improve design quality and reduce late rework, which further increases interdependency [12]. Interdependency and the dynamic nature of engineering work mean that, in many cases, all required information may not be available when a particular task needs to be done [13]. The solution is that some people need to accept immature inputs or make assumptions to get started. There is consequently often pressure to release information before it is final to prevent starvation of downstream tasks and increase the degree of task overlapping, thus accelerating project completion. The likely cost of sharing preliminary information is increased effort to deal with design rework when some of that information needs to be updated [1].
Despite the obvious ubiquity of preliminary information in project work in both flexible and structured contexts, and despite the research attention over the years, commonly used engineering tools, like most Computer Aided Design (CAD) and Computer Aided Engineering (CAE) software and general office and management tools, such as productivity and planning software, do not allow for a structured expression of information maturity [6]. Product Lifecycle Management (PLM) tools like Siemens Teamcenter do allow the information’s lifecycle status to be indicated [14]—for example, a document might be flagged as in-work, under review, approved, or released—but this is not fine-grained and the tools do not assist with actually assessing the status. Another challenge is that information shared electronically lacks contextual cues, such as body language and cues gained from informal discussions, which are crucial for fully appreciating context [15]. In practice, the contextual complexities of the preliminary status of information are, therefore, often communicated through informal conversations or during meetings [9]; without documentation, there is much scope for misunderstanding the status of preliminary information received electronically from coworkers. This may result in overconfidence and potentially rework when the preliminary information is updated, or under-confidence, which leads to unnecessarily delayed commitments and potentially suboptimal designs.

3.2. Maturity as a Key Status Attribute of Preliminary Engineering Information

Handling the flow of information in projects may be enhanced by an appreciation of its attributes, such as its format, quality, and the assumptions under which it was generated. Having established the important role of preliminary information in the systems engineering design process and some of the challenges in handling such information, this section expands on the specific attribute of preliminary information of interest in this article—its maturity.
In general, the concept of reaching maturity relates to “having attained a final or desired state” [16]. The concept is widely used in the context of maturity models, such as capability maturity models [17] and technology readiness levels [18], which have been developed in many domains to provide frameworks for assessing and improving systems, capabilities, or processes towards a perceived best-practice state, referred to as a mature state [19]. Over 150 such maturity models had been developed by 2005 [20] and likely many more since. However, the concept of information maturity has attracted far less research attention than the maturity of systems, capabilities, or processes [6].
The concept of information maturity has also attracted less attention than other information context attributes such as information quality, fidelity, and uncertainty [6]. Information maturity has been expressed using an array of terms such as the degree of consensus [7], stability [21], completeness, [22] and development [9,23] of the preliminary information. Many publications observe that the information’s maturity depends on the context and information consumers’ needs, including their ability to understand and interpret the information (see, for example, [24,25,26,27]). Information maturity is therefore a socio-technical attribute, dependent on how people perceive, share, and want to use that information. Thus, information that is mature enough for one purpose, consumer, and context may be considered immature in another situation [26]. Considering these issues, Brinkmann and Wynn [6] developed the following definition of information maturity, which is also adopted in this article:
“The maturity of preliminary information is its degree of development in relation to expectations of sufficiency for specific contexts and consumers”
[6] (p. 2)
Brinkmann and Wynn [6] show that many constituent aspects of information’s maturity—including its completeness, precision, stability, fidelity, and so on—are closely related to other information context attributes such as information quality, uncertainty, and knowledge used when generating and interpreting the information. However, unlike these other information attributes, maturity applies specifically to preliminary information in the context of its development process. Appreciating preliminary information’s maturity is essential to consider how it can be most appropriately shared and used during systems engineering design.

3.3. Practical Approaches for Handling Immature Engineering Information

A range of problems in systems design and development could be alleviated by improving the awareness of information maturity. Support approaches and concepts have been proposed to achieve this. Some are generic, intended to apply to any information, while others are specific, focusing on particular information types and contexts.
In terms of generic approaches, O’Brien and Smith [22] may be among the earliest to propose assessing and managing information maturity in systems engineering design, intending to “speed the release of designs with sufficient maturity and prevent the release of designs that have not reached the required maturity” (p. 85). They propose assessing a design’s frequency of change by monitoring CAD file updates and also asking designers to manually assess the magnitude of each update on a 1–100 scale. Clarkson and Hamilton [28] write that the maturity of preliminary information determines when specific engineering design and analysis tasks are justified. They develop a support approach in which designers assess their confidence in design parameter values on a 4-point scale so that suitable tasks can be identified at each design step. Rouibah and Caskey [7] focus on early design stages in concurrent product development, pointing out that the dynamic, iterative and incremental nature of such processes mean they should be organised around the evolving design data status rather than prescribed task sequences. They propose an engineering design process should be organised around controlling key parameters’ hardness grades—which describe parameter maturity, reflecting its perceived quality and reliability and directly reflecting process stages and approvals obtained. Blanco et al. [26] highlight that information maturity assessment could support engineering design in the following two ways: (1) by optimising project schedules, such that task timings and overlapping degrees are appropriate given their input information’s maturity levels, and (2) by supporting informal communication processes, specifically, allowing earlier information sharing by establishing when preliminary information is mature enough to share and use among particular groups of colleagues. Johansson et al. [9,29] write that a common understanding of information maturity in aerospace engineering can facilitate discussions around important decisions. They develop an information maturity assessment grid comprising three maturity criteria as follows: inputs used to generate information, methods used to generate it, and the experience of the person generating the information. Drémont et al. [30] propose assessing information’s maturity in terms of the following: scope for variation; performance level versus downstream task requirements; validity duration; and sensitivity of the downstream task to potential future changes in the information. They envisage these aspects being estimated and combined with a simple formula, with the result used to monitor a design’s evolution across iterations, as well as prioritise design issues needing attention. Zou et al. [31] suggest that maturity assessment could help store, retrieve, and reuse preliminary design information while providing a proper appreciation of its context—they propose capturing key information’s maturity in IBIS-style decision rationale charts after the approach of Bracewell et al. [32]. They also suggest a Bayesian approach could be used to integrate different aspects of maturity into a single indicator, but they do not elaborate. More recently, Sinnwell et al. [33] propose that capturing information maturity can ease communication between design and manufacturing planning, and they propose doing this by assessing maturity against nine criteria. This is the most granular general assessment approach we identified.
Other authors propose specific approaches for handling immature information that are tailored to particular information types and contexts. For instance, Kreimeyer et al. [34] develop an approach for optimising preliminary information transfer from design to simulation departments in an automotive manufacturer, assessing whether changes to preliminary information warrant updates to downstream tasks given the specific design features changed and considering whether the downstream task has accounted for those specific features yet. Similarly, identifying which specific parameters are shared between two disciplines helps determine which aspects of preliminary shared information should be frozen first [35]. Abualdenien and Borrmann [36] propose a method for dealing with incomplete information in building design, while building information modelling (BIM)’s established concept of levels of development provides specific guidance for assessing the maturity of different types of building elements, with standards provided by BIMForum [37]. Ebel et al. [38] show how machine learning may be used to track a CAD file’s maturity as an input for assessing project progress. Overall, the approaches discussed in this paragraph are concrete due to their focus on specific information types, but this also means they cannot offer generally applicable support for managing information maturity.

3.4. Empirical Perspectives

While the aforementioned approaches typically draw insights from knowledge of industry challenges, very few researchers have applied structured empirical methodology to the topic of preliminary information and its management. The most significant empirical study is that of Terwiesch et al. [13], who conduct an in-depth case study involving about 100 interviews in an automotive manufacturer, resulting in proposed strategies for coordinating dependent tasks depending on the precision and stability of preliminary information exchanged between them. Other examples include the work of Grebici et al. [39], who draw on case study experience to discuss how engineers need to understand uncertainty levels to develop confidence in information from peers, considering commitment levels and downstream task sensitivity, while Kreimeyer et al. [34] draw on experience in the automotive sector to examine how simulation models’ maturities evolve during product development. Johansson et al. [9]’s maturity assessment approach is based on industry workshops in Volvo Aero. In critique, the empirical aspects of these publications are presented mainly as reflections on observations and most do not detail the use of structured empirical methodologies to analyse collected data and reach the presented conclusions. While these authors’ contributions are valid and important, a study applying structured empirical methodology would add significant value by helping to reveal practitioner perspectives and validate the challenges faced when handling preliminary information.

3.5. Research Gap: Lack of Concretely Developed, Empirically Grounded, and Generally Applicable Solutions for Clarifying Information Maturity

Overall, the literature study reveals a consensus that preliminary information is essential in systems engineering design and that clarifying preliminary information’s maturity can provide many benefits for managing the process. Table 1 summarises and thereby establishes the practical importance of this research. However, detailed analysis revealed a significant gap: there is still no concretely developed and empirically evaluated approach to provide the required clarity. This research gap is made precise in Table 2, which compares the few generally applicable multidimensional approaches for assessing information maturity that exist in the prior literature. The contributions of Blanco et al. [26] and particularly Johansson et al. [9] represent the state of the art in this area, while the other listed publications are less comprehensive.
Table 2 highlights that while prior publications contribute significantly to establishing needs, theory, and concepts, they do not elaborate or justify key aspects of solution concepts presented. We could not find any attempt at structured empirical evaluation of the maturity assessment concept in any publication. In other words, there are still no concrete solutions for clarifying information’s maturity and the practical viability of the whole concept therefore remains unproven.
To address the identified gap and summarise the results of the Research Clarification stage of this research, the main research objective was clarified as follows:
Research, develop, and evaluate concrete practical support for clarifying preliminary information’s maturity levels.
The other main output of the Research Clarification phase stipulated by DRM is a definition of success criteria that are achievable and measurable within the project’s scope. In this research the success criterion was defined as follows:
Practitioners should find the approach useful in improving the clarity of information maturity levels.

4. Descriptive Study I: Practitioner Interviews and Thematic Analysis

In DRM, the Descriptive Study I stage stipulates empirical research to better appreciate the practical situation and relevance of the research, as well as to pinpoint specific problems to be addressed by a support approach [5] (p. 76). In this research, with reference to the main research objective stated above, the specific objective for the phase was to explore the reality of handling immature information in design and development. Qualitative empirical method was deemed appropriate because the objective is exploratory. We also sought generalisable findings. Therefore, an interview study involving multiple companies was selected. Ultimately, the study provided practical orientation and a starting point for addressing the main research objective stated in Section 3.5. It also confirmed the value of the research by underlining the problems faced by practitioners handling preliminary information.
Companies in New Zealand were approached to recruit study participants. The scope was broad, including any company involved in systems design or engineering design. Preference was given to companies offering interviews with multiple employees, because this would help triangulate findings. The following five companies participated:
  • Company C1: A well-established producer of large-scale agricultural sorting systems, including custom machines and modular solutions. Three of the approximately three hundred employees on site were interviewed.
  • Company C2: A startup producing support and motion equipment for film-makers and photographers. Two of the thirteen employees participated.
  • Company C3: A manufacturer of water jet propulsion systems. Five of the approximately four hundred employees participated.
  • Company C4: A long-standing company in tapware and valving equipment, designed and partially manufactured in New Zealand. Four employees participated; the total number of employees is undisclosed.
  • Company C5: A developer and manufacturer of home appliances and the largest company in the study. One employee was interviewed.
Employees from each company were approached to participate based on the suggestions of the initial contact in each case. We sought a mix of engineering and management personnel. Interviews were done by the first author of this paper; they were one-on-one interviews, except for two cases where participants requested to participate in pairs. A linked interview technique [42] was used, meaning the interviewer sought to follow up on insights gained from one interview with the next in the same company. Table 3 summarises the 13 interviews and 15 participants.
The interviews were semi-structured, in every case following the schedule below, with the conversation being allowed to evolve naturally following each question as recommended by Adams [43]:
  • Introductions were provided to set the interviewee at ease, and a preprepared script was used to explain the interview’s purpose.
  • The concept of information maturity was outlined using a preprepared script, giving the example that immature design information might be ambiguous (for example, when a concept can be interpreted in different ways) or imprecise (for example, when a numeric parameter is given as a range). The interviewee was asked for their opinion on this concept.
  • The interviewee was asked whether they encountered other aspects of information maturity in their work, and how the maturity of preliminary information was communicated during the design process.
  • The interviewee was asked to comment on how tasks in the design process were affected by immature input information, including how information maturity influenced the timing and effort dedicated to tasks.
  • The interviewee was asked whether problems could be avoided if design information maturity was more deliberately considered during the design and development process.
All interviews were audio-recorded. Thematic analysis was subsequently applied using the approach of Braun and Clarke [44].
Specifically, interviews were first transcribed manually to deepen familiarity. In total, 18 h and 32 min of recordings were transcribed verbatim by the interviewer. The transcripts were then analysed to extract verbalisations relevant to the research objective stated at the start of this section. Initially, 601 such remarks were identified, which were narrowed to 301 of the most relevant. To further assist in familiarisation with the large volume of qualitative data, the remarks were then coded using 10 predefined categories and examined further. In the third step, themes and sub-themes related to the research question were identified inductively through several iterations of grouping and regrouping the remarks, which was done by both authors. To be included and considered generalisable, a theme needed to be mentioned by multiple interviewees and in all but one case across multiple companies. The resulting thematic structure, which summarises the findings, comprises three themes and ten sub-themes as illustrated in Figure 1. Towards internal validity, the second author reviewed the original transcripts and confirmed that the thematic summary accurately reflects what was said. Some adjustments were made. Finally, towards external validity, the sub-themes were set in context of prior research findings as explained in the next subsections. Some of the sub-themes are strongly supported in prior research, while others are less widely discussed. Combining the themes revealed specific needs for a practitioner support approach.
The developed themes are discussed next, prior to discussing the revealed needs. As is usual for a thematic analysis, direct quotes from the interviews are included to support our interpretations.

4.1. The Need to Work with Immature Information Cannot Be Avoided

The first theme is that the need to work with immature design information is inevitable and unavoidable. Participants explained that preliminary information exists in their work contexts for a range of reasons, all of which result from the needs to iterate, make design changes and work concurrently. The need to work with immature information is therefore not only a symptom of a poorly organised process or of another problem—it is a fundamental feature of design and development practice.

4.1.1. Some Information Is Immature Because Design Iteration Is Inevitable

The iterative nature of design and development means that changes to some information are inevitable and expected [45]. Designers may have no choice but to accept changeable information and begin work, recognising that adjustments will be needed later [46]. For example,
“I get told what the parameters are, but the parameters […] are subject to change, sometimes. Like the [part’s] height. There are a few—kind of—iterations of how high that needed to be…”
(C3E3)
Companies commonly try to avoid such situations by freezing selected information so that it cannot be changed in the future [47]. However, immature information cannot be reliably frozen [48]. Therefore, information updates due to changes and iterations remain unavoidable, a fact strongly emphasised by one study participant, as follows:
“Modifications happen all the time. The design freeze is still a dream. It’s not a clear design freeze as for my experience!”
(C1E3)

4.1.2. Some Information Is Immature Because It Is Carried Across Contexts

Engineering design rarely starts from a clean sheet [49]; in most situations, a new project builds on a prior product generation [50] or a platform [51]. Such design reuse is highly desirable to reduce the cost and time to produce each new product and to benefit from proven design assets [52]. However, one of the challenges entailed is the need to adapt and integrate the assets. In two separate interviews, two colleagues at company C3 explained that carry-over information is accordingly a key driver of immature information early in the design process, as follows:
“We kind of knew what the [product] looked like—we have done smaller [versions of these products] before—we spent a bunch of time researching as to what we wanted to do, but basically I said: ‘Right, I am going to make this analysis model’ and I just used past [product] geometries as a starting point.”
(C3E4)
“If I am working on the hydraulics, I don’t know what the [product] looks like, yet. However, I will try to make some rules about, looking at some old product we have got, so here is some rules about how we want to do that. I guess at that stage you don’t have all the information.”
(C3E1)

4.1.3. Some Information Is Immature Because It Was Deliberately Released Early

Many researchers have described how preliminary engineering information can be released before it is finalised to deliberately increase overlapping design phases [1,53,54]. Our study also confirmed this practice; several participants explained that shared information may be immature due to deliberate release prior to finalisation, because it is useful even in its preliminary state, as follows:
“Rubbish information now might actually be worth more than better information in the future […] Knowledge now actually has a value. Even if it’s rough.”
(C1E1)
[Early information release is] “a matter of convincing the engineer, saying: ‘Your final solution is not the problem right now. We just need to know the minimum characteristics…’
(C1E2)
“Once we know the outside dimensions from a casting perspective, we might not necessarily know all the machine details, so we are quite often being working on making some assumptions about: ‘OK, this is the overall size’—and then as the design evolves, the detail comes.”
(C3E2)

4.2. Practitioners Struggle to Express Information Maturity

The second theme is that practitioners struggle to articulate the concept of information maturity, despite all participants confirming that they work with preliminary information during their everyday tasks. Participants also did not report systematically assessing information’s maturity.

4.2.1. There Is No Consensus Among Practitioners on the Aspects of Information Maturity

The concept of information maturity has no clear consensus definition in the literature, as recently shown by the review study of Brinkmann and Wynn [6]. This lack of researcher consensus is reflected in mixed practitioner appreciations of the concept; in our study, no interviewee could articulate more than three different aspects of information maturity when asked to describe the concept. In many cases, the language they used while discussing the concept suggested the described aspects were not conceptually distinct. To illustrate, some participants’ attempts to articulate the maturity concept are now described.
One interviewee described mature information as being detailed, comprehensive, and clear as follows:
“The more detailed, the more comprehensive, the more clear that information is, the easier it is to make those decisions.”
(C4E3)
Another suggested that increasing the level of detail can reduce the ambiguity in preliminary information as follows:
“If you have got a lot of detail to start with, you can probably take out a lot of the ambiguity.”
(C4E1)
C4E2 suggested that maturity relates to the presence of essential elements of the information as follows:
“These are the important bits. These are the like-to-haves, and these are the we-don’t-really-care abouts.”
(C4E2)
Credibility of preliminary information’s sources (or one’s ability to verify information if the credibility is low) was also seen as an important contributor to information maturity as follows:
“There are some things that I might say: ‘Well, OK, I hear what you are saying but I need to do it myself.’ To revisit and ensure that the information is correct. Furthermore, similarly, there is information from sources where you just go: ‘OK, verbatim!’
(C4E1)
Finally, another interviewee described information’s maturity in terms of its reliability (his word), which he decomposed into the following four levels:
“It starts at guess, then working value, confident and final.”
(C3E4)

4.2.2. Practitioners Assess Information’s Maturity in Ad-Hoc Ways

The second sub-theme is that practitioners assess specific information’s maturity when using that information, but the assessments are done in ad hoc ways. The study participants did not believe there was a need for more systematic approaches to information maturity assessment. For example,
“We have a fairly small office and so we know what everyone’s working on. So if they have only been working on it for a couple of days, you go: ‘Ha, yeah…’ If you know they’ve been working on it for three, four, five weeks, you go: ‘Yeah right, okay, they’ve done a few iterations. They’ve been around where they think it’s going to be. I am reasonably confident that they are not just going to change their mind…’
(C4E2)
“You would just talk to people, it’s easy. ‘You are working on this? Let us get three of us in a room—hash it out!’ I don’t think that there would be any need for a kind of CAD system or PLM system to indicate what data was preliminary because the design was moving so quickly and so fluidly…”
(C3E4)
One of their colleagues, in a different interview, suggested that accurate assessments of information maturity might not be particularly important, as follows:
“We know that we are working with preliminary data at times, but we don’t necessarily have to get into the absolute detail of it. However, as long as we are directionally correct…”
(C3E2)
Two interviewees observed that assessing information’s maturity is difficult due to not appreciating the endpoint or target state of full development. For example,
“Your perception of what is good is preliminary. There is no way you could ask someone in the 1950s to tell you what quality of TV they want—because they had no idea.”
(C2E1)

4.2.3. Practitioners Represent Information’s Maturity in Ad-Hoc Ways

Thirdly, none of the five companies systematically represent and communicate information maturity—except for the concept of formal release status, which is provided in PLM solutions [14] but is not highly granular. To represent maturity in more granular ways, a range of ad hoc approaches were used by the study participants. One was to indicate immature design properties in an informal way in CAD models as follows:
“We make our CAD bright pink, if it has not applied any material to it yet […] we want to have accurate mass, so it’s bright pink—it stands out!”
(C3E1)
Another reported approach was to deliberately avoid representing immature information at all, to ensure that only sufficiently mature aspects of a design are shared:
“We almost deliberately don’t put too many dimensions on the drawing apart from the overall size, because actually, you don’t want anyone to be able to make it actually from the drawing […] providing that initial information, but being a bit vague, too.”
(C5E1)
Other participants explained that they would share preliminary information but without representing its maturity explicitly, simply presenting the immature information as though it were final and relying on contextual clues to remember or appreciate its status as follows:
“I have given it a best guess, but there is nowhere where I write: ‘This is a best guess.’ I have to keep it in my head, actually, or write it down elsewhere.”
(C4E4)
“The preliminary stuff is usually really preliminary, so it’s quite obvious that you put something in there that is unknown.”
(C2E1)
Only one company seemed to use a systematic approach to represent information maturity; namely, information’s formal release status in a stage-gate process. In this case, C3E1 explained that higher status implies widened visibility and indicates that a more thorough review process has been completed. This reflects the common practice also observed by Blanco et al. [26].

4.3. There Are Coordination Challenges When Working with Immature Information

As shown in Figure 1, the third and final main theme resulting from the empirical study is that practitioners face a range of challenges in handling immature information, which are described in the next subsections. The revealed challenges predominantly relate to coordination, that is, dealing with dependencies among people and activities that are entailed by sharing immature information, rather than making optimal decisions within specific tasks that produce or consume the information.

4.3.1. Individuals’ Willingness to Work with Preliminary Information Differs

One of the revealed coordination challenges is that individuals have different willingness to work with immature information. On the one hand, several interviewees expressed a preference to move forward with early, rough approximations, rather than investing excessive time in analysis and information development. C1E1 summarises this as follows:
“Fail fast! Information now! Imperfect! …Do something! Why are you drawing that up in CAD? You can go to the workshop and hack it apart with a saw, a drill and an angle grinder. Go and do something!”
(C1E1)
“People are just having ideas all the time, but some ideas you can prove wrong in a conversation, some of them take some drawing, some of them take a quick CAD model, some of them take a prototype”
(C2E1)
The remarks above reflect modern practice to prototype and iterate designs as a way of testing ideas, developing technology, and learning about problems and solutions [55]. However, on the other hand, another interviewee observed that some engineers have limited comfort when dealing with preliminary information, as follows:
“Quite a few engineers […] really do not cope with ambiguity and unknownness […] they just want to be told what to do…”
(C5E1)
Reflecting this latter group, some participants expressed frustration with the need to work with changeable or immature input information, and they mentioned some ways of dealing with the issue, outlined as follows:
“How confident are you that this information is going to change? Furthermore, if you go: ‘Not confident at all!’ you might go: ‘Well, I will just look like I’m doing something, and then when they change it, it is not going to be too much of an effort…’
(C4E2)
“Obviously, any further work you are doing is based on that information […] if I can see that there are obvious gaps or inconsistencies, then I will clarify those before I do any further work.”
(C4E3)
These differences may stem from personal preferences as well as the nature of the tasks the individuals are working on. They are likely to cause difficulties in deciding whether and when preliminary information is ready for sharing.

4.3.2. Intensive Communication Is Needed When Working with Preliminary Information

Collaborative engineering design is a highly social and consultative process [56]. In our empirical study, the importance of interpersonal communication to handling immature information was strongly emphasised by participants. For example,
“One of those kind of ambiguous parameters changes and everyone that it impacts needs to be consulted to have their input!”
(C3E3)
“I can’t overemphasise the importance of collaboration and communication—and knocking down the barriers from different departments…”
(C3E2)
Interviewee comments also confirmed that the needs of an information’s recipient for mature information may not match the information producer’s ability to provide it at a given point in the process, as follows:
“Typically [the manufacturing engineers] want the precision side of it, because they are going to make some very definitive choices […] to give them enough information early enough, that is always a challenge we are facing…”
(C3E2)
Again emphasising the social and consultative aspects of engineering design, participants told us that such challenges are resolved by negotiating activities around immature information, considering both the producer’s and consumer’s points of view, as follows:
“It’s a personal relationship. When I go to see our production engineers, I will say: ‘I have done this. This is preliminary information’‘I want to see it! Or: ‘I have done this, it’s finished. We need you to do a jig to do this.’ However, sometimes it is: ‘We have done this, I need a jig to do this, but we can change a few things…’
(C4E2)
“We would just talk it through, and he would say: ‘Can we change this? Can we change that?’ Furthermore, [I] would just write those things down and I would do those changes and would send an updated model through to him, and he would say: ‘Yes, that is good’, or ‘it needs further changes’ and then we would come up with something.”
(C4E3)

4.3.3. Dependencies Among Immature Information Contribute to Process Complexity

The items of information available in a project are interdependent, because they describe related aspects of the emerging design and because they are partially derived from one another through performing tasks [57]. Therefore, the maturity levels of different pieces of information are also interdependent [57]. The complexity arising from such interdependencies was recognised by the interviewees, for example,
“[A certain type of part] can be designed preliminary on their own, but the proper nature of [those parts] and some other parts can be analysed only in the assembly context. So the complexity of analysis grows while we are still investigating some preliminary responses.”
(C3E5)
Due to these dependencies, problems can also arise when immature information is incorporated into a design without traceability and without all stakeholders being aware as follows:
“I am not really sure what the inputs were, I am not sure what has changed between then and now. Maybe we have to do the whole job again. It’s a massive hassle.”
(C3E4)
“If only I had known how preliminary that data was…how good the data was […] sometimes you look at it and you go: ‘It will probably be preliminary, and we can shift some of this stuff around.’ Then, you find out that it’s not.”
(C4E2)
“The distinction between decisions that are made on next to no information vs. ones that are made on a lot…very little traceability of that in the system, currently. A lot of that resides within a person’s head.”
(C1E1)
When immature information is updated, the resulting changes should be propagated so that all downstream decisions are revisited and their own outputs updated if necessary [58]. However, the study revealed that the necessary propagation may not happen in practice—hence, practitioners may not recognise when the maturity of certain information is reduced due to prerequisites becoming invalid as follows:
“In a theoretically perfect world, any time a design decision is made or a value changes or anything in the design changed, the information will be propagated to the entirety, instantly, and you will be able to update and adjust for all of it—and if that decision impacts on something negatively, you will get that feedback straight away. Obviously, that does not happen.”
(C3E4)

4.3.4. Misaligned Information Maturity and Task Needs Contributes to Process Complexity

Finally, several participants explained problems that occur when suitably mature information is not available prior to starting a task. One is the inability to make necessary decisions:
“We know that we don’t have the information. So therefore we flounder—we meander through until either somebody makes a decision or it becomes Hobson’s choice.”
(C4E1)
Another recognised issue is that insufficiently mature input information for a task may lead to poor quality outcomes and is hence likely to increase rework as follows:
“As soon as you go and base it on guesswork, your quality is shocking! […] it’s having to…being able to get the preliminary data that is based on existing fact, rather than guesswork.”
(C4E2)
The study also confirmed that problems can be caused when information is developed further than needed by the downstream task.
“My experience with other engineers has been: ‘Ah, I finished the job here! Instead of checking it in, I think I can improve on this design […] because I have this luxury of time…’
(C1E3)
These comments support the concept that assessing information’s maturity and comparing it to the needs of downstream tasks could help to avoid both under- and over-refinement.
Overall, the third and final theme highlights that the necessary use of immature information contributes to multiple coordination challenges in systems engineering design. The corollary that the flow and use of preliminary information could be more deliberately managed to address these challenges was not recognised by any interviewee.

4.4. Summary of Descriptive Study I Findings: The Revealed Needs for Practitioner Support

To recap, the purpose of this phase of the research project (following the DRM framework [5]) was to confirm the practical value of the research topic, clarify the problem situation, and refine the research objective by identifying specific needs to be addressed by a support approach.
Clarifying the value of the research, every participant confirmed that they work with immature information on a daily basis and experienced problems relating to handling such information, particularly relating to coordination. Clarifying the problem situation, we observed that all participants struggled to articulate the concept of information maturity itself, while some approaches for handling immature information were mentioned, no participant described systematically tracking and managing information’s maturity to streamline processes and avoid unnecessary rework. Participants’ perceptions of the implications of working with preliminary information were focused on coordination issues rather than on decision making within individual tasks. They also mainly reflected interpersonal communication and negotiation aspects. Participants did not emphasise the systemic implications of immature information flow and how it could be managed to support task planning. This reflects the fact that that the companies involved in our study, while all dealing with system level engineering design, did not exhibit the complex, very large-scale concurrent engineering contexts where managing rework and churn is a more substantial issue. As a result, participant explanations focused on their experiences as providers and recipients of immature information rather than on system-level information flow management.
Table 4 summarises the findings of the interview study and their implications for the whole research project reported in this article. As shown in the table, each finding either supports the hypothesis that information maturity levels should be clarified or indicates a specific need towards achieving the clarification.

5. Prescriptive Study: Solution Development

In DRM, the purpose of Prescriptive Study includes deciding what problem factors to focus on and developing a support approach to address them. Because creating support is a form of design and development, a systematic design methodology is recommended by Blessing and Chakrabarti [5] as follows: needs should be considered to identify functions to be provided by the support approach, alternative concepts for each function should be considered leading to ideation and development of a support approach, before initial verification of the solution by the researchers. In DRM, the purpose of initial verification is not to comprehensively evaluate the approach but to check concepts are correctly implemented and build confidence that the stated needs are met well enough to justify an empirical evaluation study (in this research, such a study was also done and is reported in Section 6).
In this research, as already stated, the objective was to develop support that can provide clarity on preliminary information’s maturity level. The needs identified from the interviews and thematic analysis (Section 4 and Table 4) revealed four specific functions to be provided by a support approach, as shown in Figure 2. The functions, and the development of a practical approach to realise them, are explained next.

5.1. Need 1—The Solution Must Provide Clear Definitions of Information Maturity Aspects

Practitioners in the interview study struggled to articulate the precise meaning of information maturity (Section 4.2.1). The literature review also revealed no consensus on what the dimensions for an information maturity assessment should be (Section 3.3). Without a clear and common conceptualisation, maturity levels cannot practically be clarified.
To address this need for clear conceptualisation, an integrative literature review [59] was used to develop and systematically verify a new conceptual framework unpacking the many aspects of information maturity. This part of the solution development is reported in depth in a recent article [6] and needs not be repeated here. Table 5 summarises the result, providing a decomposition of the information maturity construct that integrates all maturity dimensions found in 109 reviewed articles in the literature on engineering design, product development, building design, information systems, and management. The framework is presented as a hierarchically structured taxonomy as shown in Table 5. In overview, it combines 18 distinct aspects of information maturity from prior research and organises them into three categories: (1) aspects concerning the maturity of preliminary information’s content, including its completeness and clarity; (2) aspects relating to the information’s context, including how well it meshes with the project context and how much it is expected to change; and (3) aspects related to the information’s provenance, including the suitability of assumptions while generating it and the degree to which it has been properly generated and validated. The foundational concept of the support approach introduced in this article is that answering each of the questions set out in Table 5 can provide a means for describing information’s maturity in a comprehensive and granular way that integrates conceptual insights from prior work on the topic. Readers are referred to Ref. [6] for full details and an explanation of the taxonomy’s conceptual foundations.

5.2. Need 2—Provide a Structured Way to Elicit and Represent Information Maturity

To recap, the second theme of the interview study revealed that that practitioners do not use structured, systematic and repeatable approaches to assess or represent information maturity levels (Section 4.2). The literature review of Section 3 showed that no prior approach provides detailed solutions for these tasks, as summarised in Table 2.
Four approaches were considered to address the need for a structured, detailed and generally applicable way to assess information maturity: (1) automatic detection of information maturity levels (like refs. [38,60,61]); (2) manual assessment based on a generic scale-based approach, such as a 1–5 scale for every maturity aspect (like refs. [26,30,33]); (3) manual representation of maturity levels in a bubble diagram (like ref. [31]); and (4) a manually-completed maturity grid approach (like ref. [9]). The following disadvantages of options 1–3 were noted. While automatic detection is highly desirable, it is not clear how it would be achieved for facets of the taxonomy that are contextual in nature; to be applicable in this research, automated information maturity detection would also need to deal with any type of information developed during a design and development project (the above-listed approaches are too specific). A scale-based approach, such as assessing maturity levels on a 1–5 scale, does not indicate precisely how to determine when information reaches each maturity level. The diagram-based proposal of Zou et al. [31] is highly preliminary, untested, and would not be easy to extend for the 18 maturity attributes identified in our study. Considering these disadvantages of options 1–3, it was decided that maturity evaluations in the new approach would use a maturity grid approach, which is well-established and proven in many other domains [62,63]. A significant practical advantage of maturity grids is that they are time- and cost-effective to complete [63], as well as flexible, because they can be completed digitally or on paper.
In overview, a maturity grid is a table that shows the aspects of maturity to be assessed (defined in the table row headers) against a number of discrete maturity levels that can be achieved for each aspect (in the table column headers). Maturity grids are distinguished from scale-based assessments by the precise definition of every maturity level for every aspect of maturity being assessed [63]—the definitions are written in the cells of the grid. Once a maturity grid is defined, a particular information item’s maturity may be assessed by working systematically through all the rows in the grid and, for each row, selecting the most appropriate descriptor for the current maturity state. Thus, a completed maturity grid provides detailed justification for the assessed maturity level, and by examining the descriptors, also indicates what developments are necessary to reach a higher level. An information producer may complete a grid to assess and express the status of information they are working on, while an information consumer may use the same system to define and communicate the maturity levels they require.
In this research, the process recommended by Maier et al. [63] informed development of new maturity grids for the specific context of preliminary engineering information. The first step was to identify the set of maturity elements to be assessed, thereby forming the row headers of the grid. The eighteen elements of our recently developed information maturity taxonomy (Table 5) were used. The second step was to define the scale for assessing each element, forming the column headers of the maturity grid. In this research, five maturity levels were allowed for, which is common practice [63]. The third and final step was to define the maturity levels themselves for each element, thereby filling all cells of the 18 × 5 grid.
In this case, because the 18 aspects of maturity in the taxonomy apply to all types of preliminary information, they are relatively abstract. To assist interpretation, three separate maturity grids were developed to concretise the assessments for three types of preliminary information commonly encountered in system engineering design projects:
  • Documents—such as test reports or project plans.
  • CAD models—such as assembly models comprising multiple parts and subassemblies.
  • Parameter values—such as the length of a bolt or the mass of an assembly.
It was determined that the maturity of the simplest, unstructured form of information—the parameter value—cannot meaningfully be assessed against all of the 18 aspects shown in Table 5, particularly because many of the aspects concern the information’s internal structure, which a simple number does not have. This can be appreciated by reading the descriptions in Table 5.
For each of the three information types listed above, the 18 × 5 maturity descriptors were then defined. This was straightforward for numeric parameters after the non-assessable aspects had been excluded. Specific maturity level definitions were more challenging to formulate for CAD models and documents because each dimension of a maturity assessment may be interpreted in the following two ways: (1) as the proportion of the information’s elements, such as a document’s sections or a CAD assembly’s parts, that have reached a specified level of maturity; or (2) as the degree to which full maturity is reached considering all the information’s elements. The final maturity grids are carefully worded to allow flexibility for the person performing an assessment to choose the appropriate interpretation for their specific context.
To illustrate how the maturity level definitions are worded, the five possible levels of the maturity aspect Coverage for the information type CAD assembly are as follows:
  • Level 0—None, or very few of the subassemblies and parts have been created yet.
  • Level 1—Only some of the subassemblies and parts have been created.
  • Level 2—Many subassemblies and parts required for this CAD assembly are already present; a significant number remain to be added.
  • Level 3—Most subassemblies and parts are present.
  • Level 4—All necessary subassemblies and parts are present—regardless of their levels of detail and definition.
The developed maturity grids are presented in full detail in Appendix A, Figure A1, Figure A2 and Figure A3; the reader is encouraged to explore their detail as they are ready for use in a practical setting.
The grids allow items of preliminary information to be scored on each of the maturity aspects shown in Table 5. The decomposition of information maturity into the rows of the maturity grids for different types of information allows for a highly granular, justified assessment and is intended to minimise subjectivity as far as possible. The result of completing the assessment for one piece of information is 18 integer values between 0 and 4, where 4 indicates full maturity on a particular aspect and 0 indicates no maturity—meaning that the information largely does not exist yet or the value is not defined yet. Parts of the assessment may also be skipped if the corresponding maturity aspect is deemed inapplicable to the specific information and context. This allows for a wide range of use cases.
The suitability of the new grids to elicit information maturity levels was initially checked by the authors by trials on a range of cases. This provided confidence to justify seeking practitioner feedback, which is discussed in Section 6.

5.3. Need 3—The Solution Must Combine and Summarise Information Maturity Assessments

The interview study revealed the need for support to clearly communicate information maturity levels (Table 4). To achieve this, the 18 values resulting from an assessment grid need to be aggregated into a smaller number of values to provide easily comprehensible summaries of information’s assessed maturity levels.
As shown in Table 2, most approaches in the literature that provide for multidimensional information maturity assessments do not consider how to aggregate those dimensions into a single value. The main exception is Drémont et al. [30]’s formula for calculating information maturity by combining the assessed scope for variation, assessed level of design performance, assessed time for which preliminary information is considered valid, and sensitivity of the downstream task to changes in the information. Johansson et al. [29] provide a figure that suggests combining their three maturity aspects to obtain an overall maturity level, but the text of their article does not provide much detail on the aggregation approach used, so the method remains unclear. Zou et al. [31] suggest using a Bayesian approach to aggregate maturity aspects but do not provide an explanation or example of how this would work.
Because there is no clear solution in prior work, a range of possible approaches were therefore considered. The simplest would be to average the 18 assessed aspects to arrive at an overall maturity level. However, averaging is a form of compensating aggregation in which low values in some factors may be compensated for by high values in others. Compensating aggregations are usually unsuitable to combine assessments in engineering contexts because low values are often critical and should not be easily disguised [64,65]. To illustrate in the maturity assessment context, consider a situation in which a preliminary CAD model was analysed and showed the bulk part design would not meet strength requirements. This would result in a reduction in the CAD model’s validity according to the definition given in Table 5. Logically, it should not be possible to recover the previous maturity level by ignoring the unfavourable analysis and adding more design depth, such as chamfers, to the same design. One alternative is a non-compensating aggregation, for example, aggregating elements of the assessment by multiplication or using the min function. Non-compensating aggregations ensure that very poor outcomes are never disguised, but one zero renders the whole aggregated value zero. This is also not realistic for information maturity assessments. We also considered more sophisticated aggregation schemes, specifically a fuzzy logic-based [66] or Bayesian network based approach, both of which are appealing because maturity assessments are themselves not crisp, or crafting a mathematical function to combine the individual assessment elements based on arguments about the relationships between the 18 maturity aspects, like Zou et al. [31]. However, initial trials, some reported in Brinkmann [67], showed that these approaches were excessively complicated, would be difficult to explain to practitioners, and did not yield demonstrably better results than the simple approach ultimately selected, which is explained next.
To reduce the undesired disguising of extremely low values while also avoiding overall maturity assessments of zero when only one element of the assessment grid is zero, and to do so using an approach that is easy to explain to practitioners, a simple combination of compensating and non-compensating approaches is used in the new approach. Specifically, leveraging the hierarchical structure of Table 5 taxonomy, the weighting scheme shown in Equation (1) is used to aggregate the three evaluations x 1 x 3 within each lowest-level subcategory of Table 5 as follows:
x 123 = 80 × m i n ( x 1 , x 2 , x 3 ) + 50 × m e d i a n ( x 1 , x 2 , x 3 ) + 20 × m a x ( x 1 , x 2 , x 3 ) 150
As an illustrative example, consider the subcategory completeness in Table 5 and a situation in which an information maturity assessment yielded coverage = 2, depth = 3, and readiness = 1. In this case, x c o m p l e t e n e s s = [ 80 × m i n ( 2 , 3 , 1 ) + 50 × m e d i a n ( 2 , 3 , 1 ) + 20 × m a x ( 2 , 3 , 1 ) ] / 150 = [ 80 × 1 + 50 × 2 + 20 × 3 ] / 150 = 1.6 . The resulting values for each pair of subcategories shown in Table 5 are then aggregated using a similar 80-20 scheme, in which the lower value of the pair is weighted at 80 and the higher value at 20. Thus, for example, results for completeness and clarity are aggregated to arrive at a result for content. Finally, content, context, and provenance values are aggregated using the 80-50-20 scheme described above to arrive at an overall maturity assessment for the information.
This scheme is justified as follows. The lowest maturity levels are most heavily weighted so that potential design problem areas have the greatest impact on the final aggregated value, for the reasons explained above. The 80-50-20 and 80-20 weightings reflect the so-called 80-20 rule familiar to most practitioners. The aggregation approach was initially tested on a range of hypothetical cases, including those with extreme differences among assessed maturity levels. It generated intuitively reasonable results. Finally, the suitability of the selected weighting factors was evaluated using sensitivity analysis. This showed that changing the weighting scheme from 80-50-20 to 90-50-10 or 70-50-30 yielded less than 2.5% change in the aggregated maturity value for the example given in the next subsection; this was deemed insignificant given that the assessments being aggregated are themselves experiential and, hence, not precise. Additionally, the empirical feedback on the approach (discussed in Section 6) did not reveal practitioner concerns about the aggregation method. If concerns were raised in a particular application context, the analytic hierarchy process [68] could be used to arrive at weightings for that context with input from domain experts.
Overall, the aggregation method explained in this subsection was deemed to function well enough to seek practitioner feedback on the entire approach, while having the significant practical advantage of simplicity.

5.4. Need 4—Provide Visualisations to Clearly Communicate Information Maturity Levels

The identified need for support to clearly communicate information maturity levels (Table 4) is additionally addressed in the concept demonstrator by visualising raw and aggregated assessment results. The prior maturity assessment approaches from the literature, summarised in Table 2, were examined to consider how results were visualised. Very little was found, with the topic either not discussed at all (in refs. [22,30,31] or maturity levels being visualised as basic bar charts (in refs. [9,26]). Three new visualisation concepts were therefore ideated to illustrate how information maturity assessments could address envisaged use cases. The focus was set on supporting common communication and coordination tasks because coordination based on immature information had been revealed as particularly challenging in the interview study (Section 4.3.3).
  • The first developed visualisation was developed to highlight which maturity aspects are lagging behind others and hence, guide discussions about how maturity of key information could be improved. Called the Information Maturity Wheel, this visualisation combines detail and overview of a single maturity assessment as shown in Figure 3. To illustrate this, Figure 3 indicates that the overall maturity level of the concerned information is medium-high, with the lagging aspects being crispness, stability, and readiness.
  • The second visualisation was developed to aid discussions of information maturity in a design review setting as well as aid communication of design progress. Overall maturity levels of subsystems are indicated using colours in a CAD model as shown in Figure 4 (left). The same information may be overlaid onto a tangible design prototype using augmented reality as shown in Figure 4 (right); this example was created using Vuforia Studio and trialled using a Formula SAE car design.
  • The third visualisation was developed to assist tracking of progress on iterative engineering design tasks where traditional milestone tracking and earned value tracking can be difficult to apply due to a lack of measurable deliverables prior to design completion [69]. The visualisation summarises a series of consecutive maturity assessments in terms of the number of the 18 aspects that reached each of the five maturity levels at the time of each assessment (Figure 5). Tracking these charts for key design subsystems could indicate where targeted action may be needed on particular subsystems whose maturity progress is consistently lagging behind most others.

5.5. Summary of Prescriptive Study: Research Contributions of the Developed Solution

To recap, the concept demonstrator presented in this section addresses needs resulting from the empirical study. It is grounded in a conceptual framework that decomposes information maturity into well-defined constituent elements, provides detailed maturity grids to concretise the maturity assessment concept for three distinct types of commonly encountered engineering design information, aggregates assessment components into overall information maturity scores, and visualises assessment results to make information maturity visible to practitioners so that the use of preliminary information can be more deliberately managed. The presented approach thereby addresses the research gap and limitations of prior work summarised in Table 2.

6. Descriptive Study II: Practitioner Feedback

The final stage of DRM is Descriptive Study II. According to Blessing and Chakrabarti [5] (p. 36), its objectives include: (1) to determine whether research goals were adequately addressed, that is, whether success criteria are satisfied by the developed approach, (2) to inform improvement of the approach, and (3) to suggest how it could be introduced into practice. These authors recommend a multistage evaluation including a basic verification that the support technically meets needs prior to assessing practical impact of the support against the success factors. Blessing and Chakrabarti [5] recognise that the ideal evaluation may not be achievable within the scope of many research projects.
In this research, the initial verification was done as already explained in Section 5, then the assessment against success factors was approached through a second interview study with reference to the success criterion defined in Section 3.5. Five practitioners from the earlier study agreed to participate; see Table 6. They were introduced to the concept demonstrator’s functionality using a prerecorded video presentation, which summarised the first three visualisations described in this article along with an explanation of the maturity grids and assessment process.1 Then, they were asked the following questions:
  • What are your thoughts about the concept that you were just shown?
  • Could you imagine a system like this would be useful to you or to others? Would you use it? if yes: mainly because of what?/what for?/what are the advantages? if no: why not?
  • What aspects of the concept did you like best and why?
  • What aspects of the concept do you think could be improved and why/how?
As before, all interviews were audio-recorded and manually transcribed for familiarisation and analysis. Comments deemed insightful were extracted and mapped against the three objectives listed by Blessing and Chakrabarti [5], that are mentioned at the start of this section. The next subsections summarise the findings for each objective.

6.1. Determining Whether the Success Criterion Is Satisfied

To recap from Section 3.5, the success criterion for this research was that practitioners should find the approach useful to improve clarity of information maturity levels. Most interviewees’ feedback was supportive and emphasised the value of the approach to make information maturity more visible:
This would absolutely be something that would be beneficial for us […] this checklist to work through and being able to quickly score where each component, for example, is and then having a visual representation where all the parts are. That would be useful; I can see how I would use that pretty much straight away!”
(C4E4)
The visualisation would be a great way to keep reminding ourselves: That bit of data is not that good! So you need to be looking out for that problem…”
(C3E1)
“Engineers tend to focus on the detail and go down the rabbit hole, so there has to be someone who has that high-level view […] Have we thought about that? Or the assumption on this detail, how do we know that is accurate? So, [the support approach is] a tool to make it visible to everyone, not just the project manager or this high-level-thinker…”
(C3E1)
“I can see it being good for clarifying what you understand and what you don’t […] it could be good for that, particularly your CAD diagram showing colours…”
(C5E1)
C1E1, the director from the agricultural sorting systems manufacturer, was also enthusiastic—emphasising the value of structuring the treatment of information maturity and suggesting that the resulting visibility could provide for more informed decision making, as follows:
“I like the fact that there is a—sort of a—structured approach.”
(C1E1)
I would love to see these in use, and how quickly people can pick up the intent and apply to different scenarios. The allowance for gut feel—made explicit and captured by way of ‘provenance’—is great too.”
(C1E1)
“It also democratises the ability to make high-level decisions that matter, the ability to manage a process effectively and efficiently—and it does not rely on some kind of magical personal trades of a senior manager…”
(C1E1)
C5E1, the senior specialist in product development from the home appliances manufacturer, suggested the approach could be particularly useful at early design stages when the need to handle immature information is greatest, as follows:
It could be very useful at the start of a project. However, it’s a tool, and like all tools, I can see it to be used for good or evil […] it might be useful working by yourself as a reminder of what you know—and what you don’t know—at the start. ”
(C5E1)
The opinion of the fifth interviewee, C2E1, differed from the others. He was reluctant to consider a systematic approach for information maturity assessment, referring to a simple process and fast-paced environment. Recall from Table 3 that this participant was the director of a startup with only 13 employees.
Way too complicated from my point of view. Essentially, when we look at the process, we only have three stages, and we rely on people to fill in the gaps.”
(C2E1)
“We are not facing the same problems over and over again, because the industry is always young […] So that is why [the approach] does not apply as much to us.”
(C2E1)
Overall, the four participants from the larger companies all saw value in the concept demonstrator, with particular emphasis on how it can clarify information maturity, particularly with visual representations, which is expected to help avoid oversights and support communication and decision making. The importance of this insight into how maturity assessments can add value led to its inclusion in the title of this article. The startup company director did not see value in the approach, and it is unlikely to be useful in a small company context—except perhaps to support communication with external parties.

6.2. Identifying Improvement Opportunities

The main improvement area, mentioned by several participants, was predictably to reduce the overhead of applying the approach as follows:
“One of the challenges, and I got that from every manager I ever had, is the time overhead for managing things with a process.
(C1E1)
…the usual challenge of finding time to use. I think there are generally other checklists for different subjects that are kind of good practice to use, but then you get pushed to a deadline to achieve and milestones, so you have to cut corners.”
(C4E4)
“How can you speed up the overhead of assessing the maturity?”
(C3E1)
The interviewees’ choices of words do not indicate that the effort is perceived to be highly excessive, but that like any support method, the new approach should be made as quick and easy to use as possible. This reflects many researcher’s observations on the desirable properties of design methods in general [70,71]. We considered whether the number of assessment items in the grids could be reduced to make assessments quicker. A simplified version of the grids was generated in which each triplet of bottom-level maturity attributes was merged into one. For example, considering Figure A1, the coverage, depth and readiness rows were combined into a single row for directly assessing completeness, and so on for the rest of the grid, while reducing assessment effort, the summarised grids had clear disadvantages, outlined as follows: they are more difficult to interpret and offer lower assessment precision than the full grids and, once completed for a particular piece of information, are less useful as checklists for what needs to be done to reach full maturity. Future research could therefore focus on reducing application effort for the full-scale grids and on more comprehensively establishing the value of information maturity management to help justify application. Suggestions are provided in Section 8.2.

6.3. Identifying Insights for Introduction into Practice

Reflections impacting the approach’s potential introduction were shared by the evaluation study participants. In particular, introducing the approach into practice would require suitable processes and responsibilities, could benefit from elaboration of the use cases to clarify when the approach should be used and convince practitioners of the value of entering the information, and could also benefit from considering how reported maturity levels might be manipulated.
C3E1, a senior engineer from the water jet propulsion systems manufacturer, focused on the need for assigned responsibility to keep information maturity assessments up to date as a design evolves:
“It made me think of the process of how to keep it up to date. Someone would need to—on a weekly basis or similar—go through all the parts.”
(C3E1)
Another identified implementation issue was the need for people to appreciate the value and limitations of the approach, which could be addressed in future research by further clarification of use cases, as follows:
“Managers need to understand the process, the limitations, […] when it is appropriate to deploy and when it may not be…”
(C1E1)
Finally, C5E1 observed that company politics and individual decisions regarding how to express individual maturity levels could affect the result of assessments, and hence, their value, as follows:
“I could see someone who was doing a good job of guarding their team’s interest but not a good job of looking after the overall project, saying—clicking all the buttons to make their part go green—and you cannot change any of the things of this—when in actual fact you could!”
(C5E1)

6.4. Summary of Descriptive Study II

The final DRM phase showed that participants in larger companies perceived value in the concept demonstrator. The phase also revealed opportunities to improve the approach, particularly to reduce application effort, and insights that could help bring the approach into practice.

7. Implications

7.1. Summary of Practical Findings and Implications

The first practical finding from this research is that, although preliminary information is ubiquitous in design and development processes, practitioners find it difficult to articulate the concept of information maturity clearly and do not systematically assess, represent or manage key information’s maturity. This causes a range of problems including floundering, downstream errors, and rework when information’s suitability for purpose is not well appreciated. The practical implication is that managers should promote greater awareness and discussion of preliminary information’s maturity. This article sets the scene for more research to explore how information maturity can be effectively managed.
The second practical finding is that practitioners in companies with established processes agree that structured maturity assessments and visualisation tools could provide transparency and help to inform decisions. The practical implication is that managers of the iterative design phases in engineering projects should consider introducing structured processes to track and manage the maturity of key information as it is progressively developed. We recommend that the maturity grids presented in this article are considered as the basis for such processes. The grids are ready to use in a practical context, and may additionally be customised if needed, while the aggregation method and the two simpler visualisations are straightforward to implement using spreadsheet software. If customisations are desired for a specific context, this may be achieved by further concretising the maturity level descriptors within the grid cells while retaining the overall structures and assessment criteria of the presented grids. For example, considering the top-left cell of Figure A1, the descriptor “All necessary sections, figures, tables and concepts exist in the document” may be replaced with a list of specific sections required for a particular document.
The third practical finding is that, despite recognising the likely benefits of structured information maturity assessments, practitioners have concerns about the effort and time required. Effort may be minimised by keeping assessments simple and applying them selectively where there is particular concern about the proper handling of immature information. Completing the one-sheet maturity assessments developed during this research requires no more than 10–15 min once familiar with the task; the effort could easily be outweighed in problem situations by the benefits of reflecting in depth on key information’s maturity, making it visible and hence allowing it to be actively managed. Further work may also seek to reduce the application effort and quantify the benefits; some suggestions are given in Section 8.2.

7.2. Further Work Implications: Towards Applications for Preliminary Information Management

As already mentioned, the main objective of the developed approach is to make information maturity levels visible. This section presents some future work suggestions to explore how the heightened visibility could help ensure the proper flow and use of preliminary information, particularly in large-scale concurrent engineering where updates to such information are a common cause of churn and project delays [72]. A range of research methods could be used, of which many are collected in the recent review of Escudero-Mancebo et al. [73].
The first suggestion is to explore how targeted information maturity assessments could inform decisions on when preliminary information is ready for release to downstream tasks and, similarly, when it is necessary to update those tasks to account for changes in upstream information. The seminal work of Krishnan et al. [72] previously discussed task overlapping strategies but did not offer a detailed practical approach to actually determine the status of preliminary information; the new approach presented here could potentially help address this gap. To explore this possibility, a case study approach could be used to examine the specific details of particular task overlapping situations, and the new maturity grids could be used to characterise the evolution of preliminary information as it is progressively developed by upstream tasks, as well as to characterise the evolving information needs of the downstream tasks. Some of the challenges of case study research in engineering companies and methods that could be used are discussed by Ahmed [74].
The second future work suggestion is to explore how information maturity assessments could be used to account for the entailed uncertainties when preliminary information is used in design decisions. For example, appropriate allocation of margin can reduce the likelihood that decisions need to be revisited as the information is progressively finalised [75]; further research could examine how the appropriate amount of margin relates to assessed information maturity levels. One potential application for large-scale projects would be informing the appropriate allocation of margin in interfaces to reduce the propagation of changes among coupled subsystems, hence reducing the number of variation orders that need to be handled [76]. The work of Hamraz et al. [77] could provide a starting point. Similarly, clarity on preliminary information’s maturity could inform sensitivity analysis of key decisions or assist in focusing the application of uncertainty-aware design methods such as robust design [78]. For all the suggestions presented in this paragraph, conceptual work is initially needed to explore how the information maturity concept might be used to generate quantitative estimates of uncertainty entailed by preliminary information.
Thirdly, observing that difficulties in tracing decisions made based on preliminary information were highlighted by the study participants (Section 4.3.3), investigating the potential incorporation of information maturity assessments in PLM processes to enhance traceability could be of practical value. To explore this, a case study in a large company that relies heavily on PLM to coordinate information flows would be informative.
Finally, our trials suggested that assessing information maturity levels has much intrinsic value—doing an assessment raises awareness of the maturity of work-in-progress, prompts reflection on one’s own ability to assess the maturity level and hence on one’s ability to assess how much more work is really required to finalise the information. These benefits are likely to help with planning and tracking of personal work, which can be particularly difficult for inexperienced people and when doing unfamiliar tasks. Therefore, the possibility to use information maturity assessments as a reflection tool could also be explored in future work. This could be particularly useful in educational settings, where research has shown that novice designers and engineers tend to be less effective at gathering, processing, and characterising information relevant to a task than their more experienced counterparts [79,80]. By prompting consideration of information’s suitability for purpose, the approach presented in this article could be used to help students develop more deliberate approaches for seeking and using information when addressing design tasks. At the same time, it should be recognised that challenges faced by design practitioners differ from those faced by students [81], so the industrially derived insights and support approach presented in this article might need to be modified or simplified to be applicable in an educational context. The possibility to use the approach in education could be explored through a controlled study with student groups doing design projects in a university setting. Specifically, some students could be asked to use the developed information maturity grids to characterise their evolving work at key points in their project, and later interviewed to collect their opinions on the value of the approach to help structure their work.

8. Concluding Remarks

8.1. Summary of Contributions

To recap, this article has contributed an improved appreciation of practitioners’ challenges when handling preliminary information and a practical approach for clarifying information maturity that can help address those challenges. Although approaches to clarify information maturity have previously been proposed in the design and development research literature, as shown in Table 2, no concretely developed and empirically evaluated approaches ready to support practitioners were found prior to this research. While this article is primarily a practical and empirical contribution, contributions to developing the theory of preliminary information maturity assessment were also made as part of addressing the research gap, as shown in Table 7.

8.2. Limitations and Outlook

Some limitations may be mentioned. The first concerns the effort required to apply the developed approach. As already indicated, a typical information maturity assessment should take 10–15 min, which should not be excessive if targeted to address recognised problem areas. In future, effort reduction might be achieved in several ways, allowing for broader application—we recommend investigating how to fully or partially automate assessments, for which AI processing of preliminary information might be particularly useful. However, there are challenges to overcome; during this research, initial trials in which ChatGPT-4o was instructed to assess the maturity of a preliminary document using the method explained in this article yielded appealing but superficial results and highlighted that a deeper appreciation of information’s specific content and context may be needed by an AI tool if it is to adequately assess maturity. Apart from automating assessments, a structured approach could also be envisaged to determine which maturity aspects are most relevant to a particular context, thereby cutting down the number of aspects needing to be assessed while retaining high resolution and specificity of assessment. A third suggestion is that the workflow for aggregating completed assessments and producing visualisations could be automated to reduce effort—one interviewee envisaged a system in which individual engineers could regularly update key assessments in a common spreadsheet so the overview visualisations could be updated automatically.
The second limitation is that the empirical studies discussed in this article focused on a single country and small-to-medium enterprises. This limits the scope of the findings (for example, applicability to large-scale projects in multinational companies and highly regulated industries is not proven) but also broadens earlier research into design information maturity, which was mainly set in large-scale concurrent engineering contexts. The developed approach is relatively generic and could benefit collaborative work beyond engineering design and development; therefore, further studies could investigate its applicability in other contexts such as software and construction. The number of interviews was also relatively limited. Although there is no clear consensus on the number of interviews required to achieve saturation in thematic analysis, the highly cited analysis of Guest et al. [82] found that 12 interviews are likely to be sufficient, although this depends on the study context. Our study exceeds the suggested number, with 13 interviews in the first part of the study and another 5 in the evaluation study. Nevertheless, further research could do more interviews across more companies, particularly larger companies, and analyse them using the same method to strengthen and expand the empirical findings. In particular, the evaluation study should be a focus of attention, since that part of the work involved only five interviews.
Finally, while this research demonstrated that practitioners believe the approach would work and that clarifying information maturity would be useful, we have not yet definitively established usefulness. In other words, we have not established that making information maturity’s visible would lead to the ultimate benefits expected by prior researchers, that are summarised in Table 1. Nevertheless, the initial industrial evaluation presented here was an essential step to establish the practical value of clarifying preliminary information’s maturity. Since the study participants generally agreed on the main themes described in this article and on the potential value of the proposed solution, the most productive next step for this research is thought to be a trial application of the assessment grids and visualisations in a practical setting, which would yield insights including the effort and rigidity required to apply the approach. The trial application would ideally be done in a company context and follow the development of specific information along an extended time. For this, an action research approach combining methods such as document analysis and practitioner interviews is recommended.

Author Contributions

Conceptualisation, J.T.B. and D.C.W.; methodology, J.T.B. and D.C.W.; software, J.T.B.; validation, J.T.B. and D.C.W.; investigation, J.T.B.; writing—original draft preparation, J.T.B. and D.C.W.; writing—review and editing, J.T.B. and D.C.W.; visualisation, J.T.B. and D.C.W. Elements of this article were substantially further developed from the content of J.T.B.’s PhD dissertation [67], which was supervised by D.C.W. All authors have read and agreed to the published version of the manuscript.

Funding

Initial stages of this research received funding from the New Zealand Manufacturing and Design Network. The open access publication received financial support from King Fahd University of Petroleum and Minerals.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the UAHPEC Ethics Committee of the University of Auckland (protocol 021145, date of approval 7 May 2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data (interview transcripts) are unavailable due to ethical requirements.

Acknowledgments

The authors are grateful to all participants in the interview study reported in this article, as well as the anonymous peer reviewers whose remarks helped to improve this article.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Appendix A

Figure A1. Maturity grid for assessing preliminary document maturity. Zoom in for full details.
Figure A1. Maturity grid for assessing preliminary document maturity. Zoom in for full details.
Systems 13 00674 g0a1
Figure A2. Maturity grid for assessing preliminary CAD assembly maturity. Zoom in for full details.
Figure A2. Maturity grid for assessing preliminary CAD assembly maturity. Zoom in for full details.
Systems 13 00674 g0a2
Figure A3. Maturity grid for assessing preliminary parameter value maturity. Zoom in for full details.
Figure A3. Maturity grid for assessing preliminary parameter value maturity. Zoom in for full details.
Systems 13 00674 g0a3

Note

1
Table 5 and Figure A1, Figure A2 and Figure A3, as well as the presented maturity visualisations, have been adjusted to reflect the latest version of the information maturity taxonomy, which is fully detailed in Brinkmann and Wynn [6]. In reality, the evaluation was done before these refinements. However, because the evaluation study focused on the overall concept of the information maturity approach, not the detailed wording of the grids, the differences are not significant for the findings presented. Therefore the most up-to-date versions of the taxonomy and maturity grids are presented in this article.

References

  1. Krishnan, V.; Eppinger, S.D.; Whitney, D.E. A Model-Based Framework to Overlap Product Development Activities. Manag. Sci. 1997, 43, 437–451. [Google Scholar] [CrossRef]
  2. Verma, D. (Ed.) Systems Engineering for the Digital Age: Practitioner Perspectives; John Wiley & Sons: Hoboken, NJ, USA, 2024. [Google Scholar]
  3. Jiao, R.; Commuri, S.; Panchal, J.; Milisavljevic-Syed, J.; Allen, J.K.; Mistree, F.; Schaefer, D. Design engineering in the age of Industry 4.0. J. Mech. Des. 2021, 143, 070801. [Google Scholar] [CrossRef]
  4. Cantamessa, M.; Montagna, F.; Altavilla, S.; Casagrande-Seretti, A. Data-driven design: The new challenges of digitalization on product design and development. Des. Sci. 2020, 6, e27. [Google Scholar] [CrossRef]
  5. Blessing, L.T.M.; Chakrabarti, A. DRM, a Design Research Methodology; Springer: London, UK, 2009. [Google Scholar] [CrossRef]
  6. Brinkmann, J.T.; Wynn, D.C. Aspects of information maturity in design and development. Res. Eng. Des. 2025, 36, 8. [Google Scholar] [CrossRef]
  7. Rouibah, K.; Caskey, K. A workflow system for the management of inter-company collaborative engineering processes. J. Eng. Des. 2003, 14, 273–293. [Google Scholar] [CrossRef]
  8. Cooper, R.G. Stage-gate systems: A new tool for managing new products. Bus. Horizons 1990, 33, 44–54. [Google Scholar] [CrossRef]
  9. Johansson, C.; Hicks, B.; Larsson, A.C.; Bertoni, M. Knowledge Maturity as a Means to Support Decision Making during Product-Service Systems Development Projects in the Aerospace Sector. Proj. Manag. J. 2011, 42, 32–50. [Google Scholar] [CrossRef]
  10. Toche, B.; Pellerin, R.; Fortin, C. Set-based design: A review and new directions. Des. Sci. 2020, 6, e18. [Google Scholar] [CrossRef]
  11. Eppinger, S.D.; Whitney, D.E.; Smith, R.P.; Gebala, D.A. A model-based method for organizing tasks in product development. Res. Eng. Des. 1994, 6, 1–13. [Google Scholar] [CrossRef]
  12. Prasad, B. Concurrent Engineering Fundamentals: Integrated Product and Process Organization; Prentice Hall: Englewood Cliffs, NJ, USA, 1996; Volume 1. [Google Scholar]
  13. Terwiesch, C.; Loch, C.H.; De Meyer, A. Exchanging Preliminary Information in Concurrent Engineering: Alternative Coordination Strategies. Organ. Sci. 2002, 13, 402–419. [Google Scholar] [CrossRef]
  14. Stark, J. Product Lifecycle Management (Volume 1): 21st Century Paradigm for Product Realisation, 5th ed.; Springer: Cham, Switzerland, 2022. [Google Scholar] [CrossRef]
  15. Brun, E.; Steinar Saetre, A.; Gjelsvik, M. Classification of ambiguity in new product development projects. Eur. J. Innov. Manag. 2009, 12, 62–85. [Google Scholar] [CrossRef]
  16. Merriam-Webster. Mature (Adjective). Available online: https://www.merriam-webster.com/dictionary/mature#h1 (accessed on 14 June 2025).
  17. CMMI. CMMI for Development, version 1.3. Technical Report CMU/SEI-2010-TR-033. Carnegie-Mellon University: Pittsburgh, PA, USA, 2010. [CrossRef]
  18. Mankins, J.C. Technology Readiness Levels; Technical report; Advanced Concepts Office, Office of Space Access and Technology, NASA: Greenbelt, MD, USA, 1995. [Google Scholar]
  19. Maier, A.M.; Eckert, C.M.; Clarkson, P.J. Identifying requirements for communication support: A maturity grid-inspired approach. Expert Syst. Appl. 2006, 31, 663–672. [Google Scholar] [CrossRef]
  20. de Bruin, T.; Rosemann, M.; Freeze, R.; Kulkarni, U. Understanding the main phases of developing a maturity assessment model. In Proceedings of the ACIS 2005 Proceedings—16th Australasian Conference on Information Systems, Sydney, Australia, 30 November–2 December 2005. [Google Scholar]
  21. Helms, R. Framework for releasing preliminary information in product development. Adv. Eng. Inform. 2004, 18, 231–240. [Google Scholar] [CrossRef]
  22. O’Brien, C.; Smith, S.J.E. Design maturity assessment for concurrent engineering co-ordination. Eur. J. Oper. Res. 1995, 41, 311–320. [Google Scholar] [CrossRef]
  23. Grebici, K. La Maturité de l’Information et le Processus de Conception Collaborative. Ph.D. Thesis, Institut National Polytechnique De Grenoble, Grenoble, France, 2008. [Google Scholar]
  24. Saint-Marc, L.; Callot, M.; Reyterou, C.; Moly, M.; Girard, P.; Deschamps, J.C. Toward a data maturity evaluation in collaborative design processes. In Proceedings of the DS 32: Proceedings of DESIGN 2004, the 8th International Design Conference, Dubrovnik, Croatia, 18–21 May 2004; Marjanovic, D., Ed.; The Design Society: Copenhagen, Denmark, 2004; pp. 69–76. [Google Scholar]
  25. Goh, Y.M.; Booker, J.D.; McMahon, C.A. A Framework for the Handling of Uncertainty in Engineering Knowledge Management to Aid Product Development. In Proceedings of the DS 35: Proceedings ICED 05, the 15th International Conference on Engineering Design, Melbourne, Australia, 15–18 August 2005; Samuel, A., Lewis, W., Eds.; The Design Society: Copenhagen, Denmark, 2005; pp. 363–364. [Google Scholar]
  26. Blanco, E.; Grebici, K.; Rieu, D. A unified framework to manage information maturity in design process. Int. J. Prod. Dev. 2007, 4, 255–279. [Google Scholar] [CrossRef]
  27. Fleche, D.; Bluntzer, J.B.; Al Khatib, A.; Mahdjoub, M.; Sagot, J.C. Collaborative project: Evolution of computer-aided design data completeness as management information. Concurr. Eng. 2017, 25, 212–228. [Google Scholar] [CrossRef]
  28. Clarkson, P.J.; Hamilton, J.R. ‘Signposting’, A Parameter-driven Task-based Model of the Design Process. Res. Eng. Des. 2000, 12, 18–38. [Google Scholar] [CrossRef]
  29. Johansson, C.; Larsson, A.; Larsson, T.; Isaksson, O. Gated maturity assessment—Supporting Gate Review Decisions with Knowledge Maturity Assessment. In Proceedings of the Conference Proceedings CIRP Design Conference, Eschede, The Netherlands, 7–9 April 2008; Lutters, D., Miedema, J., Eds.; University of Twente: Eschede, The Netherlands, 2008. [Google Scholar]
  30. Drémont, N.; Troussier, N.; Whitfield, R.I.; Duffy, A. A Meta-Model for Knowledge Representation Integrating Maturity for Decision Making in Engineering Design. In Product Lifecycle Management for Society; Bernard, A., Rivest, L., Dutta, D., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 385–395. [Google Scholar] [CrossRef]
  31. Zou, R.; Flanagan, R.; Jewell, C.; Tang, L. An exploratory study of information maturity in construction and developing a decision-making model. In Proceedings of the 29th Annual ARCOM Conference, Reading, UK, 2–4 September 2013; Smith, S.D., Ahiaga-Dagbui, D.D., Eds.; Association of Researchers in Construction Management: London, UK, 2013; pp. 69–80. [Google Scholar]
  32. Bracewell, R.; Wallace, K.; Moss, M.; Knott, D. Capturing design rationale. Comput.-Aided Des. 2009, 41, 173–186. [Google Scholar] [CrossRef]
  33. Sinnwell, C.; Siedler, C.; Aurich, J.C. Maturity model for product development information. Procedia CIRP 2019, 79, 557–562. [Google Scholar] [CrossRef]
  34. Kreimeyer, M.; Deubzer, F.; Herfeld, U.; Lindemann, U. Strategies Towards Maturity of Product Information Objects to Manage Concurrent Engineering Processes. In Proceedings of the TMCE 2008, Seventh International Symposium on Tools and Methods of Competitive Engineering, Izmir, Turkey, 21–25 April 2008. [Google Scholar]
  35. Emami, N. Untangling parameters: A formalized framework for identifying overlapping design parameters between two disciplines for creating an interdisciplinary parametric model. Adv. Eng. Inform. 2019, 42, 100943. [Google Scholar] [CrossRef]
  36. Abualdenien, J.; Borrmann, A. A meta-model approach for formal specification and consistent management of multi-LOD building models. Adv. Eng. Inform. 2019, 40, 135–153. [Google Scholar] [CrossRef]
  37. BIMForum. Level of Development (LOD) Specification Part I & Commentary: For Building Information Models and Data; Technical report; BIMForum: Stockholm, Sweden, 2018. [Google Scholar]
  38. Ebel, H.; Riedelsheimer, T.; Stark, R. Enabling automated engineering’s project progress measurement by using data flow models and digital twins. Int. J. Eng. Bus. Manag. 2021, 13, 1–10. [Google Scholar] [CrossRef]
  39. Grebici, K.; Goh, Y.M.; Zhao, S.; Blanco, E.; McMahon, C. Information maturity approach for the handling of uncertainty within a collaborative design team. In Proceedings of the 2007 11th International Conference on Computer Supported Cooperative Work in Design, Melbourne, Australia, 26–28 April 2007; pp. 280–285. [Google Scholar] [CrossRef]
  40. Terwiesch, C.; Loch, C.H. Collaborative Prototyping and the Pricing of Custom-Designed Products. Manag. Sci. 2004, 50, 145–158. [Google Scholar] [CrossRef]
  41. Besharati, B.; Azarm, S.; Kannan, P. A decision support system for product design selection: A generalized purchase modeling approach. Decis. Support Syst. 2006, 42, 333–350. [Google Scholar] [CrossRef]
  42. Padgett, D.K. Qualitative Methods in Social Work Research; Sage: Thousand Oaks, CA, USA, 2016; Volume 3. [Google Scholar]
  43. Adams, W.C. Conducting semi-structured interviews. In Handbook of Practical Program Evaluation, 4th ed.; Newcomer, K.E., Hatry, H.P., Wholey, J.S., Eds.; Wiley: Hoboken, NJ, USA, 2015; pp. 492–505. [Google Scholar] [CrossRef]
  44. Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef]
  45. Yassine, A.; Joglekar, N.; Braha, D.; Eppinger, S.D.; Whitney, D. Information hiding in product development: The design churn effect. Res. Eng. Des. 2003, 14, 145–161. [Google Scholar] [CrossRef]
  46. Wynn, D.C.; Eckert, C.M. Perspectives on iteration in design and development. Res. Eng. Des. 2016, 28, 153–184. [Google Scholar] [CrossRef]
  47. Lim, J.; Kang, S.; Lee, J.; Hong, Y.S. An effective design freeze strategy for establishing design rules in modular product development. J. Eng. Des. 2024, 35, 483–503. [Google Scholar] [CrossRef]
  48. Eger, T.; Eckert, C.M.; Clarkson, P.J. The role of design freeze in product development. In Proceedings of the DS 35: Proceedings ICED 05, the 15th International Conference on Engineering Design, Melbourne, Australia, 15–18 August 2005; Samuel, A., Lewis, W., Eds.; The Design Society: Copenhagen, Denmark, 2005. [Google Scholar]
  49. Pahl, G.; Beitz, W.; Feldhusen, J.; Grote, K.H. Engineering Design, 3rd ed.; Springer: London, UK, 2007. [Google Scholar] [CrossRef]
  50. Albers, A.; Rapp, S.; Heitger, N.; Wattenberg, F.; Bursac, N. Reference Products in PGE—Product Generation Engineering: Analyzing Challenges Based on the System Hierarchy. Procedia CIRP 2018, 70, 469–474. [Google Scholar] [CrossRef]
  51. Krause, D.; Gebhardt, N. Methodical Development of Modular Product Families: Developing High Product Diversity in a Manageable Way; Springer: Berlin/Heidelberg, Germany, 2023. [Google Scholar] [CrossRef]
  52. Wong, F.S.; Wynn, D.C. M-ARM: An automated systematic approach for generating new variant design options from an existing product family. Res. Eng. Des. 2024, 35, 389–408. [Google Scholar] [CrossRef]
  53. Loch, C.H.; Terwiesch, C. Communication and Uncertainty in Concurrent Engineering. Manag. Sci. 1998, 44, 1032–1048. [Google Scholar] [CrossRef]
  54. Bhuiyan, N.; Gerwin, D.; Thomson, V. Simulation of the New Product Development Process for Performance Improvement. Manag. Sci. 2004, 50, 1690–1703. [Google Scholar] [CrossRef]
  55. Camburn, B.; Viswanathan, V.; Linsey, J.; Anderson, D.; Jensen, D.; Crawford, R.; Otto, K.; Wood, K. Design prototyping methods: State of the art in strategies, techniques, and guidelines. Des. Sci. 2017, 3, e13. [Google Scholar] [CrossRef]
  56. Bucciarelli, L.L. An ethnographic perspective on engineering design. Des. Stud. 2002, 23, 219–231. [Google Scholar] [CrossRef]
  57. Wynn, D.C.; Grebici, K.; Clarkson, P.J. Modelling the evolution of uncertainty levels during design. Int. J. Interact. Des. Manuf. 2011, 5, 187–202. [Google Scholar] [CrossRef]
  58. Brahma, A.; Wynn, D.C. Concepts of Change propagation analysis in engineering design. Res. Eng. Des. 2023, 34, 117–151. [Google Scholar] [CrossRef]
  59. Torraco, R.J. Writing integrative literature reviews: Using the past and present to explore the future. Hum. Resour. Dev. Rev. 2016, 15, 404–428. [Google Scholar] [CrossRef]
  60. Abualdenien, J.; Borrmann, A. Vagueness visualization in building models across different design stages. Adv. Eng. Inform. 2020, 45, 101107. [Google Scholar] [CrossRef]
  61. Abualdenien, J.; Borrmann, A. Ensemble-learning approach for the classification of Levels Of Geometry (LOG) of building elements. Adv. Eng. Inform. 2022, 51, 101497. [Google Scholar] [CrossRef]
  62. Wendler, R. The maturity of maturity model research: A systematic mapping study. Inf. Softw. Technol. 2012, 54, 1317–1339. [Google Scholar] [CrossRef]
  63. Maier, A.M.; Moultrie, J.; Clarkson, P.J. Assessing Organizational Capabilities: Reviewing and Guiding the Development of Maturity Grids. IEEE Trans. Eng. Manag. 2016, 59, 138–159. [Google Scholar] [CrossRef]
  64. Scott, M.J.; Antonsson, E.K. Aggregation functions for engineering design trade-offs. Eur. J. Oper. Res. 1998, 99, 253–264. [Google Scholar] [CrossRef]
  65. Scott, M.J.; Antonsson, E.K. Compensation and weights for trade-offs in engineering design: Beyond the weighted sum. J. Mech. Des. 2005, 127, 1045–1055. [Google Scholar] [CrossRef][Green Version]
  66. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  67. Brinkmann, J.T. Maturity of Design Information in Engineering Design. Ph.D. Thesis, University of Auckland, Auckland, New Zealand, 2021. [Google Scholar]
  68. Saaty, T.L. The analytic hierarchy process (AHP). J. Oper. Res. Soc. 1980, 41, 1073–1076. [Google Scholar]
  69. Wynn, D.; Clarkson, P. Design project planning, monitoring and re-planning through process simulation. In Proceedings of the DS 58-1: Proceedings of ICED 09, the 17th International Conference on Engineering Design, Vol. 1, Design Processes, Palo Alto, CA, USA, 24–27 August 2009; Norell Bergendahl, M., Grimheden, M., Leifer, L., Skogstad, P., Lindemann, U., Eds.; The Design Society: Copenhagen, Denmark, 2009; pp. 25–36. [Google Scholar]
  70. Wallace, K. Transferring design methods into practice. In The Future of Design Methodology; Birkhofer, H., Ed.; Springer: London, UK, 2011; pp. 239–248. [Google Scholar] [CrossRef]
  71. Gericke, K.; Eckert, C.; Stacey, M. What do we need to say about a design method? In Proceedings of the DS 87-7 Proceedings of the 21st International Conference on Engineering Design (ICED 17) Vol 7: Design Theory and Research Methodology, Vancouver, BC, Canada, 21–25 August 2017; Maier, A., Škec, S., Kim, H., Kokkolaras, M., Oehmen, J., Fadel, G., Salustri, F., der Loos, M.V., Eds.; The Design Society: Copenhagen, Denmark, 2017; pp. 101–110. [Google Scholar]
  72. Krishnan, V.; Eppinger, S.D.; Whitney, D.E. Accelerating product development by the exchange of preliminary product design information. J. Mech. Des. 1995, 117, 491–498. [Google Scholar] [CrossRef]
  73. Escudero-Mancebo, D.; Fernández-Villalobos, N.; Martín-Llorente, Ó.; Martínez-Monés, A. Research methods in engineering design: A synthesis of recent studies using a systematic literature review. Res. Eng. Des. 2023, 34, 221–256. [Google Scholar] [CrossRef]
  74. Ahmed, S. Empirical research in engineering practice. J. Des. Res. 2007, 6, 359–380. [Google Scholar] [CrossRef]
  75. Brahma, A.; Ferguson, S.; Eckert, C.; Isaksson, O. Margins in design – review of related concepts and methods. J. Eng. Des. 2024, 35, 1193–1226. [Google Scholar] [CrossRef]
  76. Jarratt, T.A.W.; Eckert, C.M.; Caldwell, N.H.M.; Clarkson, P.J. Engineering change: An overview and perspective on the literature. Res. Eng. Des. 2010, 22, 103–124. [Google Scholar] [CrossRef]
  77. Hamraz, B.; Hisarciklilar, O.; Rahmani, K.; Wynn, D.C.; Thomson, V.; Clarkson, P.J. Change prediction using interface data. Concurr. Eng. 2013, 21, 141–154. [Google Scholar] [CrossRef]
  78. Arvidsson, M.; Gremyr, I. Principles of robust design methodology. Qual. Reliab. Eng. Int. 2008, 24, 23–35. [Google Scholar] [CrossRef]
  79. Ennis, C.W., Jr.; Gyeszly, S.W. Protocol analysis of the engineering systems design process. Res. Eng. Des. 1991, 3, 15–22. [Google Scholar] [CrossRef]
  80. Atman, C.J.; Adams, R.S.; Cardella, M.E.; Turns, J.; Mosborg, S.; Saleem, J. Engineering design processes: A comparison of students and expert practitioners. J. Eng. Educ. 2007, 96, 359–379. [Google Scholar] [CrossRef]
  81. Winarno, N.; Rusdiana, D.; Samsudin, A.; Susilowati, E.; Ahmad, N.J.; Afifah, R.M.A. Synthesizing results from empirical research on engineering design process in science education: A systematic literature review. Eurasia J. Math. Sci. Technol. Educ. 2020, 16, em1912. [Google Scholar] [CrossRef] [PubMed]
  82. Guest, G.; Bunce, A.; Johnson, L. How many interviews are enough? An experiment with data saturation and variability. Field Methods 2006, 18, 59–82. [Google Scholar] [CrossRef]
Figure 1. Themes resulting from the interview analysis provide insight into the reality of handling immature information. Numbers indicate the article sections in which themes are discussed.
Figure 1. Themes resulting from the interview analysis provide insight into the reality of handling immature information. Numbers indicate the article sections in which themes are discussed.
Systems 13 00674 g001
Figure 2. Empirical findings led to needs for support and to the functions of a support approach that provides clarity on information maturity levels.
Figure 2. Empirical findings led to needs for support and to the functions of a support approach that provides clarity on information maturity levels.
Systems 13 00674 g002
Figure 3. The Information Maturity Wheel visualises the detail of a single information maturity assessment to communicate information’s maturity level and pinpoint aspects requiring improvement. Colours on the outer rings indicate aggregated maturity levels. The hierarchical structure and definitions reflect the taxonomy of Table 5.
Figure 3. The Information Maturity Wheel visualises the detail of a single information maturity assessment to communicate information’s maturity level and pinpoint aspects requiring improvement. Colours on the outer rings indicate aggregated maturity levels. The hierarchical structure and definitions reflect the taxonomy of Table 5.
Systems 13 00674 g003
Figure 4. (left) Visualising the aggregated design maturity of product subsystems using a traffic light colour scheme overlaid on a CAD model to support design reviews and communication. (right) Visualising the design maturity of subsystems in an evolving design by augmented reality overlay on an earlier product generation.
Figure 4. (left) Visualising the aggregated design maturity of product subsystems using a traffic light colour scheme overlaid on a CAD model to support design reviews and communication. (right) Visualising the design maturity of subsystems in an evolving design by augmented reality overlay on an earlier product generation.
Systems 13 00674 g004
Figure 5. Visualising the results of successive information maturity assessments to assist design progress tracking. The colours represent the number of the 18 facets reaching the corresponding maturity level at each point in time. As a process moves forward, the green areas representing high and full maturity may be expected to progressively cover more of the plot.
Figure 5. Visualising the results of successive information maturity assessments to assist design progress tracking. The colours represent the number of the 18 facets reaching the corresponding maturity level at each point in time. As a process moves forward, the green areas representing high and full maturity may be expected to progressively cover more of the plot.
Systems 13 00674 g005
Table 1. Value of this research—the benefits of clarifying preliminary information’s maturity.
Table 1. Value of this research—the benefits of clarifying preliminary information’s maturity.
Some Benefits of Clarifying Information’s Maturity Levels in Systems Engineering Design
Prevent excessive refinement before information handover [22]
Prevent information release before sufficient maturity is reached [22]
Sequence tasks and decisions to reflect maturity evolution and hence reduce design iterations [40]
Determine when changes to immature information need to be propagated across departments [34]
Consider reliability of immature information when making decisions at stage gates [9]
Provide clarity on preliminary information’s maturity when retrieved from databases [26,31]
Determine when evolving information is mature enough to justify specific tasks [28]
Organise processes dynamically around evolving information, rather than unrealistic task sequences [7]
Monitor design evolution across iterations and highlight which parameters need attention at each step [30]
Systematically account for risks when evaluating immature design alternatives [41]
Communicate preliminary information effectively between design and manufacturing [33]
Table 2. Pinpointing the research gap—a lack of empirically grounded, comprehensively developed, and empirically evaluated solutions for clarifying preliminary information’s maturity. Contributions are sequenced chronologically, and the research gap is summarised in the final row.
Table 2. Pinpointing the research gap—a lack of empirically grounded, comprehensively developed, and empirically evaluated solutions for clarifying preliminary information’s maturity. Contributions are sequenced chronologically, and the research gap is summarised in the final row.
Ref.Justification of
Need or
Requirements
Maturity
Aspects Defined
for Assessment
Assessment
Approach
Proposed
Maturity Levels
Aggregation and
Visualisation
Evaluation of
Proposed
Approach
O’Brien and
Smith [22] (Short
journal article)
Theoretical
argument
2 aspects
proposed
without
justification
Manual and
automatic
suggested (little
detail)
No aggregation
or visualisation
Not discussed
Drémont et al. [30]
(Conference
paper)
Theoretical
argument
3 aspects
proposed
without
justification
Manual on
qualitative scale
and design perf.
calcs. (little
detail)
Formula defined
to aggregate
3 aspects; no
visualisation
discussed
Not discussed
Zou et al. [31]
(Conference
paper)
Theoretical
argument
11 aspects shown
on diagram
without
definition,
explanation or
justification
Assessing
arguments’
supporting
information
Bayesian
network
aggregation
suggested; no
detail; no
visualisation
discussed
Not discussed
Blanco et al. [26]
(Full journal
article)
Mainly theory;
design office
observations
mentioned,
method not
detailed
4 aspects
proposed
without
justification
Select 4-point
scale for each of
4 aspects
No aggregation
Tool visualises
assessed
maturities vs.
required levels
Not discussed
Johansson et al. [9]
(Full journal
article)
4 workshops at
Volvo Aero and 1
in VIVACE
project; method
not detailed
3 aspects
developed in the
workshops
Method not
detailed
3 × 5 maturity
grid inspired by
Technology
Readiness Levels
No aggregation
Graph shows
info. maturities
vs. stage gate
criteria
Expanded
hypothetical
example
Sinnwell et al. [33]
(Conference
paper)
Theoretical
argument
9 aspects
identified from
the literature
Select 5-point
scale for each of
9 aspects
No aggregation
or visualisation
1-paragraph
hypothetical
example
GAPNo structured
empirical study
establishes the
need and specific
requirements
Typically little
justification of
maturity aspects
selected; no
consensus
Most approaches
described in very
little detail; most
not developed
beyond concepts
Aggregation and
visualisation of
assessment
results barely
treated
No solution has
been empirically
evaluated to
establish concept
viability
Table 3. Participants in the interview study.
Table 3. Participants in the interview study.
CodeInterviewee’s Job TitleDate of InterviewDuration
C1-E1Director, R&D Project Management6 July 20181:28 h
C1-E2Operation Program Manager12 July 20182:08 h *
C1-E3Senior Mechanical Design Engineer12 July 20182:08 h *
C2-E1Director18 July 20181:46 h *
C2-E2Industrial Designer18 July 20181:46 h *
C3-E1Senior Engineer27 July 20181:32 h
C3-E2Technical Project Manager27 July 20181:21 h
C3-E3Engineer in Hydrodynamics R&D27 July 20181:06 h
C3-E4Senior Mechanical Engineer27 July 20180:51 h
C3-E5Principal Mechanical Engineer, FEA27 July 20181:16 h
C4-E1Group Standards and Technology Manager30 July 20181:31 h
C4-E2Senior Development Engineer30 July 20181:30 h
C4-E3Senior Technology Development Engineer1 August 20181:40 h
C4-E4Product Development Engineer1 August 20180:59 h
C5-E1Senior Specialist Product Development3 October 20181:24 h
* Indicates multiple participants per interview
Table 4. Summary of empirical findings and implications from the Descriptive Study I phase.
Table 4. Summary of empirical findings and implications from the Descriptive Study I phase.
Empirical Findings About Current PracticeImplications and Revealed Needs for Support
Section 4.1.1: Information is immature due to design iterationConfirms that immature information is ubiquitous
Section 4.1.1: Information is immature due to carry-over assetsConfirms that immature information is ubiquitous
Section 4.1.1: Info. is immature due to deliberate early releaseConfirms that immature information is ubiquitous
Section 4.2.1: Maturity concept is unclear to practitionersNeed clear definition of maturity and its aspects
Section 4.2.2: Practitioners make ad hoc maturity judgmentsNeed systematic, repeatable maturity assessments
Section 4.2.3: Ad-hoc (or no) representation of maturityNeed for practical maturity representations
Section 4.3.1: Practitioners want to work with immature info.Need to support communication of maturity levels
Section 4.3.1: Practitioners want clarity on maturity levelsNeed to support communication of maturity levels
Section 4.3.2: Maturity is appreciated through consultationNeed to support communication of maturity levels
Section 4.3.2: Decisions under immature info. are negotiatedNeed to support communication of maturity levels
Section 4.3.3: Immature info. affects decisions downstreamReinforces value of clarifying information maturity
Section 4.3.3: Indirect users often unaware of info. maturityReinforces value of clarifying information maturity
Section 4.3.3: Updates to prelim. info. often not propagatedReinforces value of clarifying information maturity
Section 4.3.4: Low maturity causes floundering (no decisions)Reinforces value of clarifying information maturity
Section 4.3.4: Low maturity causes guesswork (hence rework)Reinforces value of clarifying information maturity
Section 4.3.4: Some engineers over-refine info. (hence delays)Reinforces value of clarifying information maturity
Table 5. Taxonomy decomposing the maturity of preliminary engineering information into eighteen aspects and associated questions, structured in a three-level hierarchy. Adapted from Brinkmann and Wynn [6]. For the conceptual foundation of this taxonomy, please refer to the original work of Brinkmann and Wynn [6].
Table 5. Taxonomy decomposing the maturity of preliminary engineering information into eighteen aspects and associated questions, structured in a three-level hierarchy. Adapted from Brinkmann and Wynn [6]. For the conceptual foundation of this taxonomy, please refer to the original work of Brinkmann and Wynn [6].
Aspect of MaturityDescription/Key Question
ContentAspects of information maturity associated with the information itself
CompletenessTo what extent is everything included and done?
• Coverage• To what degree is everything in place that is known to be needed?
• Depth• To what degree are the expected detail and conceptual depth present?
• Readiness• What is the believed degree of progress towards maturity?
ClarityHow clear and precise is the information?
• Nomenclature• How adequately are units, concepts, and terminology defined and used?
• Conciseness• How adequately are waffle and unnecessary repetition minimised?
• Crispness• How precise, specific, and lacking vagueness is the information?
ContextAspects of maturity associated with how information meshes and changes
ConsistencyHow adequately does the information mesh within itself and with its surroundings?
• Compatibility• How adequately does the information mesh with its environment?
• Coherence• How adequately do elements of the information mesh with each other?
• Understandability• How adequately can the meaning of the information be appreciated?
DynamismHow much is the information expected to evolve?
• Convergedness• How significant are potential changes?
• Stability• How frequent or likely are changes to the information expected to be?
• Comprehendedness• How well understood is the current state in relation to the mature state?
ProvenanceAspects of information maturity arising from the origins of the information
PrerequisitesHow adequate were the inputs used to generate the information?
• Groundedness• Were required prerequisites available; how (in)significant were assumptions?
• Traceability• How adequately can the prerequisites be traced?
• Trustedness• How trusted is the information given its prerequisites?
ProcessHow thorough and suitable were the generation and validation processes?
• Suitability• How suitable were the generation and validation approaches?
• Rigour• How thorough were the generation and validation processes?
• Validity• How correct and suitable has the information been confirmed to be?
Bold italic text, plain italic text and bullet points indicate the taxonomy’s three hierarchical levels.
Table 6. Participants in the second interview study.
Table 6. Participants in the second interview study.
Participant CodeInterview DateInterview Duration
C1-E13 March 202152 min
C2-E129 March 202116 min
C3-E15 March 202134 min
C4-E422 March 202118 min
C5-E119 March 202131 min
Table 7. Summary of how this article’s contributions address all elements of the research gap established in Table 2.
Table 7. Summary of how this article’s contributions address all elements of the research gap established in Table 2.
Research GapContribution of This Article
No structured empirical study confirms the need and
establishes specific requirements for support.
Thematic analysis of 13 interviews across 5 companies
was used to elicit practitioner perspectives on
information maturity and problems faced, to confirm that
support is needed, and to develop specific requirements.
Typically little justification of the maturity aspects selected for
assessment.
The new approach is grounded in a detailed conceptual
framework comprising 18 maturity aspects developed by
in-depth literature study.
Most assessment approaches are proposed at a high level with
little detail.
Three 18 × 5 maturity grids are provided in full detail and are
ready for use.
Aggregation and visualisation of assessment results are barely
treated.
The new approach shows how to aggregate and visualise
maturity assessments to yield insight for managing engineering
work.
No solution has been empirically evaluated to establish concept
viability.
A concept demonstrator was evaluated by interview study
across 5 companies; feedback was encouraging.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Brinkmann, J.T.; Wynn, D.C. Handling Preliminary Engineering Information: An Interview Study and Practical Approach for Clarifying Information Maturity. Systems 2025, 13, 674. https://doi.org/10.3390/systems13080674

AMA Style

Brinkmann JT, Wynn DC. Handling Preliminary Engineering Information: An Interview Study and Practical Approach for Clarifying Information Maturity. Systems. 2025; 13(8):674. https://doi.org/10.3390/systems13080674

Chicago/Turabian Style

Brinkmann, Jens T., and David C. Wynn. 2025. "Handling Preliminary Engineering Information: An Interview Study and Practical Approach for Clarifying Information Maturity" Systems 13, no. 8: 674. https://doi.org/10.3390/systems13080674

APA Style

Brinkmann, J. T., & Wynn, D. C. (2025). Handling Preliminary Engineering Information: An Interview Study and Practical Approach for Clarifying Information Maturity. Systems, 13(8), 674. https://doi.org/10.3390/systems13080674

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop