Next Article in Journal
Advancing Digital Project Management Through AI: An Interpretable POA-LightGBM Framework for Cost Overrun Prediction
Previous Article in Journal
Factors Associated with Travel Patterns Among Mixed-Use Development Residents in Klang Valley, Malaysia, Before and During COVID-19: Mixed-Method Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Digital Engineering: A Systematic Literature Review of Strategies, Components, and Implementation Challenges

Industrial & Systems Engineering and Engineering Management, The University of Alabama in Huntsville, Huntsville, AL 35899, USA
*
Author to whom correspondence should be addressed.
Systems 2025, 13(12), 1046; https://doi.org/10.3390/systems13121046
Submission received: 18 October 2025 / Revised: 15 November 2025 / Accepted: 18 November 2025 / Published: 21 November 2025
(This article belongs to the Special Issue Digital Engineering: Transformational Tools and Strategies)

Abstract

Digital Engineering (DE) is redefining systems engineering through data-driven methods, digital models, and integrated tools that support lifecycle decision-making. Since the release of the 2018 U.S. Department of Defense (DoD) Digital Engineering Strategy, DE has gained traction across government, industry, and academia. This systematic literature review examines 56 peer-reviewed publications from 1995 to 2024 to synthesize current research on DE and assess its maturity as a discipline within systems engineering. The review is guided by five research questions focused on DE definitions, reported benefits, core components, implementation challenges, industry applications, and alignment with the five strategic goals of the DoD DE Strategy. The analysis reveals that while DE is widely discussed, its definitions and applications vary, and many studies lack detailed methodologies or implementation guidance. Furthermore, alignment with the DoD’s strategic goals is often implied but not explicitly addressed. Findings highlight the need for standardized frameworks, improved integration strategies, and evidence-based evaluation of DE effectiveness. This work contributes to systems engineering by identifying research gaps and offering a consolidated view of how DE is evolving across sectors. The insights are intended to inform practitioners, researchers, and policymakers as they implement and refine DE practices in complex system developments.

1. Introduction

Digital Engineering (DE) is reshaping how industries design, develop, and manage complex systems. As an integrated approach that leverages digital models, data-driven decision-making, and advanced simulation techniques, DE has emerged as a transformative approach to designing, developing, and managing complex systems across many industries [1]. However, despite its growing significance, DE’s definition, scope, and implementation vary across industries and academic literature [2]. Since the release of the 2018 U.S. Department of Defense (DoD) Digital Engineering Strategy [3], DE has increasingly influenced government, industry, and academic domains, redefining practices for designing, developing, and managing complex systems. This research seeks to provide a comprehensive assessment of the current state of DE by systematically reviewing its core components, technologies, strategies, adoption challenges, industry applications, and the influence of the 2018 DoD DE Strategy.
Despite the growing body of work surrounding DE, relatively few systematic literature reviews (SLRs) have synthesized the field comprehensively. Existing reviews often focus on individual aspects and components, such as Model-Based Systems Engineering (MBSE) [4,5], digital twin applications in manufacturing [6,7,8,9,10], or digital thread [11,12]. For instance, some studies explore DE within aerospace [13], construction [14], or manufacturing [15,16] sectors but rarely examine its cross-industry diffusion or alignment with national strategies such as the DoD’s DE Strategy framework. One of the most current SLRs on DE focused on determining a conceptual DE framework to support the DoD acquisition processes, and it was pointed out that the DE has fragmented, unstandardized definitions [17]. Another review addresses DE from its current development, its implementation, main challenges, and future directions [2]. Furthermore, another study has investigated the current state of digitalization practices and identified a conceptual framework that satisfies the goals of the DoD DE Strategy [17]. While these reviews contribute valuable insights, they do not systematically assess the degree to which DE research aligns with the DoD’s five strategic goals or trace the evolution of terminology and best practices over time. This gap highlights the need for a structured SLR that not only catalogs DE tools and methods but also examines the motivations, challenges, and strategic influences shaping the field. By doing so, this study advances understanding of DE in three ways. First, it isolates DE as an explicit construct, distinguishing it from adjacent concepts such as MBSE, digital twins, and digital thread, and clarifies how authors define and bound the term in practice. Second, it combines component- and benefits-focused synthesis with an implementation lens that surfaces recurring barriers and enabling conditions reported across sectors. Third, by using the 2018 DoD Digital Engineering Strategy as an interpretive framework, it shows which aspects of DE practice are accumulating evidence and which remain largely aspirational, translating a high-level policy document into an empirically grounded research and practice agenda. Therefore, this SLR is structured around three main research objectives:
  • Develop a shared understanding of DE Concepts, methodologies, and implementation practices.
  • Identify existing studies, frameworks, challenges, and best practices related to DE knowledge and skills.
  • Evaluate the extent to which the five strategic goals outlined in the 2018 DoD Digital Engineering Strategy have influenced academic literature, technological advancements, and organizational practices in Digital Engineering.
As DE continues to expand, various frameworks, methodologies, and technologies have emerged, shaping its implementation across industries. Therefore, in the first objective, we want to explore how DE is defined in the literature, how its definition has evolved, explore the motivations behind DE adoption mentioned in the literature, and which key DE components are frequently discussed as essential enablers of DE. Organizations must navigate technical, organizational, workforce, financial, and regulatory barriers to effectively implement DE strategies. The second objective of this research is to categorize and quantify these challenges while identifying best practices that have enabled successful DE adoption across various sectors. In the last objective, we aim to assess the impact of the 2018 DoD Digital Engineering Strategy on academic research and industry practices, to determine whether the DoD Strategy has significantly shaped Digital Engineering research, methodologies, and best practices. To address the main objectives, we have structured this research with five research questions (RQ):
  • RQ1: How is DE defined in the literature?
  • RQ2: What are the motivations and benefits, stated in the literature, to apply DE?
  • RQ3: What DE components (e.g., digital twins, MBSE, digital threads, authoritative source of truth) are most mentioned when talking about DE?
  • RQ4: What are the key challenges in implementing DE, and what solutions have been proposed?
  • RQ5: To what extent do academic publications explicitly reference or align with the five strategic goals of the 2018 DoD Digital Engineering Strategy?
By systematically reviewing the existing literature, this study seeks to holistically clarify the current state of DE and highlight areas that will require future research. The findings of this paper will contribute to a deeper understanding of DE’s role in modern engineering practices and provide valuable insights to guide future applications. The remainder of this paper is structured according to the following. In Section 2, background information on current DE research is provided. In Section 3, the methodology of the systematic literature review is described. In Section 4, the results from this research are provided. Section 5 discusses the results, and Section 6 presents concluding thoughts.

2. Background

Technological advancement, increased system complexity, and growing demands for speed and agility have reshaped engineering practices across sectors [18]. Organizations such as the DoD have responded to these challenges by investing in research, digital innovation, and workforce development. The DoD has traditionally followed a linear, document-heavy engineering approach that often results in document-intensive engineering processes and communication with a large amount of data used in activities and decisions stored separately in disjointed and static forms across organizations, tools, and environments [19,20]. These legacy methods struggle to meet the demands posed by exponential technological growth, increasing complexity, and the need for rapid access to information. To modernize these practices, the DoD has started the DE initiative to support lifecycle activities and promote a culture of innovation, experimentation, and efficiency [3]. DE is defined as “an integrated digital approach that uses authoritative sources of systems’ data and models as a continuum across disciplines to support lifecycle activities from concept through disposal” [21]. This approach incorporates existing model-based principles, such as model-based engineering (MBE), MBSE, digital thread, digital twin, etc. [20,21].
The DoD 2018 Digital Engineering Strategy outlines five strategic goals for DE to promote continuous engineering activities, centralized access to current and verified data, adoption of emerging tools, collaborative digital environments, and strategic workforce transition [20]. The first goal is to formalize the development, integration, and use of digital models, encouraging the use of consistent modeling practices across the system lifecycle to replace traditional document-centric processes. The second goal focuses on establishing an authoritative source of truth (ASoT), which is a centralized, validated repository of models and data to ensure all stakeholders work from the same accurate information. Third, the strategy aims to improve engineering efficiency and effectiveness by leveraging automation, model reuse, and digital collaboration to accelerate development timelines and enhance system quality. The fourth goal is to establish a supporting Digital Engineering ecosystem, which involves building the necessary infrastructure, including tools, data environments, and standards, to enable widespread adoption of Digital Engineering practices. Lastly, the strategy emphasizes the need to transform the workforce, advocating for training, education, and cultural shifts to equip engineers with the skills and mindset required to operate in a Digital Engineering environment. These goals are not exclusive to defense and are equally relevant to modern enterprises adopting new technologies while upskilling their workforce. Despite its promise, implementing DE at scale, whether in government or industry, comes with both technical and cultural challenges [22,23].
DE encompasses a suite of interrelated concepts that work together to improve the efficiency, traceability, and integration of complex systems throughout their lifecycle. Among the most critical of these concepts are MBSE, digital twins, simulations, digital threads, and ASoT. MBSE is defined as the “formalized application of modeling to support system requirements, design, analysis, verification, and validation activities” throughout the system lifecycle [24,25]. It forms the methodological foundation of DE, replacing document-based systems engineering with model-centric practices that improve consistency and communication across engineering teams. Building on this foundation, the concept of the digital twin enhances real-time system visibility. A digital twin is a virtual representation of a physical asset, process, or system that mirrors its real-world counterpart’s behavior and communication, enabling real-time monitoring and predictive analysis [26]. Digital twins allow engineers to simulate and assess system behavior under varying conditions before physical deployment, reducing risk and accelerating decision-making. Supporting both MBSE and digital twins is the digital thread, which provides the connectivity and continuity needed across lifecycle stages [12]. To maintain consistency and data integrity within such a complex ecosystem, DE requires an ASoT, which ensures that models and data remain accurate and trustworthy across teams and disciplines [27].

3. Methodology

This research follows an SLR approach to ensure a holistic and unbiased assessment of the existing body of work on DE. An SLR is designed to minimize bias by adhering to a predefined search strategy, inclusion and exclusion criteria, and systematic procedures that allow for consistent identification, evaluation, and synthesis of relevant literature [28]. The methodology for this review aligns with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework [29,30], which provides a widely recognized set of guidelines for transparent reporting. The structured SLR methodology developed for this research is summarized in Figure 1, and the corresponding PRISMA flow diagram outlining the selection process is provided in Figure 2; with a completed PRISMA checklist provided in the Supplementary Materials. This SLR process and data extraction were conducted as part of the ISE 439/539 Digital Engineering course offered in Spring 2025 at the University of Alabama in Huntsville, aligning with the pedagogical strategy for integrating DE into industrial and systems engineering education [31].
In step 1, a comprehensive literature search was conducted across five academic databases: IEEE Xplore, ScienceDirect, Aerospace Research Central (ARC), INCOSE, and the Association for Computing Machinery (ACM). These databases were selected due to their broad coverage of engineering, systems engineering, aerospace, and computing research. The thoroughness of an SLR search largely depends on the search strings employed [32,33,34]. Therefore, to ensure conceptual clarity and comparability across studies, the exact core search string used in each database was the phrase “Digital Engineering,” applied to the article title field. This decision was intended to capture literature that frames Digital Engineering as a central construct, rather than tangentially related concepts such as MBSE, digital twins, or digital threads, which, while integral to DE, could otherwise dilute the scope of analysis. We acknowledge that this choice narrows the corpus and that future work could extend the search to title, abstract, and keyword fields to assess broader interpretations of Digital Engineering. The database search was performed on 3 October 2024. The search initially identified 75 publications. After duplicate removal, 71 unique titles remained. Only English-language journal articles, conference papers, and forum publications were considered.
In step 2, a two-stage approach was taken. First, titles and abstracts were analyzed for relevance. Second, full-text reviews were performed to confirm alignment with the inclusion criteria. The screening criteria includes: only publications explicitly presenting an application, framework, or implementation of DE were retained, while conceptual discussions without practical application, literature reviews, surveys, or studies with only tangential reference to DE were excluded. Applying these criteria resulted in a refined set of 56 relevant publications. The detailed dataset extracted from all studies included in this SLR is provided in the Supplementary Materials.
In step 3, the in-depth review was carried out and a structured data extraction process was applied. The systematic extraction and coding of study data were conducted by a team of 13 reviewers under instructor supervision. Each of the 56 papers was first reviewed independently by two reviewers, who extracted information using a standardized form that captured bibliographic details, study definitions, methods, and key findings. A third reviewer then performed a cross-check for each paper by reconciling discrepancies between the two initial reviews, completing any missing items, and entering a consolidated final record into the shared dataset. Extracted data were aligned with the defined research questions and included both quantitative and qualitative information, such as study objectives, key contributions, identified challenges, implementation approaches, reported benefits, and proposed solutions. Reviewer groups were subsequently assigned to individual research questions and used the reconciled dataset to conduct the analytic synthesis, including categorization, frequency counts, visualizations, and narrative results. Because coding was finalized through reconciliation and consensus rather than through independent application of a fixed codebook, formal inter-rater reliability statistics were not computed. We acknowledge this as a limitation and note that consensus-based reconciliation served as the primary mechanism for ensuring coding consistency.
This review is constrained by the selection of five databases and the restriction to English-language publications, which may omit relevant works published in other outlets or languages. Additionally, as the database search was conducted in October 2024, emerging DE research published thereafter is not reflected in the present analysis. To maintain conceptual clarity and ensure that the selected studies explicitly addressed DE as a defined discipline, the search strategy limited results to papers containing the phrase “Digital Engineering” in the title. This approach helped differentiate DE-focused research from broader digital transformation or MBSE studies that only partially overlap with DE. However, this deliberate narrowing also constitutes a limitation, as it may bias the sample toward defense-oriented publications, where the term is more commonly used, and exclude other relevant works that contribute to DE principles under different terminologies. Another limitation of this review is that the analysis relied on frequency counts to identify recurring concepts and themes across the corpus. This approach treats all mentions equally and does not distinguish between brief references and detailed empirical examinations. We adopted this method to provide a descriptive overview suited to the heterogeneous nature of the publications, which span conceptual, methodological, and applied studies. Because the depth and purpose of discussions varied substantially across papers, implementing a weighted or depth-based scoring system would have required introducing subjective judgments beyond the scope of this review.

4. Results

Fifty-six publications were subjected to a full-text assessment to evaluate RQ1–RQ5. Figure 3 illustrates the distribution of publications over time, categorized by type (conference papers, journal articles, and forums). The graph reveals a sharp increase in conference papers beginning in 2019, immediately following the release of the 2018 DoD Digital Engineering Strategy, suggesting that the strategy significantly influenced academic and professional interest in DE. Since the search criteria specifically targeted papers with “Digital Engineering” in the title, the temporal shift in output reflects how the DoD Strategy catalyzed scholarly engagement with the topic.

4.1. RQ1: DE Definitions

This section presents the findings related to RQ1: “How is DE defined in the literature?”. To better understand how DE is conceptualized across the literature, we analyzed the specific sources cited when definitions were provided. Out of the 56 reviewed papers, only 23 offered an explicit definition of DE. The distribution of references is summarized in Table 1.
The analysis reveals that nearly 57% of all definitions (13 of 23) cited sources originating from the U.S. DoD, particularly the Digital Engineering Strategy and the Defense Acquisition University (DAU) Glossary. These definitions typically emphasize a lifecycle perspective of DE, integrating authoritative digital models and data across engineering disciplines. For example, the DoD Strategy defines DE as: “An integrated digital approach that uses authoritative sources of system data and models as a continuum across disciplines to support lifecycle activities from concept through disposal” [20]. Whereas DAU [21] uses the definition from the DoD Instruction 5000.97 Digital Engineering [20], which defines DE as: “A means of using and integrating digital models and the underlying data to support the development, test and evaluation, and sustainment of a system”. The consistency in referencing the DoD Strategy highlights its strong influence, especially in government, defense, and systems engineering literature. This shows the DoD has had a central role in shaping how DE is understood in academic and industry circles.
Meanwhile, 34.8% of papers defined DE without citing any source, indicating that authors either formulated their understanding or assumed the concept was well-known enough not to require citation. Table 2 shows the summary of the definitions found in the literature that had no source provided. This lack of citation may reflect conceptual ambiguity or inconsistency across the field. Only two papers (8.7%) referenced other academic sources such as [35,36], which expand the view of DE to include workforce competencies and broad engineering applications, respectively.
Model-Based Engineering is the most dominant underlying concept, appearing in three of the eight unsourced definitions (including one duplicate). Several definitions attempt to articulate DE as an ecosystem of tools, data, and processes, highlighting its transformative effect on systems engineering and decision-making. Some definitions [37,39] take a conceptual or aspirational tone, focusing on cultural shifts and new ways of thinking, rather than technical specifics. The lack of citations with these detailed definitions suggests that authors are attempting to contribute to the ongoing conceptualization of DE, potentially adding original definitions based on the context of their study.

4.2. RQ2: Motivations to Apply DE

This section presents the findings related to RQ2: “What are the motivations or benefits, stated in the literature, to apply DE?” and builds upon an analysis of case studies presented in the reviewed literature.
To understand the practical application of DE, we examined how case studies were presented across the 56 reviewed papers. For this review, a case study was defined as an applied example where a DE framework, methodology, or toolset was implemented to produce results and inform conclusions. As shown in Figure 4, among the 56 papers, 32 (57%) included such case studies, while 24 (43%) did not. This suggests that although more than half of the literature grounds its findings in practical applications, a significant portion remains primarily conceptual or theoretical, reflecting an ongoing need to strengthen empirical validation in the DE domain. Among the 32 case study papers, we further distinguished between “general DE” applications and those focusing on specific “subsets of DE”. General DE case studies (12 papers, 36%) approached DE holistically, often demonstrating organization-wide integration, cultural adoption, or lifecycle benefits without anchoring the analysis to one specific methodology. In contrast, the majority of case studies (20 papers, 64%) emphasized specific DE subsets, such as MBSE, digital threads, and digital twins. These papers highlighted the tangible advantages of subset adoption, such as improved traceability, real-time monitoring, and lifecycle data continuity. This indicates that while broad DE frameworks provide valuable conceptual direction, research interest has leaned toward showcasing the benefits of targeted DE technologies that can yield immediate organizational gains.
We further analyzed the industry context of the case studies, differentiating between those associated with general DE and DE subsets. As illustrated in Figure 5, the most frequently represented sectors were defense and aerospace, which reflect the origins of many DE frameworks and tools. The aerospace sector, including aviation, has embraced DE for its ability to enhance safety-critical system design and reduce development costs. Other industries, such as automotive, healthcare, and manufacturing, were represented but to a lesser extent, suggesting that DE adoption outside defense and aerospace remains an emerging frontier.
Given the varying scope of DE adoption, we also categorized the specific subsets emphasized within the subset-focused case studies. These findings are summarized in Table 3. Notably, 35% of subset case studies centered on MBSE, highlighting the central use of modeling in DE. Other commonly cited subsets included the following: ASOT, simulation, digital threads, and digital twins. Less frequently mentioned subsets included augmented reality, ontology-based modeling, and System of Systems (SoS)/End-to-End (E2E) modeling. The higher number of subset-focused studies highlights an ongoing preference for showcasing the impact of modular, implementable DE technologies rather than organization-wide DE transformations. This could reflect both the complexity of broad DE adoption and the tendency of researchers to focus on more bounded, measurable interventions.
Among the 56 papers reviewed, 47 papers (87.5%) explicitly discussed motivations or benefits associated with adopting DE, as illustrated in Figure 6. These statements were identified throughout various sections of the papers, including introductions, discussions, and conclusions. The remaining 12.5% of papers made no mention of motivations or benefits for DE adoption.
To better interpret the underlying drivers for DE adoption, a thematic categorization of the stated motivations was conducted. This analysis involved keyword searches and context reviews to ensure consistent classification. It is important to note that many papers cited multiple motivations; therefore, the frequencies presented represent non-exclusive counts across themes.
Figure 7 presents the frequency of each thematic category identified. The most commonly cited motivation was efficiency (17 papers), frequently tied to analytics-driven management and performance metrics (e.g., [1,40,42]), followed by integration (12 papers), often emphasizing interoperable data/model exchange and semantic pipelines (e.g., [35,44,61]), and cost reduction (11 papers), including explicit cost models and reductions in test/program expenses (e.g., [13,62,63]). Collaboration and improved decision-making were each cited in 10 papers (e.g., [56]). Less frequently mentioned categories included data management, automation, and accuracy, with validation and verification appearing in only one paper [37].
This dual analysis, of both case study content and motivations, highlights that industries such as defense and aerospace are not only leading in DE implementation but also articulate clear rationales for adoption. The dominance of MBSE as a DE subset reinforces the field’s reliance on models as central artifacts. These findings suggest that understanding and emphasizing specific DE subsets may enhance adoption by aligning digital strategies with industry-specific goals. Moreover, identifying motivation patterns helps stakeholders prioritize DE initiatives. For example, a defense organization aiming to increase efficiency can confidently look to DE as a validated means of achieving this goal, as supported by the existing body of literature. Categorization of the industries by key motivators mentioned and the DE subsets of the case studies results in clearer understanding of the motivations behind DE implementation. From the analysis, defense and aerospace industries commonly mention key motivators behind DE adoption. Therefore, these industries have knowledge of the benefits that DE would provide for their areas specifically. The classification of the subsets focused on in the case studies provided in the literature show what DE subsets are most relevant and applicable. This information is useful when implementing DE because it shows what areas should be focused on. For example, MBSE, being the most common subset, shows the importance of the role of models in DE.

4.3. RQ3: Most Common DE Components

This section addresses RQ3: “What DE components (e.g., digital twins, MBSE, digital threads, authoritative source of truth) are most mentioned when talking about DE?”. We examine how specific DE components are discussed, adopted, and implemented in the literature, aiming to identify which components receive the most attention. Identifying heavily emphasized components can help future research explore root causes, whether due to ease of implementation, industry preference, or alignment with specific use cases. It also raises the question of whether some industries may be underutilizing the full spectrum of DE tools.
Figure 8 presents the frequency of DE component mentions across the reviewed papers. If a paper referenced multiple components, each was counted. The most frequently mentioned components were simulation (39) and MBSE (37), consistent with simulation curricula and road-mapping efforts that foreground simulation’s role in DE [36,63] and with SysML/MBSE practice in contemporary workflows [35,64]. Other notable mentions included authoritative source of truth (ASoT) (27) and digital thread (24), reflected in work on reusable/enterprise DE environments and hub/pathfinder infrastructures that emphasize authoritative shared data and End-to-End modeling [47,60,65]. Digital twin, ontology, and AI collectively matched simulation in total mentions, supported by twin-focused methodology [50], semantic/ontology pipelines for integration [44,66], and AI applications within DE workflows [48].
We also analyzed DE component mentions by industry (Figure 9). When a paper addressed multiple industries, all were included. The defense sector was the most represented across the dataset, with MBSE leading its DE mentions (12), followed by simulation (10), ASoT (9), and digital thread (8). The aerospace sector, often overlapping with defense, showed similar trends, particularly in MBSE and simulation. Simulation was frequently employed to analyze model-generated data, further reinforcing its importance in mission-critical domains. This overlap suggests that MBSE’s prominence may be driven in part by its established use in both aerospace and defense.
Figure 10 highlights the most mentioned DE tools, with papers often contributing multiple mentions. In practice, SysML, Cameo Systems Modeler, MagicDraw, and DOORS all align under the category of MBSE tools and languages. Taken together, they represent the strongest and most consistently referenced set of tools, underscoring that MBSE is the most widely used and prominent subcomponent within Digital Engineering. This consolidation explains why MBSE-related tools dominate the landscape, particularly in defense and aerospace, where systems modeling and requirements management are foundational.
Software with only one mention was excluded from the figure for clarity. The dominance of Cameo reflects its central role in DE toolchains across high-technology sectors.
Figure 11 illustrates how DE component mentions have evolved over time. Each component was counted if mentioned in a paper, regardless of co-occurrence. A clear increase in mentions follows the release of the 2018 DoD Digital Engineering Strategy, suggesting a strong influence on scholarly discourse. The most notable growth occurred in five years post-release, with a marked spike in MBSE and simulation mentions in 2023. As the years progress, concepts like MBSE, ASoT, ontology, and digital threads continue to grow in mentions. However, in 2023 there is a large spike in MBSE and simulation. It is important to note that even though there is a large spike coming in 2023, the was also a large spike in papers from 2023 to begin with. Given the criteria of selection being “papers that had Digital Engineering in the title,” it would seem the 2018 DoD Digital Engineering had the largest effect 5 years after its release.
This trend coincides with a general increase in publications featuring “Digital Engineering” in their titles during that year. To better quantify the direct impact of the DoD Strategy, future work should compare this data with the number of papers that explicitly reference the 2018 Strategy document.

4.4. RQ4: Challenges and Solutions in Implementing DE

This section presents the results from RQ4: “What are the key challenges in implementing DE, and what solutions have been proposed?”. To systematize the analysis, a coding protocol was employed to review each publication and classify reported implementation challenges into four core categories: Technical, Cultural, Financial, and Regulatory. A barrier was coded as present if the narrative of the study explicitly or implicitly mapped onto the definitions in Table 4.
The papers were then categorized across four main industrial sectors: defense, aerospace, manufacturing, and education, with an additional “general” category for papers that presented cross-industry frameworks or non-specific application domains. To ensure clarity, a set of inclusion criteria was developed to guide the categorization of papers into these sectors. These criteria were based on explicit sector application, institutional affiliation, or the contextual framing of the DE challenge in the paper. Some papers spanned multiple sectors, particularly those at the intersection of aerospace and defense; in those cases, papers were double counted to preserve the sectoral analysis. This sectoral aggregation provided a lens through which to view not only the prevalence of different barriers within each domain but also the context in which they would arise. For example, sectors like defense and aerospace had multiple overlapping barriers due to security requirements, highly regulated environments, and legacy infrastructure. Manufacturing and education had a different barrier profile due to operational agility, workforce training needs, and digital infrastructure maturity.
By presenting the data in this way we can see not only the frequency of each barrier category but also its prominence in each sector. This allows us to see how challenges like cultural resistance or regulatory complexity manifest differently depending on the institutional priorities and structural configuration of a sector. These patterns are key to designing sector-specific and implementation-ready mitigation strategies. The proportion of each barrier category by sector is shown in Figure 12.
To characterize solution strategies, proposed solutions were first tagged with keywords generated from term-frequency analysis and the papers’ associated sectors were recorded. These keywords were then consolidated into a 15-category classification schema (Table 5), enabling frequency-based mapping of solution adoption across industries and clarifying which forms of solutions each sector most often requires.
Among these, Standardization and Metrics emerged as the most frequently cited solution category with 15 mentions, highlighting its role as a foundational pillar in the Digital Engineering ecosystem. This solution type encompasses developing and adopting standardized metrics taxonomies, data interfaces, and compliance with domain-specific frameworks. The widespread citation of this category illustrates a collective acknowledgment that interoperability, data consistency, and validation protocols are essential to DE integration. Without a shared vocabulary or set of metrics, collaboration across multidisciplinary teams becomes fractured, delaying system integration and increasing lifecycle costs.
Closely trailing behind, Governance Mechanisms (14 mentions) and Simulation and Digital Twins (13 mentions) were also highly cited. Governance Mechanisms emphasize structural oversight, leadership engagement, and quality assurance practices that facilitate model traceability, configuration control, and institutional readiness. Meanwhile, Simulation and Digital Twins offer real-time virtual feedback loops, enabling validation of system behavior under varying conditions before physical prototyping. These categories reflect a shared emphasis on codified infrastructure, process consistency, strategic oversight, and high-fidelity digital validation tools that together form the backbone of scalable, sustainable DE adoption strategies.
These trends are further visualized in Figure 13, which presents the frequency of each solution category across the reviewed literature. The bar chart reveals not only which strategies are most prominent but also illustrates the breadth of approaches being considered to support DE adoption.
Each of the 56 papers was examined to see if it proposed a new methodology. For this analysis, a paper was considered to have proposed a new methodology if it introduced an original concept, framework or structured process to advance DE in practice. As shown in Figure 14, 33 papers (58.9%) met these criteria and the remaining 23 papers (41.1%) offered empirical results, discussed challenges and benefits, or evaluated existing tools without proposing a new methodology.
This finding indicates a robust level of conceptual development in the field, with over half of the literature reviewed contributing new approaches to DE implementation. The relatively high frequency of novel proposals reflects the field’s emergent and evolving nature, where frameworks are still being shaped to meet the unique demands of modern engineering systems. Even among papers that did not propose a methodology, many contributed valuable insights into recurring problems, adoption barriers, or sector-specific considerations, suggesting that the literature continues to play a critical role in identifying areas for improvement.
Given the DoD’s visible role in DE related policy and programmatic initiatives, particularly through its release of the 2018 Digital Engineering Strategy, this review also examined the extent to which the proposed methodologies were possibly influenced by that document. Only explicit references to the strategy’s goals or focal areas were considered valid indicators of influence. Although some papers may have incorporated principles aligned with the strategy, only those with clear, documented references were included in this analysis to ensure analytical rigor.
As shown in Figure 15, 12 of the 33 methodology papers (36.4%) explicitly cited the 2018 DoD Digital Engineering Strategy as a basis for their framework development [13,27,42,43,52,63,69,70,71,73,77,81]. This suggests that the strategy has become an important reference point for a portion of the academic dialog, while many methodological contributions continue to develop independently of explicit DoD guidance. The remaining 21 papers (63.6%) either did not reference the strategy or pursued alternative conceptual directions.
Notably, an additional four papers, despite not proposing any new methodology, still explicitly referenced the 2018 Strategy, indicating that its influence extends beyond conceptual contributions and informs broader discussions on DE implementation. These papers often leveraged the strategy as a benchmark or motivator for research, even when their focus was evaluative or thematic rather than prescriptive. Challenges were classified by sector and thematically coded across recurring categories including technical, organizational, cultural, and process-specific issues. Solutions were then analyzed according to their frequency, thematic focus, and sectoral relevance.
Of the 56 studies, 26 papers discussed barriers specific to the defense sector [1,13,27,36,39,40,41,42,43,45,52,60,63,64,65,68,69,73,74,75,76,77,81,82,83,84], 21 addressed barriers in non-defense sectors [16,37,38,44,47,48,49,50,53,54,57,58,59,61,67,70,78,79,80,85,86], and 9 papers did not mention barriers at all [40,41,45,52,60,68,74,76,83]. Figure 16 shows this mix of barriers.
Papers were categorized into sectors based on their stated focus. Defense papers were identified through direct references to DoD priorities or organizations, while non-defense papers discussed sectors such as aerospace, manufacturing, or education. In cases where a paper addressed multiple sectors, defense was given precedence to avoid double-counting. This classification enabled a more accurate understanding of how barriers differ by sector.
Barriers varied in frequency and type between sectors. Figure 17 and Figure 18 illustrate the most frequently cited barriers within each sector.
The most common challenge across both sectors was lack of interoperability between tools, systems, and platforms. In the defense sector this was mentioned in 10 of the 26 papers [27,39,60,64,68,73,75,76,82,83] and in the non-defense sector in 9 of 21 papers [13,38,48,49,53,58,59,85,86]. This is a big problem given the reliance on legacy systems and the many software tools used across the lifecycle. Interoperability issues were often related to integrating different digital tools, managing proprietary data formats and transferring models between platforms. Notably this was 42.8% of all barriers mentions in the non-defense sector and 38.5% in the defense sector.
Another big barrier was cultural and organizational resistance to change. This included inadequate training, lack of leadership emphasis and difficulty in transitioning personnel from traditional workflows to digital environments. This was mentioned in 7 of the 26 defense papers [1,36,40,43,52,63,76] and 4 of the 21 non-defense papers [16,61,85,86]. In these cases, the challenge was not the technology itself but the workforce’s readiness and skepticism about DE adoption. These papers said that without sufficient cultural alignment and communication, even well designed DE frameworks will fail. This was 26.9% of the barriers in the defense sector and 19% in the non-defense sector. A third common barrier was the lack of a standard approach to DE. In the defense sector this was mentioned in 10 papers (38.5%) [13,36,39,40,41,42,43,69,76,81] and in the non-defense sector in 3 papers (14.3%) [38,67,70]. Several papers noted that the lack of cross-industry consensus on DE best practices creates confusion during implementation. With rapidly evolving technological approaches that were considered state of the art yesterday can become outdated today, making standardization even harder.
A smaller but still significant challenge was the difficulty of moving from paper-based documentation to fully digital environments. This paper-to-digital barrier was mentioned in three defense papers [13,69,81] and two non-defense papers [49,58]. Moving from static documentation to model-based systems is not just a technological upgrade, it is a process change, retraining staff and addressing concerns about digital reliability. Several papers noted that clients or stakeholders often prefer traditional documents and may resist a change that feels unfamiliar or unnecessary.
Finally challenges in model curation and configuration management were also mentioned. In the defense sector four papers mentioned configuration management [40,45,57,64] and seven papers model curation [27,42,60,73,77,81,84]. In non-defense papers configuration management was mentioned once [61] and model curation in six papers [16,47,50,53,70,78]. These were issues around version control, data consistency, and resolving conflicting model interpretations across departments. The findings suggest these are more acute in defense-related projects which have long development timelines and complex organizational structures.

4.5. RQ5: Influence of the 2018 DoD DE Strategy

This section addresses RQ5: “To what extent do academic publications explicitly reference or align with the five strategic goals of the 2018 DoD Digital Engineering Strategy?”. Specifically, it examines how the DoD Strategy has influenced the development of research frameworks, methodologies, and best practices in DE. Investigating the extent to which academic publications refer to or align with the 2018 DoD Digital Engineering Strategy is important because the strategy represents one of the earliest and most comprehensive efforts to formalize the vision of DE, specifically targeting the DoD. By analyzing how the five strategic goals are reflected in scholarly work, we can assess whether academic research is supporting, extending, or diverging from national priorities. This alignment matters not only for advancing theory but also for ensuring that research outputs remain relevant to real-world implementation challenges, such as workforce transformation, technological innovation, and infrastructure development. Identifying which goals receive the most academic attention, and which remain underexplored, provides valuable insight into where research is reinforcing policy initiatives and where gaps exist that may hinder broader adoption of DE practices.
Out of the 56 reviewed papers, 20 (36%) were identified as explicitly or implicitly referring to the DoD DE Strategy. Among these, explicit references were most frequent for Goals 2 and 1, followed closely by Goals 3 and 5. In contrast, Goal 4 was the least represented. Figure 19 is the breakdown of explicit and implicit mentions of each of the goals. The distribution suggests that research has gravitated toward methodological and cultural dimensions of DE, areas where academia can directly contribute through conceptual development, case studies, and workforce training. The lower emphasis on infrastructure-related goals likely reflects barriers such as resource intensity, reliance on large-scale organizational investment, and limited accessibility of enterprise-level digital ecosystems within academic settings. Overall, the findings indicate that while the DoD Strategy has shaped research directions, particularly in advancing model-centric approaches and workforce transformation, its influence has been uneven across the five goals. This highlights both the alignment of academia with certain strategic priorities and the gaps where further engagement is needed.
Figure 20 presents a co-occurrence map of the five strategy goals, illustrating how frequently they appear together within the literature. A continuous color gradient encodes the frequency with which two terms appear together. The color scale ranges from low-intensity (representing infrequent co-occurrence) to high-intensity (representing frequent co-occurrence). The analysis highlights a particularly strong triad among Goals 1, 2, and 5, where studies combine model-driven decision support with authoritative model/data backbones and the workforce/culture to enable them. This triad reflects an integrated view of DE: technical advances in modeling and data integration are seen as inseparable from the human and organizational capacity needed to enable them. Importantly, no evidence of trade-offs or conflicts between goals was identified, suggesting that the DoD Strategy is conceptualized in the literature as a cohesive framework rather than a set of competing imperatives. Industry mirrors this: of the 20 papers that referenced the DoD Strategy, 18 are defense-focused and 2 generalize beyond defense. Furthermore, the dominance of MBSE is striking: 18 of 20 papers explicitly link their implementation of the DoD Strategy to MBSE, 1 uses MBE terminology without the MBSE label, and only 1 makes no reference to MBSE.
When broken down by goal:
  • Goal 1 aligns with MBSE, digital surrogates and model curation/application, reflecting its centrality in formalizing modeling practices [13,39,42,43,52,60,63,69,70,71,73,74,76,77,81,84].
  • Goal 2 extends into data integration, information standardization and unified repositories, highlighting the infrastructural backbone required for effective DE [1,13,27,36,39,42,43,52,68,69,70,71,73,76,77,81,84].
  • Goal 3 is tied to model-based testing, digital surrogates, and advanced analytics [1,13,27,39,42,43,52,58,63,69,70,71,76,77,81,84].
  • Goal 4 appears less frequently and is often expressed as toolchains and acquisition infrastructure [13,39,42,43,58,60,69,70,71,73].
  • Goal 5 shows up in competency frameworks, curriculum, and organizational change initiatives [1,36,39,42,43,58,60,69,70,71,76,81,84].
Overall, the co-occurrence patterns reveal that DE research aligns most strongly with methodological, data, and workforce dimensions of the DoD Strategy, while infrastructure-focused work remains relatively limited. This indicates that while academic efforts are reinforcing much of the DoD Strategy’s intent, certain goals, particularly Goal 4, may require greater attention to fully realize the integrated vision of DE.
In addition to analyzing research themes and strategy alignment, it was also of interest to examine funding sources, as funding often shapes the scope, orientation, and applicability of research. Understanding who sponsors DE research provides insight into the drivers of methodological development and highlights the sectors most invested in operationalizing DE principles. Across the 56 reviewed papers, 43% reported funding support. Figure 21 shows the overall distribution of these sources, with non-U.S. government entities (29%) representing the largest share, followed by the Department of Defense (21%), the Systems Engineering Research Center (21%), U.S. defense contractors (13%), universities (8%), and smaller contributions from U.S. non-defense entities (4%) and other companies (4%).
When stratifying by strategy mention, however, the funding landscape shifts noticeably. Among papers without explicit reference to the DoD DE Strategy, non-U.S. government sponsors dominate (41%) (Figure 22). In contrast, for papers that do reference the strategy, funding is concentrated almost entirely within defense-linked channels: DoD (57%) [36,63,76,81], Systems Engineering Research Center (SERC) (29%) [1,77], and defense contractors (14%) [27] (Figure 23). Taking together, this distribution reveals that 86% of strategy-referencing papers are directly tied to DoD or SERC funding, and 100% are defense-related when contractors are included. These patterns indicate a strong association between the DoD Strategy citation and defense-linked funding, although they do not by themselves demonstrate that the DoD Strategy guidelines caused these funding distributions.
A final analytical lens considers how the DoD DE Strategy goals are used rhetorically versus operationally. Most papers that mention the strategy invoke the goals as evidence or background context to frame claims, while a smaller subset presents solution frameworks or methods that directly implement one or more goals. A residual group integrates both approaches in Figure 24.
  • Evidence-oriented usage is seen in papers that cite the goals primarily to establish motivation or justification [27,35,36,39,41,43,52,58,67,68,69,70,71,72,76,77,81,84,85].
  • Solution-oriented usage appears in studies that develop architectures, processes, or methods aligned with specific goals [55,56,60,64].
  • Hybrid usage is found in a small set of papers that both motivate with and operationalize the goals [13,41,42,48,73,86].
Read alongside MBSE’s dominance and the limited emphasis on Goal 4, this distribution suggests an implementation gap. The DoD Strategy is widely cited to legitimize research directions, yet far fewer studies advance goal-driven architectures, toolchains, or infrastructure that operationalize the strategy in practice, particularly in the infrastructure/environment dimension [13,39,42,67,73]. In summary, MBSE remains the principal mechanism for aligning with Goals 1 and 2; Goal 5 is often framed as the organizational prerequisite for long-term adoption; and Goal 4 is underdeveloped compared to its importance.

5. Discussion

Across the five research questions, several common themes emerge that characterize the current state of DE. Together, the findings illustrate a field strongly shaped by defense priorities, conceptually ambitious but operationally immature, and unevenly distributed across sectors. In interpreting these findings, it is important to recall that the corpus is intentionally restricted to studies that include the phrase “Digital Engineering” in the title. This focus made it possible to treat DE as an explicit construct and to analyze how authors define and bound the term in practice, but it also narrows the lens through which the field is viewed. Some organizations and sectors may implement practices that are consistent with DE principles under other labels, such as MBSE, digital transformation, or model-based digital thread, and those efforts are not captured here. As a result, the patterns reported in this review likely overrepresent DE-labeled, often defense-oriented, work and generalizations to the broader digitalization landscape should be made with caution.
The analysis of DE definitions demonstrates the heavy reliance on the U.S. DoD DE Strategy and Defense Acquisition University materials as primary definitional anchors. While these provide necessary structure and legitimacy to the discipline, they also confine DE’s conceptual scope within the defense and aerospace ecosystem. The lack of cited definitions in roughly one-third of the reviewed papers and the limited presence of cross-sector or international references indicate an opportunity for broader, consensus-driven frameworks. Expanding the definitional base to include non-defense and international perspectives is essential to establish a globally coherent understanding of DE.
The analysis of DE motivations and components points out that efficiency, integration, and cost reduction are the most frequently cited drivers, with MBSE and simulation emerging as the dominant technological enablers. It was particularly interesting to observe how often DE is framed almost synonymously with MBSE or simulation. This pattern suggests a prevailing misconception in the literature, treating DE as an extension or rebranding of model-based approaches rather than as a broader, integrative paradigm that connects data, models, people, and processes across the entire lifecycle. While MBSE and simulation are essential components, they represent only part of the DE ecosystem, which also includes data governance, digital threads, authoritative sources of truth, and organizational transformation. This conceptual narrowing highlights the need for the DE community to communicate more clearly that DE is a system-level strategy, not a single methodology or toolset.
Despite these conceptual limitations, the literature demonstrates a strong belief in DE’s transformative potential. However, much of the current work remains descriptive rather than evaluative, emphasizing anticipated benefits over measurable evidence. This emphasizes an urgent need for empirical research that quantifies DE’s impact on lifecycle cost, schedule adherence, and decision quality.
Implementation challenges are remarkably consistent across sectors, centered around interoperability limitations, organizational cultural resistance, and a lack of standardization. These issues manifest differently depending on context: defense and aerospace organizations struggle with legacy systems and compliance constraints, whereas manufacturing and education sectors emphasize agility, workforce training, and return on investment. Despite these barriers, the literature demonstrates a growing number of proposed frameworks, signaling an active and evolving research landscape. Yet, the persistent gap between conceptual innovation and operational realization suggests that DE remains in an early phase of adoption, defined more by its promise than by systematically validated results.
Further analysis suggests that the 2018 DoD Digital Engineering Strategy has become an important reference point in academic dialog, particularly in defense-oriented work, but its influence is neither universal nor uniform across the five strategic goals. Only 20 of the 56 papers reference the DoD Strategy, and 12 of the 33 methodology papers (36.4 percent) cite it as a basis for framework development. Where it is referenced, goals related to modeling practices (Goal 1), authoritative data and model management (Goal 2), and workforce transformation (Goal 5) dominate the literature, whereas infrastructure development (Goal 4) receives limited attention. This pattern indicates that academic research frequently leverages the DoD Strategy guidelines to frame motivations and methodological choices in model-centric and workforce domains, while having less visibility into or access to the large-scale enterprise ecosystems needed to study infrastructure integration and digital continuity in depth.
From the perspective of Goal 4, the findings suggest that DE is still largely conceptualized as a modeling and data problem rather than as an infrastructure and acquisition integration problem. Very few studies describe how DE artifacts connect to enterprise software environments, contract deliverables, or configuration-controlled acquisition baselines. This highlights an implementation gap between the DoD Strategy’s infrastructure aspirations and the current research focus and the need for more actionable guidance for organizations seeking to operationalize Goal 4.
Notably, the DoD Strategy goals are most often used as justification for research directions, less often as the backbone of End-to-End operations that connect environments, governance, model curation, cybersecurity, and workforce change. The near-term opportunity lies in publishing the “plumbing” of DE operationalization: defining service boundaries for the ASoT, establishing model curation cadence and governance roles, and identifying minimal standardized outcome measures that enable cross-study comparison and benchmarking.
Sectoral differences and funding asymmetries further reinforce these patterns. Defense and aerospace studies focus on compliance, configuration management, and curated repositories supported by dedicated funding streams, whereas non-defense sectors prioritize interoperability and faster time-to-value within tighter capital constraints. When the DoD Strategy is cited, funding overwhelmingly originates from defense-linked sources, which appear to facilitate deeper investments in governance and infrastructure. The shared language and structure introduced through defense-oriented initiatives have likely advanced DE maturity, yet they also risk overfitting solutions to one sector’s governance and acquisition context.
Beyond its technical and methodological contributions, the findings of this review highlight the role of DE as a foundational enabler of digital transformation in engineering organizations. DE provides the model-centric, data-driven infrastructure that supports shifts in how engineering work is coordinated, communicated, and governed. By establishing authoritative digital artifacts, continuous data flows, and integrated lifecycle models, DE helps organizations transition from document-oriented processes toward collaborative, model-based practices. These technical shifts have meaningful organizational implications: they facilitate cross-functional alignment, enable faster decision-making, and promote a culture of transparency and traceability. As organizations adopt DE practices, they also confront necessary cultural and procedural changes, such as redefining roles, restructuring workflows, and developing new competencies around digital artifacts. In this way, DE functions not only as a set of tools but as the technical engine that accelerates and stabilizes broader digital transformation efforts. Recognizing DE’s dual role, as both a methodological framework and a catalyst for organizational change, provides a more comprehensive understanding of its impact and reinforces its strategic importance in contemporary engineering practice.
Taken together, findings from RQ1–RQ5 suggest two things for the DE community. First, broaden the reference points beyond defense so guidance travels across industries, by including sector-neutral standards, metrics, and exemplars. Second, shift from frameworks to operational implementations. That would close the rhetoric-practice gap, show where DE works, how well and under what conditions, and accelerate maturity from concept-forward claims to evidence-based practice.

6. Conclusions and Future Research

This systematic literature review consolidates three decades of research on DE, tracing its evolution from conceptual foundations to emerging operational frameworks. By examining DE definitions, motivations, components, challenges, and how the 2018 U.S. DoD Digital Engineering Strategy is reflected in 56 peer-reviewed studies, this work provides an integrated understanding of how DE has evolved as both a discipline and a practice.
While DE is widely recognized as a transformative enabler of lifecycle integration, a noticeable portion of the literature approaches DE primarily through the lenses of MBSE and simulation. This focus is understandable given their foundational roles in model-centric engineering; however, it also highlights a tendency to narrow the interpretation of DE’s broader intent. In practice, DE encompasses far more than modeling, it integrates data management, digital threads, authoritative sources of truth, automation, and organizational change. Emphasizing this wider systems perspective can help the community move toward a more comprehensive and balanced understanding of DE as a unifying engineering paradigm rather than a collection of specific methods or tools.
The review also highlights the uneven distribution of DE maturity across sectors. Defense and aerospace continue to dominate, benefiting from structured governance frameworks and sustained funding, while manufacturing, education, and other domains are still in exploratory stages. This imbalance highlights the importance of developing cross-sector frameworks that adapt DE principles to different operational realities rather than replicating defense-specific solutions.
Across the corpus, empirical evaluations of DE in operational settings remain relatively scarce compared to conceptual frameworks, reference architectures, and position papers. As a result, many of the reported benefits of DE are supported more by reasoned argument than by systematic evidence. Future work should therefore move beyond proposing new frameworks and taxonomies toward studies that design, implement, and rigorously evaluate DE solutions in real programs. This includes comparative analyses against conventional practice and quantification of impacts on lifecycle cost, schedule performance, quality, and risk. Specifically, upcoming work should: (i) develop sector-neutral DE maturity models that account for differences in scale, risk tolerance, and digital infrastructure; (ii) advance interoperability standards and open data ecosystems, ensuring that DE implementations remain scalable and vendor agnostic; (iii) conduct empirical case studies of programs that attempt to embed DE assets into acquisition and sustainment processes, particularly in relation to enterprise toolchains and configuration-controlled baselines; (iv) develop and evaluate reference architectures and integration patterns that link MBSE environments, configuration management systems, and enterprise data services; (v) create methods and metrics for assessing integration maturity and digital continuity across the lifecycle; and (vi) invest in workforce and education initiatives that strengthen DE competencies across disciplines, bridging the gap between conceptual understanding and practical application.
By addressing these priorities, DE can evolve from a sector-specific strategy to a globally recognized engineering paradigm that unites digital models, authoritative data, and integrated decision-making across the full system lifecycle. Ultimately, realizing this vision will require collaboration across academia, industry, and government to align conceptual progress with operational maturity, transforming DE from a promising idea into a proven, trusted, and scalable practice.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/systems13121046/s1, a PRISMA checklist that assures that the PRISMA methodology for performing systematic literature review was followed, and the detailed dataset extracted from all studies included in this systematic literature review for transparency.

Author Contributions

Conceptualization, A.W.; methodology, A.W.; formal analysis, A.W. and L.W.; investigation, A.W.; data curation, L.W.; writing—original draft preparation, A.W. and L.W.; writing—review and editing, A.W. and L.W.; supervision, A.W.; project administration, A.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

This review was not registered. Reasonable data requests for materials used in this review will be reviewed by the authors.

Acknowledgments

This work benefited from the contributions of students enrolled in the ISE 439/539 Digital Engineering course in Spring 2025 at the University of Alabama in Huntsville, who supported the systematic literature review and data extraction processes. The following students participated in these efforts: Celeste Adair, Oluyinka Joseph Adedokun, Joseph Cox, Levi Geller, Rayden Gray, Justin Hilpert, Elizabeth Kacerek, Jessica Landberg, Josh Miller, Marques David Proctor, Beatriz Samur-Zuniga, Jonathan Spuhl, Tony Stoneback, and Landon Womack.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
DEDigital Engineering
DoDDepartment of Defense
SLRSystematic Literature Review
MBSEModel-Based Systems Engineering
RQResearch Question
MBEModel-Based Engineering
ASoTAuthoritative Source of Truth
PRISMAPreferred Reporting Items for Systematic Reviews and Meta-Analyses
ISEIndustrial and Systems Engineering
IEEEInstitute of Electrical and Electronics Engineers
ARCAerospace Research Central
INCOSEInternational Council on Systems Engineering
ACMAssociation for Computing Machinery
DAUDefense Acquisition University
SOSSystem of Systems
E2EEnd-to-End
SysMLSystems Modeling Language
AIArtificial Intelligence
CADComputer Aided Design
OMLOntological Modeling Language
STKSystem Modeling Tool Kit
CODECyber Oriented Digital Engineering
DECMAFDigital Engineering Capability Maturity Assessment Framework
DEFIIDigital Engineering Framework for Interoperability and Integration
CoEDTCo-Evolutionary Digital Twins
EPHMEngineering Project Health Management
DFEDigital Factory Economics
SESystems Engineering
MBTEModel-Based Testing Environment
ECSSEuropean Cooperation for Space Standardization
BIMBuilding Information Modeling
DECFDigital Engineering Competency Framework
QMUQuantified Margins and Uncertainty
ARAugmented Reality
BPMNBusiness Process Model and Notation
CDECommon Data Environment
SERCSystems Engineering Research Center

References

  1. Henderson, K.; McDermott, T.; Van Aken, E.; Salado, A. Towards Developing Metrics to Evaluate Digital Engineering. Syst. Eng. 2023, 26, 3–31. [Google Scholar] [CrossRef]
  2. Tao, F.; Ma, X.; Liu, W.; Zhang, C. Digital Engineering: State-of-the-Art and Perspectives. Digit. Eng. 2024, 1, 100007. [Google Scholar] [CrossRef]
  3. Zimmerman, P.; Gilbert, T.; Salvatore, F. Digital Engineering Transformation across the Department of Defense. J. Def. Model. Simul. Appl. Methodol. Technol. 2019, 16, 325–338. [Google Scholar] [CrossRef]
  4. Ma, J.; Wang, G.; Lu, J.; Vangheluwe, H.; Kiritsis, D.; Yan, Y. Systematic Literature Review of MBSE Tool-Chains. Appl. Sci. 2022, 12, 3431. [Google Scholar] [CrossRef]
  5. Wach, P.; Topcu, T.G.; Jung, S.; Sandman, B.; Kulkarni, A.U.; Salado, A. A Systematic Literature Review on the Mathematical Underpinning of Model-based Systems Engineering. Syst. Eng. 2025, 28, 134–153. [Google Scholar] [CrossRef]
  6. Atalay, M.; Murat, U.; Oksuz, B.; Parlaktuna, A.M.; Pisirir, E.; Testik, M.C. Digital Twins in Manufacturing: Systematic Literature Review for Physical–Digital Layer Categorization and Future Research Directions. Int. J. Comput. Integr. Manuf. 2022, 35, 679–705. [Google Scholar] [CrossRef]
  7. Semeraro, C.; Lezoche, M.; Panetto, H.; Dassisti, M. Digital Twin Paradigm: A Systematic Literature Review. Comput. Ind. 2021, 130, 103469. [Google Scholar] [CrossRef]
  8. Wooley, A.; Silva, D.F.; Bitencourt, J. When Is a Simulation a Digital Twin? A Systematic Literature Review. Manuf. Lett. 2023, 35, 940–951. [Google Scholar] [CrossRef]
  9. Bitencourt, J.; Wooley, A.; Harris, G. Verification and Validation of Digital Twins: A Systematic Literature Review for Manufacturing Applications. Int. J. Prod. Res. 2025, 63, 342–370. [Google Scholar] [CrossRef]
  10. Wooley, A.; Bitencourt, J.; Silva, D. Bridging the Gap between Discrete Event Simulation and Digital Twin: A Manufacturing Case Study. Manuf. Lett. 2025, 44, 1274–1284. [Google Scholar] [CrossRef]
  11. Abdel-Aty, T.A.; Negri, E. Conceptualizing the Digital Thread for Smart Manufacturing: A Systematic Literature Review. J. Intell. Manuf. 2024, 35, 3629–3653. [Google Scholar] [CrossRef]
  12. Zhang, Q.; Liu, J.; Chen, X. A Literature Review of the Digital Thread: Definition, Key Technologies, and Applications. Systems 2024, 12, 70. [Google Scholar] [CrossRef]
  13. Kraft, E.M. Transforming Ground and Flight Testing through Digital Engineering. In Proceedings of the AIAA Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2020. [Google Scholar]
  14. Golizadeh, H.; Hon, C.K.H.; Drogemuller, R.; Reza Hosseini, M. Digital Engineering Potential in Addressing Causes of Construction Accidents. Autom. Constr. 2018, 95, 284–295. [Google Scholar] [CrossRef]
  15. Lee, Y.-C.; Bae, H.-R. An Effective Quality Assurance for Small Quantity Batch Manufactured Products with Digital Engineering. Int. J. Precis. Eng. Manuf. 2012, 13, 1805–1811. [Google Scholar] [CrossRef]
  16. Barata, J.; Cardoso, J.C.S.; Cunha, P.R. Mass Customization and Mass Personalization Meet at the Crossroads of Industry 4.0: A Case of Augmented Digital Engineering. Syst. Eng. 2023, 26, 715–727. [Google Scholar] [CrossRef]
  17. Waugh, S.M. Determining a Digital Engineering Framework: A Systematic Review of What and How to Digitalize. Am. J. Manag. 2021, 21, 51–66. [Google Scholar] [CrossRef]
  18. Baldwin, K. Model-Based Systems Engineering: Enabling the Digital Engineering Practice in the Department of Defense. Get. It Right Q. Newsl. Mission Assur. 2017, 7, 5–7. [Google Scholar]
  19. Noguchi, R.A.; Wheaton, M.J.; Martin, J.N. Digital Engineering Strategy to Enable Enterprise Systems Engineering. INCOSE Int. Symp. 2020, 30, 1727–1741. [Google Scholar] [CrossRef]
  20. Office of the Deputy Assistant Secretary of Defense (Systems Engineering). Department of Defense Digital Engineering Strategy; U.S. Department of Defense: Arlington, VA, USA, 2018.
  21. Hagan, G. Glossary of Defense Acquisition Acronyms & Terms; Defense Acquisition University: Fort Belvoir, VA, USA, 2009. [Google Scholar]
  22. Hale, J.P.; Zimmerman, P.; Kukkala, G.; Guerrero, J.; Kobryn, P.; Puchek, B.; Bisconti, M.; Baldwin, C.; Mulpuri, M. Digital Model-Based Engineering: Expectations, Prerequisites, and Challenges of Infusion; NASA, Marshall Space Flight Center: Huntsville, AL, USA, 2017.
  23. McDermott, T.; Henderson, K.; Van Aken, E.; Salado, A.; Bradley, J. Measuring Systems Engineering Progress Using Digital Engineering. In Systems Engineering for the Digital Age; Verma, D., Ed.; Wiley: Hoboken, NJ, USA, 2023; pp. 137–147. ISBN 978-1-394-20328-4. [Google Scholar]
  24. Hart, L. Introduction to Model-Based Systems Engineering (MBSE) and SysML. In Proceedings of the Delaware Valley INCOSE Chapter Meeting, Online, 30 July 2015. [Google Scholar]
  25. INCOSE Technical Operations. Systems Engineering Vision 2020, Version 2.03; INCOSE-TP-2004-004-02; International Council on Systems Engineering: Seattle, WA, USA, 2007. [Google Scholar]
  26. Grieves, M.W. Digital Twins: Past, Present, and Future. In The Digital Twin; Crespi, N., Drobot, A.T., Minerva, R., Eds.; Springer International Publishing: Cham, Switzerland, 2023; pp. 97–121. ISBN 978-3-031-21342-7. [Google Scholar]
  27. Allison, D.L.; Cribb, M.W.; McCarthy, T.; LaRowe, R. Authoritative Sources of Truth and Consistency in Digital Engineering. In Proceedings of the AIAA SCITECH 2023 Forum, National Harbor, MD, USA, 23–27 January 2023; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2023. [Google Scholar]
  28. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; The PRISMA Group. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef]
  29. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  30. Liberati, A.; Altman, D.G.; Tetzlaff, J.; Mulrow, C.; Gøtzsche, P.C.; Ioannidis, J.P.A.; Clarke, M.; Devereaux, P.J.; Kleijnen, J.; Moher, D. The PRISMA Statement for Reporting Systematic Reviews and Meta-Analyses of Studies That Evaluate Health Care Interventions: Explanation and Elaboration. J. Clin. Epidemiol. 2009, 62, b2700. [Google Scholar] [CrossRef]
  31. Wooley, A. Designing a Digital Engineering Curriculum for Manufacturing Education: Industry-Aligned Course Development. Manuf. Lett. 2025, 46, 34–36. [Google Scholar] [CrossRef]
  32. Wanyama, S.B.; McQuaid, R.W.; Kittler, M. Where You Search Determines What You Find: The Effects of Bibliographic Databases on Systematic Reviews. Int. J. Soc. Res. Methodol. 2022, 25, 409–422. [Google Scholar] [CrossRef]
  33. Mourão, E.; Pimentel, J.F.; Murta, L.; Kalinowski, M.; Mendes, E.; Wohlin, C. On the Performance of Hybrid Search Strategies for Systematic Literature Reviews in Software Engineering. Inf. Softw. Technol. 2020, 123, 106294. [Google Scholar] [CrossRef]
  34. Niazi, M. Do Systematic Literature Reviews Outperform Informal Literature Reviews in the Software Engineering Domain? An Initial Case Study. Arab. J. Sci. Eng. 2015, 40, 845–855. [Google Scholar] [CrossRef]
  35. Giachetti, R.E.; Vaneman, W. Requirements for a System Model in the Context of Digital Engineering. In Proceedings of the 2021 IEEE International Systems Conference (SysCon), Vancouver, BC, Canada, 15 April–15 May 2021; IEEE: Vancouver, BC, Canada, 2021; pp. 1–7. [Google Scholar]
  36. See Tao, H.Y.; Hutchison, N.; Clifford, M.; Kerr, G.; Beling, P.; Sherburne, T.; Wach, P.; Long, D.; Arndt, C.; Verma, D.; et al. Challenges and Opportunities in the Digital Engineering Simulation Curriculum Development. In Proceedings of the 2023 IEEE International Systems Conference (SysCon), Vancouver, BC, Canada, 17–20 April 2023; IEEE: Vancouver, BC, Canada, 2023; pp. 1–7. [Google Scholar]
  37. Güdemann, M.; Kegel, S.; Ortmeier, F.; Poenicke, O.; Richter, K. SysML in Digital Engineering. In Proceedings of the First International Workshop on Digital Engineering, Magdeburg, Germany, 14 June 2010; ACM: Magdeburg, Germany, 2010; pp. 1–8. [Google Scholar]
  38. Platunina, G.P.; Salutina, T.Y.; Frank, I.A. Megatrends of Digital Engineering Technologies: Analysis of the Model of Integral Assessment of the State and Potential of Digital Development in the Conditions of the Fourth Industrial Revolution. In Proceedings of the 2023 Intelligent Technologies and Electronic Devices in Vehicle and Road Transport Complex (TIRVED), Moscow, Russia, 15–17 November 2023; IEEE: Moscow, Russia, 2023; pp. 1–4. [Google Scholar]
  39. Campagna, J.M.; Bhada, S.V. A Capability Maturity Assessment Framework for Creating High Value Digital Engineering Opportunities. In Proceedings of the 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Melbourne, Australia, 17–21 October 2021; IEEE: Melbourne, Australia, 2021; pp. 2542–2548. [Google Scholar]
  40. Beshears, R.; Long, J. Digital Engineering Transformation for Reliability/Maintainability. In Proceedings of the 2024 Annual Reliability and Maintainability Symposium (RAMS), Albuquerque, NM, USA, 22–25 January 2024; IEEE: Albuquerque, NM, USA, 2024; pp. 1–5. [Google Scholar]
  41. Baker, A.; Pepe, K.; Hutchison, N.; Tao, H.Y.S.; Peak, R.; Blackburn, M.; Khan, R.; Whitcomb, C. Enabling the Digital Transformation of the Workforce: A Digital Engineering Competency Framework. In Proceedings of the 2021 IEEE International Systems Conference (SysCon), Virtual, 15 April–15 May 2021; pp. 1–8. [Google Scholar]
  42. Kraft, E.M. Digital Engineering Enabled Systems Engineering Performance Measures. In Proceedings of the AIAA Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2020. [Google Scholar]
  43. Kraft, D.E.M. Value-Creating Decision Analytics in a Lifecycle Digital Engineering Environment. In Proceedings of the AIAA SciTech Forum, San Diego, CA, USA, 7–11 January 2019. [Google Scholar]
  44. Dunbar, D.; Hagedorn, T.; Blackburn, M.; Dzielski, J.; Hespelt, S.; Kruse, B.; Verma, D.; Yu, Z. Driving Digital Engineering Integration and Interoperability through Semantic Integration of Models with Ontologies. Syst. Eng. 2023, 26, 365–378. [Google Scholar] [CrossRef]
  45. Beshears, R.; Crate, S. Digital Engineering Approach for Test (Manufacturing/Depot). In Proceedings of the 2023 IEEE AUTOTESTCON, National Harbor, MD, USA, 28–31 August 2023; IEEE: National Harbor, MD, USA, 2023; pp. 1–5. [Google Scholar]
  46. Snider, C.; Gopsill, J.A.; Jones, S.L.; Emanuel, L.; Hicks, B.J. Engineering Project Health Management: A Computational Approach for Project Management Support Through Analytics of Digital Engineering Activity. IEEE Trans. Eng. Manag. 2019, 66, 325–336. [Google Scholar] [CrossRef]
  47. Verhoef, M.; Gerené, S.; Vorobiev, A.; Smiechowski, N.; Jahnke, S.; Knippschild, J.; Weikert, S.; Becker, M.; Paquay, S.; Vogt, J.P.H.; et al. Digital Engineering Hub Pathfinder. In Proceedings of the 2021 ACM/IEEE International Conference on Model Driven Engineering Languages and Systems Companion (MODELS-C), Virtual, 10–15 October 2021; pp. 467–476. [Google Scholar]
  48. Dunbar, D.; Vierlboeck, M.; Blackburn, M. Use of Natural Language Processing in Digital Engineering Context to Aid Tagging of Model. In Proceedings of the 2023 IEEE International Systems Conference (SysCon), Vancouver, BC, Canada, 17–20 April 2023; IEEE: Vancouver, BC, Canada, 2023; pp. 1–8. [Google Scholar]
  49. Regli, W.C.; Kopena, J.B.; Grauer, M. On the Long-Term Retention of Geometry-Centric Digital Engineering Artifacts. Comput.-Aided Des. 2011, 43, 820–837. [Google Scholar] [CrossRef]
  50. Tong, X.; Bao, J.; Tao, F. Co-Evolutionary Digital Twins: A Multidimensional Dynamic Approach to Digital Engineering. Adv. Eng. Inform. 2024, 61, 102554. [Google Scholar] [CrossRef]
  51. Calvert, M.E. Digital Engineering and U.S. Army Air Vehicle Technical Description Reports. In Proceedings of the AIAA Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2020. [Google Scholar]
  52. Brown, G.; Jain, A. Model-Based Test Engineering—Increasing the Value Test Provides in the Wide World of Digital Engineering. In Proceedings of the 2023 IEEE AUTOTESTCON, National Harbor, MD, USA, 28–31 August 2023; IEEE: National Harbor, MD, USA, 2023; pp. 1–5. [Google Scholar]
  53. Sherry, J. Digital Engineering Enables Multinational Input on Bergen’s Light Rail Extension, Norway. Proc. Inst. Civ. Eng.—Civ. Eng. 2018, 171, 49–56. [Google Scholar] [CrossRef]
  54. Hussain, Z.I.; Sivarajah, U.; Hussain, N. The Role of a Digital Engineering Platform in Appropriating the Creation of New Work-Related Mind-Set and Organisational Discourse in a Large Multi-National Company. Int. J. Inf. Manag. 2019, 48, 218–225. [Google Scholar] [CrossRef]
  55. Zhou, L. Research on Digital Engineering Modeling System under Computer Virtual Reality Technology. In Proceedings of the 2023 IEEE 3rd International Conference on Power, Electronics and Computer Applications (ICPECA), Shenyang, China, 29–31 January 2023; IEEE: Shenyang, China, 2023; pp. 887–891. [Google Scholar]
  56. LeVine, M.J.; Chell, B.; Grogan, P.T. Leveraging a Digital Engineering Testbed to Explore Mission Resilience for New Observing Strategies. In Proceedings of the AIAA SCITECH 2023 Forum, National Harbor, MD, USA, 23–27 January 2023; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2023. [Google Scholar]
  57. Reinhart, C.M.; Huffman, J.E. The Application of Digital Engineering at a Tier 1 Supplier and the Challenges of Integrating throughout the Supply Chain (OEM to Tier 1 Supplier and Below). In Proceedings of the AIAA AVIATION 2023 Forum, San Diego, CA, USA, 12–16 June 2023; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2023. [Google Scholar]
  58. Duque, E.P.; Morton, S.A.; Wissink, A.M.; Stone, S.W.; Spotz, W.; Caraway, D.J.; Legensky, S.M. Summary of the CFD 2030 Integration Committee Invited Panel on Physics Based Model Improvement and Uncertainty Quantification for the Digital Engineering Transformation. In Proceedings of the AIAA SCITECH 2023 Forum, National Harbor, MD, USA, 23–27 January 2023; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2023. [Google Scholar]
  59. Capodieci, A.; Mainetti, L.; Alem, L. An Innovative Approach to Digital Engineering Services Delivery: An Application in Maintenance. In Proceedings of the 2015 11th International Conference on Innovations in Information Technology (IIT), Dubai, United Arab Emirates, 1–3 November 2015; IEEE: Dubai, United Arab Emirates, 2015; pp. 342–349. [Google Scholar]
  60. Dahmann, J.; Khaw, A.; Biloiu, I.; Jacobs, R.; Kim, C.; Thompson, C. Digital Engineering of Large Scale System of Systems: End-to-End (E2E) Modeling and Analysis Environment. In Proceedings of the 2021 16th International Conference of System of Systems Engineering (SoSE), Västerås, Sweden, 14–18 June 2021; IEEE: Västerås, Sweden, 2021; pp. 120–125. [Google Scholar]
  61. Gopalakrishnan, S.; Hartman, N.W.; Sangid, M.D. A Digital Engineering Framework to Facilitate Automated Data Exchange between Geometric Inspection and Structural Analysis. Adv. Eng. Softw. 2023, 183, 103498. [Google Scholar] [CrossRef]
  62. Luquin, V.; Lai, A.S.; Latkin, A.W.; Hashii, W.N. Safe and Efficient Flight Test Execution through Digital Engineering: High-Fidelity Loads Regression Prediction Generation. In Proceedings of the AIAA AVIATION 2021 FORUM, Virtual Event, 2–6 August 2021; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2021. [Google Scholar]
  63. Tao, H.Y.S.; Hutchison, N.; Beling, P.; Arndt, C.; Blackburn, M.R.; Sherburne, T.; Wach, P.; Long, D.; Verma, D.; McDermott, T.A. Initial Development of a Roadmap for Digital Engineering Simulations Curriculum. In Proceedings of the 2022 IEEE International Systems Conference (SysCon), Montreal, QC, Canada, 25–28 April 2022; IEEE: Montreal, QC, Canada, 2022; pp. 1–7. [Google Scholar]
  64. Maier, M.W. Adapting the Hatley-Pirbhai Method for the Era of SysML and Digital Engineering. In Proceedings of the 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, USA, 5–12 March 2022; IEEE: Big Sky, MT, USA, 2022; pp. 1–12. [Google Scholar]
  65. Pennock, M.J.; Driscoll, G.I.; Dahmann, J.S.; Adams, M. Enabling Mission Engineering through a Reusable Digital Engineering Environment. In Proceedings of the 2022 IEEE International Systems Conference (SysCon), Montreal, QC, Canada, 25–28 April 2022; IEEE: Montreal, QC, Canada, 2022; pp. 1–8. [Google Scholar]
  66. Orellana, D.; Mandrick, W. The Ontology of Systems Engineering: Towards a Computational Digital Engineering Semantic Framework. Procedia Comput. Sci. 2019, 153, 268–276. [Google Scholar] [CrossRef]
  67. Volkmann, J.W.; Westkämper, E. Cost Model for Digital Engineering Tools. Procedia CIRP 2013, 7, 676–681. [Google Scholar] [CrossRef]
  68. Mitola, J.; Prys, M. Cyber Oriented Digital Engineering. Syst. Eng. 2024, 27, 109–119. [Google Scholar] [CrossRef]
  69. Kraft, E.M. dTES—A Digital Engineering Enabled T&E Strategy. In Proceedings of the AIAA SCITECH 2022 Forum, San Diego, CA, USA, 3–7 January 2022; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2022. [Google Scholar]
  70. Wheaton, J.S.; Herber, D.R. Seamless Digital Engineering: A Grand Challenge Driven by Needs. In Proceedings of the AIAA SCITECH 2024 Forum, Orlando, FL, USA, 8–12 January 2024; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2024. [Google Scholar]
  71. Martin, J.N.; Noguchi, R.A.; Minnichelli, R.J.; Wheaton, M.J. Implementing Enterprise Systems Engineering Enabled by the Digital Engineering Approach. In Proceedings of the ASCEND 2020, Online, 16–18 November 2020. [Google Scholar]
  72. Perzylo, A.; Kessler, I.; Profanter, S.; Rickert, M. Toward a Knowledge-Based Data Backbone for Seamless Digital Engineering in Smart Factories. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; IEEE: Vienna, Austria, 2020; pp. 164–171. [Google Scholar]
  73. McDermott, T.; Collopy, P.; Nadolski, M.; Paredis, C. The Future Exchange of Digital Engineering Data and Models: An Enterprise Systems Analysis. Procedia Comput. Sci. 2019, 153, 260–267. [Google Scholar] [CrossRef]
  74. Miller, M.E.; Spatz, E. A Taxonomy of Metrics for Human Representations in Digital Engineering. Syst. Eng. 2024, 27, 315–325. [Google Scholar] [CrossRef]
  75. Taylor, N.J. Digital Engineering: Recognizing and Honing Our 6th Sense with Respect to Physical Modelling. In Proceedings of the AIAA SCITECH 2023 Forum, National Harbor, MD, USA, 23–27 January 2023; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2023. [Google Scholar]
  76. Baker, A.; Pepe, K.; Hutchison, N.; Blackburn, M.; Khan, R.; Peak, R.; Wade, J.; Whitcomb, C. Preparing the Acquisition Workforce: A Digital Engineering Competency Framework. In Proceedings of the 2020 IEEE International Systems Conference (SysCon), Montreal, QC, Canada, 24–27 August 2020; IEEE: Montreal, QC, Canada, 2020; pp. 1–6. [Google Scholar]
  77. Rhodes, D.H. Model Curation: Requisite Leadership and Practice in Digital Engineering Enterprises. Procedia Comput. Sci. 2019, 153, 233–241. [Google Scholar] [CrossRef]
  78. Chen, Y. Improvement of the Digital Engineering Quality Control System by Dynamic Programming Method. In Proceedings of the 2024 Second International Conference on Data Science and Information System (ICDSIS), Hassan, India, 17–18 May 2024; IEEE: Hassan, India, 2024; pp. 1–5. [Google Scholar]
  79. Zuo, X.; Huang, J. Intelligent Algorithms Optimize the Level of Digital Engineering Quality Management. In Proceedings of the 2024 Second International Conference on Data Science and Information System (ICDSIS), Hassan, India, 17–18 May 2024; IEEE: Hassan, India, 2024; pp. 1–5. [Google Scholar]
  80. Ye, P. Global Analysis of Quality Problems in Digital Engineering by Whale Algorithm. In Proceedings of the 2024 IEEE 13th International Conference on Communication Systems and Network Technologies (CSNT), Jabalpur, India, 6–7 April 2024; IEEE: Jabalpur, India, 2024; pp. 1151–1156. [Google Scholar]
  81. Graves, R.E. Digital Engineering Influences on Formation Flying Technology Development. In Proceedings of the AIAA Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2020. [Google Scholar]
  82. Hane Hagström, M.; Bergsjö, D.; Wahrén, H. Barriers from A Socio-Technical Perspective to Implement Digitalisation in Industrial Engineering Processes—A Literature Review. Proc. Des. Soc. 2023, 3, 737–746. [Google Scholar] [CrossRef]
  83. Gkrimpizi, T.; Peristeras, V.; Magnisalis, I. Classification of Barriers to Digital Transformation in Higher Education Institutions: Systematic Literature Review. Educ. Sci. 2023, 13, 746. [Google Scholar] [CrossRef]
  84. Mannepalli, R.Y. Digital Engineering Is Too Important to Be Left Alone to MBSE Practitioners. In Proceedings of the AIAA AVIATION 2023 Forum, San Diego, CA, USA, 12–16 June 2023; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2023. [Google Scholar]
  85. Stark, R.; Brandenburg, E.; Lindow, K. Characterization and Application of Assistance Systems in Digital Engineering. CIRP Ann. 2021, 70, 131–134. [Google Scholar] [CrossRef]
  86. Honoré-Livermore, E.; Birkeland, R.; Bakken, S.; Garrett, J.L.; Haskins, C. Digital Engineering Development in an Academic CubeSat Project. J. Aerosp. Inf. Syst. 2022, 19, 649–660. [Google Scholar] [CrossRef]
Figure 1. Systematic literature review steps.
Figure 1. Systematic literature review steps.
Systems 13 01046 g001
Figure 2. PRISMA diagram.
Figure 2. PRISMA diagram.
Systems 13 01046 g002
Figure 3. Digital Engineering publications by year.
Figure 3. Digital Engineering publications by year.
Systems 13 01046 g003
Figure 4. Categorization for general DE and subset DE case studies.
Figure 4. Categorization for general DE and subset DE case studies.
Systems 13 01046 g004
Figure 5. Industry context of the case studies.
Figure 5. Industry context of the case studies.
Systems 13 01046 g005
Figure 6. Percentage of papers discussing motivations for applying DE.
Figure 6. Percentage of papers discussing motivations for applying DE.
Systems 13 01046 g006
Figure 7. Thematic categorization of motivations and benefits for applying DE.
Figure 7. Thematic categorization of motivations and benefits for applying DE.
Systems 13 01046 g007
Figure 8. Frequency of DE components discussed in the literature.
Figure 8. Frequency of DE components discussed in the literature.
Systems 13 01046 g008
Figure 9. Industry-specific adoption of DE components.
Figure 9. Industry-specific adoption of DE components.
Systems 13 01046 g009
Figure 10. Frequency of tools used in DE.
Figure 10. Frequency of tools used in DE.
Systems 13 01046 g010
Figure 11. Evolution of DE components mentioned.
Figure 11. Evolution of DE components mentioned.
Systems 13 01046 g011
Figure 12. Proportional distribution of barriers to DE implementation by sectors.
Figure 12. Proportional distribution of barriers to DE implementation by sectors.
Systems 13 01046 g012
Figure 13. Frequency of solution categories.
Figure 13. Frequency of solution categories.
Systems 13 01046 g013
Figure 14. Percent of papers proposing a new methodology/framework.
Figure 14. Percent of papers proposing a new methodology/framework.
Systems 13 01046 g014
Figure 15. Percent of proposed frameworks influenced by the 2018 DoD Strategy.
Figure 15. Percent of proposed frameworks influenced by the 2018 DoD Strategy.
Systems 13 01046 g015
Figure 16. Percent of papers that cite barriers per industry.
Figure 16. Percent of papers that cite barriers per industry.
Systems 13 01046 g016
Figure 17. The most prevalent barriers found among the defense sector.
Figure 17. The most prevalent barriers found among the defense sector.
Systems 13 01046 g017
Figure 18. The most prevalent barriers found among the non-defense sector.
Figure 18. The most prevalent barriers found among the non-defense sector.
Systems 13 01046 g018
Figure 19. Explicit mentions of each goal found in relevant papers.
Figure 19. Explicit mentions of each goal found in relevant papers.
Systems 13 01046 g019
Figure 20. Heatmap of co-occurrence of goals.
Figure 20. Heatmap of co-occurrence of goals.
Systems 13 01046 g020
Figure 21. Funding categories for all papers mentioning funding source.
Figure 21. Funding categories for all papers mentioning funding source.
Systems 13 01046 g021
Figure 22. Funding categories for implicit mentions of goals.
Figure 22. Funding categories for implicit mentions of goals.
Systems 13 01046 g022
Figure 23. Funding categories for explicit mentions of goals.
Figure 23. Funding categories for explicit mentions of goals.
Systems 13 01046 g023
Figure 24. DoD DE Strategy goals rhetorically versus operationally.
Figure 24. DoD DE Strategy goals rhetorically versus operationally.
Systems 13 01046 g024
Table 1. Frequency of cited sources for DE definitions.
Table 1. Frequency of cited sources for DE definitions.
Definition SourceFrequency% of Definitions
DoD (DoD DE Strategy [20] and DAU Glossary [21])1356.5%
No Reference Provided834.8%
Other Sources [35,36]28.7%
Total23100%
Table 2. Definitions of Digital Engineering provided without reporting references.
Table 2. Definitions of Digital Engineering provided without reporting references.
Definition IDDefinitions Used with No Source Provided
1 [37]“Digital Engineering is a relatively new discipline, which aims at minimizing friction losses, when different disciplines meet each other.”
2 [38]“Digital Engineering is a modern methodology that uses digital technologies and tools to integrate, optimize, and manage complex systems, processes, and data.”
3 [39]“Digital Engineering is the term being used to represent the powerful digital transformation effects of new transdisciplinary innovations and hyper-connectivity that are occurring in engineering environments.”
4 [40]“A digital environment is an integrated ecosystem of data interaction which fosters model information interaction and utilization across the product lifecycle.”
5 [41]“Digital Engineering updates the systems engineering practices to take full advantage of computational technology, modeling, data analytics, and data sciences.”
6 [13,42]“Digital Engineering, a key enabler for taking full advantage of the rapidly expanding, global digital revolution, is the application of model-based engineering to create and apply enduring authoritative truth sources to improve engineering practice and decision-making under risk.”
7 [43]Digital Engineering is articulated in this paper as the application of model-based, enduring, authoritative truth sources to improve engineering practice and decision analytics through a digital ecosystem to transform the culture and the work of creating value for product development and support.”
Table 3. Subset of DE case studies.
Table 3. Subset of DE case studies.
SubsetMentionedReference
MBSE10[1,38,44,45,46,47,48,49,50,51]
ASOT5[27,47,52,53,54]
Simulation4[36,55,56,57]
Digital Thread4[45,47,51,54]
Digital Twin3[46,50,58]
Augmented Reality1[59]
Ontology1[1]
SoS, E2E Modeling1[60]
Table 4. Category requirements for classifying challenges.
Table 4. Category requirements for classifying challenges.
CategoryKey Barriers
TechnicalInteroperability, data integration, lack of standardization
CulturalOrganizational resistance, workforce skill gaps, and inadequate training
FinancialHigh initial investment and unclear return on investment
RegulatoryData security and compliance restraints
Table 5. DE solutions.
Table 5. DE solutions.
SolutionsKeywordsPapers
Digital Engineering Frameworks and ModelsCODE Framework, DECMAF, DEFII Framework, CoEDT Framework, Strategic DE Framework, EPHM, Digital Factory Economics (DFE), Seamless DE System of Systems[39,43,44,50,67,68,69,70,71]
Ontology and Semantic IntegrationDEFII Ontology-Aligned Data, OWL Ontologies, Semantic Web, Computable SE Ontology (OWL-RDF), Lexicom Governance, Ontology Metadata, Digital Archives[35,40,44,49,66,72,73]
Standardization and MetricsStandardized Metrics Taxonomy, Standardized Data Interfaces, MBTE, IEEE Standards, ECSS Standards, BIM-based Standards, Spatial Awareness Standards, Surrogate and Reduced-order Models, Workflow Automation[1,35,45,47,49,52,53,58,74,75]
Assessment, Capability, and Competency DevelopmentDigital Readiness, DECMAF, DECF, Workforce Training, Project Health Management, Competency Assessments, DFE[14,38,39,41,43,46,67,71,76]
Digital Threads and ASoTDigital Thread Integration, ASoT, Model-Centric Test and Evaluation, Configuration Management, Traceability, Legacy Systems Integration[27,43,44,47,51,52,57,69]
Governance and LeadershipGovernance Structures, Model Curation, Leadership Roles, Change Management, Project Health Management, Quality Assurance, Enterprise Collaboration[27,38,46,54,57,69,71,73,77]
Machine Learning and OptimizationDynamic Programming, Intelligent Algorithms, Optimization Schemes, Whale Algorithm, Machine Learning, QMU, Bayesian Methods, Surrogate Modeling, Performance Metrics Optimization[13,44,78,79,80]
Simulation and Digital TwinsSysML Multi-layered Architectures, Simulation Training Environments, Digital Twin Integration, Agent-Based Simulation, Living Digital Textbook, Technology Infusion, Uncertainty Quantification, Surrogate and Reduced-order Models[36,50,58,63,65,81]
Human Integration and VisualizationHuman-Centric Metrics, AR Integration, BPMN Workflows, Human-in-the-Loop Validation, Clash Detection Visualization[27,53,59,74]
Modular Reuse and InteroperabilityLightweight Modular Modeling, Model Data Reuse, SysML Architecture Models, Central Data Hub, Reusable Seed Models, CAD Integration, Troubleshooting Model Reuse, Multi-layered DE Environment[45,47,50,52,53,60,65,67]
Training and Workforce KnowledgeWorkforce Development, Competency Training, Knowledge Management, Curriculum Reviews, Platform Training, Educational Modules, Strategic Digital Training Programs[14,35,36,38,41,54,71,76]
Cybersecurity and InfrastructureZero Trust Architecture, Hybrid Cloud, Hardwar Cyber Hardening, Cyber Resilience Modules[63,66]
Data Management and ArchivalData Governance, Data Archives, CDE, Standardized Libraries, Workflow Dependencies[47,49,52,53]
Decision Analytics and Resource OptimizationReal-Time Analytics, Resource Allocation, Cost-Benefit Analysis, Intelligent Optimization, Agent-Based Simulation, Digital Surrogate Truth Sources[43,46,67,79,80,81]
Spatial and Discretization MethodsSpatial Awareness, Discretization Error Control, Parametric Digital Models[63,75]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wooley, A.; Womack, L. Digital Engineering: A Systematic Literature Review of Strategies, Components, and Implementation Challenges. Systems 2025, 13, 1046. https://doi.org/10.3390/systems13121046

AMA Style

Wooley A, Womack L. Digital Engineering: A Systematic Literature Review of Strategies, Components, and Implementation Challenges. Systems. 2025; 13(12):1046. https://doi.org/10.3390/systems13121046

Chicago/Turabian Style

Wooley, Ana, and Landon Womack. 2025. "Digital Engineering: A Systematic Literature Review of Strategies, Components, and Implementation Challenges" Systems 13, no. 12: 1046. https://doi.org/10.3390/systems13121046

APA Style

Wooley, A., & Womack, L. (2025). Digital Engineering: A Systematic Literature Review of Strategies, Components, and Implementation Challenges. Systems, 13(12), 1046. https://doi.org/10.3390/systems13121046

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop