Next Article in Journal
A Monitoring System for Measuring the Cognitive Cycle via a Continuous Reaction Time Task
Previous Article in Journal
Building Information Modeling and Big Data in Sustainable Building Management: Research Developments and Thematic Trends via Data Visualization Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Decision Analysis Data Model for Digital Engineering Decision Management

1
Department of Industrial Engineering, University of Arkansas, Fayetteville, AR 72701, USA
2
Edwardson School of Industrial Engineering, Purdue University, West Lafayette, IN 47907, USA
3
Deloitte Consulting LLP, Fredericksburg, VA 22408, USA
4
Deloitte Consulting LLP, Tullahoma, TN 37388, USA
5
SAIC, Stockholm, NJ 07460, USA
6
Deloitte Consulting LLP, Miami, FL 33133, USA
7
Deloitte Consulting LLP, Fairfax, VA 22031, USA
*
Author to whom correspondence should be addressed.
Systems 2025, 13(7), 596; https://doi.org/10.3390/systems13070596
Submission received: 16 May 2025 / Revised: 19 June 2025 / Accepted: 4 July 2025 / Published: 17 July 2025

Abstract

Decision management is the systems engineering life cycle process for making program/system decisions. The purpose of the decision management process is: “…to provide a structured, analytical framework for objectively identifying, characterizing and evaluating a set of alternatives for a decision at any point in the life cycle and select the most beneficial course of action”. Systems engineers and systems analysts need to inform decisions in a digital engineering environment. This paper describes a Decision Analysis Data Model (DADM) developed in model-based systems engineering software to provide the process, methods, models, and data to support decision management. DADM can support digital engineering for waterfall, spiral, and agile development processes. This paper describes the decision management processes and provides the definition of the data elements. DADM is based on ISO/IEC/IEEE 15288, the INCOSE SE Handbook, the SE Body of Knowledge, the Data Management Body of Knowledge, systems engineering textbooks, and journal articles. The DADM was developed to establish a decision management process and data definitions that organizations and programs can tailor for their system life cycles and processes. The DADM can also be used to assess organizational processes and decision quality.

1. Introduction

To address the problem of the lack of a widely available and reusable model-based decision support system for systems engineers, the INCOSE Decision Analysis Working Group developed a Decision Analysis Data Model (DADM) in Model-Based Systems Engineering software that provides decision management guidance to support multi-factored decisions, such as design comparisons or trade studies, while leveraging a model-based environment to improve how those decisions are analyzed and communicated. This data model was developed using the decision management methodology defined in the INCOSE Systems Engineering Handbook [1], and defines the steps in the decision management process and identifies the data exchanged between those steps. The model can accelerate trade-off analyses, increase consistency, and support the documentation of decision outcomes in a digital model, enabling collaboration in a digital ecosystem in all life cycle stages.
The importance of this work is amplified by the INCOSE Systems Engineering Vision 2035 [2], which outlines several challenges that must be realized to achieve the vision for the future state of systems engineering, including the following:
  • Enable trusted collaboration and interactions through a digital ecosystem;
  • Provide analytical frameworks for managing the lifecycle of complex systems;
  • Widely adopt reuse practices.

1.1. Review of Research on Decision Analysis Data in Systems Engineering

The need for systems engineers to obtain a comprehensive set of data to support system engineering decision making has been documented and reinforced across many decades.
  • MIL-STD 499B [3], published in 1994, describes a requirement for a decision data base, which is “a repository of information used and generated by the systems engineering process, at the appropriate level of detail. The intent of the decision data base is that, when properly structured, it provides access to the technical information, decisions, and rationale that describe the current state of system development and its evolution”.
  • The NASA Systems Engineering Handbook was initially published as NASA/SP-6105 in 1995. The 2007 revision describes a need to obtain a comprehensive set of data to support decision making while the initial 1995 publication does not. The most recent version of the NASA handbook [4], published in 2019, states, “Once the technical team recommends an alternative to a NASA decision-maker (e.g., a NASA board, forum, or panel), all decision analysis information should be documented. The team should produce a report to document all major recommendations to serve as a backup to any presentation materials used. The important characteristic of the report is the content, which fully documents the decision needed, assessments done, recommendations, and decision finally made.” In addition to prescribing the need to document decision analysis information, the NASA Systems Engineering Handbook prescribes that the process must be risk-informed, which may include both qualitative and quantitative techniques.
  • The INCOSE Systems Engineering Handbook was initially published in 1997. Version 4 of the Handbook, published in 2015, was the first version to describe the need to obtain a comprehensive set of data to support decision making. The INCOSE Systems Engineering Handbook [1], published in 2023, states “Decisions should be documented using digital engineering artifacts. Reports that include the analysis, decisions, and rationale are important for historical traceability and future decisions”. The INCOSE Systems Engineering Handbook prescribes that the process must identify uncertainties and conduct probabilistic analysis.

1.2. Review of Research on Architectural Patterns, Reference Models, and Reference Architectures for Engineering Decisions

Bass, Clements, and Kazman [5] provide a framework for developing software that can be applied to develop a Decision Analysis Data Model to capture digital engineering artifacts that document decisions. They define three architectural structures:
  • An architectural pattern is a description of element and relation types together with a set of constraints on how they may be used. A pattern can be thought of as a set of constraints on an architecture—on the element types and their patterns of interaction—and these constraints define a set or family of architectures that satisfy them. One of the most useful aspects of patterns is that they exhibit known quality attributes.
  • A reference model is a division of functionality together with data flow between the pieces. A reference model is a standard decomposition of a known problem into parts that cooperatively solve the problem. Reference models are a characteristic of mature domains.
  • A reference architecture is a reference model mapped into software elements (that cooperatively implement the functionality defined in the reference model) and the data flows between them. Whereas a reference model divides the functionality, a reference architecture is the mapping of that functionality onto a system decomposition.
There are many examples of architectural patterns, reference models, and reference architectures that have contributed to the advancement of digital engineering. Some important ones are shown in Table 1 in historical order.
Early efforts in digital engineering focused on computer-aided manufacturing. The U.S. Air Force Program for Integrated Computer-Aided Manufacturing (ICAM) developed a series of models to increase manufacturing productivity through systematic application of computer technology [6]. Two of the models were (1) IDEF0, a functional model, and (2) IDEF1, an information or data model, and IDEF2, a dynamics model. Eventually, the National Institute of Standards released standards for IDEF0 and IDEF1 that were later withdrawn [7,8].
The Purdue Enterprise Reference Architecture (PERA) was a reference model for enterprise architecture, developed by Theodore J. Williams and members of the Industry-Purdue University Consortium for Computer Integrated Manufacturing [9]. The model allowed end users, integrators, and vendors to share information across five levels in an enterprise:
  • Level 0—The physical manufacturing process.
  • Level 1—Intelligent devices in the manufacturing process such as process sensors, analyzers, actuators, and related instrumentation.
  • Level 2—Control systems for the manufacturing process such as real-time controls and software, human–machine interfaces, and supervisory control and data acquisition (SCADA) software.
  • Level 3—Manufacturing workflow management such as. batch management; manufacturing execution/operations management systems (MES/MOMS); laboratory, maintenance, and plant performance management systems; and data collection and related middleware.
  • Level 4—Business logistics management such as enterprise resource planning (ERP), which establishes the plant production schedule, material use, shipping, and inventory levels.
The PERA was adopted and promulgated globally as a series of industry standards, known as ISA-95, that has had a significant impact on automation in the industrial supply chain worldwide [10,11,12,13,14,15,16,17].
Early efforts in reference models for systems engineering were summarized by Long in 1995 and included the Functional Flow Block Diagram (1989), the Data Flow Diagram (1979), the N2 Chart (1968), the IDEF0 Diagram (1989), and the Behavior Diagram (1968) [18].
The Unified Modeling Language (UML) provides system architects, software engineers, and software developers with tools for analysis, design, and implementation of software-based systems as well as for modeling business and similar processes [19]. UML has three diagram types [26]:
  • Structure, which includes Class, Component, Composite Structure, Deployment, Object, and Package diagrams;
  • Behavior, which includes Activity, State machine, and Use Case diagrams;
  • Interaction, which includes Collaboration—Communication, Interaction Overview, Sequence, and Timing diagrams.
Business Process Model and Notation (BPMN) has become the de facto standard for business processes diagrams [19]. It is used for designing, managing, and implementing business processes, and it is precise enough to be translated into executable software systems. It uses a flowchart-like notation that is independent of any implementation environment. The four basic elements of BPMN are flow objects that represent events, activities, and gateways; connecting objects that represent sequence flows, message flows, and associations; swim lanes that are a mechanism of organizing and categorizing activities; and artifacts that represent data objects, groups, and annotations.
Mendonza and Fitch developed the Decision Network Framework to link business, technology and design choices [21,22]. It provides decision networks that can be used to set both the business vision and product concept for any product, system, or solution. It also has a decision-centric information model of the critical decisions and models the relationships between decision data, the requirements that drive these decisions, and the consequences that flow from them. Risk and opportunity identification and analysis are included in the framework, but the framework does not prescribe quantitative, probability-based methods risk and opportunity identification and analysis. This framework was implemented in software and marketed as a proprietary product named Decision Driven® Design (US Trademark 75771618) from 2003 to 2024.
Systems Modeling Language (SysML) is a general-purpose modeling language for model-based systems engineering (MBSE) [23]. It provides the capability to represent a system using the three UML diagram types and adding a fourth diagram type [26]:
  • Structure, which includes Class—renamed to be Block, Package, and Parametric (new) diagrams and eliminates Component, Composite Structure, Deployment, and Object diagrams;
  • Behavior, which includes Activity (modified), State Machine, and Use Case diagrams;
  • Interaction, which includes Sequence diagrams and eliminate Collaboration—Communication, Interaction Overview, and Timing diagrams; and
  • Requirements (new), which includes the Requirement (new) diagram.
SysML has been adopted widely within the systems engineering community—a search for the term “SysML” in the 154 papers that were presented at the 2024 INCOSE International Symposium yielded 34 papers that included the term at least once [27].
Decision Model and Notation [24] is a standard approach for describing and modeling repeatable operational decisions within organizations to ensure that decision models are interchangeable across organizations. The standard includes three main elements:
  • Decision Requirements Diagrams show how the elements of decision-making are linked into a dependency network.
  • Decision tables to represent how each decision in such a network can be made.
  • Business context for decisions such as the roles of organizations or the impact on performance metrics.
DMN is designed to work alongside BPMN the Case Management Model and Notation (CMMN) [28], providing a mechanism to model the decision-making associated with processes and cases. The decisions that can be modeled in DMN are deterministic decision responses to deterministic inputs, i.e., they are prescriptive models like programming flowcharts. DMN includes examples where risk, affordability, and eligibility are incorporated via qualitative categorization that serve as deterministic inputs to the decision prescriptions.
The Agile Systems Engineering Life Cycle Model (ASELCM) MBSE Pattern [25] accounts for information that is produced and consumed by systems engineering and other life cycle management processes. The ASELCM MBSE Pattern uses information models, not just process models, and it emphasizes learning in the presence of uncertainty. The information models in the MBSE pattern are intended to be built using composable knowledge. Davis states, “Composability is the capability to select and assemble components in various combinations to satisfy specific user requirements meaningfully. In modeling and simulation, the components in question are themselves models and simulations…. It is also useful to think of composability as a property of the models rather than of the particular programs.” [29].
Specking et al. use MBSE with Set-Based Design for Unmanned Aeronautical Vehicles [30]. Fang et al. use architecture design space generation with MBSE [31].

1.3. Main Aim of DADM and Principal Conclusions

The main aim of DADM was to develop a model that has the following characteristics:
  • Enables trusted collaboration and interactions through a digital ecosystem;
  • Is deployable and configurable for multiple decision domains;
  • Provides an analytical framework for decision making across the lifecycle stages of complex systems;
  • Will be widely adopted;
  • Enables traceability and reuse of analysis, decisions, and rationale for decisions;
  • Incorporate guidelines for identifying uncertainties and conducting probabilistic analysis and for documenting the rationale and results;
  • Provides information models built using composable knowledge and process models that emphasize learning in the presence of uncertainty;
  • Can be tailored to agile development for DEVOPS.
To meet these aims, the DADM was developed using composable SysML activity diagrams for process modeling and block definition diagrams for information modeling. The Magic System of Systems Architect was selected to implement the model, as it is widely designed for trusted digital collaboration, allows for traceability and reuse, allows the capturing of guidelines and documentation, and can be tailored.

2. Methods

The methods used in developing DADM follow data modeling by the International Data Management Association (DAMA) and were informed by the INCOSE Agile Systems Engineering Life Cycle Model (ASELCM). Both DAMA and ASELCM shaped the approach described below.

2.1. Data Modeling

According to DAMA, Data Management is a wide-ranging set or activities, which includes the abilities to make consistent decisions about how to get strategic value using data [32]. These activities are organized into the Data Management Framework, which identifies ten (10) categories of data management activities, which interact with an organization’s data governance to inform the data, information, and content lifecycle. These ten (10) data management activities begin with the definition of data architecture and design using data models. “Data modeling is the process of discovering, analyzing, and scoping data requirements, and then representing and communicating these data requirements in a precise form called the data model. This process is iterative and involves conceptual, logical, and physical models” [32].
The Decision Analysis Data Model mapped the inputs and outputs of the Decision Management processes defined in ISO/IEC/IEEE 15288 [33], the INCOSE Systems Engineering Handbook [1], and the SEBoK [34] to identify the high-level ‘decision concepts’ needed to execute the high-level Decision Management process. From there, the conceptual data model was decomposed and traced to an implementation-agnostic logical model, which defines data-driven processes and the data needs connecting them. Finally, implementation-specific physical data models would be tailored from the logical data model on a project-by-project basis.

2.1.1. Conceptual Data Model (CDM)

“A conceptual data model captures the high-level data requirements as a collection of related concepts. It contains only the basic and critical business entities within a given realm and functions.” [32]. In this regard, the conceptual DADM provides an executive-friendly definition of the key concepts that apply to business needs for their decision making. For example, the concept of a decision should include definitions of the decision itself, including the decision-maker(s), the alternatives being considered, and the values against which those alternatives will be evaluated. A summary of the DADM’s conceptual model is provided in Section 3.1.

2.1.2. Logical Data Model (LDM)

A CDM is too high-level to be implemented. The CDM defines needs without mapping those needs to data solutions. “A logical data model (LDM) captures the detailed data requirements within the scope of the CDM.” [32]. For example, whereas our CDM defines the decision concept as including a decision-maker, a set of alternatives, and a set of values, our LDM maps those to specific data points to include a decision authority, courses of action, and objectives. As the LDM was further refined, those data needs also included properties such as data attributes and domains, and the structure of data began to take shape without discussing additional properties of the data structure, such as format and validation rules, which are reserved for the physical data model. A summary of the DADM’s logical data model is captured in Section 3.2.

2.1.3. Physical Data Model

“Logical data models require modifications and adaptations in order to have the resulting design perform well within storage applications.” [32]. Therefore, the implementation specifics for any given solutions are defined in the Physical Data Model for that individual system. Note, the purpose of the Decision Analysis Data Model is to provide a reference for implementation for systems engineers and developers. Therefore, the DADM does not include a Physical Data Model.

2.2. Agile Systems Engineering Life Cycle Model

The INCOSE Agile Systems Engineering Life Cycle Model (ASELCM) has three major systems (Figure 1). The INCOSE ASELCM is a model of learning patterns that describes how systems improve through three interconnected levels: (1) The Target System; (2) the Lifecycle Domain System; and (3) The System of Innovation. System 1 of the ASELCM represents an engineered system of intertest (and its components) about which development and operations decisions are being made. Note that many different instances of Systems 1 may be present over time. Examples include aircraft, automobiles, telephones, satellites, software systems, data centers, and health care delivery systems. System 2 is the life cycle domain system, which is the system within which different instantiations of System 1 will exist during their life cycle. This includes any system that directly manages the life cycle of an instance of a target system during its development, production, integration, maintenance, and operations. If System 1 is managed as a program over its life cycle, then System 2 can be thought of as a program office responsible for the life cycle stages of System 1 (and the program’s associated processes). Different instances of System 2 can occur, e.g., one instance could be a program office responsible for development, production, and integration; and another instance could be a program office responsible for maintenance and operations. System 3 is the system of innovation that includes System 1 and System 2, and that is additionally responsible for managing the life cycles of instances of any System 2. System 3 develops, deploys, and manages System 2 work processes and evaluates them for improvements. System 3 can be thought of as the enterprise innovation system, e.g., an organization that develops, produces, and integrates many diverse kinds of systems, or a user that operates, integrates, and maintains many different systems. The System of Innovation produces better lifecycle processes and better lifecycle processes are implemented to produce better target systems.
Figure 2 shows how DADM can be embedded in the ASELCM to support digital engineering. The System 3 enterprise-level system of innovation can deploy an instance of the DADM logical model to a System 2 program management entity within the enterprise. The System 2 life cycle manager is responsible for using the logical process model to develop a physical model of the DADM artifacts. Physical models can be implemented using a combination of various methods such as model-based systems engineering software, product lifecycle management software, and relational databases. In theory, even paper-based documentation libraries could be used to implement a physical model; however, the purpose of DADM is to enable a fully digital implementation of the physical model. The physical models can be seen as structured templates that are populated with the actual data for a specific life cycle decision that is being made for a specific System 1 instance.

2.3. Data Model Validation

From a design perspective, data models are validated by mapping inputs and outputs through an organization’s business processes, by assuring traceability across the conceptual, logical, and physical models, by justifying the models’ activities and data conform to best practices that are documented in the literature, and by independent review of the model by subject matter experts. The business process for the DADM are the Decision Management processes defined in ISO/IEC/IEEE 15288 [33], the INCOSE Systems Engineering Handbook [1], and the SEBoK [34]. Traceability for the DADM is achieved by decomposing the conceptual data model activities and data in a traceable manner to the logical model. Validating that the models’ activities and data conform to best practices is done by following relevant standards, bodies of knowledge, systems engineering textbooks, and journal articles in developing the models. Conformance to best practices is demonstrated in this article by the extensive citations in Section 3 and Appendix A. DADM has been reviewed by the INCOSE Impactful Products Committee and approved for releafse in the INCOSE Systems Engineering Laboratory [36].
Validation of the feasibility and effectiveness of the DADM model will be achieved by seeking government and industry users who will deploy the DADM MBSE implementation and provide feedback on its use in their organizations.

3. Results—The Decision Analysis Data Model

This section describes the DADM conceptual and logical data model for each of the ten steps in the Decision Management Process. Section 3.1 describes the ten steps in the process model and provides illustrative data artifacts for each step. For each step, we describe the purpose and some of the system engineering methods used to create decision management data. We also provide tailoring and reuse guidance for each step. Section 3.2 provides the Logical Process Model Activity Diagram and illustrates key concepts for each of the ten steps. The Activity Diagrams provide the activity flow and the information flow to develop the decision data elements. Appendix A has definitions of all the decision data elements, and Appendix B has a Block Definition Diagram (BDD) for each the ten steps that show the relationships between the data elements. In addition, we illustrate the major data artifacts and provide references to the systems engineering and decision analysis literature.

3.1. DADM Conceptual Data Modeling

The foundation of DADM can be summarized via a data flow model, Figure 3, that identifies a sequence of conceptual management steps that are executed sequentially in the numerical order shown without skipping any of the steps. The data flow model shows the inputs to the first step, the outputs from the last step, and has arrows to indicate the general flow of data between the steps. The figure lists the illustrative data artifacts generated for each step in the process. Each step describes an important activity and includes data artifacts used to inform the decision management team. More detailed discussion of the data and conceptual management steps will be provided in the subsequent sections. The data flow and the data artifacts can be reused with appropriate modifications in subsequent system life cycle stages.
In this section, we define the step name, describe its purpose, list alternative methods in the systems engineering literature, and provide guidance for tailoring each step to your digital engineering implementation.
Table 2 provides the name of each of the ten steps, the purpose of each step, and common methods used in other systems engineering decision models and the system engineering literature.
Table 3 provides important guidance for reuse and tailoring for each step to the decision management process for your organization (System 2) and your system (System 3). The quality of each step should be validated with the decision-makers(s) and key stakeholders. We do not recommend skipping any step, but your organization may choose to combine steps. For example, the Systems Decision Process [37] uses Problem Definition (Steps 1 and 2), Solution Design (Steps 3 and 7), Decision Making (Steps 5 and 8) and Solution Implementation (Steps 9 and 10).

3.2. Logical Data Modeling

3.2.1. Step 1: Frame Decision

Spetzler, Winter, and Meyer define the decision frame as a collection of artifacts that answer the question: “What problem (or opportunity) are we addressing?” This comprises three components: (1) our purpose in making the decision; (2) the scope, what will be included and excluded; and (3) our perspective including, our point of view, how we want to approach the decision, what conversations will be needed, and with whom [38]. Figure 4 uses an activity diagram to depict the logical process for defining a decision frame.
The frame decision process is described by the 8 steps (1.1 to 1.8) shown in Figure 4. The process uses inputs from past decisions and from subject matter experts (SME’s). The process ends with a well-formed decision frame. The first step is to 1.1 identify the stakeholders that need to be included in the decision at hand. Every decision is potentially unique and may require different stakeholders. By using interviews, surveys and facilitated brainstorming sessions, the stakeholders’ needs will be captured to help 1.2 set the decision context and 1.3 develop the vision that will define the decision purpose and help keep the decision analysis efforts focused. In addition to the context and vision, it is also important to 1.4 raise issues that may have been identified by the stakeholders, SME knowledge or past decisions and 1.5 categorize issues. The issues, stakeholder needs, context, and vision are used to help identify uncertainty, risk, and or opportunity. This information is used to 1.6 develop and modify a decision hierarchy. It is also used to 1.7 develop/modify influence diagrams. The context and vision for the decision are combined with the decision hierarchy and influence diagram resulting in the 1.8 decision frame.
As a result of performing the activities in the 1. Frame the Decision process, there are several data artifacts to capture and manage. The remainder of this section provides guidance on best practices for how to develop these data artifacts.
For 1.1 identify the stakeholders, a generic list of stakeholders to be considered by subject matter experts when identifying stakeholders for the decision is as follows [37]:
  • Decision authority. The stakeholder(s) with ultimate decision gate authority to approve and implement a system solution.
  • Client. The person(s) or organization(s) that solicit systems decision support for a project or program.
  • Owner. The person(s) or organization(s) responsible for proper and purposeful system operation.
  • User. The person(s) or organization accountable for proper and purposeful system operation.
  • Consumer. The person(s) or organization(s) that realizes direct or indirect benefits from the products or services provided by the system or its behavior.
  • Collaborator. The person(s) or organization(s) external to the system boundary that are virtually or physically connected to the system, its behaviors, its input, or output and who realize benefits, costs, and/or risks resulting from the connection.
According to the Systems Engineering Handbook [1], an external view of a system must introduce elements that specifically do not belong to the system but do interact with the system. This collection of elements is called the system environment and is a result of the 1.2 set the decision context process activity [1]. A generic list of environmental factors to be considered when setting the decision context is as follows [37]:
  • Technological. Availability of “off-the-shelf” technology. Technical, cost, and schedule risks of innovative technologies.
  • Economic. Economic factors such as a constrained program budget within which to complete a system development and impacts on the economic environment within which the system operates.
  • Political. Political factors such as external interest groups impacted by the system and governmental bodies that provide financing and approval.
  • Legal. National, regional and community legal requirements such as safety and emissions regulations, codes, and standards.
  • Health and Safety. The impact of a system on the health and safety of humans and other living beings.
  • Social. The impact on the way that humans interact.
  • Security. Securing the system and the products, and services from harm by potential threats.
  • Ecological. Positive and negative impacts of the system on a wide range of ecological elements.
  • Cultural. Considerations that constrain the form and function of the system.
  • Historical. Impact on historical landmarks and facilities.
  • Moral/ethical. Moral or ethical issues that arise in response to the question: “Are we doing the right thing here?”
  • Organizational. Positive and negative impacts on the organization that is charged with making the decision.
  • Emotional. Personal preferences or emotional issues of decision makers or key stakeholders regarding the system of interest or potential system solutions.
According to Parnell and Tani, the 1.3 develop the vision process activity results in a vision statement that answers three questions [39]:
  • What are we going to do?
  • Why are we doing this?
  • How will we know that we have succeeded?
The vision statement refers to the process of making the decision, not the consequences of the decision on the cost, performance, and schedule of the system that is being engineered. So, the third question (“How will we know that we have succeeded?”) should be understood to mean success at the time the decision is taken, not when the final impact of the decision on the engineered system is known.
In 1.4 raise issues process activity, issue-raising brings to light the perspectives held by the stakeholders that can be used to view the decision situation. A well-formed set of issues considers the vision statement and the environmental factors that are related to the decision context. The next step is to categorize the issues into four groups as follows [39]:
  • Decisions. Issues suggesting choices that can be made as part of the decision.
  • Uncertainties. Issues suggesting uncertainties that should be considered when making the decision.
  • Values. Issues that refer to the measures with which decision alternatives should be compared.
  • Other. Issues not belonging to the other categories, such as those referring to the decision process itself.
The 1.5 categorize issues process activity and 1.6 develop and or modify a decision hierarchy provide the basis for defining the decision hierarchy, the uncertainties to be considered and analyzed, and the measures used in evaluating the decision alternatives.
Typically, the categorized issues are interrelated, and their interrelationships can be captured in 1.7 develop/modify influence diagrams, which is a network representation for decision modeling where the nodes correspond to variables which can be decisions, uncertain quantities, deterministic functions, or values [40]. Figure 5 captures the semantics and syntax of an influence diagram. Semantically, the boxes represent decisions, the ovals represent uncertain quantities known as chance nodes, the double oval represents a calculated quantity, the round tangle represents a value measure, and the arrows represent different influences. The meaning of the arrows is determined syntactically:
  • An arrow that enters a decision represents information that is available at the time the decision is taken.
  • An arrow leading into a chance node represents a probabilistic dependency (or relevance) on the node from which the arrow is drawn.
  • An arrow leading into a calculated or value node represents a functional relevance to the receiving node from which the arrow is drawn. In this instance, the receiving node is calculated as a function of all the nodes that have arcs leading into the receiving node.
Syntactical rules of influence diagrams are:
  • No loops: It must not be possible to return to any node in the diagram by following arrows in the direction that they point.
  • One value measure: There must be one final value node that has no arrows leading out of it and that represents the single value measure or objective that is maximized to determine the optimal set of decisions.
  • No forgetting: Any information known when a decision is made is assumed to be known when a subsequent decision is made.
The decision hierarchy captures three levels to the decisions identified by categorizing the issues. The top level includes those choices that are out of scope because they are assumed to have been made already. The middle level of the decision hierarchy contains those choices that are within the scope of the decision under consideration. The lowest level of the decision hierarchy has choices that can be deferred until later or delegated to others
The 1.8 decision frame process activity combines the context and vision for the decision with the decision hierarchy and influence diagram resulting in the decision frame. As defined in Figure 5, the decision frame has three components: (1) Purpose—what the decision is trying to accomplish; (2) Perspective—the point of view about this decision, consideration of other ways to approach it, how others might approach it; and the (3) Scope—what to include and exclude in the decision.

3.2.2. Step 2: Structure Objectives and Measures

We believe that it is a systems decision-making best practice to identify the decision objectives and measures used to evaluate attainment of the objectives before determining the alternatives [42]. Parnell et al. [43] provides a detailed discussion of the challenges and best practices for structuring objectives and measures.
Figure 6 provides the Logical Model activity diagram for Step 2, Structure Objectives and Measures. The decision frame provides important input information on to 2.1 Understand Values that guide 2.2 Identify Decision Objectives and 2.3 Develop Value Measures and 2.4 Structure Objectives. Identifying objectives is more art than science. The major challenges are (1) identifying a full set of objectives and measures, (2) differentiating fundamental (i.e., what we care about achieving) and means (i.e., how we can achieve) objectives, and (3) structuring a comprehensive set of fundamental objectives and measures for validation by the decision maker(s) and stakeholders and use in the evaluation of alternatives [43].
The value measures and structured objectives are represented as a value hierarchy (also called objectives hierarchy or value tree) which is a graphical tool for structuring objectives and measures [42]. Value hierarchies are easier to understand and validate when they are logically structured [43]. For systems with multiple functions, the objectives can be structured by time sequenced functions. The value hierarchy is the output of this process step. Figure 7 provides an example of a value hierarchy organized by functions. If functions are not appropriate, the objectives can be structured by criteria.
Decision makers and stakeholders are interested in four dimensions of value: performance of the system, cost of the system, the system schedule, and external impacts [44]. The best practice is to develop an objectives hierarchy for the performance and external impacts of the system, a financial model or a system life cycle cost model, and a system development schedule to be used to evaluate the alternates developed in the next step in the logical model. Private companies may also develop a net present value model to assess the potential financial benefits of a new product(s) or service(s) that the system will provide to their customers.
Kirkwood identified two useful dimensions for value measures: alignment with the objective and type of measure [45]. Alignment with the objective can be direct or proxy. A direct measure focuses on the objective, e.g., the percentage of time available for system availability. A proxy measures only part of the objective, e.g., reliability is a proxy for availability. The type of measure can be natural or constructed. Natural measures are commonly used, such as dollars. A constructed measure scale must be developed, such as a five-star scale for automobile safety. To be useful, constructed measures require clear definition of the measurement scales.
One measure should be used for each objective. Table 4 provides preferences for the four types of value measures [43]. Priorities 1 and 4 are obvious. We prefer direct and constructed to proxy and natural for two reasons. First, alignment with the objective is more important than the type of scale. Second, one direct, constructed measure can replace many natural and proxy measures.
The value hierarchy provides the qualitative value model. Developing the quantitative value mode is described in 6.0 Evaluate Alternatives under activity 6.2, Determine Value Function and Weight for each Value Measure.

3.2.3. Step 3: Generate Creative Alternatives

Figure 8 uses an activity diagram to depict the logical process for generating creative alternatives. There are five process activities to accomplish Step 3, Generate Creative Alternatives. The 5 activities are performed across three phases to generate alternatives: a preparation phase, a divergent phase, and a convergent phase. Our decision can only be as good as the best alternative we identify. The divergent phase is important to make sure that we identify and explore the full range of potential solutions. The convergent phase is important since we usually only have the resources to evaluate a limited number of alternatives.
The Preparation Phase uses the decision frame and value hierarchy as inputs to 3.1 Communicate Context and Values process activity. It also takes SME knowledge as input to support 3.2 Structure Decision process activity. In the Divergent phase, Risk and Opportunity along with decision, knowledge, record, and course of action are used as inputs to support 3.3 Identify Potential Architecture Options process activity which provides an output of options. In the Convergent phase constraints, risk, opportunity and course of action are inputs to 3.4 Evaluate Qualitative Value Space process activity which provides course of action as outputs. If the alternatives are not sufficiently diverse, they are modified and updated as new courses of action to consider for further evaluation. The output of this step results in a selected alternative and a list of dismissed alternatives.
The preparation phase ensures that alternatives are generated that are within the scope of the decision problem that is described in the decision frame and the alternatives provide the most potential value following two principles of Value-Focused Thinking [42]:
  • Specifying the meaning of the fundamental objective by defining value measures can suggest alternatives.
  • Identifying means that contribute to the achievement of the value measures can also indicate potential alternatives.
Figure 9 depicts how the divergent phase results in creating many ideas and how the convergent phase requires organized, analytic thinking to identify a smaller number of well-crafted, feasible alternatives that span the decision space.
Table 5 presents the assessment of the alternative generation techniques by identifying the life cycle stage appropriate for their use and their potential contributions in the divergent and convergent phases of alternative generation [44]. Several representative structured methods among the many available are brainstorming, Reversal, SCAMPER, Six Thinking Hats, Gallery, and Force Field. All the techniques are dependent on the expertise, diversity, and creativity of individual team members, on the effectiveness of the team leadership, and on the processes used.

3.2.4. Step 4: Identify Areas of Uncertainty

Figure 10 uses an activity diagram to depict the logical process for identifying areas of uncertainty. The decision frame, SME knowledge, the value hierarchy, the courses of action, and the decision record provide important inputs to 4.1 to 4.6 Identify Uncertainty process activities. There are many sources of uncertainty in system development that impact the risk areas. Table 6 provides a mapping of the major sources of uncertainty to the areas of uncertainty. Table 6 is illustrative and is neither an exhaustive list nor a checklist. Additional sources of uncertainty and issues are provided in the Department of Defense Risk, Issue, and Opportunity (RIO) Management Guide for Defense Acquisition Programs [47] and in ISO/IEC/IEEE 16085:2021 [48]. Any of these sources may have been included in 4.7 Evaluate Previous Areas of Uncertainty process activity. 4.8 Compile List of Uncertainties process activity provides the complete list of uncertainties to be considered for this decision.

3.2.5. Step 5: Plan Evaluation Methods

Figure 11 uses an activity diagram to depict the logical process for planning evaluation methods for each measure of each alternative.
The inputs to the 12 process activities in Step 5, Plan Evaluation Methods, include Uncertainty, Decision Frame, Knowledge, Value Hierarchy, and Couse of Action. Uncertainty is considered for each objective and option being evaluated. The Decision Frame is used to support 5.1 Review Previous Systems Analysis Plan and be sure to 5.10 Look at the Quality of Information Gathered.
The Decision Frame and Domain Knowledge is used to 5.2 Define Fidelity Needed. Use the domain knowledge to 5.3 Review Models and Simulations and provide knowledge to 5.11 Determine the Evaluation Methods that the models and simulation will need to support.
For each entry in the value hierarchy, and each option, the evaluation method is determined based on 5.4 Review Cost Objectives, 5.5 Review Performance Objectives, 5.6 Review Schedule Objectives, and 5.7 Review External Impact Objectives. There should be measures for each objective in the value hierarchy.
Each course of action in the 5.8 Review Process Alternatives and 5.9 Review Technology Alternatives activities are used to, 5.11 Determine the Evaluation Method for Each Option.
We use the inputs of uncertainty, Decision Frame, Fidelity, Knowledge Objectives and Options in 5.11 Determine the Evaluation Method for all Value Measures and Options to establish the evaluation approach for each option. We used the approaches and each course of action in 5.12 Develop Process for Data, Models and Simulations, to develop the Evaluation Plan.
5.11 Determine the Evaluation Method for all Value Measures and Options process activity is the goal for Step 5, Plan Evaluation Methods. To accomplish this, determining the evaluation methods for all the value measures and costs and the courses of action can be represented as an influence diagram (Figure 12). The missions, scenarios, threats, and environments used for the evaluation are defined in the Frame the Decision step. Because of the “no forgetting” rule for influence diagrams, the selected items are treated as sources of uncertainty for the downstream decisions of selecting objectives and measures, courses of action, and the data analytics, modeling, simulations, test data, and operational data to be used for evaluation. These choices all lead to calculating quantified measures that are uncertain due to these measures being functions of uncertainties. The quantified measures are used to calculate the value achieved and the cost incurred for each course of action that is being evaluated.
Assessment flow diagrams may be used to trace functional performance objectives to design decisions and modeling elements. Figure 13 is an example of such a diagram, which can help individual subject matter experts understand how their expertise fits into the larger evaluation picture, where inputs to feed their model will be coming from and where their outputs will be used. The lead systems engineer/systems analyst can use these diagrams to organize, manage, and track COA evaluation activities. By showing the pedigree of the data used in the decision support model, these diagrams also give stakeholders confidence that the evaluation rests on a solid quantitative foundation.
Specifically for the Unmanned Aeronautical Vehicle example in Figure 13, the functional performance objectives outline what the system must achieve, including:
  • Mobility & Survivability: Be transportable, maneuverable, and capable of enduring missions while avoiding detection.
  • Detection Capabilities: Detect ground vehicles and humans both day and night.
  • Efficiency Metrics: Optimize life cycle costs, airframe weight, travel time, scanning time, platform endurance, and blue airspace deconfliction.
  • Modeling elements and design decisions influence how the objectives are met. They include:
  • Design Decisions: Airframe type, wingspan, engine type, sensor payload, EO/IR resolution and field-of-view, and operating altitude.
  • Modeling Techniques: Use of multiresolution models and detection models (EO and IR).
  • Calculation relationships visually map how key parameters contribute to system costs and performance metrics. They include:
  • Design Decisions: Airframe type, wingspan, engine type, sensor payload, EO/IR resolution and field-of-view, and operating altitude.
  • Modeling Techniques: Use of multiresolution models and detection models (EO and IR).
Figure 12 indicates that uncertainties are dependent on the physical means (course of action). An example of how uncertainties are dependent on the choice of the course of action is shown in Table 7. The three courses of action have three different energy sources: (1) a Li-Ion Battery, (2) a Li-S Battery, and (3) JP-8 kerosene fuel. The uncertainty in the time that a UAV can dwell at an area of interest is dependent on the choice of energy technology and the models to predict the dwell time would be different. There also are two different payload data links: (1) fixed-antenna VHF and (2) an electronically steered phased array Ka band. Also, the uncertainty in the time that a UAV needs to send ISR data to the command station is dependent on the choice of technology and the models to predict the transmission time would be different.
Types of models that might be used for evaluation include descriptive models, predictive models, and predictive models [51]. Descriptive models explain relationships between previously observed states. Examples of descriptive models are catalog descriptions of COTS items that describe cost and performance as a function of the attributes of the items, e.g., pricing of cloud services as a function of processing throughput, software functionality, and data storage being purchased. Predictive models attempt to forecast future actions and resulting states, frequently employing forecast probabilities, seasonal trends, or even subject matter expert opinions (i.e., educated guesses). Prescriptive models seek COAs that, given initial state, lead to the best anticipated ending state. Examples of prescriptive models are optimization algorithms used to simulate behavior of threats to the system of interest when evaluating system survivability.
According to Kempf [52], attributes of the analytic tools that come in to play when selecting them to be used for evaluation are as follows:
  • Analytical Specificity or Breadth;
  • Access to Data;
  • Execution Performance;
  • Visualization Capability;
  • Data Scientist Skillset;
  • Vendor Pricing;
  • Team Budget;
  • Sharing and Collaboration.

3.2.6. Step 6: Evaluate Alternatives

The data from the Assessment Flow Diagram provides the data for the evaluation of alternatives. Figure 14 uses an activity diagram to depict the logical process for the evaluate alternatives activity.
Activity 6.1 Communicate Evaluation Processes to Participants uses the assessment flow diagram plan to the SMEs and stakeholders who will perform the analysis to assess each value measure for each alternative. In Activity 6.2, when there are multiple objectives and a value measure for each objective, the most common value model to evaluate the alternatives is the additive value model shown in Equations (1) and (2) [37,42,45,50] to calculate the value of each alternative. The single dimensional value functions convert each alternative’s score on each measure to a dimensionless common value. The swing weights provide the relative value of the swing in the value measure from its lowest level to its highest level. The above references provide methods for assessing the value functions and the swing weights.
v ( x ) = i = 1 n w i v i ( x i )
where for a set of value measure scores given by vector x,
i = 1 to n is the index of the value measure
xi is the alternative’s score of the ith value measure (from activity 6.4 or 6.5)
vi(xi) = is the single dimensional y-axis value of an x-axis score of xi
wi is the swing weight of the ith value measure
v(x) is the alternative’s value of x
  • and the swing weights are normalized.
i = 1 n w i = 1
Activity 6.2 develops the value functions and swing weights for each value measure. The data from the Assessment Flow Diagram provides the data for the evaluation of alternatives (Activities 6.4 and 6.5). Activity 6.6 uses this and Equation (1) to calculate the overall value for each alternative, v(x). Since value measures are aligned with fundamental objectives (and not means objectives), there is no index for the alternative in the two equations since the single dimensional value functions and the swing weights do not depend on the alternatives (i.e., the means).
MacCalman et al. [53] analyzed the conceptual design for future Army Infantry squad technologies to enhance Infantry squad effectiveness. An Infantry squad is a nine-person unit that consists of a squad leader and two teams of four. The Infantry squad system is the collection of integrated technologies that consists of soldiers, sensors, weapons, exoskeletons, body armor, radios, unmanned aerial vehicles (UAV), and robots. The analysis used 6 design concepts and 13 value measures. We use this example to illustrate DADM Steps 6, 7, and 8. Figure 15 provides the value component chart [53] for the deterministic value for each alternative. The chart shows the value contribution based on the value model and the estimated value measure performance. The ideal alternative assumes the best value for each value measure and size of each value measure’s contribution is determined by the swing weights. The hypothetical best alternative is obtained by taking the best alternative score for each value measure. This chart shows that the alternatives are all improvements over the baseline and that Performance is close the Hypothetical Best Alternative.
Figure 16 provides the cost vs. value chart [53] for this analysis. This figure shows that Sustainable and Survivable are dominated by Attack and LongRange, respectively. The non-dominated alternatives have a range of value from about 62 to 78 for a life cycle cost of USD 38 to 82k.
Section 3.2.7 provides probabilistic analysis and illustrates how it could be used to improve the alternatives. Section 3.2.8 will use the figures in Section 3.2.6 and Section 3.2.7 to describe the trade-offs between the alternatives.

3.2.7. Step 7: Improve Alternatives

Figure 17 uses an activity diagram to depict the logical process for improving alternatives.
According to the Department of Defense Risk, Issue, and Opportunity (RIO) Management Guide for Defense Acquisition Programs [47]:
  • A risk is a potential future event or condition that may have a negative effect on achieving program objectives for cost, schedule, and performance. A risk is defined by (1) the likelihood that an undesired event or condition will occur and (2) the consequences, impact, or severity of the undesired event, were it to occur. An issue is an event or condition with negative effect that has occurred (such as a realized risk) or could occur that should be addressed.
  • An opportunity offers potential future benefits to the program’s cost, schedule, or performance baseline.
It should be noted that the negative effects of risks and the benefits of opportunities are the result of outcomes that are uncertain events.
A value-based quantitative approach to identifying risk and opportunities is described by MacCallum, Parnell, and Savage [53] for the system design problem that involves the development of new infantry squad technologies. After evaluating the alternatives, a stochastic Pareto chart of cost versus value can be created as shown in Figure 18, which is a bivariate box-and-whisker plot for each alternative’s cost and value. The boxes (the thick lines) along each axis represent the second and third quartiles of the output data from the cost and value models used for evaluating the courses of action while the lines (whiskers) represent the first and fourth quartiles. Based on the stochastic Pareto chart, Sustainable is stochastically dominated by all other alternatives and can be eliminated from consideration. The first step in improving the non-dominated alternatives is to identify the risks and opportunities based on the stochastic Pareto chart. LongRange has a large downside risk for low value and that Defendable may have opportunities for cost savings. Pursuing a cost-effective risk treatment for LongRange would be an improvement to the LongRange. Similarly, investigation of the source of cost uncertainty for Survivable and LongRange may yield an opportunity to reduce the cost uncertainty.
To gain further understanding of the risk and opportunities, cumulative distribution charts (CDF) can be developed using the output data from the cost and value models used for evaluating the courses of action. Figure 19 shows a CDF chart with six S-curves that represent the uncertain alternative value outcomes. Sustainable is stochastically dominated by the other five alternatives, because its CDF is to the left of the other, and likewise, Attack is stochastically dominated by Survivable, Defendable, and Performance. The Performance and Defendable alternatives stochastically dominate all others because their S-curves are positioned completely to the right of all others. The risk profiles of Survivable and LongRange value cross, indicating that there is no clear winner between the two; we may want to accept a higher cost by choosing Survivable to mitigate the risk associated with LongRange. The CDF chart tells us that there is a significant chance Survivable will outperform LongRange and that Survivable has less risk due to its steeper risk profile.
Tornado diagrams allow us to compare the relative importance of each uncertain input variable. Figure 20 has tornado charts for the output from stochastic value and cost for the Long Range and Survivable courses of action. The bars are sorted so that the longest bars are at the top; sorting the bars in this way makes the diagram look like a tornado. The left end of each bar is the average of the output from the value or cost model when the input is fixed at its 30th percentile value, and the right end of each bar is the average of the output from the value or cost model when the input is fixed at its 70th percentile. In Figure 20, the input variables for value are the value measures and the input variables for cost are the cost components. In addition to tornado charts, Figure 20 traces the value measures to the systems functions; the functions to the subsystems that perform the functions; and the subsystems to the components that are inputs to the cost model. From the LongRange Value tornado chart, it can be determined that the radio drives the value risk for the LongRange course of action while soldier sensor drives its cost uncertainty. The protective suit drives the value risk for the Survivable course of action and while the soldier sensor drives its cost uncertainty. These insights provide a clearer understanding of what drives risk and uncertainty in improving the alternatives. For example, the risk of the LongRange course of action might be reduced by investing more resources into improving the radio’s range, security, and bandwidth features. For both courses of action, there may be an opportunity to reduce the cost and its uncertainty by pursuing an off-the-shelf lantern that performs adequately in terms of Beyond Line-of-Sight detection and Detection Distance.
Additional information on developing risk and opportunity treatments is provided in the Department of Defense Risk, Issue, and Opportunity (RIO) Management Guide for Defense Acquisition Programs [47] and in ISO/IEC/IEEE 16085:2021 [48].

3.2.8. Step 8: Assess Trade-Offs

Figure 21 uses an activity diagram to depict the logical process model for Assess Trade-offs. The step uses the analysis provided in the two previous steps.
The deterministic tradespace is provided in the value component chart in Figure 15 and cost vs. value chart in Figure 16. The cost vs. value chart shows that Sustainable and Survivable are dominated by Attack and LongRange, respectively. The non-dominated alternatives have a range of value from about 62 to 78 for a life cycle cost of USD 38 to 82k.
The probabilistic tradespace is shown in Figure 18 and Figure 19. These figures show that there is more variation in value than in the life cycle cost. In addition, there is significant variation in value of the Performance alternative. Also, LongRange does not dominate Survivable. These insights reinforce the importance of performing probabilistic analysis.

3.2.9. Step 9: Communicate Recommendations and Implementation Plan

Figure 22 uses an activity diagram to depict the logical process model for Communicate Recommendations and Implementation Plan.
Table 8 provides the rationale for including each of the data elements in the Make Decision Process. It is important to remember that major stakeholders and decision makers are busy, and project managers/systems analysts/systems engineers must provide clear and concise communications [54]. In addition, organizational guidance and individual leader preferences should be considered in developing decision-making documents and presentations.

3.2.10. Step 10: Make Decision

Major decisions usually involve major stakeholders and decision-makers at multiple decision levels. DADM includes several digital artifacts that are included in the decision-making process. The most common artifacts are included in Figure 23, the activity diagram for the logical process model for Make Decision. Of course, any DADM data artifact would be digitally available for use in the decision-making process.
It is important to remember that major stakeholders and decision makers are very busy and project managers/systems analysts/systems engineers must provide clear and concise communications [53,54]. In addition, organizational guidance and individual leader preferences should be considered in developing decision-making documents and presentations. The data elements in the Make Decision Process that should be communicated to provide the rationale are as follows:
  • Decision Hierarchy: Decision-makers are busy. It is important to identify the major decisions that have been made, the decision(s) to be made and major future decisions. It may be necessary to review the decision frame.
  • Value Hierarchy: The system objectives include performance, cost, and schedule. The value hierarchy includes the performance objectives and value measures.
  • Major Requirements: Decision-makers may want to verify the recommended COA that meets the major system requirements.
  • Courses of Action (COAs): The decision-makers and key stakeholders need to understand the COAs that were identified and evaluated. It is common that decision-makers or key stakeholders have preferences for one or more COAs. They may distrust the systems analysis or not decide if their preferred COA is not included.
  • Tradespace: The decision-makers and key stakeholders are usually interested in knowing the COA tradespace, the dominated COAs and the trade-offs between the non-dominated COAs. Decision-makers and key stakeholders typically want to know why their preferred COAs are not recommended.
  • Uncertainty: Decision-makers and key stakeholders know that system development involves many uncertainties (See Step 4). They will want to understand the major remaining uncertainties that create future risks and opportunities.
  • Decision (Selected COA): The recommended COA, the rationale for the decision, and the estimated cost, schedule, and performance of that COA.
  • (Decision) Rationale: A clear and concise summary of the non-dominated COAs and the rationale for the recommended COA.
  • (Implementation) Plan: The implementation plan usually includes the key program personnel, responsibilities, milestones, major risks, and risk reduction activities.
  • (Decision) Record: The decision record should include the decision made and the rationale for that decision. The record may include other DADM data based on organizational and program requirements.

4. Discussion

The DADM can be used in the Concept Stage and reused to inform Life Cycle Decisions throughout the system life cycle. In Table 9, we use the Generic Life Cycle [1,33] to provide decisions opportunities to improve the system value that are commonly encountered throughout a system’s life cycle. Many of these decisions would benefit from a DADM that integrates the data produced by performance, value, cost, and schedule models that are meaningful to the decision makers and stakeholders. The table also lists the types of data that would be available in DADM for each life cycle stages.
Looking ahead, the DADM has a clear path for incremental development through a series of targeted future activities. The next step is the development of example implementations and case studies. By piloting DADM in real-world contexts—such as system acquisitions, technology selection, or lifecycle planning—we intend to demonstrate its practical value, collect lessons learned, and gather specific feedback to further refine the model. These case studies will help build a stronger evidence base for DADM’s utility and support broader adoption by offering concrete examples and guidance for potential users.
We will continue iterating on the model alongside this effort, aiming for the release of DADM v2.0 in late 2025. This version will include a physical data model, shaped by insights from pilot use cases, and will serve as a detailed reference for organizations evaluating DADM. Maintaining this feedback-driven refinement process will help ensure the model continues to meet the needs of the systems engineering community.
Technical integration is another near-term opportunity. Aligning DADM with SysML v2 and BPMN will promote interoperability with contemporary MBSE tools and strengthen traceability from requirements through decision analytics. Another active area of exploration is the incorporation of artificial intelligence (AI)—especially Large Language Models (LLMs)—to assist with data ingestion, recommendation generation, and automation of alternative evaluation or risk assessment. A longer-term goal is to enable auditable, AI-enabled decision support by capturing rationale and potential bias, thereby supporting regulatory and ethical compliance.
In parallel, pursuing formal standardization remains important. As we validate and improve DADM through pilot projects, workshops, and collaboration with industry, steps toward a standards designation will encourage wider, more consistent adoption across organizations. Through these targeted activities, DADM will continue to advance toward enabling data-driven, transparent, and auditable decision-making in MBSE and digital engineering environments.

Author Contributions

Conceptualization, D.C., G.S.P. and C.R.K.; methodology, D.C., C.R.K., G.S.P. and F.S.; software, J.S. and C.N.; validation, C.R.K.; formal analysis, G.S.P., C.R.K. and D.C.; investigation, G.S.P. and C.R.K.; resources, D.C. and J.S.; data curation, C.N.; writing—original draft preparation, G.S.P., C.R.K. and D.C.; writing—review and editing, F.S. and J.S.; visualization, J.S., C.N. and D.C.; supervision, D.C., G.S.P. and C.R.K.; project administration, J.S., S.D. and F.S.; conference funding acquisition, F.S. All authors have read and agreed to the published version of the manuscript.

Funding

Funding was provided by INCOSE (www.incose.org) Technical Operations for travel funds to present DADM at two professional conferences.

Data Availability Statement

The data presented in this study are available on request from the corresponding author due to restrictions related to membership access to INCOSE Teamwork Cloud environment.

Acknowledgments

We would like to acknowledge the advice of several INCOSE members including William Miller, INCOSE Technical Operations, the INCOSE Impactful Products Committee, the SE Lab managers, and members of the Decision Analysis Working Group that provided important comments on DADM during its development.

Conflicts of Interest

Authors Devon Clark, Jared Smith, Chiemeke Nwobodo and Sheena Davis were employed by Deloitte Consulting LLP. Author Frank Salvatore was employed by SAIC. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Abbreviations

The following acronyms are used in this manuscript:
AbbreviationDescription
CDFCumulative Distribution Function
COACourse of Action
DADMDecision Analysis Data Model
DAMAData Management Association
DAWGINCOSE Decision Analysis Working Group
DMBoKData Management Body of Knowledge
INCOSEInternational Council on Systems Engineering
ISOInternational Organization for Standardization
MBSEModel Based Systems Engineering
MOEMeasure of Effectiveness
MOPMeasure of Performance
SEBoKSystems Engineering Body of Knowledge
SHStakeholder
SMESubject Matter Expert
SysMLSystems Modeling Language

Appendix A. DADM Definitions

We used the following sources for the DADM definitions in priority order: ISO Standards, e.g., ISO/IEC/IEEE 15288-2023 [33]; Handbooks, e.g., INCOSE Systems Engineering Handbook [1]; bodies of knowledge, e.g., systems engineering textbooks [55]; SeBoK [34] and PMI [56]; refereed papers; and the Merriam-Webster Dictionary [57].
Table A1. DADM Definitions.
Table A1. DADM Definitions.
TermDefinition
AnalysisA detailed examination of anything complex in order to understand its elements, essential features, or their relationships [57].
ApproachThe taking of preliminary steps toward a particular purpose [57].
AssumptionA factor in the planning process that is considered to be true, real, or certain, without proof or demonstration [56].
BaselineAn agreed-to description of the attributes of a product at a point in time, which serves as the basis for defining change [1].
ComponentA constituent part [57].
ConceptualThe highest level of data organization that outlines entities and relationships without implementation details. [32]
Conceptual Data ModelProvides a framework for understanding data requirements, expressed in terms of entities and relationships to identify necessary representations [32].
Conceptual ProcessAn abstract representation outlining the high-level steps and functions involved in a given process [34].
Configuration ManagementA discipline applying technical and administrative direction and surveillance to: identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verify compliance with specified requirements [34].
ConstraintA limiting factor that affects the execution of a project, program, portfolio, or process [56].
ContextAn external view of a system must introduce elements that specifically do not belong to the system but do interact with the system. This collection of elements is called the system environment or context [1].
CostCost is a key system attribute. Life Cycle Cost is defined as the total cost of a system over its entire life [1].
Course of ActionA means available to the decision maker by which the objectives may be attained [58].
Current ConditionA state of physical fitness or readiness for use [57].
DataSystem element that can be implemented to fulfill system requirements [1].
Data ElementThe smallest units of data that have meaning within a context typically representing a single fact or attribute [32].
Data ModelA conceptual representation of the data structures that are required by a database, defining how data is connected, stored, and retrieved [32].
DecisionAn irrevocable allocation of resources [59].
Decision AuthorityThe stakeholder(s) with ultimate decision gate authority to approve and implement a system solution [37].
Decision FrameA collection of artifacts that answer the question: “What problem (or opportunity) are we addressing?” That is comprised of three components: (1) our purpose in making the decision; (2) the scope, what will be included and excluded; and (3) our perspective including, our point of view, how we want to approach the decision, what conversations will be needed, and with whom [38].
Decision HierarchyThe specific decisions at hand are divided into three groups: (1) those that have already been made, which are taken as given; (2) those that need to be focused on now, and (3) those that can be made either later or separately [38].
Decision ManagementThe methods and processes involved in making informed and effective decisions regarding actions and strategies [1].
Descriptive DataWhat happened, e.g., historical data [60]
Dimension of PerformanceA quantitative measure characterizing a physical or functional attribute relating to the execution of a process, function, or activity [1].
Dimensions of ValueThe three most common dimensions of value are performance, financial/cost, and schedule [54].
EvaluationDetermination of the value, nature, character, or quality of something or someone [57].
External ImpactsInfluences or effects originating outside a specific system or entity, which can be positive, negative, or neutral, and significantly impact the behavior and outcomes of the system [61].
FunctionA function is defined by the transformation of input flows to output flows, with defined performance [34].
Influence DiagramAn influence diagram is a network representation for probabilistic and decision analysis models. The nodes correspond to variables which can be constants, uncertain quantities, decisions, or objectives [40].
InformationKnowledge obtained from investigation, study, or instruction [57].
IntegrationA process that combines system elements to form complete or partial system configurations in order to create a product specified in the system requirements [34].
InterfaceA shared boundary between two systems or system elements, defined by functional characteristics, common physical characteristics, common physical interconnection characteristics, signal characteristics or other characteristics [1].
IssueA current condition or situation that may have an impact on the project objectives [56].
IterationThe action or process of repeating [57].
KnowledgeA mixture of experience, values and beliefs, contextual information, intuition, and insight that people use to make sense of new experiences and information [56].
Life CycleThe stages through which a system progresses from conception through retirement [33].
Life Cycle CostTotal cost of a system over its entire life [1].
Lifecycle StageA period within the lifecycle of an entity that relates to the state of its description or realization [1].
LogicalA framework detailing how data elements will relate and interact, focusing on attributes and data integrity without physical storage considerations [32].
Logical Data ModelAn expanded version of the conceptual model detailing entities, attributes, and relationships while remaining agnostic to physical storage [32].
Logical ProcessA detailed specification of the sequence and interactions of activities in a process without concern for its implementation [34].
MBSEModel Based Systems Engineering, an approach that uses models to support systems engineering processes [34].
MeasureA variable to which a value is assigned as the result of the process of experimentally obtaining one or more quantity values that can reasonably be attributed to a quantity [62].
MethodA way, technique, or process of or for doing something [57].
ModelA physical, mathematical, or otherwise logical representation of a system, entity, phenomenon, or process [63].
Model-basedAn approach that uses models as the primary means of communication and information exchange in systems development [34].
ModularityDegree to which a system or computer program is composed of discrete components such that a change to one component has minimal impact on other components [34].
ObjectiveSomething toward which work is to be directed, a strategic position to be attained, a purpose to be achieved, a result to be obtained, a product to be produced, or a service to be performed [56].
Opportunitypotential future benefits to the program’s cost, schedule, and/or performance baseline [47].
PerformanceA quantitative measure characterizing a physical or functional attribute relating to the execution of a process, function, activity, or task [1].
PerspectiveA mental view or prospect [57].
PhysicalSpecifies the actual implementation of the data model in a database, including tables, columns, data types, and constraints [32].
Physical Data ModelTranslates the logical data model into a physical database schema, specifying how data is stored and accessed, including database tables and field specifications [47].
PlanA proposed means of accomplishing something [56].
Predictive DataData that predicts future probabilities and trends, finds relationships in data [64].
Prescriptive DataIdentifies and evaluates potentially new ways to operate, targets business objectives, balances all constraints [64].
ProcessA set of interrelated or interacting activities that transforms inputs into outputs [33].
Process OptionA set of interrelated or interacting activities that transforms inputs into outputs [1].
ProcessesSets of related tasks that lead to a specific outcome, often organized in a sequential manner [33].
PurposeWhat the system is for, and why the different stakeholders are willing to participate in the system lifecycle [65].
RationaleAn explanation of controlling principles of opinion, belief, practice, or phenomena [57].
RedundancyIn fault tolerance, the presence of auxiliary components in a system to perform the same or similar functions as other elements for the purpose of preventing or recovering from failures [34].
RequirementStatement that identifies a product or process operational, functional, or design characteristic or constraint, which is unambiguous, testable or measurable, and necessary for product or process acceptability [1].
ResponsibilitySomething for which one is responsible [57].
Riskpotential future event or condition that may have a negative effect on achieving program objectives for cost, schedule, and performance. Risks are defined by (1) the probability (greater than 0, less than 1) of an undesired event or condition and (2) the consequences, impact, or severity of the undesired event, were it to occur [47].
ScalabilityThe ability of something, especially a computer system, to adapt to increased demands [34].
ScenarioA set of actions/functions representing the dynamic of exchanges between the functions allowing the system to achieve a mission or a service [34].
ScheduleA procedural plan that indicates the time and sequence of each operation [57].
ScopeThe goals, processes, resources, facilities, activities, outputs and outcomes an organization is responsible for [34].
SimulationA model that behaves or operates like a given system when provided a set of controlled inputs [66].
StakeholderA stakeholder in an organization is any group or individual who can affect or is affected by the achievement of the organization’s objectives [67].
Stakeholder ConcernMatter of interest or importance to a stakeholder [33].
Stakeholder NeedNeeds are well-formed textual statements of expectations for an entity stated in a structured, natural language from the perspective of what the stakeholders need the entity to do, in a specific operational environment, communicated at a level of abstraction appropriate to the level at which the entity exists [68].
Stakeholder ValuesValues of a stakeholder. See values and stakeholder.
StoryAn account of incidents or events [57].
StrategyAn adaptation or complex of adaptations (as of behavior, metabolism, or structure) that serves or appears to serve an important function in achieving evolutionary success [57].
SubsystemA system that is part of a larger system [57].
SysMLSystems Modeling Language, a visual modeling language used to specify, analyze, and design systems [34].
SystemA system is an arrangement of parts or elements that together exhibit behavior or meaning that the individual constituents do not [34].
System ArchitectureThe fundamental concepts or properties of an entity in its environment and governing principles of the realization and evolution of the entity and its related life cycle processes [1].
System ElementA member of a set of elements that constitutes a system. A system element is a discrete part of a system that can be implemented to fulfill specified requirements. A system element can be hardware, software, data, humans, processes (e.g., processes for providing service to users), procedures (e.g., operator instructions), facilities, materials, and naturally occurring entities (e.g., water, organisms, minerals), or any combination [34].
System EnvironmentThe surroundings (natural or man-made) in which the system-of-interest is utilized and supported; or in which the system is being developed, produced or retired [1].
Technology OptionTechnology is “a manner of accomplishing a task especially using technical processes, methods, or knowledge. Option is “an alternative course of action” [57].
Tools(1) software product that provides support for software and system life cycle processes [69]
(2) something tangible, such as a template or software program, used in performing an activity to produce a product or result [56].
Trade-offA situation where one must balance benefits against costs or risks in decision-making processes [57].
Trade-offDecision-making actions that select from various alternatives based on net benefit to stakeholders., e.g., to make a trade-off [1].
TradespaceA tradespace is a multidimensional space that defines the context for the decision, defines bounds to the regional of interest, and enables Pareto optimal solutions for complex, multiple stakeholder decisions [55].
UncertaintyThe state, even partial, of deficiency of information related to understanding or knowledge of an event, its consequence, or likelihood [33].
Use CaseDescription of the behavioral requirements of a system and its interaction with a user [34].
ValidationConfirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled [1].
ValueRegard that something is held to deserve; the importance, worth, or usefulness of something to somebody [70].
Value FunctionA function that converts a value measure to a normalized value [54].
Value HierarchyThe functional value hierarchy is a combination of the functional hierarchy from systems engineering and the objectives hierarchy from decision analysis [43].
Value MeasureA metric that measures the attainment of an objective that we care about [54].
Value WeightThe swing weight assigned to a value measure in the additive value model [54].
VerificationConfirmation, through the provision of objective evidence, that specified requirements have been fulfilled [1].
Vision StatementA vision statement is an effective way to make sure that there is agreement on the purpose of the decision among the decision makers and key stakeholders. The vision statement answers three questions: What are we going to do? Why are we doing this? How will we know that we have succeeded? [39]

Appendix B. Block Definition Diagrams for the Logical Process Model

This appendix has a Block Definition Diagram (BDD) for each the ten steps of the Logical Process Model that shows the relationships between the data elements.
Figure A1. Logical Data Model: Block Definition Diagram for Step 1, Frame Decision.
Figure A1. Logical Data Model: Block Definition Diagram for Step 1, Frame Decision.
Systems 13 00596 g0a1
Figure A2. Logical Data Model: Block Definition Diagram for Step 2, Structure Objectives and Measures.
Figure A2. Logical Data Model: Block Definition Diagram for Step 2, Structure Objectives and Measures.
Systems 13 00596 g0a2
Figure A3. Logical Data Model: Block Definition Diagram for Step 3, Generate Creative Alternatives.
Figure A3. Logical Data Model: Block Definition Diagram for Step 3, Generate Creative Alternatives.
Systems 13 00596 g0a3
Figure A4. Logical Data Model: Block Definition Diagram for Step 4, Identify Sources of Uncertainty.
Figure A4. Logical Data Model: Block Definition Diagram for Step 4, Identify Sources of Uncertainty.
Systems 13 00596 g0a4
Figure A5. Logical Data Model: Block Definition Diagram for Step 5, Plan Evaluation Methods.
Figure A5. Logical Data Model: Block Definition Diagram for Step 5, Plan Evaluation Methods.
Systems 13 00596 g0a5
Figure A6. Block Definition Diagram for Evaluate Alternatives.
Figure A6. Block Definition Diagram for Evaluate Alternatives.
Systems 13 00596 g0a6
Figure A7. Logical Data Model: Block Definition Diagram for Step 7 Improve Alternatives.
Figure A7. Logical Data Model: Block Definition Diagram for Step 7 Improve Alternatives.
Systems 13 00596 g0a7
Figure A8. Logical Data Model: Block Definition Diagram for Step 8, Assess Trade-offs.
Figure A8. Logical Data Model: Block Definition Diagram for Step 8, Assess Trade-offs.
Systems 13 00596 g0a8
Figure A9. Logical Data Model: Block Definition Diagram for Step 9, Communicate Recommendations, and Implementation Plan.
Figure A9. Logical Data Model: Block Definition Diagram for Step 9, Communicate Recommendations, and Implementation Plan.
Systems 13 00596 g0a9
Figure A10. Logical Data Model: Block Definition Diagram for Step 10, Make Decision.
Figure A10. Logical Data Model: Block Definition Diagram for Step 10, Make Decision.
Systems 13 00596 g0a10

References

  1. Walden, D.D.; Shortell, T.M.; Roedler, G.J.; Delicado, B.A.; Mornas, O.; Yew-Seng, Y.; Endler, D. (Eds.) Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, 5th ed.; INCOSE Systems Engineering Handbook; Wiley: Hoboken, NJ, USA, 2023; ISBN 978-1-119-81430-6. [Google Scholar]
  2. Systems Engineering Vision 2025; International Council on Systems Engineering: San Diego, CA, USA, 2021.
  3. MIL-STD-499B; Systems Engineering. Department of Defense: Washington, DC, USA, 1994.
  4. 6.8 Decision Analysis. In NASA Systems Engineering Handbook; NASA: Washington, DC, USA, 2019.
  5. Bass, L.; Clements, P.C.; Kazman, R. Software Architecture in Practice, 2nd ed.; SEI series in software engineering; Addison-Wesley: Boston, MA, USA, 2003; ISBN 9786612433177. [Google Scholar]
  6. Integrated Computer Aided Manufacturing (ICAM) Architecture; Materials Laboratory (AFWAL/MLTC), AF Wright Aeronautical Laboratories (AFSC): Wright-Patterson AFB, OH, USA, 1981.
  7. FIPS PUB 183; Integration Definition for Function Modeling (IDEFO). National Institute of Standards and Technology: Washington, DC, USA, 1993.
  8. FIPS PUB 184; Integration Definition for Function Modeling (IDEF1). National Institute of Standards and Technology: Washington, DC, USA, 1993.
  9. Williams, T.J. The Purdue Enterprise Reference Architecture. Comput. Ind. 1994, 24, 141–158. [Google Scholar] [CrossRef]
  10. ANSI/ISA-95.00.01-2025; Enterprise-Control System Integration—Part 1: Models and Terminology. International Society of Automation: Durham, NC, USA, 2025.
  11. ANSI/ISA-95.00.02-2018; Enterprise-Control System Integration—Part 2: Objects and Attributes for Enterprise-Control System Integration. International Society of Automation: Durham, NC, USA, 2018.
  12. ANSI/ISA-95.00.03-2013; Enterprise-Control System Integration—Part 3: Activity Models of Manufacturing Operations Management. International Society of Automation: Durham, NC, USA, 2013.
  13. ANSI/ISA-95.00.04-2018; Enterprise-Control System Integration—Part 4: Objects and Attributes for Manufacturing Operations Management Integration. International Society of Automation: Durham, NC, USA, 2018.
  14. ANSI/ISA-95.00.05-2018; Enterprise-Control System Integration—Part 5: Business-to-Manufacturing Transactions. International Society of Automation: Durham, NC, USA, 2018.
  15. ANSI/ISA-95.00.06-2014; Enterprise-Control System Integration—Part 6: Messaging Service Model. International Society of Automation: Durham, NC, USA, 2014.
  16. ANSI/ISA-95.00.07-2017; Enterprise-Control System Integration—Part 7: Alias Service Model. International Society of Automation: Durham, NC, USA, 2017.
  17. ANSI/ISA-95.00.08-2020; Enterprise-Control System Integration—Part 8: Information Exchange Profiles. International Society of Automation: Durham, NC, USA, 2020.
  18. Long, J.E. Relationships Between Common Graphical Representations Used in System Engineering; Intonational Council on Systems Engineering: San Diego, CA, USA, 1995; Volume 5, pp. 973–979. [Google Scholar]
  19. Unified Modeling Language Version 2.5.1. Available online: https://www.omg.org/spec/UML/2.5.1 (accessed on 1 July 2025).
  20. Business Process Model and Notation (BPMN) Version 2.0.2. Available online: https://www.omg.org/spec/BPMN (accessed on 2 May 2025).
  21. Fitch, J.A. A Decision Network Framework for Vehicle Systems Engineering. SAE Tech. Pap. 2005. [Google Scholar] [CrossRef]
  22. Mendonza, P.; Fitch, J. Integrating System Models around Decisions. INCOSE Int. Symp. 2013, 23, 213–227. [Google Scholar] [CrossRef]
  23. OMG Systems Modeling Language (OMG SysML) Version 1.7. Available online: https://www.omg.org/spec/SysML/1.7/ (accessed on 14 April 2025).
  24. Decision Model and Notation Version 1.5. Available online: https://www.omg.org/spec/DMN/1.5 (accessed on 14 April 2025).
  25. Schindel, B.; Dove, R. Introduction to the Agile Systems Engineering Life Cycle MBSE Pattern. INCOSE Int. Symp. 2016, 26, 725–742. [Google Scholar] [CrossRef]
  26. Buede, D.M.; Miller, W.D. The Engineering Design of Systems: Models and Methods, 3rd ed.; Wiley: Hoboken, NJ, USA, 2016; ISBN 978-1-119-02790-4. [Google Scholar]
  27. INCOSE International Symposium: Volume 34, Issue 1. Available online: https://incose.onlinelibrary.wiley.com/toc/23345837/2024/34/1 (accessed on 16 April 2025).
  28. Case Management Model and Notation (CMMN) Version 1.1. Available online: https://www.omg.org/spec/CMMN/1.1 (accessed on 16 April 2025).
  29. Davis, P. Appendix C: Composability. In Defense Modeling, Simulation, and Analysis: Meeting the Challenge; The National Academies Press: Washington, DC, USA, 2006; pp. 74–80. ISBN 9786610604517. [Google Scholar]
  30. Specking, E.; Parnell, G.S.; Pohl, E.; Buchanan, R. Article Early Design Space Exploration with Model-Based System Engineering and Set-Based Design. Systems 2018, 6, 45. [Google Scholar] [CrossRef]
  31. Fang, Z.; Zhao, X.; Li, F. Architecture Design Space Generation via Decision Pzattern-Guided Department of Defense Architecture Framework Modeling. Systems 2024, 12, 336. [Google Scholar] [CrossRef]
  32. Data Management Association. Data Management Body of Knowledge (DMBOK), 2nd ed.; Technics Publications: Basking Ridge, NJ, USA, 2017; ISBN 978-1-63462-247-9. [Google Scholar]
  33. ISO/IEC/IEEE International Standard 15288-2023; Systems and Software Engineering—System Life Cycle Processes. IEEE: New York, NY, USA, 2023; ISBN 1-5044-9665-5.
  34. SEBoK Editorial Board. The Guide to the Systems Engineering Body of Knowledge (SEBoK); v 2.11; The Trustees of the Stevens Institute of Technology: Hoboken, NJ, USA, 2024. [Google Scholar]
  35. Pinon Fischer, O.J.; Matlik, J.F.; Schindel, W.D.; French, M.O.; Kabir, M.H.; Ganguli, J.S.; Hardwick, M.; Arnold, S.M.; Byar, A.D.; Lewe, J.-H.; et al. Digital Twin: Reference Model, Realizations, and Recommendations. Insight 2022, 25, 50–55. [Google Scholar] [CrossRef]
  36. SYSTEM ENGINEERING LABORATORY. 2025. Available online: https://www.incose.org/learn/se-laboratory (accessed on 3 June 2025).
  37. Driscoll, P.J.; Parnell, G.S.; Henderson, D.L. Decision Making in Systems Engineering and Management, 3rd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2022; ISBN 978-1-119-90140-2. [Google Scholar]
  38. Spetzler, C.S.; Winter, H.; Meyer, J. Decision Quality: Value Creation from Better Business Decisions; Wiley: Hoboken, NJ, USA, 2016; ISBN 1-119-14469-8. [Google Scholar]
  39. Parnell, G.S.; Tani, S.N. Frame the Decision Opportunity. In Handbook of Decision Analysis; Parnell, G.S., Bresnick, T.A., Tani, S.N., Johnson, E.R., Eds.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2025. [Google Scholar]
  40. Shachter, R.D. Probabilistic Inference and Influence Diagrams. Oper. Res. 1988, 36, 589–604. [Google Scholar] [CrossRef]
  41. Instructor Companion Site. Available online: https://bcs.wiley.com/he-bcs/Books?action=resource&bcsId=10076&itemId=111902790X&resourceId=40177 (accessed on 28 March 2025).
  42. Keeney, R.L. Value-Focused Thinking: A Path to Creative Decisionmaking; Harvard University Press: Cambridge, MA, USA, 1992; ISBN 978-0-674-03940-7. [Google Scholar]
  43. Parnell, G.S.; Bresnick, T.A.; Johnson, E.R. Craft the Decision Objectives and Value Measures. In Handbook of Decision Analysis; Parnell, G.S., Bresnick, T.A., Tani, S.N., Johnson, E.R., Eds.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2013; pp. 127–148. [Google Scholar]
  44. Kenley, C.R.; Whitcomb, C.; Parnell, G.S. Developing and Evaluating Alternatives. In Trade-Off Analytics; Parnell, G.S., Ed.; Wiley: New York, NY, USA, 2017; pp. 257–295. [Google Scholar]
  45. Kirkwood, C.W. Strategic Decision Analysis: Multiobjective Decision Analysis with Spreadsheets; Duxbury Press: Belmont, CA, USA, 1997. [Google Scholar]
  46. Bresnick, T.A.; Parnell, G.S. Decision Analysis Soft Skills. In Handbook of Decision Analysis; Parnell, G.S., Bresnick, T.A., Tani, S.N., Johnson, E.R., Eds.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2025. [Google Scholar]
  47. Office of the Executive Director for Systems Engineering and Architecture. Office of the Under Secretary of Defense for research and Engineering. Department of Defense Risk, Issue, and Opportunity (RIO) Management Guide for Defense Acquisition Programs. Washington, DC, USA. 2023. Available online: https://www.dau.edu/sites/default/files/2023-10/DoD%20RIO%20Management%20Guide%2009-2023.pdf (accessed on 26 March 2025).
  48. ISO/IEC/IEEE 16085:2021; Systems and Software Engineering—Life Cycle Processes—Risk Management. ISO: Geneva, Switzerland, 2021.
  49. Parnell, G.; Shallcross, N.; Pohl, E.; Phillips, M. Role of Decision Analysis in MBSE. In Handbook of Model-Based Systems Engineering; Madni, A., Augustine, N., Seivers, M., Eds.; Springer Nature: Cham, Switzerland, 2023; pp. 119–150. [Google Scholar]
  50. Cilli, M.; Parnell, G.S. Understanding Decision Management. In Trade-Off Analytics: Creating and Exploring the System Tradespace; Parnell, G.S., Ed.; John Wiley & Sons Inc.: Hoboken, NJ, USA, 2017; pp. 155–231. ISBN 978-1-119-23755-6. [Google Scholar]
  51. Brown, G.G. Modeling. In INFORMS Analytics Body of Knowledge; Cochran, J.J., Ed.; John Wiley and Sons, Inc.: Hoboken, NJ, USA, 2018; pp. 155–230. ISBN 978-1-119-50591-4. [Google Scholar]
  52. Kempf, K.G. Getting Started with Analytics. In INFORMS Analytics Body of Knowledge; Cochran, J.J., Ed.; John Wiley and Sons, Inc.: Hoboken, NJ, USA, 2018; pp. 31–48. ISBN 978-1-119-50591-4. [Google Scholar]
  53. MacCallum, A.D.; Parnell, G.S.; Savage, S. An Integrated Model for Trade-Off Analysis. In Trade-Off Analytics: Creating and Exploring the System Tradespace; Parnell, G.S., Ed.; John Wiley & Sons Inc.: Hoboken, NJ, USA, 2017; pp. 297–336. ISBN 978-1-119-23755-6. [Google Scholar]
  54. Parnell, G.S.; Bresnick, T.A.; Tani, S.N.; Johnson, E.R. (Eds.) Handbook of Decision Analysis, 1st ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2025. [Google Scholar]
  55. Parnell, G.S. Trade-Off Analytics: Creating and Exploring the System Tradespace; John Wiley & Sons Inc.: Hoboken, NJ, USA, 2017; ISBN 978-1-119-23755-6. [Google Scholar]
  56. Project Management Institute. A Guide to the Project Management Body of Knowledge (PMBOK® Guide), 7th ed.; Project Management Institute: Newtown Square, PA, USA, 2021; ISBN 978-1-62825-664-2. [Google Scholar]
  57. Merriam-Webster Dictionary. Available online: https://www.merriam-webster.com/ (accessed on 26 March 2025).
  58. Heylighen, F.; Joslyn, C.; Turchin, V. Course of Action. Available online: http://pespmc1.vub.ac.be/ASC/COURSE_ACTIO.html (accessed on 26 March 2025).
  59. Howard, R.A. The Foundations of Decision Analysis. IEEE Trans. Syst. Sci. Cybern. 1968, 4, 211–219. [Google Scholar] [CrossRef]
  60. Keenan, P.T.; Owen, J.H.; Schumacher, K. Introduction to Analytics. In INFORMS Analytics Body of Knowledge; Cochran, J.J., Ed.; John Wiley and Sons, Inc.: Hoboken, NJ, USA, 2018; pp. 1–30. ISBN 978-1-119-50591-4. [Google Scholar]
  61. External Impacts. Available online: https://www.google.com/search?q=external+impacts (accessed on 26 March 2025).
  62. ISO/IEC Guide 99:2007; International Vocabulary of Metrology—Basic and General Concepts and Associated Terms (VIM). ISO: Geneva, Switzerland, 2007.
  63. DoD 5000.59-M; DoD Modeling and Simulation (M&S) Glossary. US Department of Defense: Washington, DC, USA, 1998.
  64. Cochran, J.J. (Ed.) INFORMS Analytics Body of Knowledge; Wiley series in operations research and management science; Wiley: Hoboken, NJ, USA, 2019; ISBN 1-119-50592-5. [Google Scholar]
  65. Blockley, D.I.; Godfrey, P. Doing It Differently: Systems for Rethinking Construction; Thomas Telford: London, UK, 2000; ISBN 0-7277-2748-6. [Google Scholar]
  66. SO/IEC/IEEE 24765:2017; Systems and Software Engineering—Vocabulary. ISO: Geneva, Switzerland, 2017.
  67. Freeman, R.E. Strategic Management: A Stakeholder Approach; Pitman series in business and public policy; Pitman: Boston, MA, USA, 1984; ISBN 0-273-01913-9. [Google Scholar]
  68. Ryan, M.; Wheatcraft, L. Guide to Writing Requirements, 4th ed.; International Council on Systems Engineering: San Diego, CA, USA, 2023; ISBN 978-1-93707-05-4. [Google Scholar]
  69. ISO/IEC 15474-1:2002; Information Technology—CDIF Framework. ISO: Geneva, Switzerland, 2002.
  70. Martin, J.N. Overview of an Emerging Standard on Architecture Evaluation—ISO/IEC 42030. INCOSE Int. Symp. 2017, 27, 1139–1156. [Google Scholar] [CrossRef]
Figure 1. Top-Level Agile System Engineering Life Cycle Model [25].
Figure 1. Top-Level Agile System Engineering Life Cycle Model [25].
Systems 13 00596 g001
Figure 2. DADM is a deployed generic model (pattern) to be applied for making decisions about an engineered system [35].
Figure 2. DADM is a deployed generic model (pattern) to be applied for making decisions about an engineered system [35].
Systems 13 00596 g002
Figure 3. DADM data flow model and illustrative data artifacts. Some artifacts are updated in subsequent steps.
Figure 3. DADM data flow model and illustrative data artifacts. Some artifacts are updated in subsequent steps.
Systems 13 00596 g003
Figure 4. Logical process model: activity diagram for Step 1, frame decision.
Figure 4. Logical process model: activity diagram for Step 1, frame decision.
Systems 13 00596 g004
Figure 5. Generic influence diagram (Adapted from Buede and Miller [41]).
Figure 5. Generic influence diagram (Adapted from Buede and Miller [41]).
Systems 13 00596 g005
Figure 6. Logical process model: activity diagram for Step 2, structure objectives and measures.
Figure 6. Logical process model: activity diagram for Step 2, structure objectives and measures.
Systems 13 00596 g006
Figure 7. Illustrative Value Hierarchy.
Figure 7. Illustrative Value Hierarchy.
Systems 13 00596 g007
Figure 8. Logical process model: activity diagram for Step 3, generate creative alternatives.
Figure 8. Logical process model: activity diagram for Step 3, generate creative alternatives.
Systems 13 00596 g008
Figure 9. Two phases of alternative development (adapted from Bresnick and Parnell [46]).
Figure 9. Two phases of alternative development (adapted from Bresnick and Parnell [46]).
Systems 13 00596 g009
Figure 10. Logical process model: activity diagram for Step 4, identify sources of uncertainty.
Figure 10. Logical process model: activity diagram for Step 4, identify sources of uncertainty.
Systems 13 00596 g010
Figure 11. Logical process model: activity diagram for Step 5, plan evaluation methods.
Figure 11. Logical process model: activity diagram for Step 5, plan evaluation methods.
Systems 13 00596 g011
Figure 12. Influence diagram for plan evaluation methods.
Figure 12. Influence diagram for plan evaluation methods.
Systems 13 00596 g012
Figure 13. Assessment flow diagram for a hypothetical Unmanned Aeronautical Vehicle [49].
Figure 13. Assessment flow diagram for a hypothetical Unmanned Aeronautical Vehicle [49].
Systems 13 00596 g013
Figure 14. Logical process model: Step 6, evaluate alternatives.
Figure 14. Logical process model: Step 6, evaluate alternatives.
Systems 13 00596 g014
Figure 15. Value component chart [53].
Figure 15. Value component chart [53].
Systems 13 00596 g015
Figure 16. Cost vs. value chart [53].
Figure 16. Cost vs. value chart [53].
Systems 13 00596 g016
Figure 17. Logical process model: activity diagram for Step 7, improve alternatives.
Figure 17. Logical process model: activity diagram for Step 7, improve alternatives.
Systems 13 00596 g017
Figure 18. Stochastic Pareto chart for the assault scenario [53].
Figure 18. Stochastic Pareto chart for the assault scenario [53].
Systems 13 00596 g018
Figure 19. Cumulative distribution of value for the assault scenario [53].
Figure 19. Cumulative distribution of value for the assault scenario [53].
Systems 13 00596 g019
Figure 20. Value and cost tornado charts for the Long Range and Survivable courses of action [53].
Figure 20. Value and cost tornado charts for the Long Range and Survivable courses of action [53].
Systems 13 00596 g020
Figure 21. Logical process model: activity diagram for Step 8, Assess Trade-offs.
Figure 21. Logical process model: activity diagram for Step 8, Assess Trade-offs.
Systems 13 00596 g021
Figure 22. Logical process model: activity diagram for Step 9, Communicate Recommendations, and Implementation Plan.
Figure 22. Logical process model: activity diagram for Step 9, Communicate Recommendations, and Implementation Plan.
Systems 13 00596 g022
Figure 23. Logical process model: activity diagram for Step 10, Make Decision.
Figure 23. Logical process model: activity diagram for Step 10, Make Decision.
Systems 13 00596 g023
Table 1. Patterns, reference models, and reference architectures in engineering.
Table 1. Patterns, reference models, and reference architectures in engineering.
Year IntroducedPattern, Model, or Architecture
1981Integrated Computer-Aided Manufacturing (ICAM) Architecture [6]
1993Integration Definition for Functional Modeling (IDEF0) [7]
Integration Definition for Information Modeling (IDEF1) [8]
1994Purdue Enterprise Reference Architecture [9]
1995ISA-95: Enterprise-Control System Integration [10,11,12,13,14,15,16,17]
1995Relationships Between Common Graphical Representations Used in System Engineering [18]
1997Unified Modeling Language (UML) [19]
2004Business Process Model and Notation [20]
2005Decision Network Framework [21,22]
2007Systems Modeling Language (SysML) [23]
2015Decision Model and Notation [24]
2016Agile Systems Engineering Life Cycle Model (ASELCM) MBSE Pattern [25]
Table 2. DADM purpose and alternative methods for each DADM conceptual model process step.
Table 2. DADM purpose and alternative methods for each DADM conceptual model process step.
StepPurposeCommon Methods
1.0 Frame DecisionIdentify the purpose and scope of the decision in the context of the system life cycle. Problem Definition, Opportunity Definition, Influence Diagram, Stakeholder Analysis, Use Cases, Scenarios, Systems Thinking
2.0 Structure Objectives and MeasuresIdentify the decision objectives and the measures that will be used to assess achievement of the objectives before we develop the alternatives. Value Hierarchy, Objectives Hierarchy, Value Tree, Requirements Analysis
3.0 Generate Creative AlternativesDevelop creative alternatives that span the decision space of feasible alternatives including innovative technologies and new production, service, and delivery processes.Zwicky’s Morphological Box, Structured Creativity Techniques, Optimization, Set-Based Design, Pugh Method, TRIZ
4.0 Identify Areas of UncertaintyIdentify uncertainties to understand potential risks and opportunities and allow decision-makers to evaluate choices more accurately.Risk and Opportunity Analysis, Influence Diagram, Scenario Analysis
5.0 Plan Evaluation MethodsPlan evaluation methods to identify the data, the models, simulations, process flow, resources and time required to perform an alternative assessments that biases and enhances the reliability of the analysis.Assessment Flow Diagram, Systems Engineering Management Plan
6.0 Evaluate AlternativesImplement the alternative evaluation plan to assess the performance, value, cost, and schedule of the alternatives.Deterministic Analysis, Probabilistic Analysis, Portfolio Analysis, Benefit Cost Analysis, Modeling, Simulation, and Analysis, Optimization, Systems Analysis, Risk Analysis, Mission Analysis, Life Cycle Cost Analysis, Scheduling
7.0 Improve AlternativesUse the evaluation information to improve the alternatives by increasing value and reducing risk. Value-Focused Thinking, Risk Analysis, Risk Management, Systems Thinking, Systems Analysis, Optimization
8.0 Assess Trade-offsAssess the value trade-offs including performance, value, cost, and schedule trade-offs. Tradespace, Trade-off Analysis, Systems Analysis, Pareto Analysis, Cost as an Independent Variable
9.0 Communicate Recommendations and Implementation PlanCommunicate the trade-offs, recommendation(s), and implementation plan to the system decision maker (s) for their decision.System Decision, Solution Implementation, Implementation Schedule
10. Make the DecisionMake the decision and document the rationale.Decision Record
Table 3. DADM tailoring and reuse guidance.
Table 3. DADM tailoring and reuse guidance.
StepTailoring and Reuse GuidanceImpact of Not Doing
1.0 Frame DecisionThe decision(s) should be clearly identified in each life cycle stage or decision point. It is usually helpful to begin with the decisions made in the previous life cycle stage and identify revised and additional decisions. The system may be developed for the wrong problem, not take advantage important opportunities, or not address stakeholder concerns.
2.0 Structure Objectives and MeasuresThe value hierarchy should be updated based on the current information in each stage. New objectives and measures may be needed based on latest information about the problem and the system.The alternative design and evaluation will not focus on the decisionmaker and stakeholder objectives. This may lead to rework and schedule delays.
3.0 Generate Creative AlternativesIt is important to generate creative alternatives that span the decision space. The alternatives may include previous alternatives and new alternatives to increase value and/or reduce risk. The decision can only be as good as the best alternative. Poor alternatives lead to poor decisions.
4.0 Identify Areas of UncertaintyThe sources of uncertainty will vary in each life cycle stage. Some uncertainties will be resolved, and new risks and opportunities will be identified. Systems are designed for years and sometimes decades. There are many uncertainties. If they are not considered, rework will be required.
5.0 Plan Evaluation MethodsAs better information becomes available on the system alternatives in the life cycle, higher fidelity evaluation methods can be used. The lack of an alternative evaluation plan to obtain appropriate fidelity evaluation methods for the life cycle stage can lead to rework and schedule delays.
6.0 Evaluate AlternativesThe alternative evaluations should be updated based on new alternatives, new uncertainties, improved evaluation methods, and updated data.Poor evaluation can lead to rework, schedule delays, or system cancelation.
7.0 Improve AlternativesIt is important to improve the alternatives using the information from the alternative evaluations. Value gaps can be identified and filled. Risk can be mitigated, and new opportunities can be addressed. Not improving the alternatives before the decision can result in less value, more risk, and rework.
8.0 Assess Trade-offsOnce the alternatives have been improved, the next step is to update the value, performance, cost, and schedule trade-offs. The impacts on other systems and stakeholders should also be considered.Unless there is one alternative that dominates all the others, there will be trade-offs between the alternatives that decision makers may not consider.
9.0 Communicate Recommendations and ImplementationThe systems engineers (system analysts) should communicate the trade-offs, recommendation(s), and implementation plan to the system decision maker (s) for their decision using the organization’s recommended decision process. Without clear and concise communication of the recommendations, the best decision may not be made or be actionable.
10.0 Make the DecisionRecord the decision and rationale using organizational procedures.Without records, we lose traceability of past decisions and rationale.
Table 4. Preference for types of value measure [43].
Table 4. Preference for types of value measure [43].
TypeDirectProxy
Natural13
Constructed24
Table 5. Assessment of alternative development techniques [44].
Table 5. Assessment of alternative development techniques [44].
Technique [44]Recommended Life Cycle StageDivergent PhaseConvergent Phase
Optimization
methods
All stagesUse constraints to define the decision spaceUse optimization to identify the best alternatives.
Set-Based Design [45]Concept and designUse sets to define the decision spaceEliminate the poorer performing sets.
Structured
Creativity Methods
All stagesDevelop many ideasConvergence after creativity.
Zwicky Morphological BoxAll stagesDevelop many ideas (can be function-based)Do not consider unreasonable or physically unachievable alternatives.
Pugh MethodConcept and
design stages
Nominal group technique for individual reflection Improves concepts in group setting by attacking the negatives and improving the positives, which can lead to hybrid conceptsAllows for eliminating worst concepts, especially after they have been studied for positive features that might be incorporated into other concepts.
TRIZConcept and design stagesDefines the problem by a set of functional requirements. Models the problem by defining a set of resources that play a role in achieving the functional requirements or detract from achieving the functional requirementsFocuses search for alternative to a set of solutions based on historical patent data. Identifies which solutions among a set of 76 standard solutions, or high-level concepts, can be applied to meet each requirement. Uses 40 inventive and 4 separation principles to further reduce the number of alternatives.
Table 6. Mapping sources of uncertainty to areas of uncertainty.
Table 6. Mapping sources of uncertainty to areas of uncertainty.
Sources of
Uncertainty
Illustrative
Issues
4.1
SH
4.2
Performance
4.3 Schedule4.4 Cost4.5 Other4.6 COAs
Business/
Mission
Changes in markets/threats,
vision, and strategies
XXXX X
Context and
Environment
Economic disruptions (e.g., recession), disruptive technologies, adverse publicity, changes to law or policiesXXXX X
ManagementOrganization culture, investment management experience and expertise, mature baselining (technical, cost, schedule) and
cost estimating processes.
XXXXX
RequirementsUnclear or changing user needsXXXX X
TechnologicalNew technologies and integration with systems and infrastructure XXX X
Project
Management
Resource availability, changing need dates, concurrent development, design processes, manufacturing processes, production capabilities, and supply chain issues XXXXX
Human FactorsUser acceptance, resistance to changeXXXXXX
Operations and LogisticsAffordability, system sustainability, reliability, security, maintenance, supply chain X XXX
Table 7. Three small unmanned aerial vehicle courses of action [50].
Table 7. Three small unmanned aerial vehicle courses of action [50].
Buzzard ICardinal IRobin I
Systems 13 00596 i001Systems 13 00596 i002Systems 13 00596 i003
SubsystemComponentDesign ChoiceDesign ChoiceDesign Choice
Air Vehicle
Propulsion SystemElectric 300 WElectric 300 WPiston 2.5 HP
Energy SourceLi-Ion BatteryLi-S BatteryJP-8
Prop Size and Location18″ Rear20″ Rear26″ Front
Wingspan5′6′8′
Wing ConfigurationCanardConventionalConventional
Fin ConfigurationInverted VTwin BoomH Tail
ActuatorsElectromagneticElectromagneticHydraulic
Airframe MaterialGraphite EpoxyGraphite EpoxyFiberglass Epoxy
AutopilotSemi-AutoRemotely PilotedRemotely Piloted
Launch MechanismHandHandTensioned Line
Landing MechanismBellyBellyNet
ISR Collecting Payload
Sensor ActuationFixedFixedPan-tilt
EO Imager4 MP4 MP8 MP
IR Imager320 × 240 MWIR320 × 240 MWIR1280 × 720 MWIR and
LWIR cooled
Communication Links
Command and Control LinkFixed VHFFixed VHFFixed VHF
Payload Data LinkFixed VHFFixed VHFElect. Steered Phased Array Ka
Ground Elements
AntennaDipoleDipoleDipole
ComputerLaptopSmartphoneLaptop
User Input DeviceKeyboardJoystickKeyboard
PowerBattery + SpareBattery + SpareBattery + Gen.
Table 8. Communication recommendations and implementation plan.
Table 8. Communication recommendations and implementation plan.
Data ElementRationale
KnowledgeThe foundation for the use of DADM on the system of interest is to inform a major decision in the life cycle stage is the knowledge of SMEs and SHs.
InformationOne of the major purposes of a development program is to identify and obtain the information required to make a sound investment decision.
Decision FrameYou should begin with a review and update of the decision frame. Decision-makers are busy. It is important to identify the major decisions that have been made, the purpose of the current decision(s), and the decision(s) to be made and major future decisions. The need for the system should be reverified.
Courses of Action (COAs)The decision-makers and key stakeholders need to understand the COAs that were identified and evaluated. It is common that decision-makers or key stakeholders have preferences for one or more COAs. They may distrust the systems analysis or not decide if their preferred COAs are not included.
Trade-offsThe analysis in Step 8.0 provides the basis for defining the tradespace and identifying the trade-offs. The decision-makers and key stakeholders are usually interested in knowing the trade-offs between the non-dominated COAs. Decision-makers and key stakeholders typically want to know why their preferred COAs are not included in the analysis.
Determine Decision InsightsThe decision insights include the following:
  • The tradespace;
  • The dominated and non-dominated COAs;
  • The estimated cost, schedule, and performance of each non-dominated COA;
  • A comparison of each non-dominated COA’s potential to satisfy the major requirements;
  • The trade-offs between the non-dominated COAs.
Recommended COAThe key data for the recommended COA is the estimated cost, schedule, and performance. This would also include a comparison of the recommended COA’s potential to satisfy the major requirements.
Communicate Decision Story to StakeholdersA clear and concise summary of the non-dominated COAs and the rationale for the recommended COA compared to the other non-dominated alternatives.
Implementation PlanThe implementation plan usually includes the recommended COA, decision story, key program milestones, major risks, and risk reduction roles, responsibilities, and activities.
Table 9. Illustrative decisions and data availability throughout the System Life Cycle (adapted from [1]).
Table 9. Illustrative decisions and data availability throughout the System Life Cycle (adapted from [1]).
Life Cycle StageIllustrative DecisionsData Availability
Concept StageAssess Technology Opportunity/Initial Business Case
  • Of all the potential system concepts or capabilities that could incorporate the emerging technology of interest, do any offer a potentially compelling and achievable market opportunity?
  • Which should be pursued, when, and in what order?
  • Inform, Generate, and Refine a Concept
  • What requirements should be included?
  • What needs to be accomplished and what can be traded away to achieve it within anticipated cost and schedule constraints?
  • How should requirements be expressed such that they are focused, yet flexible?
  • How can the set of requirements be demonstrated to be sufficiently compelling while at the same time achievable within anticipated cost and schedule constraints?
Descriptive data on existing systems.



Predictive data using low fidelity models for new concepts

Predictive data on new technologies and concepts using low fidelity models
Create Solution Class Alternatives and Select Preferred COA
  • After considering the system level consequences of the sum of solution class alternatives across the full set of stakeholder values (to include cost and schedule), which solution class alternative should be pursued?
Development StageSelect/Define System Elements
  • After considering the system-level consequences of the system element design choices across the full set of stakeholder values (to include cost and schedule), which system-element alternatives should be pursued?
  • Make or buy decisions for system, subsystems, and elements
Descriptive data on existing system elements

Predictive data on new system elements and the system using high fidelity models

Development test data in later parts of the stage
Select/Design Verification and Validation Methods
  • Is prototyping warranted?
  • What verification and validation methods should be performed (test, demonstration, analysis/simulation, inspection)?
  • What are the verification and validation plans?
Production StageDevelop Production Plans
  • What is the target production rate?
  • To what extent will low-rate initial production be used?
  • What is the ramp-up plan?
  • What production processes will be used?
  • Is the system still affordable?
Initial operational test and evaluation data

Descriptive production quality data
Utilization and Support StagesUtilization decisions
  • What are the best operations concepts for the resources available?
Support decisions
  • What is the maintenance strategy?
  • What is the logistics concept?
  • What is the preventive-maintenance plan?
  • What is the corrective-maintenance plan?
  • What is the spare-parts plan?
  • Is the system still affordable?
Operational and logistics descriptive data

Predictive data using high fidelity operational and logistics models
Retirement StageRetirement Plan
  • When is it time to retire the system?
  • How will disposal of the system materials be accomplished?
Operational and logistics descriptive data

Predictive data using high fidelity operational and logistics models
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Parnell, G.S.; Kenley, C.R.; Clark, D.; Smith, J.; Salvatore, F.; Nwobodo, C.; Davis, S. Decision Analysis Data Model for Digital Engineering Decision Management. Systems 2025, 13, 596. https://doi.org/10.3390/systems13070596

AMA Style

Parnell GS, Kenley CR, Clark D, Smith J, Salvatore F, Nwobodo C, Davis S. Decision Analysis Data Model for Digital Engineering Decision Management. Systems. 2025; 13(7):596. https://doi.org/10.3390/systems13070596

Chicago/Turabian Style

Parnell, Gregory S., C. Robert Kenley, Devon Clark, Jared Smith, Frank Salvatore, Chiemeke Nwobodo, and Sheena Davis. 2025. "Decision Analysis Data Model for Digital Engineering Decision Management" Systems 13, no. 7: 596. https://doi.org/10.3390/systems13070596

APA Style

Parnell, G. S., Kenley, C. R., Clark, D., Smith, J., Salvatore, F., Nwobodo, C., & Davis, S. (2025). Decision Analysis Data Model for Digital Engineering Decision Management. Systems, 13(7), 596. https://doi.org/10.3390/systems13070596

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop