Next Article in Journal
The Application of Blockchain Technology in Fresh Food Supply Chains: A Game-Theoretical Analysis Under Carbon Cap-and-Trade Policy and Consumer Dual Preferences
Next Article in Special Issue
A Lightweight AI System to Generate Headline Messages for Inventory Status Summarization
Previous Article in Journal
Unlocking the “Code” of Green Innovation Based on Machine Learning: Evidence from Manufacturing Enterprises in China
Previous Article in Special Issue
Meta-Learning-Based LSTM-Autoencoder for Low-Data Anomaly Detection in Retrofitted CNC Machine Using Multi-Machine Datasets
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Manufacturing Readiness Assessment Technique for Defense Systems Development Using a Cybersecurity Evaluation Method

1
Department of Industrial and Management Engineering, Kyonggi University, Suwon-si 16225, Republic of Korea
2
Department of Computer Science, Kyonggi University, Suwon-si 16227, Republic of Korea
*
Author to whom correspondence should be addressed.
Systems 2025, 13(9), 738; https://doi.org/10.3390/systems13090738
Submission received: 23 July 2025 / Revised: 11 August 2025 / Accepted: 20 August 2025 / Published: 25 August 2025
(This article belongs to the Special Issue Data-Driven Analysis of Industrial Systems Using AI)

Abstract

Weapon systems have transitioned from hardware-centered designs to software-driven platforms, introducing new cybersecurity risks, including software manipulation and cyberattacks. To address these challenges, this study proposes an improved manufacturing readiness level assessment (MRLA) method that integrates cybersecurity capabilities into the evaluation process to address the gaps in hardware-focused practices in South Korea. Based on the MITRE adversarial tactics, techniques, and common knowledge, and the defensive cybersecurity framework, this study identified security requirements, assessed vulnerabilities, and constructed exploratory testing scenarios using defense trees. These methods evaluate system resilience, the effectiveness of security controls, and response capabilities under diverse attack scenarios. The proposed MRLA approach incorporates cyberattacks and defense scenarios that may occur in operational environments. This approach was validated through a case study involving unmanned vehicle systems, where the modified MRLA successfully identified and mitigated critical cybersecurity threats. Consequently, the target operational mode summary/mission profile of a weapon system can be revised based on practical considerations, enhancing the cybersecurity assessments and thereby improving the operational readiness of weapon systems through scenario-based, realistic evaluation frameworks. The findings of this study demonstrate the practical utility of incorporating cybersecurity evaluations into MRLA, contributing to more resilient defense systems.

1. Introduction

Recently, software technologies embedded in products have advanced significantly, providing a wide range of functionalities. These software technologies have been adopted in weapon systems, where many functions are software-controlled. Unlike past hardware-centric weapon systems, modern weapon systems are controlled by software, introducing new risks that were absent in hardware-centric systems. The possibility of malicious attacks by adversarial forces resulting in software forgery or manipulation and causing significant damage is now a tangible threat rather than future speculation. Such attacks are evident in the increasing focus on cyberthreats in various defense sectors [1]. Therefore, developing risk management techniques that account for the unique characteristics of software in weapon systems and integrating them into new quality assurance procedures to address potential risks in the cyber domain effectively is essential.
South Korea faces a unique international environment due to its geopolitical location, as it borders North Korea, China, Russia, and Japan, and the United States across the Pacific Ocean. These nations possess some of the most potent military capabilities globally and are actively developing and integrating cyberattack and defense capabilities (hereinafter “cybersecurity capabilities”) into their weapon systems [2]. These nations have implemented lifecycle software development methodologies based on quality assurance, integrated cybersecurity capabilities into their weapon systems, and ensured enhanced information security. Although South Korean research on quality assurance techniques for hardware is well-established, research on quality assurance methods that address the cybersecurity capabilities of weapon system software remains limited.
Despite modern weapon systems becoming increasingly reliant on software and cybersecurity, South Korea’s MRLA framework remains largely hardware-focused. Of the 64 evaluation criteria, only two address software maturity, and none assess cybersecurity resilience explicitly [3]. This limits the ability to evaluate the readiness of software-intensive defense systems against evolving threats. Recent cases abroad demonstrate mission failures resulting from software manipulation, unauthorized access and denial-of-service attacks on control components. Without integrating threat-based cybersecurity evaluations into the MRLA framework, systems may enter production with undetected vulnerabilities. Closing this gap is essential to ensure that future defense systems can resist adversarial tactics while meeting readiness requirements. To achieve this, a lifecycle evaluation methodology is proposed that integrates the assessment and management of quality attributes related to cybersecurity capabilities into the existing manufacturing readiness level assessment (MRLA) framework.
The structure of this paper is organized as follows: Section 2 compares the weapon system acquisition procedures of the US with South Korean procedures from the perspective of the MRLA to explore the concepts and application cases of attack/defense trees (AT/DT). Section 3 details the evaluation methodology. Section 4 presents illustrative examples. Finally, Section 5 concludes and offers directions for future research.

2. Literature Review

2.1. Acquisition Process of a Weapon System with Respect to MRLA: Comparison Between the US and South Korea

The US military has developed and operated various weapon systems. To efficiently manage these systems, an adaptive acquisition framework tailored to the specific characteristics of each weapon-system development program was employed. The adaptive acquisition framework was categorized into urgent capability acquisition, middle tier of acquisition, and major capability acquisition (MCA), depending on program characteristics. The development of weapon systems within a framework involves various evaluations, and the evaluation results determine the transition to the next development phase. Strict evaluations are conducted at milestone stages A, B, and C, and only after successfully passing these assessments can the development process proceed to the next phase. The MRLA is one of several evaluation methods associated with milestones. The manufacturing readiness level (MRL) quantifies conceptual manufacturing maturity into 10 levels, and the MRLA evaluates the MRL. Initially introduced by the US Army, the MRLA assesses manufacturing maturity during the acquisition stage and helps prevent schedule delays, cost overruns, and quality degradation caused by immature manufacturing processes in the research and development (R&D) phase. In South Korea, MRLA has been implemented since 2012, following the establishment of the Defense Acquisition Program Administration.
Table 1 summarizes the differences between MRLA approaches in the US and South Korea. As explained previously, the applied framework in the US varies, depending on the characteristics of the development program within the Department of Defense [4]. By contrast, South Korea manages the weapon system development process by dividing it into four defined periods: preliminary research, exploratory development, system development, and mass production. This approach is similar to the MCA framework employed in the US; however, notable differences exist. Under the US MCA framework, critical decisions are made at milestones A, B, and C. At each milestone, an MRLA is conducted to evaluate the level of MRL achievement, and the results are used as evidence for decision-making regarding the continuation of the development process. Conversely, South Korea, reflecting its characteristic government-led development approach, employs a structured evaluation process focused on achieving MRL 8 to determine with certainty whether a product qualifies to proceed to the mass production stage. Additionally, in the US, the adaptive framework allows for multiple evaluations of the MRL based on characteristics of the acquisition program, with the evaluation criteria adjusted for different maturity levels. In contrast, South Korea conducts MRL evaluations at the conclusion of the system development phase, before transitioning to the mass production stage. These evaluations apply to weapon system development programs with total budgets exceeding 20 billion KRW (approximately 15.4 million USD), particularly those deemed at risk of schedule delays or cost overruns [3]. The primary focus of these evaluations is to ensure weapon-system quality, which ultimately facilitates the production of high-quality systems during the mass production stage.

2.2. Risk Assessment Model Based on the MITRE Framework

To evaluate cyber capabilities, the MITRE framework was employed to model cyberthreats and develop corresponding threat scenarios, which were then applied to the MRLA. This approach involves identifying and analyzing specific cyberthreats to the weapon system as a foundation for conducting the MRLA. Relevant prior studies utilizing the MITRE framework are reviewed below. Straub [5] proposed a model integrating the concepts of AT/DT under the MITRE Adversarial Tactics, Techniques, and Common Knowledge (ATT&CK) framework to evaluate various cyberattack scenarios. This model facilitates analysis of system security posture and provides methodologies for designing defense strategies against cyberattacks. The main objective is to establish a comprehensive defense strategy based on the STRIDE framework [6] and the cyber kill chain model [7]. However, the proposed threat-modeling approach does not support systematic consideration during the development phase and lacks direct alignment with standards, such as the National Institute of Standards and Technology Special Publication (NIST SP 800-53) [8]. Manocha [9] proposed a risk assessment system based on the MITRE ATT&CK matrix to evaluate enterprise security. This system quantifies risk by analyzing the results of continuous security testing, focusing on cyberattack tactics, techniques, and procedures (hereafter referred to as “TTP”). This study aids organizations in conducting comprehensive evaluations of cyberthreats to enable effective threat mitigation strategies. However, the proposed risk assessment system primarily emphasizes security validation in the later stages of the process. Consequently, these findings underscore the need for proactive threat analysis during the initial phases of the development lifecycle of a system. Chen [10] systematically investigated multiple cyberattack techniques using the MITRE ATT&CK framework and analyzed their potential impacts on enterprise systems. Analyzing these techniques provides actionable insights for developing defense strategies and evaluating organizational security risks. However, the diverse attack scenarios that arise in commercial systems differ fundamentally from those associated with software embedded in weapon systems. Therefore, developing attack and defense methods optimized for embedded systems remains essential. Khan [11] proposed a risk-based security impact analysis framework using the MITRE ATT&CK to analyze potential security risks within systems. Through this process, system vulnerabilities are identified, and various attack vectors used by adversaries are analyzed to establish effective defense strategies. However, the approach of this study has limitations, as the results vary depending on the cyberthreat scenario. Moreover, this approach does not incorporate standards such as NIST 800-53. Ahmed [12] proposed a methodology using the MITRE ATT&CK framework for risk assessment of potential cyberattacks in cyber-physical systems (CPS). Diverse attack scenarios within industrial control systems (ICS) were analyzed to propose appropriate defense strategies. However, the scope of this study was limited to ICS, which restricts its generalizability to CPS. Additionally, the proposed defense strategies and risk assessments lacked quantitative data and practical validation, creating uncertainties regarding their effectiveness and applicability in operational environments. Svilicic [13] proposed a risk management model that integrates the MITRE ATT&CK and defensive cybersecurity (D3FEND) frameworks to assess cybersecurity risks in the maritime industry. This study developed a strategic evaluation model based on cyberthreat scenarios for maritime systems, considering both attack and defense perspectives. Furthermore, the approach is valuable because it applies to mission-critical systems, such as embedded software in weapon systems. However, this study did not incorporate cyberthreat scenarios into the development phase.
Therefore, this study performs a scenario-based threat analysis, considering cyber capabilities at each stage of weapon system development, to ensure quality assurance. An MRLA evaluation method incorporating a cyber capability assessment is proposed based on the analysis results.

3. Quality Engineering Approach for Considering Cybersecurity Capabilities in an MRLA Evaluation Based on Systems Engineering

3.1. Overview

Figure 1 outlines a new assessment method for ensuring the quality of cybersecurity capabilities during weapon system development phases in South Korea. The phases related to cybersecurity capabilities include the system requirements review (SRR), system functional review (SFR), software specification review (SSR), preliminary design review (PDR), critical design review (CDR), test readiness review (TRR), design and operation test (DT/OT), and MRLA.
The SRR ensures that requirements are consistently and accurately reflected as system requirements for weapon system development. This review verifies whether the project can proceed to the system design phase within the allocated budget, development schedule, and acceptable risk levels. Additionally, the requirements for weapon-system cybersecurity capabilities are identified in the SRR. The SFR ensures that overall system requirements are consistently and accurately reflected as functional requirements in the system specifications. This review verifies that the system can proceed to the preliminary design phase within its budget, schedule, and acceptable risk levels. This phase also examines the functional elements that must be included in the cybersecurity capabilities. The SSR examines software specifications, focusing on the requirements related to the cybersecurity capabilities associated with AT/DT design. Although this phase is not traditionally addressed in the systems engineering of existing weapon systems, it utilizes software engineering processes to standardize the cybersecurity capabilities of software within weapon systems. The PDR verifies whether the system requirements and functional system requirements are accurately reflected in the development specifications that satisfy the preliminary design. This review establishes the allocated baseline and assesses the feasibility of proceeding to the detailed design phase. Additionally, the foundational design of cybersecurity capabilities for integration into a weapon system is examined. The CDR verifies whether the system and functional requirements are correctly reflected in the detailed design of the initial product specifications. This review establishes the initial product baseline and evaluates the feasibility of proceeding to the prototype development, system integration, and testing phases. This phase also involves examining the detailed design of the cybersecurity capabilities. The TRR verifies whether the test plan, including the test objectives, methods, procedures, scope, personnel, resources, and safety considerations, is ready to validate compliance with the system requirements. This review gauges the readiness to proceed to the test and evaluation phases. This phase also reviews AT/DT-based exploratory testing methods that can be conducted, which consider real-world operational scenarios to evaluate cybersecurity capabilities. In this study, DT involves design testing, whereas OT focuses on verifying the outcomes and effectiveness of the implemented cybersecurity capabilities during operational use. The modified MRLA evaluates manufacturing maturity by specifically assessing cybersecurity capabilities and evaluating productivity, including government quality assurance. If issues are identified during the MRLA process, then system development, security monitoring, attack response and mitigation, and updates to cybersecurity elements are triggered, and a return to the SRR follows. The final stage of this framework involves the ‘AT/DT-based Cybersecurity Capability Test’, which includes a red-team exercise in which experienced personnel simulate advanced hardware and software attacks in order to validate system resilience prior to production approval. To evaluate cybersecurity capabilities, a review must be conducted at each system development stage, as shown in Figure 2.

3.2. Cybersecurity Evaluation Method for Weapon Systems in MRLA

To assess the cybersecurity capabilities of a specific weapon system, it is necessary to consider a software engineering-based development process within traditional systems engineering. This process demands the identification, analysis, and integration of cybersecurity elements. In the development of weapon systems, the process involves identifying requirements related to cyberthreats and defense, assessing the feasibility of architectural applications, and verifying the results of incorporating cyber capabilities into software developed for weapon systems.

3.2.1. Identifying Cybersecurity Requirements for AT/DT

At this stage, following the SSR and SFR, a draft of the software requirements specification (SRS) is prepared that considers the cybersecurity threats and countermeasure technologies identified in the SSR. While reviewing the cybersecurity capabilities of a weapon system, cyberthreats should be modeled for each domain, considering the AT/DT configuration based on the MITRE ATT&CK and D3FEND frameworks. This study applies the MITRE ATT&CK framework systematically at each MRLA evaluation stage. During the SFR stage, ATT&CK tactics such as Initial Access and Execution are referenced to identify potential entry points and malicious execution vectors. At the SFR and SSR stages, these tactics and techniques are translated into specific functional and software requirements, ensuring that defensive measures are incorporated into the design from the outset. During the PDR and CDR stages, each selected technique is cross-checked against the MITRE D3FEND mitigation strategies and the relevant NIST SP 800-53 Rev. 5 control items to ensure the design can withstand the mapped adversary behaviours. Finally, during the TRR and MRLA stages, the mapped TTPs form the basis for constructing evaluation scenarios and determining the effectiveness of implemented controls under realistic attack conditions. The AT/DT configuration consists of attack and defense techniques used to identify security threats and derive defense strategies. This consideration is an initial step for defense companies developing weapon systems. For example, key security threats and corresponding mitigation technologies in major areas include firmware modification and denial of service (DoS) attacks on the weapon system, which are countered by secure flash and secure onboard communication [14,15,16]. While these studies provide a comprehensive framework for automotive software architecture, their design principles differ from the architectural constraints of the U.S. Department of Defense (DoD), particularly with regard to mission assurance, security accreditation and operational resilience. Further technical analysis is required to address these differences and improve interoperability in defense applications, a topic that will be explored in future studies. Intrusion detection systems (IDSs), which eavesdrop on communication systems, are mitigated by IPSec and TLS. Information corruption in the backend infrastructure is addressed through unified threat management systems, which integrate firewalls, IDSs, and intrusion prevention systems (IPS) with access control mechanisms. These procedures play a crucial role in the MRLA phase by evaluating whether security threats and mitigation technologies have been appropriately applied and by specifying the cybersecurity capabilities of each system.

3.2.2. AT/DT Configuration-Based Design Plan (Procedure)

The AT/DT construction process occurred during the PDR/CDR phases based on a preliminary requirements analysis of the development company. First, the AT hierarchically represents attack objectives and the means to achieve them. Similarly, DT defines security measures that can mitigate attack techniques within the AT by mapping them to appropriate technologies by referencing the D3FEND framework. To validate each AT/DT configuration based on the MITRE ATT&CK and D3FEND frameworks, cybersecurity experts must collaborate with developers to examine security threats and related mitigation techniques using the respective matrices as references. These results involve visualizing security threats and mitigation techniques in a tree-based format and incorporating relevant technologies as requirements into the SRS during the PDR phase. These elements were utilized to design a scenario for exploratory testing after the CDR phase. First, the construction and validation of an AT-based checklist involve mapping NIST SP 800-53 Rev. 5 to the MITRE framework to evaluate confidentiality (C), integrity (I), and availability (A), which are critical steps in achieving comprehensive cybersecurity capabilities. Additionally, the MITRE-based attack technique names corresponding to these classifications are listed, while appropriate common vulnerabilities and exposures (CVEs) are identified and applied by the operating system (OS). Subsequently, the development and validation of a DT-based checklist involves applying the harden, detect, isolate, deceive, and evict criteria to review the defense techniques that are used to respond to threat elements at each level. The risk associated with each classified threat technique is specified by mapping the defense techniques from MITRE D3FEND to the mitigation techniques from MITRE ATT&CK. Each defense element must be identified and organized using the Control IDs within MITRE D3FEND. Recently, MITRE D3FEND issued a list [reference] linked to the NIST 800-53 control items to enable continuous updates and management. Based on this list, a correlation analysis table has been created to map the D3FEND IDs, ATT&CK IDs, and NIST 800-53 control items, thus allowing for careful review and analysis.

3.2.3. Review of the AT/DT-Based Cyber Test Scenario Configuration

The attack scenarios for the weapon system were structured step by step, as shown in Figure 3. First, the target is defined to establish the root node. Strategic scenarios are designed by adding intermediate and subordinate nodes. Detailed attack methods are developed based on tactical techniques and procedures, and the relationships between nodes are analyzed to derive various attack paths and variations. Security risks are assessed and prioritized using frameworks such as NIST 800-53, ATT&CK IDs, the CIA triad, and CVEs. Finally, the validity of the scenario is reviewed through expert consensus and analyses of the latest MITRE and CVE to enhance overall security.
The defense scenario begins by defining the defense objectives and establishing a root node. Defense techniques and strategies are drawn from analyzing attack TTPs. Subtrees and nodes are subsequently defined to formulate detailed defense strategies that are combined to construct a complete DT. The correlation between MITRE ATT&CK and CK mitigation techniques and defense technologies is analyzed to refine and categorize the defense methods. The scenario validity is evaluated using frameworks such as NIST SP 800-53, and this was followed by expert reviews and additional refinements, if necessary. This process enhances the cybersecurity capabilities of weapon systems.
This phase acts as a preliminary cybersecurity capability-testing procedure for the modified MRLA evaluation and verifies the appropriateness of the attack and defense scenarios. This is a required step before the MRLA evaluation and involves the construction of an AT/DT based on TTPs. Vulnerabilities are assessed, and priorities are determined accordingly. Finally, defense strategies are developed using the visualization method for attack scenarios outlined in the Appendix A. The results are analyzed to evaluate compliance with security requirements and provide recommendations for improvement, if necessary. This step-by-step application of ATT&CK ensures that threat identification, defense mapping and validation are consistently aligned with the MRLA process. This enables the evaluation to capture manufacturing maturity and cybersecurity resilience without altering the structural flow of existing MRLA assessments.

3.2.4. System Engineering with the Modified MRLA: AT/DT-Based Cybersecurity Capability Test

As previously mentioned, the system MRLA evaluation is conducted primarily to ensure the manufacturability of the weapon system. The existing 64 evaluation items are assessed. In addition to addressing the evolving operational environment of weapon systems, new items to evaluate cybersecurity capabilities have been included. The proportion of these additional items relative to existing items that ensure manufacturability must be established during a preliminary review. Furthermore, the AT/DT-based Cybersecurity Capability Test involves a dedicated hardware/software red team from the Ministry of Defense to carry out controlled penetration testing of physical and logical components. This ensures that any latent weaknesses are identified and mitigated before full-scale manufacturing begins. The considerations for cybersecurity capability evaluations based on AT/DT are presented in Appendix B.

4. Case Study

This chapter presents a case study that integrates the construction and validation of AT/DT-based exploratory test scenarios after TRR as well as the MRLA process, following the prescribed procedures presented in Figure 2. The following methods were conducted based on the mapping between the attack and defense scenarios to construct attack/defense scenarios for exploratory testing.
When conducting exploratory testing through case studies, it is essential to structure attacks and defense scenarios to align with each stage concurrently. Using the “attack/defense tree for the intentional deactivation of unmanned vehicles based on malicious behavior” as an example, the analysis proceeds step by step, according to the AT/DT criteria.
A defense company must identify realistic and feasible attack and defense scenarios within a weapon system and analyze the corresponding security threats and mitigation technologies at each stage. Based on this analysis, the AT/DT was constructed, and an exploratory testing plan using the AT/DT framework was proposed. The defense scenarios can be summarized step-by-step as follows.

4.1. Identifying Cybersecurity Requirements for AT/DT

At this stage, attack and defense objectives are defined. This process involves setting the root node of the AT/DT scenario and establishing defense objectives for the remote control and communication of an unmanned vehicle. Security threats and technologies of the unmanned vehicle, along with its corresponding security risks, are identified.
The evaluation method in Table 2 was applied during the SRR phase to analyze the security vulnerabilities of the system accurately. This method is used to assess whether the potential security threats to each component of an unmanned vehicle have been identified to enable an effective response to a realistic attack. Additionally, the method was used during the SSR phase to design security controls and defense strategies for unmanned vehicles effectively. This method was used to assess whether the necessary defense elements for mitigation, as outlined in the MITRE ATT&CK, were identified from a defense perspective.
The evaluation items in Table 2 were derived from the example in Table 3, and the details are as follows. Cybersecurity vulnerabilities can arise in electronic control unit (ECU) components of a weapon system, command receivers within a communication system, and interfaces within the backend infrastructure. First, the MITRE ATT&CK framework is used to map ECU vulnerabilities and attack paths to the MITRE framework, and it is subsequently determined whether associated security technologies (technique IDs) can be identified. Next, the technologies related to network protocol breaches are recognized for the command receiver. In the TRR phase, identifying the elements required to establish effective technology-based security measures, which are as follows, is essential.
First, the MITRE ATT&CK framework serves as a knowledge base for systematically analyzing attacker TTPs. Using this framework, security vulnerabilities can be accurately analyzed and mapped to realistic attack scenarios, such as the security vulnerabilities of an ECU, for security testing. This process identifies and evaluates potential threats to each component of an unmanned vehicle.

4.2. Procedure for An AT/DT Configuration-Based Design Plan

This stage analyzes attack TTPs and derives the corresponding defense technologies, as outlined in Table 4.
First, attack TTPs must be analyzed to identify appropriate defense technologies and strategies. Subsequently, a DT subtree is constructed. Similarly, the construction of a DT involves combining derived subtrees to design a comprehensive DT to enable a systematic framework for defense logic against attacks. This process includes analyzing the correlations between the mitigation and defense techniques of MITRE ATT&CK to refine and detail defense methods and strategies, and its application results in well-defined defense measures. Moreover, the DT was evaluated and refined as necessary using the NIST SP 800-53, ATT&CK IDs, and MITRE D3FEND Control IDs.
The next stage maps CVEs to the confidentiality (C), integrity (I), and availability (A) of the AT, as well as reviewing their relevance to the OS. Table 5, similar to Table 4, presents the final steps of this study. The tables are combined to provide a comprehensive view, and the related CVEs for each OS are identified and defined accordingly. The following section outlines the defense paths for the described attack scenarios, including the results of detailed technique analysis.
The generalized evaluation metrics of the DT (mapped to the MITRE D3FEND framework) are listed in Table 6.
This final step is presented in Table 7. Additionally, the defense tree was restructured into a detailed table format. The defense logic is completed by mapping mitigation to the techniques corresponding to the tactics in MITRE ATT&CK. Furthermore, as shown in Table 6, the mapping between MITRE D3FEND and NIST SP 800-53 Rev. 5 is presented in tabular form.
Finally, as shown in Table 7, by mapping MITRE D3FEND and NIST SP 800-53 Rev. 5 within the DT, defense measures and evaluations against attacks were completed. Recently, NIST mapped MITRE D3FEND to NIST SP 800-53 Rev. 5 and conducted a detailed analysis. Therefore, the mapping is continuously tracked and updated.
The evaluation method in Table 8 is used during the PDR/CDR phases to assess whether relevant security threats have been adequately identified by mapping the attack tree to unmanned-vehicle vulnerabilities. For example, security controls identified in the SSR phase are evaluated, mapped to the MITRE ATT&CK framework, and assessed for reasonableness using the proposed AT/DT model. During this process, the priority order of C, I, and A was verified. Additionally, the method evaluated whether appropriate mitigation and defense strategies were established for the components of DT based on the MITRE D3FEND framework.
Therefore, this design procedure establishes a systematic security architecture centered on AT/DT by quantitatively analyzing threats and ensuring alignment with technical controls. The key insights from this case study are as follows.
  • Attack and defense tree designs confirmed that component-specific attack scenarios
  • And Corresponding defense strategies can be systematically derived.
  • Mapping across MITRE ATT&CK, D3FEND, and NIST SP 800-53 Rev. 5 demonstrates the feasibility of aligning security requirements with applicable controls.
  • This case illustrates the practical potential of establishing a security design process that spans vulnerability-based threat analyses to evaluate mitigation measures

4.3. Review of the AT/DT-Based Cyber Test Scenario Configuration

At this stage, the DT corresponding to the previous attack tree is analyzed. First, a defense logic counters attacks involving physical access and social engineering techniques that target remote operational support. Figure 4 displays the DT created by analyzing an AT, assuming a cyberattack on an unmanned vehicle.
For clarity, the structure of the defense tree in Figure 4 can be interpreted in six layers:
  • LAYER 1—Entry Points: Represent initial access attempts through physical access or social engineering targeting remote operational units.
  • LAYER 2—Threat Categorization: Potential threats, including privilege escalation, malware injection, and command hijacking, are grouped into distinct types.
  • LAYER 3—Response Strategy Mapping: Associates each threat category with defensive actions defined in the MITRE D3FEND framework.
  • LAYER 4—Policy and Control Alignment: Aligns defense techniques with NIST SP 800-53 Rev. 5 security controls to ensure policy compliance.
  • LAYER 5—Implementation Logic: Converts each mapped defense strategy into specific technical implementations (e.g., multi-factor authentication, access restriction, file analysis) to form connected defense paths.
  • LAYER 6—Effectiveness Evaluation: Defense performance is assessed using metrics such as detection, response, and recovery time, and feedback is incorporated for improvement.
This configuration is not limited to use in a single scenario; multiple scenarios can be constructed in various forms. Specifically, the defense scenarios are built based on common attack scenarios. Given that these scenarios are based on an understanding of the attack tactics against the system and classification of suitable attack techniques, they are intuitive and valid. The attack scenario must set clear objectives and specific steps that incorporate diverse techniques to form connected paths. Additionally, this process checks whether the scenario is mapped to the MITRE ATT&CK framework and NIST SP 800-53 security controls, as well as whether the scenario addresses the latest CVEs and RTOS vulnerabilities. The defense scenario must set clear objectives and steps that employ diverse techniques to create logically connected paths. The defense scenario must align with MITRE D3FEND. Moreover, evaluating the effectiveness of the defense techniques and verifying their proper mapping to NIST SP 800-53 security controls is crucial. This evaluation method is applied during the TRR phase.
Therefore, this procedure demonstrates the practical applicability of scenario-based security capability evaluation and establishes a foundation for a systematic test configuration. The key insights from this case study are as follows.
  • Attack and defense scenarios were constructed with clearly defined objectives and procedures, and their validity was verified through alignment with the MITRE ATT&CK and D3FEND frameworks as well as the NIST SP 800-53 security controls.
  • Incorporating the latest CVEs and RTOS vulnerabilities, the scenarios enhanced practical relevance and demonstrated an ability to construct logically connected paths with multiple attack vectors and corresponding defense strategies.
  • This case highlights the applicability of cyber-test configurations based on attack–defense scenarios, supporting verification of security capabilities during the TRR phase.
Although expert consensus was used to review and refine the attack and defense scenarios, the process was based on quantitative references. Each scenario was mapped to the relevant MITRE ATT&CK techniques, D3FEND mitigations, CWE entries, CVE identifiers and NIST SP 800-series security controls. This mapping provided measurable indicators, such as severity ratings, exploit likelihood scores and mitigation coverage percentages. Expert assessments were then employed to interpret these indicators within the operational context of the evaluated system, thereby ensuring that subjective judgement was supported and constrained by standardized, verifiable data. This hybrid approach enabled a structured, repeatable and semi-quantitative evaluation process, reducing potential bias and enhancing the credibility of scenario validation.

4.4. System Engineering with Modified MLRA: AT/DT-Based Cybersecurity Capability Test

In this phase, both the MRLA preliminary review and the MRLA are performed. During the MRLA preliminary review, details based on Table 9 are reviewed, including the defense success rate, detection and response time, system resilience, effectiveness of security controls, appropriateness of risk response plans, and compliance with security policies. If evaluation items similar to those in Table 10 are confirmed, the following evaluation is performed. As part of the validation process, the red team exercises replicate realistic adversarial conditions within operational constraints. This provides high-confidence evidence that the implemented defenses are effective, while also addressing any residual vulnerabilities prior to deployment.
Regarding the attack–defense rate, the five AT/DT scenarios mitigated more than 60% of attacks, with an average detection time of 30 s and an average response time of 45 s. Because these values satisfy the weapon system development requirements, No. 65 in Table 11 passed. According to Table 10 and Table 11, the scenario-based evaluation produced an average recovery time of 3 min. However, errors were discovered during testing: No. 66 in Table 11 failed. Defense testing revealed that adjustments were made to specific regulations, but this was done to add new wording to reflect new technologies. Therefore, No. 67 in Table 11 passed.
Therefore, the final decision on the MRLA results of the unmanned vehicle was made by considering the evaluation results of 64 items regarding producibility and three items regarding cybersecurity capabilities.
Thus, the modified MRLA procedure integrates AT/DT-based testing to provide a comprehensive framework for both quantitative and qualitative evaluations of the security capabilities. The key insights from this case study are as follows.
  • The case study confirmed that realistic threat identification and the establishment of defense strategies for unmanned vehicle components are achievable through the application of the MITRE ATT&CK and D3FEND frameworks.
  • The cybersecurity evaluation items were systematically defined for each phase, demonstrating the applicability of threat-based security designs throughout the lifecycle.
  • This case illustrates the practical validity of cybersecurity requirements identification by linking theoretical frameworks with implementable technical security controls.
Although this work’s case study focused on a single unmanned ground vehicle, the underlying MRLA–AT/DT methodology is inherently quantitative. Leveraging the MITRE ATT&CK and D3FEND frameworks enables each identified TTP to be systematically mapped to CWE entries, CVE records and relevant NIST SP 800-series controls. This produces objective and measurable evaluation items. These items include vulnerability severity scores (e.g., CVSS ratings), mitigation coverage percentages and compliance rates with established security control baselines, among others. Aggregating these quantitative indicators enables reproducible cybersecurity readiness assessments within the MRLA process, ensuring threat detection and mitigation effectiveness can be verified numerically. While this paper presents the methodology through a specific platform example, the framework can be expanded to diverse weapon systems where larger datasets will support comprehensive, cross-platform statistical validation in future work.
Beyond the primary case study involving an unmanned ground vehicle, the proposed MRLA–AT/DT methodology has several inherent characteristics that allow it to be adapted to various other weapon systems. Firstly, the threat and defense mapping process, which is based on the MITRE ATT&CK and D3FEND frameworks and aligned with the controls in NIST SP 800-53 Rev.5, is platform-independent. It can therefore be applied to common subsystems, such as communication interfaces, control software and backend infrastructure, across multiple domains. Secondly, this process can be standardized and integrated into MRLA stages—from SRR through to TRR—regardless of platform type, requiring only the selection of system-specific TTPs and vulnerabilities. Thirdly, the defense strategies derived in this study target modules commonly found in weapon systems, enabling direct reuse in contexts such as un-manned surface vehicles, airborne platforms and sensor-based surveillance systems. Finally, the performance improvement metrics observed in the case study, including reduced detection time, an increased mitigation rate and fewer exploitable vulnerabilities, are derived from a closed-loop evaluation process and are thus likely to be reproducible on other platforms. Together, these attributes demonstrate the generalizability and practical applicability of the proposed methodology beyond the scope of the initial case study.

5. Conclusions

This study proposes an evaluation methodology for the quality assurance of cybersecurity capabilities in countries where weapon systems are developed under government leadership. The MRLA evaluation technique is used directly prior to full-scale production to assess the overall producibility of the system, including the quality of its cybersecurity capabilities. In South Korea, passing the MRLA evaluation is mandatory for transition to full-rate production, as it serves as a critical control point for system development. Additionally, this phase is the final evaluation before the weapon system is deployed. Although it was developed within South Korea’s defense acquisition process, the MRLA-AT/DT methodology is based on global standards such as MITRE ATT&CK, D3FEND, CWE, CVE and NIST SP 800-series. This makes it adaptable to various national systems. Thanks to its modular design, the core MRLA evaluation process remains unchanged, with only the compliance mapping tables and threat databases adjusted for local regulations and environments. This ensures its applicability to frameworks such as NATO STANAGs or U.S. Department of Defense (DoD) instructions with minimal modification. The existing MRLA evaluation technique focuses on the hardware characteristics of the system and therefore overlooks cybersecurity capabilities. Thus, new evaluation items are introduced in this study. In this process, the MITRE ATT&CK and D3FEND frameworks are used to identify security requirements and effectively assess security vulnerabilities in the early stages of weapon system development. These frameworks enabled the derivation of security threats and corresponding countermeasures for each software component of the weapon system. During this process, exploratory testing was structured based on a comprehensive defense tree for various attack scenarios. The focus is on evaluating the effectiveness of security controls alongside the resilience of the system and its response capabilities. In summary, the MRLA results serve as important data for assessing the overall appropriateness of security responses and cybersecurity capabilities of weapon systems. The validity of the exploratory testing and evaluation process designed to prepare for various security threats during field operations is demonstrated. The results of this study suggest that ongoing enhancement and investigation are needed to strengthen the security of weapon systems and develop comprehensive responses to cyberthreats.
The proposed MRLA–AT/DT methodology was designed with integration into large-scale defense acquisition programs in mind. Its reliance on internationally recognized standards, such as the MITRE ATT&CK framework, the D3FEND framework, the NIST SP 800-series of controls and the MIL-STD manufacturing evaluation procedures, ensures compatibility with existing bureaucratic frameworks, including those managed by the Defense Acquisition Program Administration (DAPA) in South Korea. To facilitate adoption, the methodology can be embedded into current MRLA evaluation checkpoints without introducing disruptive procedural changes. Resource allocation can be optimized through phased implementation, beginning with high-priority subsystems, while personnel training can be incorporated into existing cybersecurity and manufacturing readiness certification programs. Additionally, potential resistance to process changes can be mitigated by demonstrating early successes through pilot applications and by providing clear evidence of the cost–benefit to stakeholders. This approach ensures that the proposed methodology is technically, administratively, and operationally robust and feasible on a national scale.
Future research should extend AT/DT-based exploratory test scenarios to various weapon systems to verify their versatility. Additionally, future research should focus on introducing new cyberthreat analysis tools to enhance the diversity of evaluations, including the construction of live simulation environments to test actual attack and defense scenarios. Such efforts should aim to strengthen the effectiveness of security response strategies while ensuring the appropriateness of security controls by aligning them with international standards.
From the MRLA perspective, the evaluation criteria should be continually refined, and integration with AT/DT-based scenarios should be used to enhance the effectiveness of security evaluations. During the MRLA process, updating indicators based on the emerging cyberthreats and attack techniques is crucial, as is continuously exploring improvements to strengthen system resilience and the effectiveness of security controls. Furthermore, tackling legal and ethical issues as well as developing strategies to respond to changes in regulations and policies will help fulfill social responsibilities related to technological advancement. These research directions will contribute to the systematic improvement of the security and resilience of weapon systems.

Author Contributions

Conceptualization, S.-I.S. and D.K.; methodology, D.K.; validation, S.-I.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Kyonggi University Research Grant 2023.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MRAManufacturing Readiness Assessment
MRLAManufacturing Readiness Level Assessment
AT/DTAttack/Defense Trees
MCAMajor Capability Acquisition
MRLManufacturing Readiness Level
R&DResearch And Development
ATT&CKAdversarial Tactics, Techniques, And Common Knowledge
ICSIndustrial Control Systems
CPSCyber-Physical Systems
D3FENDDefensive Cybersecurity
SRRSystem Requirements Review
SFRSystem Functional Review
SSRSoftware Specification Review
PDRPreliminary Design Review
DT/OTDesign/Operation Tests
DoSDenial Of Service
CVECommon Vulnerabilities And Exposures
ECUElectronic Control Unit
OSOperating System

Appendix A

The composition strategy of the attack/defense tree (AT/DT) is as follows: The attack tree, initially introduced by Schneier, has been utilized as a vulnerability-analysis tool in projects such as networks-on-wheels and E-safety vehicle intrusion-protected applications (EVITA) [17]. The FP7 project of the European Union focused on automotive cybersecurity and applied attack tree-based threat modeling and defense structuring to vehicle ECUs. Thus, EVITA serves as a representative case that demonstrates the practical applicability and structural effectiveness of the AT/DT methodology in real-world automotive systems.
In the context of AT/DT modeling, an attack tree defines the “attack goal” as the root node. It systematically expands into sub-goals in a hierarchical structure to achieve the “attack goal.” By contrast, the defense tree comprises cybersecurity strategies, countermeasures, and their interrelations. The D3FEND framework systematizes these defense strategies and integrates domain knowledge models to support effective responses.
Figure A1 presents a generalized example of an AT/DT configured with “AND/OR” logic structures. The connections between the nodes illustrate how attacks and defensive measures interact. The “AND” nodes require multiple conditions to be satisfied simultaneously to proceed, whereas the “OR” nodes allow progression if at least one condition is met. These properties define the key components of a tree. The top-level goal is denoted as the “attack/defense goal,” whereas asset-specific attacks and defenses are expressed via concrete methods. Diamond symbols represent undeveloped events requiring further analysis.
Figure A1. Scenario-based AT/DT tree.
Figure A1. Scenario-based AT/DT tree.
Systems 13 00738 g0a1
The mapping between AT/DT links ATT&CK practices with D3FEND techniques. Furthermore, this mapping allows the attack events of the MITRE ATT&CK framework to correspond with the defensive methods of the MITRE D3FEND framework and thus facilitates research, analyses, and responses. Each ATT&CK and CK ID is mapped to the relevant D3FEND techniques, providing a knowledge base from which the associated defense measures for mitigating attacks can be explored.
Overall, AT/DT is systematically structured with diverse conditions and techniques to support reliable security analysis through interconnected attacks and defenses. Attack trees assess severity and likelihood, whereas defense trees evaluate effectiveness and reliability.
Figure A1 shows the DT derived from analyzing an AT, assuming a cyberattack on an unmanned vehicle. To improve the clarity and interpretability of Figure A1, the key nodes used throughout the AT/DT structure are briefly described.
  • AG/DG (attack/defense goal): The root of the tree represents the ultimate goal of the attacker or objective to be defended.
  • AO/DO (attack/defense objective): Intermediate targets that must be reached to achieve the overall goal.
  • AM/DM (attack/defense method): Specific techniques used to accomplish attack or defense objectives map to MITRE ATT&CK and D3FEND.
  • AA1/AD1 and AA2/AD2 (actions): Concrete actions taken to execute methods that allow multiple branches in a scenario.
  • GT1–GT3 (goal trees): Subtrees representing separate or parallel attack–defense scenario flows.
These abbreviations are integral to the AT/DT hierarchy and clarify scenario logic through AND/OR relationships.

Appendix B

Cybersecurity capabilities are defined differently for each weapon system. The items in Table A1 can be either directly applied to MRLA evaluations or selectively combined and adjusted as required. In the preliminary review phase, the evaluation items were refined and finalized. Through this process, the evaluation items for Category 10 were determined. Cyber capabilities (No. 65 and beyond) were established. Examples of the evaluation items are provided in Table A2. The items selected from the preliminary review, along with previously established items, were used to perform the MRLA.
Table A1. AT/DT-based cybersecurity capability test item.
Table A1. AT/DT-based cybersecurity capability test item.
CategoryContentEvaluation Method
defense rateEvaluate how often attacks succeed during testing.Calculate the attack success rate for each scenario and analyze the successful and unsuccessful attack paths to identify key vulnerabilities.
detection and response timeEvaluate how long it takes for the security system to detect and respond to an attack.Measure detection and response times by analyzing logs and alert messages from the security system after an attack scenario is executed.
system resiliencyEvaluate the ability of the system to return to a healthy state after an attack.Evaluate how quickly the system returns to normal operations after the attack is over, and analyze problems in the recovery process.
effectiveness of security controlsEvaluate how well existing security controls block attacks.Observe the operation of security controls during the attack scenario and analyze the attack vectors that were effectively blocked versus those that failed.
Adequacy of risk response plansEvaluate whether a risk response plan has been developed based on the test results.Analyze and evaluate whether a response plan has been developed for the vulnerabilities discovered, the feasibility of the plan, and the status of implementation within the organization.
Compliance with security policiesEvaluate whether the defense testing process complies with existing security policies and regulations.Review defensive testing scenarios and procedures for alignment with the security policies and applicable regulations of the organization, and adjust the test plan if necessary.
Table A2. MRLA items.
Table A2. MRLA items.
Category#.MRL 8 Standard Metrics
1. Technical and industry capabilities1Are technical risks documented, and are plans and actions in place to reduce them?
2Have you identified single/monopoly/overseas sources of supply, assessed their reliability, and established alternatives?
3By monitoring and analyzing your suppliers, do you have a plan in place for any delays to the overall business timeline due to supplier issues?
4Have you developed a contingency plan in case of business delays or disruptions?
5Has the supplier analysis considered the production capacity of each company?
6Is there a domestic industrial base with experience in producing similar components, or are there plans to expand new production facilities to produce the components?
7Has the environmental impact of the manufacturing process been analyzed and a plan of action developed?
8Has the company analyzed the potential for joint development and co-production?
2. Design9Are production and manufacturing experts formally involved in shape control, and is this formalized and practiced?
10Have the individual product designs, including parts, components, and assemblies, been verified to meet the overall system design specification?
11Was the design completed on schedule?
12Have critical processes been identified and manufacturing process flowcharts defined?
13Has a technical acceptance criterion (acceptance criteria for checking physical/functional shape) been established and practiced?
14Has the key technology been demonstrated?
15Has the system development process verified that the key design characteristics can be achieved through the manufacturing process?
16Are geometry controls and geometry control history systematically managed and reflected in the technical data?
17Do the prime contractor and subcontractors have procedures and systems for shape tracking and control?
18Has the process design for manufacturing the product been completed and feasibility analyzed?
19Has the production process, including components produced by subcontractors, been stabilized and proven to be reliable?
20Are the requirements for firmware and software documented?
21Have all production and test equipment and facilities been designed and verified?
3. Expenses and funding22(For the project where the guidelines for the scientific business management practice are applied) Has the forecast of production costs been reviewed and analyzed to meet the target cost?
23Are cost guidelines reflected in the design process, and target values for design cost and manufacturing cost are set?
24Are design cost targets for subsystems established and evaluated, and are costs traceable to the WBS?
25Does the prime contractor have cost-tracking procedures in place and cost-control metrics for subcontractors and suppliers?
4. Materials26Have the materials used been validated during the system development phase?
27Have low-risk substitutes been considered in case of shortages of applied raw materials and their impact on performance?
28Are discontinued parts analyzed and acted upon, and is a material acquisition plan in place for mass production?
29Are procedures for part standardization in place and operational?
30Has the need to use new materials been identified, and an action plan developed based on material cost, delivery time, production capacity, etc.?
31Has the department that manages subcontracts developed and implemented a vendor management plan to ensure smooth procurement of parts/materials?
32Has a bill of materials (M-BOM) been completed for production?
33Have the issues of storage and handling of materials been considered, and storage and handling procedures put in place?
34Has the environmental impact of materials been analyzed and an action plan developed?
35Has a material management procedure and system been established and operated, including a parts management plan?
36Have you completed an analysis of long lead time items and developed a plan to acquire long lead time items for the first production?
5. Process Capabilities and process control37Is manufacturing technology acquisition for new and similar processes completed and validated?
38Are processes that require manufacturing process changes when transitioning from development to mass production identified and validated?
39Are process variables defined for key process control?
40Have factors that can cause schedule delays been identified and improved through simulation?
41Has software for simulating the production process been identified and demonstrated?
42Has yield data been collected and analyzed during system development?
43Have facilities and test equipment been validated through system development?
6. Quality44Are activities for continuous process and quality improvement being performed?
45Are procedures for quality improvement in place and operationalized?
46Are checkpoints defined in the manufacturing process?
47Has a quality management plan for suppliers been established and implemented during the system development phase?
48Is the quality assurance history documented and utilized in the system development phase?
49Are calibration/inspection procedures for measuring equipment in place and applied?
50Are procedures in place to incorporate failure information from the development phase into the process, and have they been operationalized during the system development phase?
51Are statistical process control (SPC) activities performed?
7. Workforce52Has the organization identified the manpower required to carry out the project and established a plan to expand it?
53Has the company identified the specialized technical personnel required, and has a plan to secure them been established?
54Is there a program for training personnel in production, quality, etc.?
8. Facilities55Has the production facility been verified?
56Has the production capacity of the facility been analyzed, and necessary measures taken to achieve the required production rate for mass production?
57Are activities being conducted to increase the capacity of existing facilities?
58Has the company identified whether government-owned facilities/equipment, etc., are needed for production and taken measures?
9. Manufacturing planning and scheduling59Has a production plan and management method been established and operated to enable low-rate initial production?
60Has the risk management plan been updated?
61Are all necessary resources specified in the production plan?
62Is a plan for improving the maintenance performance of manufacturing equipment established and managed?
63Has the need for special equipment or test equipment been analyzed?
64Is the development of manufacturing standards, technical documentation, and manufacturing documentation required for production complete?
10. Cybersecurity capabilities65Is the attack mitigation rate and the time to detect and respond to those attacks adequate?
66Is the system resilient to attacks, and are the effectiveness of security controls adequate?
67Do Does the defense testing process comply with existing security policies and regulations?

References

  1. Yi, C.; Kim, Y. Security testing for naval ship combat system software. IEEE Access 2021, 9, 66839–66851. [Google Scholar] [CrossRef]
  2. Oh, S.; Cho, S.; Seo, Y. Harnessing ICT-enabled warfare: A comprehensive review on South Korea’s military meta power. IEEE Access 2024, 12, 46379–46400. [Google Scholar] [CrossRef]
  3. Defense Acquisition Program Administration. Regulations on the Processing of Technology Readiness Level and Manufacturing Readiness Level Evaluations; Defense Acquisition Program Administration: Gwacheon-si, Republic of Korea, 2024. [Google Scholar]
  4. Department of Defense. Manufacturing Readiness Level (MRL) Deskbook; Department of Defense: Arlington County, VA, USA, 2022. [Google Scholar]
  5. Straub, J. Modeling attack, defense and threat trees and the Cyber Kill Chain, ATT&CK and STRIDE frameworks as blackboard architecture networks. In Proceedings of the 2020 IEEE International Conference on Smart Cloud, Washington, DC, USA, 6–8 November 2020. [Google Scholar] [CrossRef]
  6. Abuabed, Z.; Alsadeh, A.; Taweel, A. STRIDE threat model-based framework for assessing the vulnerabilities of modern vehicles. Comput. Secur. 2023, 133, 103391. [Google Scholar] [CrossRef]
  7. Naik, N.; Jenkins, P.; Grace, P.; Song, J. Comparing attack models for IT systems: Lockheed Martin’s Cyber Kill Chain, MITRE ATT&CK Framework and Diamond Model. In Proceedings of the 2022 IEEE International Symposium on Systems Engineering (ISSE), Vienna, Austria, 24–26 October 2022. [Google Scholar] [CrossRef]
  8. National Institute of Standards and Technology (NIST). Security and Privacy Controls for Information Systems and Organizations; SP 800-53, Rev. 5; National Institute of Standards and Technology (NIST): Gaithersburg, MD, USA, 2020. [Google Scholar] [CrossRef]
  9. Manocha, H.; Srivastava, A.; Verma, C.; Gupta, R.; Bansal, B. Security assessment rating framework for enterprises using MITRE ATT&CK matrix. arXiv 2021, arXiv:2108.06559. [Google Scholar] [CrossRef]
  10. Chen, R.; Li, Z.; Han, W.; Zhang, J. A survey of attack techniques based on MITRE ATT&CK Enterprise Matrix. In Network Simulation and Evaluation (NSE 2023), Proceedings of the Second International Conference, NSE 2023, Shenzhen, China, 22–24 November 2023; Communications in Computer and Information Science; Springer: Singapore, 2023; Volume 2064. [Google Scholar] [CrossRef]
  11. Khan, M.S.; Siddiqui, S.; Ferens, K. A cognitive and concurrent cyber kill chain model. In Computer and Network Security Essentials; Springer: Cham, Switzerland, 2017; pp. 585–602. [Google Scholar] [CrossRef]
  12. Ahmed, A.; Gkioulos, V. Assessing cyber risks in cyber-physical systems using the MITRE ATT&CK Framework. ACM Trans. Priv. Secur. 2023, 26, 1–33. [Google Scholar] [CrossRef]
  13. Svilicic, B.; Junzo, K.; Rooks, M.; Yano, Y. Maritime cyber risk management: An experimental ship assessment. J. Navig. 2019, 72, 1108–1120. [Google Scholar] [CrossRef]
  14. AUTOSAR. Requirements on Secure Onboard Communication; AUTOSAR: Munich, Germany, 2019. [Google Scholar]
  15. AUTOSAR. Specification of Key Manager; AUTOSAR: Munich, Germany, 2019. [Google Scholar]
  16. AUTOSAR. Specification of Secure Onboard Communication AUTOSAR CP R19-11; AUTOSAR: Munich, Germany, 2019. [Google Scholar]
  17. EVITA Project. EU FP7 (2008–2011). Available online: https://www.evita-project.org (accessed on 21 July 2025).
Figure 1. MRLA using cybersecurity evaluation method of weapon systems.
Figure 1. MRLA using cybersecurity evaluation method of weapon systems.
Systems 13 00738 g001
Figure 2. Systems engineering considerations for each stage of cybersecurity capabilities.
Figure 2. Systems engineering considerations for each stage of cybersecurity capabilities.
Systems 13 00738 g002
Figure 3. Scenario-Based AT/DT Tree.
Figure 3. Scenario-Based AT/DT Tree.
Systems 13 00738 g003
Figure 4. Defense tree for protecting against intentional deactivation of autonomous tactical vehicles.
Figure 4. Defense tree for protecting against intentional deactivation of autonomous tactical vehicles.
Systems 13 00738 g004
Table 1. Definition of MRL in the US and South Korea.
Table 1. Definition of MRL in the US and South Korea.
MRLUSSouth Korea
1–3Criteria address manufacturing maturity and risks, beginning with pre-systems acquisitionA basic level of analyzing manufacturing issues, manufacturing concepts, or feasibility to achieve the objectives of the weapon system development project, or verifying the manufacturing concept
4continue through the selection of a solutionAt the preliminary research level, reviewing manufacturability
5–6Criteria address manufacturing maturation of the needed technologies through early prototypes of components or subsystems/systems, culminating in a preliminary design.Exploratory development level. A level that involves reviewing manufacturing capabilities for prototype production and assessing manufacturability of critical technologies or components
7The criteria continue by providing metrics for an increased capability to produce systems, subsystems, or components in a production-representative environment, leading to a critical design review.Completion stage of system development. A phase to confirm feasibility for initial production, conducting a manufacturing maturity assessment based on level 8
8The next level of criteria encompasses proving manufacturing process, procedure, and techniques on the designated “pilot line”
9Once a decision is made to begin LRIP, the focus is on meeting quality, throughput, and rate to enable transition to FRPEntry level for initial mass production
10The final MRL measures aspects of lean practices and continuous improvement for systems in productionMass production phase
Table 2. Security threats and security technologies for unmanned vehicles.
Table 2. Security threats and security technologies for unmanned vehicles.
AreaComponentsSecurity ThreatsSecurity RiskSecurity Technologies
Weapon systemECUECU disableC(H), I(M), A(H)Hardware-based process isolation, IO ports restricted
ImmobilizerDoS attack for immobilizer start permissions C(M), I(M), A(H)
Communication systemControl command receiverExecution of unauthorized commandC(H), I(H), A(M)Signature-based detection, multi-factor authentication, and Command Verification
Internal communication DoS attackC(M), I(M), A(H)Behavior-based detection, IPS, and firewall rules)Internal communication DoS attack
Backend infrastructureInterface system
Malware injection
C(H), I(H), A(H)Anomaly detection, memory analysis, and network traffic analysis
Table 3. Evaluation criteria.
Table 3. Evaluation criteria.
DomainEvaluation CriteriaEvaluation Methods
Weapon systemHas the security vulnerability of the ECU been identified?Review whether it is possible to map ECU vulnerabilities and attack paths using the MITRE ATT&CK framework and generate related technical scenarios.
Has the latest technology related to ECU security vulnerabilities been identified?Check whether it is possible to map the latest attack techniques using the MITRE ATT&CK framework.
Is it possible to establish security controls for the ECU?Review whether there are applicable security control items for the ECU using the MITRE ATT&CK and D3FEND frameworks based on NIST 800-53 REV.2.
....................
Communication systemHas the security vulnerability of the control command receiver been identified?Check whether it is possible to map network protocol vulnerabilities using the MITRE ATT&CK framework.
Does the control command receiver’s security protocol respond to the latest threats?Review whether the security technology related to protocol communication threats or cryptographic module application can be mapped using the MITRE ATT&CK framework.
Is data integrity verification for the control command receiver possible?Check whether it is possible to map measures for maintaining data integrity using the MITRE D3FEND framework.
....................
Table 4. Final evaluation metrics for ‘intentional deactivation attack on autonomous tactical vehicles based on malicious behavior.
Table 4. Final evaluation metrics for ‘intentional deactivation attack on autonomous tactical vehicles based on malicious behavior.
Level 3Level 4Level 5Level 6Evaluation MetricsC, I, A
Mapping Based on NIST SP 800-53 Rev. 5/MITRE ATT&CK
Control IDControl NameMapping TypeTechnique IDPriority Order
A
T
T
A
C
K
T
R
E
E
[AM1]
Intrusion into Remote Operation Support Units
Parent:
[A01/A02]
………….
[AM11]
Obtaining Unauthorized Physical Access
Parent:
[AM1]
………….
[AM21]
Malware Injection
Parent:
[AM12]
………….
[AA1]
Key-based attack; illegal acquisition, modification, or destruction
Parent: [AM22]
AC-16Security and privacy attributesmitigatesT1550.001C(H), I(L), A(L)
CA-7Continuous monitoringmitigatesT1204.001/T1204.002/T1204.003C(H), A(M), I(L)
CA-8Penetration testingmitigatesT1204.003C(H), A(M), I(L)
CM-2Default configurationmitigatesT1204.001/T1204.002/T1204.003C(H), A(M), I(L)
RA-5Vulnerability monitoring and scanningmitigatesT1204.003C(H), A(M), I(L)
………. ……….………….……….………….……….
Table 5. OS-Specific CVE vulnerability mapping for ‘intentional deactivation attack on unmanned vehicles based on malicious behavior.
Table 5. OS-Specific CVE vulnerability mapping for ‘intentional deactivation attack on unmanned vehicles based on malicious behavior.
Level 3 ~
Level. 6
Evaluation MetricsC, I, A ContentsVulnerabilities
Mapping Based on NIST SP 800-53 Rev. 5/MITRE ATT&CK
Control IDTechnique IDPriority OrderTechnique NameGeneral CVE VulnerabilitiesRTOS Vulnerabilities
A
T
T
A
C
K
T
R
E
E
AC-16 T1550.001C(H), I(L), A(L)Stealing application access TokensCVE-2018-15801, CVE-2019-5625, CVE-2022-39222, CVE-2022-46382x
AC-2 T1052C(H), I(M), A(L)Application access tokensCVE-2023-3497, CVE-2022-3312, CVE-2021-4122, CVE-2022-47578CVE-2021-36133
CA-7 T1204.001/T1204.002/T1204.003C(H), A(M), I(L)Malicious link/Malicious file/Malicious ImageCVE-2019-16009, CVE-2019-1838, CVE-2019-15287, CVE-2019-1772CVE-2021-31566, CVE-2022-35260, CVE-2017-1000100, CVE-2018-18439
CA-8 T1204.003C(H), A(M), I(L)Malicious link/Malicious file/Malicious ImageCVE-2016-8867, CVE-2021-35497, CVE-2022-23584, CVE-2022-20829CVE-2018-18439
…………… ………….………….………….………….………….
Table 6. MITRE D3FEND framework mapping for ‘intentional deactivation attacks on unmanned vehicles based on malicious actions.’.
Table 6. MITRE D3FEND framework mapping for ‘intentional deactivation attacks on unmanned vehicles based on malicious actions.’.
Level. 3Level. 4Level. 5~6Evaluation Metrics
Mapping Based on MITRE D3FEND
Control IDControl NameATT&CK ID/ATT&CK Mitigation
D
E
F
E
N
D
T
R
E
E
[DM1]
Defense Against Intrusion into Remote Operation Support Units
Parent:
[D01/D02]
[DM11]
Credential Hardening (Obtaining Unauthorized Physical Access)
Parent: [DM1]
…………D3-DTPPolicy Enforcement: Domain Trust PolicyM1015/Active Directory Configuration
D3-SPPPolicy Enforcement: Strong Password PolicyM1026/Privileged Account Management
M1027/PasswordPolicies
D3-UAPPolicy Enforcement: User Account PrivilegesM1015/Active Directory Configuration
………………………………
[DM12]
Unauthorized Privilege Escalation (Unauthorized Remote Control Acquisition of Remote Operation Support Units)
Parent: [DM1]
[DM23]
File analysis (malware injection)
Parent: [DM12]
D3-DAFile Analysis: Dynamic AnalysisM1026/Privileged Account Management
M1047/Audit
M1048/Application Isolation and Sandboxing
D3-FHFile Analysis: File HashM1049/Antivirus/Antimalware
………….………….………….………….………….………….
Table 7. MITRE D3FEND to NIST SP 800-53 Rev. 5 framework mapping for ‘intentional deactivation attacks on unmanned vehicles based on malicious actions’.
Table 7. MITRE D3FEND to NIST SP 800-53 Rev. 5 framework mapping for ‘intentional deactivation attacks on unmanned vehicles based on malicious actions’.
Level 3
~
Level. 6
Evaluation Metrics
Mapping Based on MITRE D3FENDNIST SP 800-53 Rev. 5 Framework Mapping
Control IDATT&CK ID/ATT&CK MitigationContentsControl IDControl Name
D
E
F
E
N
D
T
R
E
E
D3-DTPM1015/Active Directory ConfigurationModify domain configurations to limit trust between domainsAC-4Enforcement of information flow control
D3-UAPM1015/Active Directory ConfigurationRestrict resource access for user accountsAC-2Account management
D3-MFAM1032/Multi-factor AuthenticationRequire two or more proofs of evidence to authenticate usersAC-2, IA-2Account management, identification, and authentication (entity: user)
D3-OTPM1027/Password PoliciesOne-time passwords are valid for only one user authentication--
……………….………….………….………….
Table 8. Examples of inspection items.
Table 8. Examples of inspection items.
DomainEvaluation ItemEvaluation Method
Offensive Tree for Autonomous VehiclesDoes the offensive tree align with the MITRE ATT&CK framework and scenarios/TTPs?Verify whether the offensive tree is aligned with scenarios and TTPs through MITRE ATT&CK mapping.
Is the offensive tree appropriately mapped to NIST SP 800-53?Analyze the NIST SP 800-53 mapping results to assess whether the offensive tree is properly linked to the corresponding regulation.
Are the impacts on Confidentiality (C), Integrity (I), and Availability (A) clearly reflected?Check whether the C, I, A priority is properly reflected in the offensive scenarios.
Are vulnerabilities in the CVE and RTOS lists adequately reflected?Compare the CVE database and RTOS vulnerability lists to determine whether the offensive scenarios reflect these vulnerabilities.
....................
Defensive Tree for Autonomous VehiclesDoes the defensive strategy align with the MITRE D3FEND framework?Verify whether the defensive strategy aligns with MITRE D3FEND mapping.
Does the defensive strategy include effective mitigation measures?Compare the defensive strategy against the NIST 800-53 criteria to assess the inclusion of effective mitigation measures.
Is the defensive strategy appropriately mapped to NIST SP 800-53?Analyze the NIST SP 800-53 mapping results to assess whether the defensive strategy is properly linked to the corresponding regulation.
Does the defensive strategy adequately address vulnerabilities from CVE and RTOS lists?Compare the CVE database and RTOS vulnerability lists to check if the defensive strategy properly addresses these vulnerabilities.
....................
Table 9. Example of inspection items.
Table 9. Example of inspection items.
DomainEvaluation ItemEvaluation Method
Construction of attack scenariosSpecificity and clarity of attack objectivesEvaluate whether the final objectives of the attack scenario and the steps to achieve them are clear and specific.
Diversity and complexity of attack techniquesAnalyze the diversity and complexity of techniques used in the attack scenario to assess the realism of the scenario.
Completeness and connectivity of the attack pathEvaluate whether each step of the attack is logically connected and whether the overall path is well-constructed.
Mapping to the MITRE ATT&CK frameworkReview whether the attack scenario is properly mapped to the MITRE ATT&CK framework.
Mapping to NIST SP 800-53 security controlsVerify if each stage of the attack scenario is consistently mapped to the NIST SP 800-53 Rev. 5 security control items.
Application of CVE and RTOS vulnerabilitiesAssess whether the latest CVE and RTOS vulnerabilities are appropriately mapped within the attack scenario.
Construction of defense scenariosSpecificity and clarity of defense objectivesEvaluate whether the final objectives of the defense scenario and the steps to achieve them are clear and specific.
Diversity and complexity of defense techniquesAssess the diversity and complexity of the defense techniques used in the scenario to determine the ability to respond to various attacks.
Completeness and connectivity of the defense pathEvaluate whether each step of the defense scenario is logically connected and whether the overall path is well-constructed.
Mapping to the MITRE D3FEND frameworkReview whether the defense scenario is properly mapped to the MITRE D3FEND framework.
Effectiveness of mitigation techniquesAssess whether the mitigation techniques defined in the defense scenario operate effectively in real environments.
Mapping to NIST SP 800-53 security controlsVerify if each stage of the defense scenario is consistently mapped to the NIST SP 800-53 Rev. 5 security control items.
Table 10. Standard evaluation items for MRA.
Table 10. Standard evaluation items for MRA.
Evaluation ItemDescriptionEvaluation MethodEvaluation Results
Attack success rateAssesses how often an attack succeeds during defense testing.Calculate the success rate for each attack scenario and analyze major vulnerabilities.Out of five scenarios, three attacks succeeded (60% success rate).
Detection and response timeEvaluates the time it takes for the security system of the autonomous vehicle to detect and respond to an attack.Measure the time by analyzing logs and alert messages from the security system after executing the attack scenario.Average detection time: 30 s, average response time: 45 s.
System resilienceAssesses the ability of the autonomous vehicle to return to normal state after an attack.Measure system recovery time after the attack and analyze any issues in the recovery process.Average recovery time: 3 min, with some errors occurring during the recovery process.
Effectiveness of security controlsEvaluates how well existing security controls block attacks.Observe the operation of security controls during attack testing and analyze both successfully blocked and failed attacks.2 out of 5 attacks were blocked by security controls (40% blocking rate).
Vulnerability exposureAssesses the number and severity of identified vulnerabilities during testing.Analyze the final vulnerability results based on the severity and scores of CWE and CVSS mapped to NIST 800-53 Rev. 2.6 major vulnerabilities identified, 2 of which were rated as severe.
Appropriateness of risk response planEvaluates whether the risk response plan is appropriately established based on test results.Review whether a response plan for vulnerabilities is established and assess its feasibility.Response plan for vulnerabilities was established and rated as feasible.
Compliance with security policiesEvaluates whether the defense testing process complies with existing security policies and regulations.Review whether the test scenarios and procedures conform to security policies.Test conducted in accordance with security policies, with some adjustments needed for certain regulations.
Table 11. Results of MLRA.
Table 11. Results of MLRA.
No.Evaluation ItemPass Status
65Is the attack mitigation rate and the time to detect and respond to those attacks adequate?[v] Yes/[ ] No
66Is the system resilient to attacks, and are the effectiveness of security controls adequate?[v] Yes/[ ] No
67Does the defense testing process comply with existing security policies and regulations?[v] Yes/[ ] No
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sung, S.-I.; Kim, D. Manufacturing Readiness Assessment Technique for Defense Systems Development Using a Cybersecurity Evaluation Method. Systems 2025, 13, 738. https://doi.org/10.3390/systems13090738

AMA Style

Sung S-I, Kim D. Manufacturing Readiness Assessment Technique for Defense Systems Development Using a Cybersecurity Evaluation Method. Systems. 2025; 13(9):738. https://doi.org/10.3390/systems13090738

Chicago/Turabian Style

Sung, Si-Il, and Dohoon Kim. 2025. "Manufacturing Readiness Assessment Technique for Defense Systems Development Using a Cybersecurity Evaluation Method" Systems 13, no. 9: 738. https://doi.org/10.3390/systems13090738

APA Style

Sung, S.-I., & Kim, D. (2025). Manufacturing Readiness Assessment Technique for Defense Systems Development Using a Cybersecurity Evaluation Method. Systems, 13(9), 738. https://doi.org/10.3390/systems13090738

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop