Next Article in Journal
HFMM-Net: A Hybrid Fusion Mamba Network for Efficient Multimodal Industrial Defect Detection
Previous Article in Journal
Uncertainty-Aware QoS Forecasting with BR-LSTM for Esports Networks
Previous Article in Special Issue
ZeroDay-LLM: A Large Language Model Framework for Zero-Day Threat Detection in Cybersecurity
error_outline You can access the new MDPI.com website here. Explore and share your feedback with us.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Integrating the CRA into the IoT Lifecycle: Challenges, Strategies, and Best Practices

by
Miguel Ángel Ortega Velázquez
1,2,*,
Iris Cuevas Martínez
2 and
Antonio J. Jara
2
1
Department of Information and Communication Technologies, Universidad Politécnica de Cartagena (UPCT), 30202 Cartagena, Spain
2
Libelium Lab, 30562 Murcia, Spain
*
Author to whom correspondence should be addressed.
Information 2025, 16(12), 1017; https://doi.org/10.3390/info16121017
Submission received: 9 October 2025 / Revised: 13 November 2025 / Accepted: 17 November 2025 / Published: 22 November 2025
(This article belongs to the Special Issue Cyber Security in IoT)

Abstract

The European Union’s Cyber Resilience Act (CRA) introduces a complex set of binding lifecycle security obligations, presenting a significant compliance challenge for the Internet of Things (IoT) industry. This study addresses this challenge by developing a comprehensive CRA mapping framework specifically tailored to the IoT sector. The core contribution is a detailed lifecycle-based checklist that translates the regulation’s legal mandates into an actionable blueprint for manufacturers. Beyond the checklist itself, this paper’s core contribution is a transparent two-phase methodology. The first phase provides a structured pipeline to translate dense legal text (from CRA Articles 13–14 and its annexes) into atomic testable engineering requirements. The second phase builds a quantitative rating tree using the Analytic Hierarchy Process (AHP) to weigh these requirements, providing a consistent and evidence-based scoring rubric. By synthesizing the complex regulatory landscape and the technical state of the art, this paper operationalizes the CRA’s requirements for governance, secure design, vulnerability management, and conformity assessment. The framework is validated in the TRUEDATA case, yielding a weighted readiness score and a sensitivity analysis that underpin the reliability of the findings.

Graphical Abstract

1. Introduction

The European Union’s Cyber Resilience Act (Regulation (EU) 2024/2847, or the CRA) officially came into force on 10 December 2024 following its signing on 23 October and publication on 20 November of that year [1]. While its main provisions will become fully applicable from 11 December 2027, it is crucial to recognize that the regulation introduces a phased timeline; for instance, the mandatory reporting of security vulnerabilities will take effect as early as September 2026. The act establishes a baseline for cybersecurity for all products with digital elements (PDEs) placed on the market across the European Economic Area, with particular relevance for the Internet of Things (IoT). The intent is clear: security must be an intrinsic property of connected products throughout their entire lifecycle, not merely an afterthought.
The expansion of the IoT has been striking. Figures often cited—based on Statista’s series—show around 11.7 billion IoT devices in 2020 and a projection of roughly 30.9 billion by 2025 [2]; more recent market trackers (e.g., IoT Analytics) revise the 2025 expectations downward to ∼27 billion [3]. This illustrates the sheer scale of deployment across domestic, industrial, and urban environments, widening the attack surface and increasing users’ exposure to risk, stretching the limits of the existing security arrangements.
Before the CRA, regulation of cybersecurity in connected environments was anchored in instruments such as the Network and Information Systems Directive (NIS) (EU 2016/1148) [4], the General Data Protection Regulation (GDPR) (EU 2016/679) [5], and the EU Cybersecurity Act (Regulation (EU) 2019/881) [6]. While these laid important foundations—risk management, resilience, and trust services—they did not impose detailed enforceable obligations for security by design and by default, mandate secure automatic updates, or set strict timelines for reporting vulnerabilities and incidents. Closely related developments include the Radio Equipment Directive (RED) [7] and its cybersecurity delegated act [8], as well as the European Common Criteria-based certification scheme (EUCC) [9].
The CRA introduces a step change. It requires risk assessment from the earliest design stages, continuous vulnerability testing, secure automatic update mechanisms, the generation and maintenance of a machine-readable Software Bill of Materials (SBOM), and time-bound notifications of vulnerabilities and incidents to competent authorities and end-users. In consequence, practical guidance is needed that goes beyond legal exposition.
This study proposes a CRA compliance framework: a pragmatic methodology to guide IoT manufacturers from initial assessment to full compliance. To this end, this work makes three contributions. First, it introduces a transparent legal-to-engineering methodology that converts CRA provisions into atomic lifecycle-mapped requirements with end-to-end traceability. Second, it operationalizes a quantitative rating tree, deriving defensible domain weights via the AHP and verifying judgment consistency. Third, it provides a reproducible evidence rubric and aggregation formula that yield auditable readiness scores, demonstrated in the TRUEDATA case. Based on these contributions, this article pursues the following objectives:
  • To analyze the essential requirements of the Cyber Resilience Act (CRA), positioning the CRA within the wider European regulatory fabric (including NIS2, the Radio Equipment Directive—RED [7], the GDPR, and the EU Cybersecurity Act/EUCC) and highlighting what materially changes for IoT manufacturers.
  • To formalize and detail a two-phase methodology. Phase 1 consists of a systematic process for extracting, normalizing, and mapping legal requirements from the CRA onto the Development, Security, and Operations (DevSecOps) lifecycle. Phase 2 introduces a quantitative assessment model, using the Analytic Hierarchy Process (AHP), to derive consistent risk-informed weights for compliance domains and enable a structured evaluation of readiness.
  • To demonstrate practical feasibility through a site-agnostic case study in the TRUEDATA project, including a rating-tree evaluation and a balanced radar profile of control effectiveness.
  • To identify the principal technical and organizational challenges for manufacturers and to discuss the lessons learned, common pitfalls, and improvement pathways for organizations seeking readiness ahead of 2027 while outlining areas where further standardization and research are needed.
The article is organized as follows. Section 2 reviews the European regulatory framework. Section 3 surveys the literature and industrial practice. Section 4 presents the two-phase methodology for operationalizing the CRA. Section 5 sets out the practical implications and best practices. Section 6 reports the TRUEDATA case study and project-wide scoring. Section 7 discusses the open challenges and lessons learned. Section 8 concludes with recommendations.

2. Regulatory Framework

The Cyber Resilience Act (CRA)—Regulation (EU) 2024/2847—stands as the central pillar of the European Union’s new product-cybersecurity paradigm. Applicable from 11 December 2027, it establishes a horizontal framework for all products with digital elements (PDEs). Its core mandate is to ensure cybersecurity throughout the product lifecycle by imposing essential security requirements, vulnerability management and disclosure obligations, secure update mechanisms, and detailed technical documentation. A key mechanism is the presumption of conformity: products certified under specified EU schemes (such as the EUCC) may be deemed compliant with relevant CRA requirements [1,9].
To help navigate this complex legislative landscape, Figure 1 provides a visual timeline. It maps the key milestones of the CRA’s implementation against the broader context of related EU cybersecurity and product safety regulations.
To situate the CRA, it must be read within a multi-layered ecosystem of complementary regulations. NIS2—applicable from 18 October 2024—sets cybersecurity and incident-reporting rules for ‘essential’ and ‘important’ entities and is the demand-side counterpart to the CRA’s supply-side focus [10]. RED and its cybersecurity delegated act—applicable from 1 August 2025—activate essential requirements for radio equipment (including much of the IoT), mandating network protection, privacy safeguards, and fraud prevention; harmonized standards under the RED provide foundational technical building blocks [7,8,11]. The EU Cybersecurity Act creates the union’s certification framework under which the EUCC is the first horizontal scheme based on Common Criteria; the CRA leverages this framework for presumption of conformity [6,9]. The General Product Safety Regulation (GPSR)—applicable from 13 December 2024—provides a horizontal safety net for consumer products and reinforces market-surveillance mechanisms on which the CRA also relies [12].
Beyond this core, other horizontal and vertical instruments interlock with the CRA. The GDPR enforces privacy by design, complementing the CRA’s security-by-design mandate. The Data Act governs fair access to and use of data from connected products and presupposes secure access pathways. The Artificial Intelligence (AI) Act coordinates with the CRA to avoid duplication for high-risk AI systems, aligning robustness and cybersecurity requirements [5,13,14]. In digital trust and connectivity, electronic IDentification, Authentication and trust Services (eIDAS) 2 underpins trusted identities and services (e.g., signatures and seals), while the European Electronic Communications Code (EECC) sets security expectations for underlying communication networks [15,16]. Sectorally, the Digital Operational Resilience Act (DORA) mandates digital operational resilience in finance; the Critical Entities Resilience (CER) Directive ensures the resilience of critical entities; and both the Medical Devices Regulation (MDR) and the Machinery Regulation incorporate cybersecurity for their product domains, with the CRA providing a horizontal minimum [17,18,19,20].
This layered structure—summarized in Table 1—presents a comparative view of the principal EU instruments related to the CRA, detailing their scope, core focus, key obligations for the IoT, and regulatory approach. It clarifies the complementary roles of each instrument: horizontal regulations such as the CRA and GDPR establish universal baselines for product security and data protection; vertical instruments such as the RED target specific product categories with tailored requirements; and NIS2 adopts a sectoral perspective for operators of essential services. Certification under the Cybersecurity Act (e.g., EUCC) offers a formal route to demonstrate conformity. Within this architecture, the CRA is the first horizontal regulation to define detailed cybersecurity obligations for connected products in the EU market. The next section turns to the state of the art in cyber resilience for the IoT, spanning academic and industrial practice.

3. Related Work

The evolution of the IoT has generated a growing volume of heterogeneous devices operating under power, capacity, and connectivity constraints, which poses new cybersecurity challenges that the regulatory framework—and the CRA specifically—must address. The existing literature reveals a multi-faceted approach, spanning from deep technical analyses of firmware and software components to regulatory mappings, socio-technical studies, and initial industry experiences. This section synthesizes these diverse contributions to build a comprehensive picture of the current landscape, critically analyses the most relevant technical standards and methodologies (ETSI EN 303 645 [21], IEC 62443 [22], NIST IR 8259 [23], Secure Software Development Framework (SSDF), and OWASP Software Assurance Maturity Model (SAMM)/Building Security In Maturity Model (BSIMM)), and contrasts them with the existing literature to identify the gap this paper aims to fill.
From a technical standpoint, a significant body of research has focused on the foundational challenges of firmware security, which is central to the CRA’s objectives. The work of Butun et al. [24] provides a comprehensive catalogue of over one hundred attacks, highlighting the breadth of the threat landscape. This is complemented by extensive surveys from Bakhshi et al. [25] and Haq et al. [26], which offer a holistic view of the firmware environment, covering architecture, extraction techniques, and vulnerability analysis frameworks. A key takeaway from these studies is the complexity manufacturers face in securing the core of their products. Delving deeper, Feng et al. [27] describe specific analysis methods like emulation, static analysis, and fuzzing, concluding that hybrid solutions are necessary to handle the diverse architectures of IoT devices—a conclusion that directly informs the technical feasibility of complying with the CRA’s mandate to ship products free of known exploitable vulnerabilities.
A second critical technical area is ensuring the integrity and transparency of software components, a cornerstone of the CRA through its mandate for a Software Bill of Materials (SBOM). The importance of an SBOM is explored from a business perspective by Kloeg et al. [28], who identify the stakeholder-specific risks and benefits that drive or inhibit adoption. Their finding that System Integrators and Software Vendors are the most likely drivers highlights the complex ecosystem dynamics that will shape CRA compliance. Nocera et al. [29] provide empirical evidence of this slow adoption, showing that many SBOMs in open-source projects are incomplete or non-compliant. This research grounds the CRA’s SBOM requirement in reality, showing that, while the mandate is clear, its practical and effective implementation remains a significant challenge.
Beyond the device itself, ensuring its integrity in a networked environment is crucial. The work by Ankergård et al. [30] and the comprehensive survey by Ambrosin et al. [31] are highly relevant. These studies review protocols designed to remotely verify the software integrity of single or entire networks of devices, offering potential technical solutions for manufacturers to meet the CRA’s implicit requirement for post-market monitoring and to ensure that devices remain secure throughout their lifecycle.
On the regulatory and socio-technical front, several studies have analyzed the CRA’s place within the existing legal framework. Ruohonen et al. [32] conduct a systematic cross-analysis, confirming the CRA’s role in filling specific gaps left by previous legislation. Shaffique [33] takes a more critical stance, underscoring the ambiguity of key regulatory terms like “limited attack surface.” He argues that, without harmonized technical specifications from the European Union Agency for Cybersecurity (ENISA), often based on standards such as ETSI EN 303 645 or IEC 62443, these concepts lack the necessary concreteness for effective implementation. This critique is echoed by Vescovi [34], who argues that regulatory effectiveness depends on creating bidirectional channels of trust with consumers, suggesting that mere compliance is insufficient without a focus on user experience. A critical and often overlooked issue is addressed by van ’t Schip [35], who investigates the problem of “orphan devices” when a manufacturer ceases operations. This work highlights a significant gap in the current legislation, including the CRA, which assumes manufacturer longevity for providing security updates, pointing to the need for solutions like interoperability and open-source software to ensure long-term device resilience.
Finally, a few studies provide insights from early industrial experiences. The work of Jara et al. [36] describes the TRUEDATA project, demonstrating the technical feasibility of meeting CRA-like requirements in industrial settings. In contrast, the survey by Risto et al. [37] reveals significant organizational challenges faced by heavy machinery manufacturers, including immature secure development processes and difficulties in producing SBOMs. This contrast illustrates the critical gap between what is technically possible and what is organizationally practical for many companies.
While the existing literature provides a solid foundation by exploring technical solutions, analyzing regulatory overlaps, and highlighting industrial challenges, a significant gap remains. The existing landscape of security standards and models, while comprehensive, is often fragmented by scope and focus. As summarized in Table 2, these standards provide vital distinct pieces of the compliance puzzle.
At the organizational level, ISO/IEC 27001:2022 is the pre-eminent standard for an Information Security Management System (ISMS). Its focus is holistic and process-driven, auditing the organization’s capacity to manage information security. However, its scope is a key limitation for the CRA: ISO 27001 is a high-level voluntary management standard, not a product-level engineering guide. It does not, by itself, provide a direct translation of the CRA’s specific legal articles (e.g., on SBOMs or vulnerability timelines) into verifiable technical requirements for a given product with digital elements (PDE).
At the product level, ETSI EN 303 645 has become a key baseline for consumer IoT security, defining what constitutes a secure product by specifying core provisions, such as the prohibition of universal default passwords and the need for a vulnerability disclosure policy [21]. While its principles are foundational, it remains a high-level list of goals, not a methodology for auditable integration throughout the DevSecOps lifecycle.
Other frameworks complement these: rigorous secure development lifecycle (SDL) frameworks, such as IEC 62443 and the NIST SSDF, provide the process for implementation, while organizational maturity models like OWASP SAMM and BSIMM offer the means to measure corporate capability. However, none of these, in isolation, offers a complete legally traceable methodology for operationalizing and quantifying compliance with the CRA’s specific binding obligations.
However, most studies focus on either a deep technical problem (like remote attestation or SBOM generation) or a high-level legal or industrial analysis. Few, if any, bridge this gap by formalizing a methodology for translating the CRA’s high-level legal obligations into a comprehensive, actionable, and lifecycle-based compliance model. This requires engaging with two specific academic domains: requirement engineering from legal texts and quantitative cybersecurity assessment.

3.1. Systematic Approaches to Regulatory Requirement Engineering

Translating legal regulations into verifiable software requirements is a well-established challenge in requirement engineering [39]. Legal texts are often characterized by ambiguity, complexity, and cross-references, making direct extraction an error-prone process [40]. To address this, researchers have proposed systematic methods for acquiring and specifying legal requirements. One of the most prominent is the Frame-Based Requirements Analysis Method (FBRAM), which uses a domain-independent upper ontology, a markup language, and a regulatory document model to systematically parse legal text into structured requirement frames [41]. This approach provides a high degree of rigor and traceability. Other research has explored the use of Natural Language Processing (NLP) and machine learning to semi-automate the extraction of key information, such as obligations, permissions, and prohibitions, from legal documents [42]. While these formal methods offer significant academic rigor, they can be heavyweight and require specialized expertise, potentially limiting their direct adoption in agile engineering environments. This paper proposes a more pragmatic, although less formalized, approach inspired by these principles. It uses a structured semantic annotation process tailored to the CRA, focusing on extracting actionable engineering tasks and mapping them directly onto a DevSecOps lifecycle, thus filling a niche between pure legal informatics and ad hoc industrial practice.

3.2. Quantitative Models for Cybersecurity Assessment

Once requirements are defined, assessing compliance requires a measurement framework. Traditional qualitative assessments, which categorize risk as ‘high’, ‘medium’, or ‘low’, are often criticized for their subjectivity and lack of a clear basis for cost–benefit analysis [43]. This has led to the development of quantitative risk assessment frameworks that assign numerical values to risk components to provide a more objective basis for decision-making [44]. A key challenge in quantitative assessment is determining the relative importance, or weight, of different security controls or compliance domains. Multi-Criteria Decision-Making (MCDM) methods are well-suited for this task. The Analytic Hierarchy Process (AHP), in particular, is a widely used technique in cybersecurity for deriving priorities and weights based on expert judgment [45]. The AHP uses pairwise comparisons to establish the relative importance of criteria and includes a mechanism to check the consistency of the judgments, thereby adding a layer of validation to the weighting process [46]. By grounding the assessment model in the AHP, this work moves beyond simple arbitrary weighting schemes and adopts a recognized academically validated method for decision support.
Beyond these established models, an emerging frontier of research integrates AI- and graph-based methods for modeling resilience and compliance. For instance, the work by Zhu et al. (2025) on AI-driven hypergraph neural networks, whilst applied to energy economics, demonstrates a methodology for modeling higher-order interactions in complex multi-factor systems [47]. This approach is conceptually relevant for modeling dynamic risk propagation in critical applications, such as the water or petrochemical infrastructures covered by the CRA’s scope. Similarly, the study by Cai (2025) on robust semantic noise for secure communication networks provides a model for enhancing data-level resilience [48]. This research on protecting the semantic integrity of transmissions, particularly for sensitive data, aligns with the CRA’s essential requirements for data confidentiality and integrity, informing future adaptive monitoring frameworks. These advanced modeling approaches represent a key future direction, one that requires a foundational legally traceable framework—such as the one proposed in this paper—to build upon.
This paper aims to fill this void by providing precisely that: a clear, detailed checklist that operationalizes the CRA’s requirements across the entire product lifecycle. It creates a unified actionable framework that synthesizes these disparate elements through the specific legal lens of the CRA. The proposed checklist does not seek to reinvent security principles but to provide a pragmatic, legally grounded, and lifecycle-oriented translation layer that bridges the critical gap between technical best practice and legal obligation. This contribution is designed to serve as a practical first-step guide for manufacturers navigating the complexities of the new regulatory landscape.

4. A Formal Methodology for Operationalizing the CRA

The entry into force of the CRA demands a structured and repeatable approach to compliance. Ad hoc interpretations or simple checklists risk being incomplete, inconsistent, and difficult to defend during an audit. This section presents a formal two-phase methodological framework designed to systematically translate the legal text of the CRA into a quantifiable lifecycle-based compliance model. Phase 1 details the process of requirement distillation and mapping, while Phase 2 describes the development of a quantitative assessment model using the Analytic Hierarchy Process (AHP). A high-level visual summary of this entire two-phase methodology is presented in Figure 2.

4.1. Phase 1: Systematic Requirement Distillation and Lifecycle Mapping

This phase addresses the challenge of transforming dense legal prose into a set of clear, atomic, and actionable engineering and governance requirements. It follows a four-step process, summarized in Table 3, designed to ensure traceability from the original legal article to the final checklist item.
  • Step 1: Corpus Definition and Scoping. The process begins by defining the legal corpus for analysis. For the CRA, the primary sources of manufacturer obligations are Articles 13 (“Obligations of the manufacturer”) and 14 (“Reporting obligations of the manufacturer”), along with the detailed essential requirements specified in Annex I (“Essential cybersecurity requirements”), the technical documentation requirements in Annex VII, and the user information requirements in Annex II. These sections constitute the core set of legal mandates that must be translated.
  • Step 2: Semantic Annotation and Requirement Extraction. A structured reading of the corpus is performed to identify and extract normative statements. Inspired by legal text analysis frameworks [49], this involves a manual semantic annotation process to tag key components of each obligation:
    • Subject: The entity responsible for the action (e.g., ‘the manufacturer’).
    • Modal Verb: The word indicating the nature of the obligation (e.g., ‘shall’ or ‘must’).
    • Action: The core activity required (e.g., ‘ensure by design’, ‘draw up’, or ‘provide security updates’).
    • Object: The target of the action (e.g., ‘that products are placed on the market without any known exploitable vulnerabilities’, ‘the EU Declaration of Conformity’, or ‘for a period of at least five years’).
    This structured extraction process deconstructs complex sentences into their fundamental components, forming the basis for clear requirement statements.
  • Step 3: Requirement Normalization. The extracted components are then normalized into a consistent atomic format. Each distinct obligation is rephrased as a standalone requirement, typically following a ‘Subject–Action–Object’ structure (e.g., “The manufacturer shall provide security updates for a defined support period”). This step ensures that each checklist item is unambiguous, verifiable, and represents a single testable condition, which is a best practice in requirement engineering [49].
  • Step 4: Lifecycle Phase Traceability. Finally, each normalized requirement is mapped to one or more phases of the DevSecOps lifecycle, as depicted in Figure 3. This mapping is performed using a rubric based on the primary activity type associated with the requirement. For example:
    • Plan/Design: Requirements related to risk assessment, threat modeling, and defining security features by design.
    • Code/Build: Requirements related to secure coding practices, managing third-party components, and generating SBOMs.
    • Test/Release: Requirements related to vulnerability testing, ensuring no known exploitable vulnerabilities are present at release, and preparing user documentation.
    • Operate/Monitor: Requirements related to post-market surveillance, vulnerability handling, and incident reporting.
    • Governance (Cross-Cutting): Requirements related to establishing policies (e.g., CVD), defining support periods, and maintaining technical documentation.
    This traceability ensures that compliance activities are integrated directly into the existing engineering workflow rather than being treated as a separate after-the-fact process.
To provide a concrete example of this process, consider the legal text from Article 13(17) [1]:
“Manufacturers shall designate a single point of contact to enable users to communicate directly and rapidly with them, including in order to facilitate reporting on vulnerabilities of the product with digital elements.”
Applying the four-step method:
  • Step 1 (Corpus): The text is identified as a core manufacturer obligation under Article 13.
  • Step 2 (Extraction): The primary obligation is extracted: subject: ‘manufacturers’, modal: ‘shall’, action: ‘designate’, object: ‘a single point of contact’, purpose: ‘to enable users to communicate... [and] report... vulnerabilities’.
  • Step 3 (Normalization): This is normalized into an atomic requirement: “The manufacturer shall designate a single point of contact for user communication and vulnerability reporting.”
  • Step 4 (Mapping): This requirement relates to foundational cross-cutting governance rather than a specific technical phase. It is therefore mapped to the ‘governance and scope (pre-design)’ phase (see Appendix A, Table A1).
Furthermore, to ensure the reproducibility and validity of this translation, the requirement extraction and lifecycle mapping for the entire corpus were conducted by a panel of two authors independently. These initial mappings were then compared, and any discrepancies were adjudicated by a third author to reach a unified consensus. This iterative expert review process was employed to mitigate subjective interpretation and to validate the final set of requirements presented in Appendix A.

4.2. Phase 2: A Quantitative Compliance Assessment Model Using AHP

This phase formalizes the ‘rating-tree’ and weighting methodology, creating a quantitative model to assess compliance readiness. It provides a structured defensible basis for scoring and prioritization, moving beyond subjective evaluation.
Step 1: Hierarchy Construction. The normalized requirements from Phase 1 are grouped into a logical hierarchy. This structure, often referred to as a rating tree, organizes the granular checklist items into broader compliance domains. For this study, the requirements were categorized into primary domains, such as ‘secure by design and development’, ‘vulnerability management and patching’, and ‘SBOM and supply chain’. This hierarchical structure forms the foundation for the AHP analysis.
Step 2: Deriving Domain Weights via the Analytic Hierarchy Process (AHP). To determine the relative importance of each compliance domain, the Analytic Hierarchy Process (AHP) was employed [50]. The AHP is a structured technique for organizing and analyzing complex decisions based on mathematics and psychology.
The process involves the following:
  • Pairwise Comparison: The high-level compliance domains are compared against each other in pairs. The pairwise comparison was performed by an expert panel composed of the three authors. The selection criterion for this panel was their direct combined domain expertise, which spans (i) cybersecurity standardization and IoT architecture, (ii) industrial OT/IT system engineering, and (iii) regulatory compliance and risk management. This cross-functional expertise was considered essential to provide a balanced judgment on the relative risks across technical, operational, and governance domains. The fundamental question guiding the judgment was “Which of these two domains, if deficient, poses a greater systemic risk to the overall cybersecurity and compliance posture of a product with digital elements under the CRA?”
  • Judgment Scale: Judgments are made using Saaty’s fundamental 1–9 scale, where 1 indicates equal importance and 9 indicates that one element is extremely more important than the other.
  • Matrix Construction and Priority Vector Calculation: The judgments are used to populate a pairwise comparison matrix, from which the principal eigenvector is calculated to derive the priority vector, representing the normalized weights for each domain.
  • Consistency Check: A key advantage of the AHP is its ability to measure the consistency of judgments. The Consistency Index (CI) and Consistency Ratio (CR) are calculated. A CR value of less than 0.10 is generally considered to indicate an acceptable level of consistency, ensuring the derived weights are not random or contradictory [50].
This process transforms subjective expert opinion into a set of mathematically consistent and validated weights.
Step 3: Scoring Rubric and Aggregation. With the domain weights established, a scoring rubric is defined to assess the implementation status of each individual checklist item. A simple, clear scale is used as follows:
  • 0: Not Implemented: No evidence of the control or process exists.
  • 1: Partially Implemented: The control or process is ad hoc, incomplete, or not consistently applied.
  • 2: Fully Implemented: The control or process is formally defined, consistently applied, and supported by evidence.
The final project-wide compliance score is calculated using a weighted aggregation formula:
Compliance Score = i = 1 n ( W i × S i )
where n is the number of compliance domains, W i is the AHP-derived weight of domain i, and S i is the average score of all checklist items within that domain. This formula produces a single quantifiable measure of overall compliance readiness, directly traceable to both the legal text and the risk-based domain weights.

5. Practical Implications of the CRA in the IoT

The entry into force of the CRA not only demands compliance with new requirements but also drives a comprehensive transformation that spans product engineering, internal governance, and auditing. The main technical and organizational implications are outlined below, indicating how to close regulatory gaps and illustrating challenges and solutions through real-world experiences. A concise summary of recommended practices is provided in Table 4.
To translate the principle of security by design into concrete actions, threat-modeling activities should be embedded from the outset to map critical assets, attack vectors, and realistic exploitation scenarios. These outputs must become visible design requirements in the development backlog and feed automated tests in an expanded Continuous Integration/Continuous Deployment (CI/CD) pipeline—combining static code analysis, symbolic fuzzing, and dynamic firmware testing in emulation environments (Narrowband Internet of Things (NB-IoT); 4G, Long Range (LoRa))—as captured in Table 4 (secure design; automated testing).
In parallel, the CRA mandates machine-readable SBOMs and secure automatic over-the-air (OTA) update mechanisms. Every dependency change should trigger SBOM generation (CycloneDX or SPDX), license validation, and Common Vulnerabilities and Exposures (CVE) queries to spot known vulnerabilities. The OTA channel must use encrypted and digitally signed protocols, include a clear user opt-out, and be complemented by collective remote attestation (e.g., Scanning, Analysis, Response and Assessment (SARA)) to verify the integrity of devices in production (see Table 4, SBOM and traceability; patch delivery).
At the organizational level, the CRA expects a Cybersecurity Management System (CSMS) that defines roles, responsibilities, and workflows. Under a Development, Security, and Operations (DevSecOps) operating model, Development, Quality Assurance (QA), Operations, and Compliance should review Key Performance Indicators (KPIs) regularly (e.g., vulnerability counts, mean time to remediation, SBOM coverage, and attestation results). A coordinated vulnerability disclosure (CVD) policy is essential. This policy must define the public-facing process for security researchers to report issues and the internal workflows for meeting the CRA’s strict 24 h alert and 72 h initial reporting deadlines. Using standardized formats, such as Structured Threat Information eXpression (STIX) or Common Vulnerability Reporting Framework (CVRF), is recommended to ensure interoperability (Table 4, coordinated vulnerability disclosure (CVD); DevSecOps Governance).
Document analysis reveals gaps in the precision of essential concepts: terms such as “without exploitable vulnerabilities” or “limited attack surface” lack detailed technical specifications. To bridge these gaps, manufacturers should maintain internal audit guides—grounded in standards like IEC 62443 or ETSI EN 303 645—that specify which fuzzing techniques apply to each device category, how to measure surface reduction on interfaces, and what criteria validate an SBOM.
Real-world experiences echo these points. In Libelium’s TRUEDATA project, integrating Open Mobile Alliance Lightweight Machine-to-Machine (OMA LwM2M) probes with a Security Information and Event Management (SIEM) system (Elasticsearch/Logstash/Grafana) enabled monitoring of tens of thousands of devices, event correlation, and automated responses, albeit with ingestion bottlenecks that required pipeline and parser optimization. Likewise, industrial equipment manufacturers acknowledge that, without up-to-date SBOMs and agile notification processes, regulatory timelines may be missed, increasing operational and reputational risk.
To synthesize these practices into a coherent operational model, Figure 3 presents a visual map of the DevSecOps lifecycle tailored to CRA compliance. This workflow illustrates how the lifecycle phases and their corresponding actions—detailed further in Appendix A—integrate into a continuous cycle of development, security, and operations.
To maintain the narrative flow of the analysis whilst still providing a detailed actionable tool for manufacturers, the complete breakdown of this checklist is presented in Appendix A. This appendix details the specific actions, expected evidence, frequency, and direct references to the CRA articles, serving as a granular implementation guide.
As detailed in Appendix A, the lifecycle framework is broken down into several key phases, each with specific obligations. The governance and scope phase is foundational (see Table A1): it begins with a formal PDE classification that determines the conformity-assessment route, the establishment of a CSMS (policies, KPIs, and review cadence), the public statement of the security support period, and the publication of a CVD policy with a single point of contact.
The Secure Design and Development phase operationalizes security by design (see Table A2). Risk is treated as a living artifact, re-evaluated at each milestone. Threat-modeling frameworks such as STRIDE (Spoofing, Tampering, Repudiation, Information disclosure, Denial of service, and Elevation of privilege) and LINDDUN (Linkability, Identifiability, Non-repudiation, Detectability, Disclosure of information, Unawareness, and Non-compliance) translate risks into actionable user stories and acceptance criteria. Technical priorities include hardening interfaces to minimize attack surface, enforcing least-privilege defaults, protecting integrity (secure boot, signing, and anti-rollback), and applying data minimization and CIA (Confidentiality, Integrity, and Availability) controls—alongside due diligence on third-party and open-source software.
Vulnerability Management and Support sustain resilience over time (see Table A3). A formal Product Security Incident Response Team (PSIRT) process handles reports under defined SLAs; proactive monitoring of National Vulnerability Database (NVD)/GitHub Security Advisories (GHSAs) feeds triage and patching; CI/CD gates block releases with known exploitable vulnerabilities; OTA updates are automatic by default, securely delivered, and supported for the declared period; every build ships with a machine-readable SBOM (and, where relevant, VEX) and clear user communications.
The CRA also imposes strict timelines for the Notification of Exploited Vulnerabilities and Incidents (see Table A4), including a 24 h early warning to CSIRTs/ENISA, a 72 h initial notification, and a final report after remediation—together with timely information for affected users.
Post-market obligations ensure continuity of compliance (see Table A5): manufacturers must uphold the support period, retain technical documentation (including the EU Declaration of Conformity (DoC)), ensure series-production conformity, cooperate with Market Surveillance Authorities, and plan End-of-Life/End-of-Support with appropriate user guidance.
Finally, conformity assessment and CE marking are the gateway to the market (see Table A6), culminating in the EU Declaration of Conformity and CE marking. Clear and accessible user information (see Table A7)—from secure installation and updating instructions to contact points and SBOM/VEX access—completes the lifecycle.
By following this structured approach, the CRA’s practical implications become a manageable roadmap that supports both regulatory compliance and genuine cyber resilience across the IoT ecosystem.

6. TRUEDATA Real Experience: CRA in Practice

This section demonstrates the practical application and validates the feasibility of the proposed two-phase compliance framework. The framework is applied to the TRUEDATA project to produce a single project-wide (site-agnostic) score and readiness profile.
The TRUEDATA project was specifically selected for this validation exercise precisely because it represents a high-stakes complex industrial (OT) environment rather than a simpler consumer IoT product. It is a cybersecurity initiative, funded by INCIBE (Instituto Nacional de Ciberseguridad (Spain)) within the European Union recovery and resilience framework, aimed at strengthening the resilience of critical water infrastructures (drinking-water treatment plants, wastewater facilities, desalination plants, pumping stations, and dams). Its scope—protecting essential assets against threats targeting SCADA (Supervisory Control And Data Acquisition) systems, Programmable Logic Controllers (PLCs), and Operational Technology (OT)/Information Technology (IT) networks—provides a rigorous test case. The rationale is that a framework proven sufficient for this demanding critical-infrastructure context should be highly adaptable to other IoT domains. As an applied R&D project, its stated goal is to advance a solution from Technology Readiness Level (TRL) 4 to TRL 9, focusing on integrating IoT, AI, and blockchain technologies for active protection and operational resilience. The analysis applies the full framework detailed in Section 4. First, the checklist derived from Phase 1 (see Appendix A) was used to assess the project’s current state. Second, the quantitative model from Phase 2 was used to calculate a weighted compliance score.
Following the methodology in Section 4.2, an AHP analysis was conducted to derive the weights for the eight primary compliance domains used in the assessment. The pairwise comparison matrix, based on the expert judgment of the authors, is presented in Table 5.
Furthermore, to validate the stability of these priorities beyond the consistency check (CR = 0.014), a sensitivity analysis was performed [51]. This involved simulating ‘what-if’ scenarios by altering the judgments of the highest-weighted criterion (‘secure by design’) [52]. Even with a 20% variation in its principal judgments, the criterion remained the top-ranked domain. When propagated to the final assessment, this variation altered the aggregated score (detailed later in this section) by less than ±4%, demonstrating that the model’s overall assessment is robust and not unduly sensitive to minor variations in expert judgment.
The resulting weights clearly prioritize ‘secure by design’ (32.2%) and ‘vulnerability management’ (21.9%) as the most critical domains for overall compliance, reflecting their central role in the CRA’s lifecycle security philosophy. Figure 4 visualizes the resulting readiness profile against a pilot-ready target.
To provide context for this analysis, the system’s high-level technical architecture is depicted in Figure 5. This diagram illustrates the flow of data from ingestion at the operational level through the processing and detection components.
TRUEDATA performs strongly in technical execution because its end-to-end data path is explicit, instrumented, and secure by design. The architecture is built on a foundation of robust technologies, including a sophisticated Extract--Transform--Load (ETL) pipeline orchestrated in Node-RED for data validation, AI models such as Graph Neural Networks (GNNs) for OT anomaly detection, and a Snort-based industrial Intrusion Detection System (IDS) for network monitoring. This technical maturity justifies high marks in categories related to monitoring, data integrity, and secure design principles.
Conversely, several scores remain in the medium- or high-risk band, primarily due to deliberate gaps in governance and lifecycle management. These areas, including formalized logging, incident response, vulnerability management, and especially SBOM generation and coordinated vulnerability disclosure, were not prioritized in the initial R&D phase. This reflects a strategic trade-off, common in technology-driven projects, where demonstrating the feasibility of the core AI engine and advancing its TRL took precedence over establishing mature operational and compliance scaffolding.
Applying the aggregation formula from Section 4.2, the weighted domain scores yield a project-wide score of ≈6.70/10, characterized as partially compliant with medium residual risk. The calculation is performed by multiplying the AHP-derived weight of each domain by its assessed score (derived from the evidence in Table 6) and summing the results:
( 0.322 × 8.0 Avg .   of   Data   Integrity   ( 8.7 ) , Secure - by - Default   ( 7.8 ) , Access   Control   ( 7.5 ) ) Design + ( 0.219 × 5.5 Direct ) Vuln .   Mngmt + ( 0.143 × 9.2 Monitoring ) Reqs . + ( 0.088 × 3.5 SBOM / Supply ) SBOM + ( 0.088 × 6.8 Direct ) Incident + ( 0.053 × 3.5 SBOM / Supply ) Supply   Chn + ( 0.053 × 6.0 Direct ) Logging + ( 0.034 × 5.0 Avg .   of   Updates   ( 7.0 ) , CVD   ( 3.0 ) ) Post - Mkt   =   6.68   ( rounded   to   6.7 )
This aggregated score of ≈6.70 is the direct result of applying the AHP-derived weights (Table 5) to the domain-specific scores justified in the subsequent Table 6. The rationale for this ‘medium-risk’ score becomes clear upon analysis: whilst the project scored highly in technically mature domains (e.g., 9.2/10 for ‘monitoring’ and 8.0/10 for ‘design’), the overall score is quantitatively impacted by significant high-risk gaps in key governance areas. Specifically, the low scores for ‘SBOM and supply chain’ (3.5/10) and ‘coordinated vulnerability disclosure’ (3.0/10)—both fundamental CRA requirements—demonstrate how procedural and documentary gaps directly reduce the measured readiness despite the system’s technical robustness. The gap to the pilot-ready target, which requires closing these specific gaps, is illustrated in Figure 4.
To provide a granular justification for the assigned scores, Table 6 maps specific project evidence and quantitative performance metrics to each category of the checklist.
To close these identified gaps and move from partial compliance to a pilot-ready state, the following measurable improvements are proposed. These actions demonstrate the framework’s utility as a decision-support tool, enabling the prioritization of remediation efforts based on the domains with the highest weights and lowest scores.
(a)
SBOM/VEX in CI/CD. Integrate automated SBOM generation in SPDX or CycloneDX format and vulnerability scanning into the CI/CD pipeline. The primary KPI will be SBOM Component Coverage, with a target of achieving >95% coverage.
(b)
CVD program. Establish and publish a formal CVD policy compliant with standards such as RFC 2350 and security.txt. The key performance metric will be Mean Time to Acknowledge (MTTA), with a target of <48 h.
(c)
Centralized logging and alerting. Complete the integration of a centralized logging solution with a defined data retention policy (minimum 12 months) to support forensic analysis.
(d)
Vulnerability management at scale. Implement periodic automated container image and OS scanning within the CI/CD pipeline, with a target of deploying patches for critical vulnerabilities (Common Vulnerability Scoring System (CVSS) > 9.0) within 14 days.
(e)
Lifecycle and conformity pack. Codify the product support period and create a reproducible conformity dossier containing all evidence required by Annex VII of the CRA.
These actions are consistent with the CRA lifecycle checklist and materially raise the weakest categories, lifting the weighted score into the ∼7.7–7.9 range and nudging residual risk towards medium–low (cf. Figure 4).

7. Open Challenges and Lessons Learned

The practical application of the CRA across a vast and varied IoT ecosystem reveals several open challenges. While the regulation sets a necessary horizontal baseline, real success depends on tackling technical complexity, organizational inertia, and market pressures. A first challenge is the meaning of key terms. Phrases such as “appropriate level of cybersecurity”, “limited attack surface”, and “without known exploitable vulnerabilities” are central to the CRA, yet they lack precise harmonized technical definitions. Without clear guidance from the ENISA or standardization bodies, manufacturers must interpret these requirements on their own. This can lead to uneven security across the market and uncertainty during conformity assessment.
Supply chain complexity is another major hurdle. The duty to produce a comprehensive machine-readable Software Bill of Materials (SBOM) is a strong step towards transparency, but its value depends on accurate and complete inputs from upstream suppliers, especially for open-source software. Creating and maintaining high-quality SBOMs for deep dependency trees is still a difficult engineering and operational task.
Long-term support and vulnerability management also carry significant costs, particularly for SMEs. Committing to a support period of at least five years, with timely security updates, requires investment in people, processes, and infrastructure. For low-cost, high-volume devices, this model may be hard to sustain, risking the appearance of orphan devices if a manufacturer leaves the market or ends a product line. This exposes a legislative gap as the CRA’s effectiveness assumes manufacturer continuity. Finally, moving towards DevSecOps may be the most demanding organizational shift. Embedding security into every phase of development means breaking down silos between development, operations, and security. This is not only about tools; it also requires leadership, continuous training, and clear governance.
Furthermore, it is important to acknowledge the methodological limitations inherent in the AHP-based weighting process itself [53]. While the AHP provides a structured and mathematically consistent framework, its outputs are fundamentally reliant on the subjective judgments of the expert panel [50]. In this study, the panel was composed of the authors; whilst possessing domain expertise, this introduces a potential source of implicit bias. A larger, more diverse panel of external industry and regulatory experts could certainly yield different domain weights. Moreover, the AHP methodology is subject to well-documented academic critiques, such as the potential for ‘rank reversal’, where the introduction or deletion of a criterion can alter the priority of existing ones [54], and the artificial constraint of the 1–9 Saaty scale, which can make it difficult to represent exceptionally large differences in importance [55]. Although our consistency check (CR < 0.10) provided assurance against contradictory judgments, these inherent limitations underscore that the resulting weights should be viewed as a validated starting point for risk prioritization, not as an immutable ground truth.
Experience from the TRUEDATA project suggests a practical approach: treat the CRA not as a checklist but as the governance layer for Zero-Trust engineering in both OT and the IoT. This mindset turns regulatory requirements into a driver for sound security architecture. In practice, it is advisable to start small and improve step by step, focusing first on one high-impact phase rather than attempting a full overhaul. Mirroring OT traffic early provides ground-truth data to train anomaly-detection models and to define a baseline of normal behavior. For operational resilience, keep key inference at the edge so that monitoring and alerts continue even if cloud links fail. Lastly, formalize what matters: publish a public coordinated vulnerability disclosure (CVD) policy (e.g., aligned with RFC 2350), generate complete SBOMs to ensure supply-chain transparency, and run a structured vulnerability-management process. This is “Next-Gen Zero Trust” in practice: explicit continuous verification of device behavior, least-privilege access enforced by policy, and cryptographic integrity for the data that runs the operation.
Looking beyond these immediate lessons, the identified challenges of complexity and continuous oversight point toward a future of automated compliance. The difficulty in managing complex dynamic supply chains, for instance, may be mitigated by future AI-driven compliance systems. Such platforms could utilize advanced graph-based models, like hypergraph neural networks, to map high-order dependencies in a product’s SBOM and model dynamic risk propagation [47]. In parallel, semantic reasoning engines, drawing on research into semantic-level security, could enable monitoring that ensures the meaning and integrity of data transmissions are protected, fulfilling CRA mandates in a way that simple encryption cannot [48]. Moreover, the challenge of continuous post-market evaluation could be met by self-adaptive digital twins. This would allow manufacturers to evolve from static point-in-time assessments to a continuous evidenced-based evaluation of CRA readiness, providing a quantifiable and testable metric for requirements such as ‘limited attack surface’.

8. Conclusions and Recommendations

This paper has addressed the critical gap between the Cyber Resilience Act’s legal mandates and auditable engineering practice. We introduced and validated a formal two-phase framework that operationalizes the act, providing a transparent methodology for systematic requirement distillation and a quantitative AHP-based model for assessing compliance readiness. This contribution provides a tangible lifecycle-based model that shifts the paradigm from general guidance to binding actionable compliance.
First, the systematic requirement distillation process provides a clear and traceable path from the legal articles of the CRA to concrete lifecycle-mapped engineering and governance tasks. This transforms abstract legal mandates into an actionable compliance blueprint. Second, the AHP-based quantitative assessment model offers a rigorous and consistent method for evaluating compliance readiness. By deriving risk-informed weights for key compliance domains, it moves beyond subjective qualitative assessments and provides a defensible basis for prioritizing investments and tracking progress over time. The value of this framework lies not only in the specific CRA checklist produced but in the adaptable methodology that can be applied to other current and future cybersecurity regulations.
To respond effectively, manufacturers should begin with a rigorous internal maturity assessment against the regulation’s requirements, identifying gaps from threat modeling through the automated generation of machine-readable SBOMs. On the basis of this diagnosis, an actionable roadmap should follow. This plan ought to cover the product backlog—adding security-focused user stories—and extend the CI/CD pipeline to include static analysis, symbolic fuzzing, and dynamic testing in emulated environments. It should also automate SBOM creation and deliver a digitally signed and encrypted OTA channel, ideally complemented by collective remote attestation to verify device integrity in the field at regular intervals.
Compliance, however, goes beyond technology: it requires a clear cultural and organizational shift. A Cybersecurity Management System (CSMS) aligned with a DevSecOps model is key to fostering collaboration among Development, QA, Operations, and Compliance. Teams should share and act on common metrics—such as mean time to remediation, SBOM coverage, and attestation success rates—and refine their approach continuously. Formalizing a coordinated vulnerability disclosure policy, using STIX/CVRF formats and meeting the CRA’s 24 h early-warning and 72 h initial-report deadlines, is essential for both regulatory adherence and building trust. In parallel, internal guidelines based on international standards (e.g., IEC 62443 and ETSI EN 303 645) should translate ambiguous terms like “limited attack surface” into clear audit criteria, making self-assessment and voluntary certification (EUCC) more practical and reliable.
Looking ahead, further research should build upon the methodology presented here. Important technical challenges remain, including more lightweight cybersecurity tools for low-power devices and interoperability models for SBOMs and patches to support the resilience of “orphan devices” if a manufacturer ceases operations. Future work should also focus on refining the framework, for instance by expanding the AHP model with more granular criteria or exploring NLP-based automation for the requirement extraction phase [56].
Moreover, the framework’s adaptability to other domains is a key area for development. For industrial control (OT) environments, the methodology is directly applicable; the TRUEDATA case study, with its focus on SCADA and PLCs, already validates this. Future work in this area could focus on creating harmonized models that map CRA obligations against established standards, such as IEC 62443-4-1 [22]. The application to artificial intelligence (AI) systems, which also fall under the CRA’s scope [1,14], presents a more complex but equally viable adaptation. The two-phase methodology remains valid, but the content of the ‘secure-by-design’ domain would need to be expanded. This would involve addressing AI-specific vulnerabilities (e.g., data poisoning and adversarial attacks) and adapting the concept of an SBOM to include AI artifacts, such as training data and model parameters, in line with emerging ‘Artificial Intelligence Bill of Materials (AI-BOM)’ concepts [57].
Furthermore, to broaden the empirical grounding of this work, a follow-on study would be beneficial, applying the framework to a more diverse portfolio of common IoT products (e.g., consumer smart devices) to validate its generalizability beyond the high-complexity industrial domain.
Additional work on dynamic attack surfaces in edge and multi-hub networks—together with the use of artificial intelligence for advanced event correlation and automated response orchestration—will help to anticipate and mitigate emerging threats.
Ultimately, close collaboration with the ENISA and standardization bodies will be vital to pilot, validate, and refine the technical standards that clarify the CRA’s more ambiguous mandates, enabling European industry to build a truly resilient and trustworthy Internet of Things.

Author Contributions

Formal analysis, M.Á.O.V.; investigation (including the mapping to TRUEDATA and to CRA in practice), I.C.M.; validation (IoT evolution against EU Cyber Resilience Act requirements), A.J.J.; writing—original draft preparation, M.Á.O.V.; writing—review and editing, A.J.J.; funding acquisition, A.J.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by Fundación Séneca—Agency of Science and Technology of the Region of Murcia (Spain), grant 22771/FPI/24, and co-funded by Libelium. It also received co-funding from the Spanish National Cybersecurity Institute S.M.E., M.P., S.A. (INCIBE) under Pre-Commercial Public Procurement call CPP3-CPP001/23, Challenge CPP3_R6 “Security Operations Center (SOC) for critical sector” within Spain’s Recovery, Transformation and Resilience Plan—funded by the European Union—NextGenerationEU. The views and opinions expressed are those of the authors and do not necessarily reflect those of the co-funding entity.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Acknowledgments

The authors wish to acknowledge the use of generative AI models (ChatGPT (https://www.chatgpt.com) and Gemini (https://gemini.google.com)) as support tools for refining language and improving the clarity of the manuscript.

Conflicts of Interest

Authors Ortega Velázquez, Miguel Ángel, Iris Cuevas Martínez, and Antonio J. Jara were employed by the company Libelium Lab. The authors declare that this study received funding from INCIBE. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Appendix A. CRA Lifecycle Checklist for IoT Products

Table A1. CRA lifecycle checklist for IoT products—(A0) governance and scope (pre-design).
Table A1. CRA lifecycle checklist for IoT products—(A0) governance and scope (pre-design).
ActionExpected EvidenceWhenCRA Ref.
Determine if the product is a PDE and classify it (default/Important I–II/Critical)Scope and classification reportProject startArts. 7–8; Annexes III–IV
Establish CSMS (policies, KPIs, reviews) and functional ownersCSMS policy; annual review planStart + annual reviewAnnex I (Parts I–II)
Set support period aligned with useful life at least 5 years unless shorter lifetime justified)Public support statementBefore launchArt. 13(8)–(9)
Publish CVD policy and enable a single point of contactCVD page; security@ alias; Pretty Good Privacy (PGP) keyPermanentAnnex I (Part II(5)–(6)); Art. 13(17)
Table A2. CRA lifecycle checklist—(A) secure design and development.
Table A2. CRA lifecycle checklist—(A) secure design and development.
ActionExpected EvidenceFrequencyCRA Ref.
Design, develop, and produce with security proportional to riskRisk assessment; risk-to-requirement matrixPer major version/significant changesAnnex I (Part I(1))
Execute risk assessment in all phases (planning→maintenance)Assessment records per phase/changePer milestone and upon changesArt. 13(2)–(3)
Translate threat modeling into requirements and acceptance criteriaSTRIDE/LINDDUN report; security user storiesPer design iterationAnnex I (Part I(1))
Minimize attack surface (interfaces, ports, services, permissions)Interface inventory; hardening baselineEach releaseAnnex I (Part I(2)(j))
Security by default (no generic credentials; least privilege; deny by default)Credentials policy; factory/reset testsEach releaseAnnex I (Part I(2)(b))
Protect integrity (secure boot, signing, anti-rollback, key management)Chain of trust; Public Key Infrastructure (PKI); downgrade testsEach releaseAnnex I (Part I(2)(f),(2)(k))
Design logging/monitoring with user opt-outTelemetry plan; opt-out User Experience (UX); testsEach releaseAnnex I (Part I(2)(l))
Apply data minimization and CIA controlsData and controls specificationDesign → verificationAnnex I (Part I(2)(e),(2)(g),(2)(h))
Enable secure deletion and factory resetWipe procedure; persistence testsRelease + annual auditAnnex I (Part I(2)(m))
Due diligence on third parties/Open-Source Software (OSS) (licenses, maintenance, CVE)Third-party assessment; OSS policyUpon integration/updateArt. 13(5)–(6)
Table A3. CRA lifecycle checklist—(B) vulnerability management and support.
Table A3. CRA lifecycle checklist—(B) vulnerability management and support.
ActionExpected EvidenceFrequencyCRA Ref.
Operate a formal Product Security Incident Response Team (PSIRT)PSIRT procedure; ticket flow; Service Level Agreements (SLAs)ContinuousAnnex I (Part II)
Monitor and address known vulnerabilities (CVE, advisories)NVD/GHSA alerts; triage and patching recordsWeekly/continuousAnnex I (Part II(2),(4),(6))
Block releases with known exploitable vulnerabilitiesCI/CD gate (NVD/CVE); closed Static/Dynamic Application Security Testing (SAST/DAST)/FuzzingEach release buildAnnex I (Part I(2)(a))
Provide automatic updates by default with clear opt-outOTA policy; opt-out User Interface/User Experience (UI/UX); user manualEach releaseAnnex I (Part I(2)(c))
Deliver updates securely (encryption, signatures, authentication)Signing pipeline; on-device validationEach releaseAnnex I (Part II(7))
Maintain security updates throughout the declared support period (≥5 years unless a shorter product lifetime is duly justified).Update policy; release historyPolicy → annual reviewArt. 13(9)
Generate and maintain a machine-readable SBOM per versionCycloneDX/SPDX SBOM; change control; Vulnerability Exploitability eXchange (VEX)Each build/hotfixAnnex I (Part II(1)); Art. 31; Annex VII
Inform users of updates and mitigation measuresRelease notes; notices; Really Simple Syndication (RSS)/Application Programming Interface (API)Each releaseAnnex I (Part II(4),(8))
Table A4. CRA lifecycle checklist—(C) notification of exploited vulnerabilities and incidents.
Table A4. CRA lifecycle checklist—(C) notification of exploited vulnerabilities and incidents.
ActionExpected EvidenceDeadline/FrequencyCRA Ref.
Send early warning to Computer Security Incident Response Teams (CSIRTs)/European Union Agency for Cybersecurity (ENISA) for exploited vulnerabilitySubmission procedure and record≤ 24 hArt. 14(2)(a)
Submit initial notification with general informationOfficial template; acknowledgment≤ 72 hArt. 14(2)(b), 14(4)(b)
Issue final report after correction/mitigationFinal report; measures; timeline≤ 14 days from fixArt. 14(2)(c)
Manage serious incidents; inform affected usersIncident Response (IR) runbook; user communications24 h/72 h/≤ 1 monthArt. 14(3)–(4)
Table A5. CRA lifecycle checklist—(D) manufacturer’s obligations (post-market).
Table A5. CRA lifecycle checklist—(D) manufacturer’s obligations (post-market).
ActionExpected EvidenceFrequencyCRA Ref.
Define and communicate the support periodDatasheet; website; manualLaunch → annual reviewArt. 13(8), 13(19)
Keep technical documentation and EU Declaration of Conformity (DoC)Conformity repository; version controlContinuousArt. 13(13); Art. 31; Annex VII
Ensure conformity of series productionControl plan; QA recordsPer batch/versionArt. 13(14)
Cooperate with Market Surveillance AuthoritiesResponse procedure and recordsAs requiredArts. 52–60
Plan End-of-Life/End-of-Support (EoL/EoS) with communication and mitigationsEoL plan; notices; decommissioning guidesBefore EoSArt. 13(19); Annex II
Table A6. CRA lifecycle checklist—(E) conformity assessment and CE (Conformité Européenne) marking.
Table A6. CRA lifecycle checklist—(E) conformity assessment and CE (Conformité Européenne) marking.
ActionExpected EvidenceWhen/FrequencyCRA Ref.
Perform conformity assessment of product and processesTechnical file (risks, tests, SBOM, support)Pre-marketArt. 32; Annex VIII; Annex VII
Correctly classify the product (default/Important I–II/Critical)Classification report and justificationPre-marketArts. 7–8; Annexes III–IV
Select assessment route (Module A or Notified Body)Decision, contracts and reportsPre-marketArt. 32; Annex VIII
(If applicable) Obtain EUCC or other certification for presumptionCertificate and scopePre/PostArt. 27(8)–(9)
Draw up EU Declaration of Conformity and make it accessibleSigned DoC; URLPre-marketArt. 28; Annex V–VI
Affix CE marking consistent with the DoCLabeling check; photographic evidencePre-marketArts. 29–30
Table A7. CRA lifecycle checklist—(F) user information.
Table A7. CRA lifecycle checklist—(F) user information.
ActionExpected EvidenceWhenCRA Ref.
Publish contact details and a security point of contactManual; label; support websitePre-marketArt. 13(16)–(18); Annex II
Provide clear instructions for installation, use, decommissioning, updatingManual; quick start guide; security sectionPre-marketAnnex II
State the support period and access to SBOM/VEX (where made available to users)Manual/website; SBOM/VEX linkPre-marketAnnex II; Art. 13(8)

References

  1. European Parliament and the Council. Regulation (EU) 2024/2847 on Horizontal Cybersecurity Requirements for Products with Digital Elements (Cyber Resilience Act). Available online: https://eur-lex.europa.eu/eli/reg/2024/2847/oj (accessed on 12 September 2025).
  2. Statista Research Department. Internet of Things (IoT) Connected Devices Installed Base Worldwide from 2015 to 2025 (In Billions). Available online: https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide/ (accessed on 10 September 2025).
  3. IoT Analytics. State of IoT 2024: Number of Connected IoT Devices. Available online: https://iot-analytics.com/numberconnected-iot-devices/ (accessed on 5 October 2025).
  4. European Parliament and the Council. Directive (EU) 2016/1148 Concerning Measures for a High Common Level of Security of Network and Information Systems Across the Union (NIS). Available online: https://eur-lex.europa.eu/eli/dir/2016/1148/oj (accessed on 22 August 2025).
  5. European Parliament and the Council. Regulation (EU) 2016/679 (General Data Protection Regulation). Available online: https://eur-lex.europa.eu/eli/reg/2016/679/oj (accessed on 15 September 2025).
  6. European Parliament and the Council. Regulation (EU) 2019/881 on ENISA and on Information and Communications Technology Cybersecurity Certification (Cybersecurity Act). Available online: https://eur-lex.europa.eu/eli/reg/2019/881/oj (accessed on 28 August 2025).
  7. European Parliament and the Council. Directive 2014/53/EU on the Harmonisation of the Laws of the Member States Relating to the Making Available on the Market of Radio Equipment (RED). Available online: https://eur-lex.europa.eu/eli/dir/2014/53/oj (accessed on 2 September 2025).
  8. European Commission. Commission Delegated Regulation (EU) 2022/30 supplementing Directive 2014/53/EU with regard to essential requirements in Article 3(3)(d)(e)(f). Available online: https://eur-lex.europa.eu/eli/reg_del/2022/30/oj (accessed on 12 September 2025).
  9. European Commission. Commission Implementing Regulation (EU) 2024/482 Laying Down Rules for the Adoption of the European Common Criteria-Based Cybersecurity Certification Scheme (EUCC). Available online: https://eur-lex.europa.eu/eli/reg_impl/2024/482/oj (accessed on 18 August 2025).
  10. European Parliament and the Council. Directive (EU) 2022/2555 on Measures for a High Common Level of Cybersecurity Across the Union (NIS2). Available online: https://eur-lex.europa.eu/eli/dir/2022/2555/oj (accessed on 25 September 2025).
  11. European Commission. Commission Delegated Regulation (EU) 2023/2444 Amending Delegated Regulation (EU) 2022/30 as Regards the Date of Application of the Essential Requirements for Radio Equipment and Correcting That Regulation. Available online: https://eur-lex.europa.eu/eli/reg_del/2023/2444/oj (accessed on 30 August 2025).
  12. European Parliament and the Council. Regulation (EU) 2023/988 on General Product Safety (GPSR). Available online: https://eur-lex.europa.eu/eli/reg/2023/988/oj (accessed on 14 September 2025).
  13. European Parliament and the Council. Regulation (EU) 2023/2854 on Harmonised Rules on Fair Access to and Use of Data (Data Act). Available online: https://eur-lex.europa.eu/eli/reg/2023/2854/oj (accessed on 1 October 2025).
  14. European Parliament and the Council. Regulation (EU) 2024/1689 Laying Down Harmonised Rules on Artificial Intelligence (AI Act). Available online: https://eur-lex.europa.eu/eli/reg/2024/1689/oj (accessed on 8 September 2025).
  15. European Parliament and the Council. Regulation (EU) 2024/1183 on a Framework for a European Digital Identity (eIDAS 2). Available online: https://eur-lex.europa.eu/eli/reg/2024/1183/oj (accessed on 20 August 2025).
  16. European Parliament and the Council. Directive (EU) 2018/1972 Establishing the European Electronic Communications Code (EECC). Available online: https://eur-lex.europa.eu/eli/dir/2018/1972/oj (accessed on 29 September 2025).
  17. European Parliament and the Council. Regulation (EU) 2022/2554 on Digital Operational Resilience for the Financial Sector (DORA). Available online: https://eur-lex.europa.eu/eli/reg/2022/2554/oj (accessed on 4 September 2025).
  18. European Parliament and the Council. Directive (EU) 2022/2557 on the Resilience of Critical Entities (CER). Available online: https://eur-lex.europa.eu/eli/dir/2022/2557/oj (accessed on 6 October 2025).
  19. European Parliament and the Council. Regulation (EU) 2017/745 on Medical Devices (MDR). Available online: https://eur-lex.europa.eu/eli/reg/2017/745/oj (accessed on 11 August 2025).
  20. European Parliament and the Council. Regulation (EU) 2023/1230 on Machinery (Machinery Regulation). Available online: https://eur-lex.europa.eu/eli/reg/2023/1230/oj (accessed on 19 September 2025).
  21. ETSI EN 303 645 V2.1.1; Cyber Security for Consumer Internet of Things: Baseline Requirements. European Telecommunications Standards Institute: Sophia Antipolis, France, 2020. Available online: https://www.etsi.org/deliver/etsi_en/303600_303699/303645/02.01.01_60/en_303645v020101p.pdf (accessed on 16 November 2025).
  22. IEC 62443-4-1:2018; Security for Industrial Automation and Control Systems—Part 4-1: Secure Product Development Lifecycle Requirements. International Electrotechnical Commission (IEC): Geneva, Switzerland, 2018. Available online: https://webstore.iec.ch/publication/33615 (accessed on 16 November 2025).
  23. NIST IR 8259; Foundational Cybersecurity Activities for IoT Device Manufacturers. National Institute of Standards and Technology: Gaithersburg, MD, USA, 2020. [CrossRef]
  24. Butun, I.; Österberg, P.; Song, H. Security of the Internet of Things: Vulnerabilities, Attacks, and Countermeasures. IEEE Commun. Surv. Tutor. 2020, 22, 616–644. [Google Scholar] [CrossRef]
  25. Bakhshi, T.; Ghita, B.; Kuzminykh, I. A Review of IoT Firmware Vulnerabilities and Auditing Techniques. Sensors 2024, 24, 708. [Google Scholar] [CrossRef] [PubMed]
  26. Haq, S.U.; Singh, Y.; Sharma, A.; Gupta, R.; Gupta, D. A survey on IoT and embedded device firmware security: Architecture, extraction techniques, and vulnerability analysis frameworks. Discov. Internet Things 2023, 3, 17. [Google Scholar] [CrossRef]
  27. Feng, X.; Zhu, X.; Han, Q.-L.; Zhou, W.; Wen, S.; Xiang, Y. Detecting Vulnerability on IoT Device Firmware: A Survey. IEEE/CAA J. Autom. Sin. 2023, 10, 25–41. [Google Scholar] [CrossRef]
  28. Kloeg, B.; Ding, A.Y.; Pellegrom, S.; Zhauniarovich, Y. Charting the Path to SBOM Adoption: A Business Stakeholder-Centric Approach. In Proceedings of the 19th ACM Asia Conference on Computer and Communications Security (ASIACCS ’24), Singapore, 1–5 July 2024; ACM: New York, NY, USA, 2024; pp. 1770–1783. [Google Scholar] [CrossRef]
  29. Nocera, S.; Romano, S.; Penta, M.D.; Francese, R.; Scanniello, G. On the adoption of software bill of materials in open-source software projects. J. Syst. Softw. 2025, 230, 112540. [Google Scholar] [CrossRef]
  30. Ankergård, S.F.J.J.; Dushku, E.; Dragoni, N. State-of-the-Art Software-Based Remote Attestation: Opportunities and Open Issues for Internet of Things. Sensors 2021, 21, 1598. [Google Scholar] [CrossRef]
  31. Ambrosin, M.; Conti, M.; Lazzeretti, R.; Rabbani, M.; Ranise, S. Collective Remote Attestation at the Internet of Things Scale: State-of-the-art and Future Challenges. IEEE Commun. Surv. Tutor. 2020, 22, 2447–2461. Available online: https://ieeexplore.ieee.org/document/9139454 (accessed on 2 October 2025). [CrossRef]
  32. Ruohonen, J.; Hjerppe, K.; Kang, E.-Y. A Mapping Analysis of Requirements Between the CRA and the GDPR. arXiv 2025, arXiv:2503.01816. [Google Scholar] [CrossRef]
  33. Shaffique, M.R. Cyber Resilience Act 2022: A silver bullet for cybersecurity of IoT devices or a shot in the dark? Comput. Law Secur. Rev. 2024, 54, 106009. [Google Scholar] [CrossRef]
  34. Vescovi, C. Complexity of IoT technologies: European regulations in progress and patterns of customer communication. Media Laws 2023, 2, 299–321. Available online: https://www.medialaws.eu/rivista/complexity-of-iot-technologies-europeanregulations-in-progress-and-patterns-of-customer-communication/ (accessed on 16 September 2025).
  35. van’t Schip, M. The Internet of Forgotten Things: European cybersecurity regulation and the cessation of Internet of Things manufacturers. Comput. Law Secur. Rev. 2025, 57, 106152. [Google Scholar] [CrossRef]
  36. Jara, A.J.; Martinez, I.C.; Sanchez, J.S. CyberSecurity Resilience Act (CRA) in Practice for IoT Devices: Getting Ready for the NIS2. In Proceedings of the 2024 IEEE Smart Cities Futures Summit (SCFC), Marrakech, Morocco, 29–31 May 2024; IEEE: New York, NY, USA, 2024; pp. 56–60. [Google Scholar] [CrossRef]
  37. Risto, R.; Sethi, M.; Katara, M. Effects of the Cyber Resilience Act (CRA) on Industrial Equipment Manufacturing Companies. In Availability, Reliability and Security; Coppens, B., Volckaert, B., Naessens, V., De Sutter, B., Eds.; Springer Nature: Cham, Switzerland, 2025; pp. 213–229. [Google Scholar] [CrossRef]
  38. ISO/IEC 27001:2022; Information Security, Cybersecurity and Privacy Protection—Information Security Management Systems—Requirements. International Organization for Standardization: Geneva, Switzerland, 2022. Available online: https://www.iso.org/standard/82875.html (accessed on 16 November 2025).
  39. Kempe, E.; Massey, A. Perspectives on Regulatory Compliance in Software Engineering. In Proceedings of the 2021 IEEE 29th International Requirements Engineering Conference (RE); IEEE: Piscataway, NJ, USA, 2021; pp. 46–57. Available online: https://ieeexplore.ieee.org/document/9604591 (accessed on 24 September 2025).
  40. Faria, J.P.; Goulão, M.; Paiva, A.R.; Paiva, R.P. Systematic Mapping Study on Requirements Engineering for Regulatory Compliance of Software Systems. arXiv 2024, arXiv:2411.01940. [Google Scholar] [CrossRef]
  41. Breaux, T.D.; Anton, A.I. A Systematic Method for Acquiring Regulatory Requirements: A Frame-Based Approach. IEEE Trans. Softw. Eng. 2008, 34, 606–621. Available online: https://www.researchgate.net/publication/228336940_A_Systematic_Method_for_Acquiring_Regulatory_Requirements_A_Frame-Based_Approach (accessed on 24 September 2025).
  42. Zadgaonkar, A.V.; Agrawal, A.J. An overview of information extraction techniques for legal document analysis and processing. Int. J. Electr. Comput. Eng. (IJECE) 2021, 11, 5450–5457. Available online: https://www.researchgate.net/publication/356684610_An_overview_of_information_extraction_techniques_for_legal_document_analysis_and_processing (accessed on 24 September 2025). [CrossRef]
  43. UpGuard. IT Security Risk Assessment Methodology: Qualitative vs. Quantitative. UpGuard Blog. 2024. Available online: https://www.upguard.com/blog/risk-assessment-methodology (accessed on 24 September 2025).
  44. CyberSaint. Level Up Your Cybersecurity: A Guide to Quantitative Risk Frameworks. CyberSaint Blog. 2024. Available online: https://www.cybersaint.io/blog/guide-to-quantitative-risk-frameworks (accessed on 5 October 2025).
  45. Moreira, F.R.; Canedo, E.D.; Nunes, R.R.; Serrano, A.L.M.; Abbas, C.; Pereira Junior, M.L.; de Mendonça, F.L. Cybersecurity Risk Assessment Through Analytic Hierarchy Process: Integrating Multicriteria and Sensitivity Analysis. In Proceedings of the 27th International Conference on Enterprise Information Systems (ICEIS), Porto, Portugal, 4–6 April 2025; SCITEPRESS: Setúbal, Portugal, 2025. Available online: https://www.researchgate.net/publication/390643114_Cybersecurity_Risk_Assessment_Through_Analytic_Hierarchy_Process_Integrating_Multicriteria_and_Sensitivity_Analysis. (accessed on 24 September 2025).
  46. Zaburko, J.; Szulżyk-Cieplak, J. Information security risk assessment using the AHP method. IOP Conf. Ser. Mater. Sci. Eng. 2019, 710, 012036. Available online: https://www.researchgate.net/publication/338050116_Information_security_risk_assessment_using_the_AHP_method (accessed on 24 September 2025). [CrossRef]
  47. Zhu, P.; Chen, X.; Zhang, Z.; Li, P.; Cheng, X.; Dai, Y. AI-driven hypergraph neural network for predicting gasoline price trends. Energy Econ. 2025, 151, 108895. [Google Scholar] [CrossRef]
  48. Cai, Y. Robust and adaptive semantic noise for complex secure communication networks. Phys. Commun. 2025, 72, 102763. [Google Scholar] [CrossRef]
  49. Breaux, T.D.; Antón, A.I. Analyzing Regulatory Rules for Ambiguity and Conflict. In Proceedings of the 15th IEEE International Requirements Engineering Conference (RE’07), New Delhi, India, 15–19 October 2007; pp. 175–184. [Google Scholar] [CrossRef]
  50. Saaty, T.L. How to make a decision: The analytic hierarchy process. Eur. J. Oper. Res. 1990, 48, 9–26. [Google Scholar] [CrossRef]
  51. Saaty, T.L.; Vargas, L.G. Models, Methods, Concepts & Applications of the Analytic Hierarchy Process; Springer: New York, NY, USA, 2001. [Google Scholar] [CrossRef]
  52. Adams, W. ANP Row Sensitivity and the Resulting Influence Analysis. In Proceedings of the International Symposium on the Analytic Hierarchy Process, Washington, DC, USA, 19 June–2 July 2014; Available online: https://www.isahp.org/uploads/p744093.pdf (accessed on 1 October 2025).
  53. Whitaker, R. Criticisms of the Analytic Hierarchy Process: Why AHP Is Not a Reliable Method for Complex Decisions. Math. Comput. Model. 2007, 46, 948–961. [Google Scholar] [CrossRef]
  54. Tu, J.; Liu, Y. Analytic hierarchy process rank reversals: Causes and solutions. J. Enterp. Inf. Manag. 2023, 346, 1785–1809. [Google Scholar] [CrossRef]
  55. Velasquez, M.; Hester, P.T. An Analysis of Multi-Criteria Decision Making Methods. Int. J. Oper. Res. 2013, 10, 56–66. Available online: https://www.iied.org/sites/default/files/pdfs/2022-02/20781G.pdf (accessed on 17 September 2025).
  56. Soavi, M.; Paja, E.; Giorgini, P. From Legal Contracts to Formal Specifications: A Systematic Literature Review. SN Comput. Sci. 2022, 3, 345. Available online: https://link.springer.com/article/10.1007/s42979-022-01228-4 (accessed on 24 September 2025). [CrossRef]
  57. OpenSSF. SBOMs in the Era of the CRA: Toward a Unified and Actionable Framework. OpenSSF Blog. October 2025. Available online: https://openssf.org/blog/2025/10/22/sboms-in-the-era-of-the-cra-toward-a-unified-and-actionable-framework/ (accessed on 23 October 2025).
Figure 1. A visual timeline of the Cyber Resilience Act (CRA), showing its phased implementation alongside key dates for related EU regulations, such as NIS2, RED, and the GDPR.
Figure 1. A visual timeline of the Cyber Resilience Act (CRA), showing its phased implementation alongside key dates for related EU regulations, such as NIS2, RED, and the GDPR.
Information 16 01017 g001
Figure 2. A visual summary of the two-phase methodology. Phase 1 (top) details the systematic distillation of legal text into the lifecycle checklist. Phase 2 (bottom) shows how this checklist is used to construct the AHP-based rating tree for quantitative assessment.
Figure 2. A visual summary of the two-phase methodology. Phase 1 (top) details the systematic distillation of legal text into the lifecycle checklist. Phase 2 (bottom) shows how this checklist is used to construct the AHP-based rating tree for quantitative assessment.
Information 16 01017 g002
Figure 3. A visual model of the DevSecOps lifecycle adapted for CRA compliance, mapping key regulatory actions to each phase of development and operations.
Figure 3. A visual model of the DevSecOps lifecycle adapted for CRA compliance, mapping key regulatory actions to each phase of development and operations.
Information 16 01017 g003
Figure 4. TRUEDATA CRA readiness profile (project-wide). The purple area shows the current baseline; the green ring indicates the pilot-ready target across the assessed categories.
Figure 4. TRUEDATA CRA readiness profile (project-wide). The purple area shows the current baseline; the green ring indicates the pilot-ready target across the assessed categories.
Information 16 01017 g004
Figure 5. High-level technical architecture of the TRUEDATA solution, showing the data flow from OT/IT network ingestion to the AI-driven detection engine.
Figure 5. High-level technical architecture of the TRUEDATA solution, showing the data flow from OT/IT network ingestion to the AI-driven detection engine.
Information 16 01017 g005
Table 1. Comparative overview of key EU instruments related to the CRA.
Table 1. Comparative overview of key EU instruments related to the CRA.
RegulationPrimary ScopeCore FocusKey Obligations for IoTApproach
Cyber Resilience Act (CRA)All Products with Digital Elements (PDEs)Intrinsic product security across lifecycleSecurity by design; vulnerability management; SBOM; secure updatesHorizontal and mandatory
NIS2 DirectiveEssential/important operatorsResilience of networks and systemsSupply-chain security; risk management; incident reportingSectoral and mandatory
GDPRProcessing of personal dataData protection and privacyPrivacy by design; DPIAs; breach notificationHorizontal and mandatory
Radio Equipment Directive (RED)Radio equipmentSafety and essential requirementsNetwork protection; privacy safeguards; anti-fraud controlsVertical and mandatory
AI ActAI systemsRisk-based regulation of AIRobustness and security for high-risk AI (often embedded in IoT)Horizontal and mandatory
Data ActData from connected productsFair access and use of dataSecure data access by design for IoT devicesHorizontal and mandatory
EU Cybersecurity Act (CSA)ICT products, services, processesEU certification frameworkCertification schemes with assurance levels; presumption supportHorizontal and voluntary framework
Table 2. Comparative analysis of security frameworks and their relation to the CRA compliance checklist.
Table 2. Comparative analysis of security frameworks and their relation to the CRA compliance checklist.
Framework/StandardPrimary ScopeNatureCore FocusKey Limitation for CRA Compliance
ISO/IEC 27001:2022 [38]Organizational Information SecurityVoluntary Management Standard (ISMS)Top-down process-driven risk management for the entire organization. (The ‘Who’ and ‘Why’)Not a product-specific or lifecycle-based framework; does not translate legal articles into specific engineering requirements.
ETSI EN 303 645Consumer IoTVoluntary StandardProduct Security Principles (The ‘What’)Lacks detailed lifecycle process and specific CRA procedural mapping.
NIST IR 8259A/BGeneral IoT (US Focus)Voluntary GuidanceProduct and Manufacturer CapabilitiesNot a lifecycle framework; not mapped to EU legal requirements.
IEC 62443-4-1/4-2Industrial/OT SystemsVoluntary Standard (basis for harmonized European standards (hENs))Process and Product Technical Rigor (The ‘How’)Domain-specific and potentially too heavyweight for all IoT; lacks direct mapping to CRA articles.
NIST SSDFGeneral SoftwareVoluntary GuidanceCatalogue of Process Best PracticesA high-level collection of practices, not an auditable compliance framework.
OWASP SAMM/BSIMMOrganizationalVoluntary Maturity ModelInternal Process and Culture Improvement (The ‘Who’)Measures capability, not legal conformity; does not produce specific CRA compliance artifacts.
This Paper’s ChecklistAll PDEs in EU MarketCRA Compliance ToolRegulatory and Lifecycle Obligations (The ‘Proof’)N/A (This is the proposed solution).
Table 3. The systematic requirement distillation and lifecycle mapping process.
Table 3. The systematic requirement distillation and lifecycle mapping process.
StepDescriptionInputOutput
1. Corpus DefinitionIdentify and select the relevant legal articles and annexes containing manufacturer obligations.The full text of the CRA Regulation.A scoped set of articles and annexes (e.g., Art. 13, 14; Annex I, II, VII).
2. Requirement ExtractionPerform a structured reading to identify and tag semantic components (Subject, Modal, Action, Object) of each obligation.The scoped legal text.A set of annotated phrases and clauses.
3. Requirement NormalizationRephrase each extracted obligation into a consistent, atomic, and verifiable requirement statement.Annotated phrases.A list of normalized requirements.
4. Lifecycle TraceabilityMap each normalized requirement to the relevant phase(s) of the DevSecOps lifecycle based on its primary activity type.Normalized requirements.A lifecycle-mapped checklist of requirements.
Table 4. Best practices across key domains aligned with CRA obligations.
Table 4. Best practices across key domains aligned with CRA obligations.
DomainKey ActionCRA Requirement
Secure DesignPerform threat modeling in the specification phase; include security-by-design requirements in the backlogAnnex I (Part I)
Automated TestingIntegrate symbolic fuzzing, static analysis, and dynamic analysis into CI/CD; link test results to the SBOM for patch prioritizationAnnex I (Part II(3)); Art. 31; Annex VII
SBOM and Traceability ManagementGenerate/validate an SBOM in each build (CycloneDX/SPDX); monitor critical dependencies and alert on CVEsAnnex I (Part II(1)); Art. 31; Annex VII
Patch Delivery (OTA)Provide an OTA channel with end-to-end encryption and signatures; include periodic remote attestationAnnex I (Part II(7)–(8))
Coordinated Vulnerability DisclosureDefine a CVD policy (STIX/CVRF); set 24 h internal alert and 72 h initial report targetsAnnex I (Part II(5)–(6)); Art. 13(17); Art. 14
Monitoring and ResponseIntegrate LwM2M probes with a SIEM; configure anomaly detection and automated response playbooksAnnex I (Part I(2)(l))
DevSecOps GovernanceEstablish a CSMS aligned with IEC 62443 with KPIs (Mean Time To Remediation (MTTR), SBOM coverage, etc.); implement feedback loops Dev ↔ Sec ↔ OpsArt. 13; Annex I (Part II)
Documentation and TrainingMaintain internal guides translating ambiguous terms into audit criteria; run “shift-left” trainingAnnex II; Annex VII
Table 5. AHP pairwise comparison matrix for CRA compliance domains. The principal eigenvalue is λ m a x = 8.139 , resulting in a CI of 0.020 and a CR of 0.014. As CR < 0.10, the judgments are consistent. Headers are abbreviated for fit (e.g., design = secure by design).
Table 5. AHP pairwise comparison matrix for CRA compliance domains. The principal eigenvalue is λ m a x = 8.139 , resulting in a CI of 0.020 and a CR of 0.014. As CR < 0.10, the judgments are consistent. Headers are abbreviated for fit (e.g., design = secure by design).
DesignVuln. MngmtReqs.SBOMIncidentSupply ChnLoggingPost-Mkt
Design12344556
Vuln. Mngmt1/21233445
Reqs.1/31/2122334
SBOM1/41/31/211223
Incident1/41/31/211223
Supply Chn1/51/41/31/21/2112
Logging1/51/41/31/21/2112
Post-Mkt1/61/51/41/31/31/21/21
Weight0.3220.2190.1430.0880.0880.0530.0530.034
Table 6. TRUEDATA CRA readiness score justification.
Table 6. TRUEDATA CRA readiness score justification.
CategoryScoreJustification and EvidenceChecklist Ref.
Monitoring and detection9.2/10Excellent. A dual-domain approach: For OT, GNN models benchmarked on SWAT/WADI achieve a 0.92 F1-score and 1.28% False Positive Rate (FPR). For IT, a Snort IDS with Industrial Control System (ICS) rules uses an automated pipeline to rotate hourly packet-capture (PCAP) files to cloud storage. A low-latency pipeline enables an Mean Time To Detect (MTTD) of ~1 s.Table A2
Data integrity and traceability8.7/10Very Good. A Node-RED ETL pipeline performs data cleaning, ontology-based validation, and semantic enrichment for high-fidelity telemetry. Blockchain integration is planned for immutable, auditable event logs.Table A2
Secure-by-default7.8/10Good. Enforces security by default using Docker containers (Docker, Inc., Palo Alto, CA, USA) for process isolation and secure configurations, minimizing the initial attack surface.Table A2
Access control and authentication7.5/10Good. Implements robust controls like Transport Layer Security (TLS)/Secure Sockets Layer (SSL) for encrypted communications, JSON Web Token (JWT) for authentication, and a formal Role-Based Access Control (RBAC) model to enforce least privilege.Table A2
Updates and lifecycle7.0/10Good. The Docker architecture simplifies updates. Score is lowered by the lack of a formal product support period and a dedicated OTA update policy, which are pending formalization.Table A3 and Table A5
Incident response and reporting6.8/10Medium. Incident response is implicit via dashboards and automation. The score is lowered by the lack of documented escalation paths, a dedicated PSIRT, and regulatory-reporting workflows.Table A3 and Table A4
Logging and observability6.0/10Medium. Basic logging is in place, but a centralized solution (e.g., Fluentd/Kibana) with defined retention/alert policies is ’PENDING,’ limiting deep forensic and KPI tracking capabilities.Table A2
Vulnerability management5.5/10Medium–Low. Lacks a recurring automated CVE/VEX workflow tied to container scanning and risk-based SLAs. Proactive vulnerability monitoring is not yet a continuous process.Table A3
SBOM and supply chain3.5/10Low. No automated generation of machine-readable SBOMs (e.g., SPDX/CycloneDX) for each build. This is a significant governance gap impacting supply-chain transparency.Table A3
Coordinated vulnerability disclosure3.0/10Low. A formal CVD policy, a public security contact (e.g., security.txt), and a Computer Emergency Response Team (CERT)-aligned intake process have not been published. This is a baseline requirement treated as a high-risk gap.Table A1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ortega Velázquez, M.Á.; Cuevas Martínez, I.; Jara, A.J. Integrating the CRA into the IoT Lifecycle: Challenges, Strategies, and Best Practices. Information 2025, 16, 1017. https://doi.org/10.3390/info16121017

AMA Style

Ortega Velázquez MÁ, Cuevas Martínez I, Jara AJ. Integrating the CRA into the IoT Lifecycle: Challenges, Strategies, and Best Practices. Information. 2025; 16(12):1017. https://doi.org/10.3390/info16121017

Chicago/Turabian Style

Ortega Velázquez, Miguel Ángel, Iris Cuevas Martínez, and Antonio J. Jara. 2025. "Integrating the CRA into the IoT Lifecycle: Challenges, Strategies, and Best Practices" Information 16, no. 12: 1017. https://doi.org/10.3390/info16121017

APA Style

Ortega Velázquez, M. Á., Cuevas Martínez, I., & Jara, A. J. (2025). Integrating the CRA into the IoT Lifecycle: Challenges, Strategies, and Best Practices. Information, 16(12), 1017. https://doi.org/10.3390/info16121017

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop