Next Article in Journal
Post-Fire Assessment in a Precast Concrete Industrial Building: Case Study
Previous Article in Journal
Rethinking Winter Heating in University Classrooms in China’s Hot Summer and Cold Winter Regions: Setpoint–Preference Mismatches, Pre-Heating, and Comfort Assessment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development and Validation of a Digital Maturity Gap Analysis Toolkit: Alpha and Beta Testing

1
Sustainable Infrastructure Research and Innovation Group, Department of Civil, Structural and Environmental Engineering, Munster Technological University, T12 P928 Cork, Ireland
2
Sustainable Infrastructure Research and Innovation Group, School of Building and Civil Engineering, Munster Technological University, T12 P928 Cork, Ireland
*
Author to whom correspondence should be addressed.
Buildings 2026, 16(7), 1305; https://doi.org/10.3390/buildings16071305 (registering DOI)
Submission received: 6 February 2026 / Revised: 13 March 2026 / Accepted: 22 March 2026 / Published: 25 March 2026
(This article belongs to the Section Construction Management, and Computers & Digitization)

Abstract

Digitalisation is transforming organisational practices, making digital readiness essential for strategic planning. However, customised digital maturity tools for the Irish Architecture, Engineering, Construction, and Operations (AECO) sector remain limited. This paper presents the development and validation of a Digital Maturity Gap Analysis Toolkit (DMGAT) for the Irish AECO sector. The toolkit assesses digital maturity across three dimensions—people, process and culture; technology; and policy and governance—covering 16 sub-dimensions and 69 assessment questions. Unlike existing tools such as the BIM Maturity Matrix, VDC BIM Scorecard, and Maturity Scan, the DMGAT uniquely integrates ISO 19650 maturity stages with a comprehensive maturity level matrix across three key dimensions, offering a customised, industry-specific assessment for the Irish AECO sector that combines structured benchmarking with actionable gap analysis. The toolkit supports gap analysis by comparing an organisation’s current maturity profile with the detailed descriptors of higher maturity levels (maturity level matrix), thereby enabling prioritised and context-specific improvement planning rather than pursuit of a uniform maximum level. The study uses a mixed-methods approach within a Design Science Research (DSR) framework, developing the tool across six phases: literature review, defining dimensions and key performance indicators (KPIs), prototype development, testing, refining and finalisation, and deployment for practical application and empirical evaluation within real organisational contexts in the Irish AECO sector, demonstrating its use as an operational diagnostic and learning tool. Alpha testing by the organisational research team refined structural enhancements including maturity stages, KPIs, and maturity matrix. Beta testing with 20 Irish AECO organisations confirmed the toolkit’s relevance, scope, and coverage. Participants highlighted its clarity and industry alignment, while suggesting minor improvements in wording, visuals, and support materials. This study concludes that DMGAT is a useful resource for informed decision-making and digital innovation in the Irish AECO sector.

1. Introduction

Digital transformation is essential for businesses of all sizes, indicating a rapid evolution in corporate environments through adapting processes and practices to effectively integrate new technologies [1]. The process involves implementing digital infrastructure to accommodate stakeholders, staff, and clients’ evolving requirements [2]. The integration of cloud computing, big data, social media, and mobile internet drives this transition [3], pushing businesses to innovate their models [4]. Digitisation enhances connectivity and manageability, drives industry through information acquisition, and provides competitive advantages, while organisations that neglect it risk reduced profitability [5,6]. Digital transformation in the AECO sector is also increasingly characterised by data-driven monitoring and precise early warning of critical infrastructure assets. Recent studies on bridge tower and cable performance monitoring demonstrate how integrated sensing, data fusion, and predictive warning methods can enhance decision-making, safety, and operational responsiveness in infrastructure management [7,8].
Digital maturity assessment tools evaluate competencies and determine the capacity to adopt technological innovations. These tools help organisations to identify gaps and formulate BIM implementation strategies to enhance digital maturity [9]. Digital maturity assessment tools are crucial for defining the conceptual scope, development trajectory, and milestones of the target systems. These frameworks have been adapted to assess digital transformations in retail, internet services, and manufacturing [10]. Similarly, these tools support AECO enterprises by evaluating organisational operations through the assessment of digital competencies maturity levels. As the construction industry continues to evolve, adopting digital maturity assessment tools become essential for companies, aiming to remain competitive. However, AECO stakeholders may face challenges in selecting suitable assessment tools [11].
In response to digital evolution, Ireland’s government initiated a phased BIM implementation plan in January 2024, mandating digital delivery integration in AECO projects. These mandates aim to improve planning, design, construction, and operation by focusing on efficiency, quality, cost, and sustainability [12,13,14,15]. McKenna et al. [16] analysed digital adoption practices across eight countries including Denmark, Finland, the Netherlands, Germany, the UK, New Zealand, Australia, and Singapore. The study found that governments lead BIM implementation initiatives with industry partners but face challenges in measuring adoption and sector fragmentation, particularly among SMEs. The study [16] confirms Ireland’s Build Digital Project aligns with international best practices and recommends developing business cases, exemplar projects, and measurement methodologies. The authors previously conducted a comprehensive review [17,18] of existing digital maturity assessment tools, identifying several gaps, including a lack of comprehensive assessment, inadequate integration of key components, and lack of fully integrated dimensional evaluation such as people, process, culture, policy, governance, and skills. The study proposed a conceptual framework to address these shortcomings.
This paper outlines the development and validation of a Digital Maturity Gap Analysis Toolkit (DMGAT), based on the framework proposed in the preceding study [18]. This Microsoft Excel-based tool is designed to support Irish AECO organisations in assessing their digital maturity. The toolkit evaluates digital maturity using three primary dimensions divided into 16 sub-dimensions and 69 assessment questions. It uses a five-level maturity scale to evaluate digital maturity aligned with ISO 19650 stages [19,20], ranging from manual procedures to full BIM integration. By providing dimension-specific results and sub-dimension scores, the toolkit highlights gaps in organisational digital maturity, guiding priorities. The toolkit’s development process includes conceptual framework development based on the conducted literature review, prototype designing, pilot testing, and feedback integration. Alpha testing involved an in-house organisational research team to ensure functionality and resolve usability issues. Beta testing was conducted with industry stakeholders to refine the toolkit based on feedback provided by practitioners. This approach ensured that the toolkit effectively addressed the needs of industry stakeholders, facilitating the efficient assessment of digital maturity and offering actionable insights. The findings aim to establish a foundation for advancing digital transformation within the Irish AECO sector, enabling enterprises to achieve greater digital integration and competitiveness.

2. Background and Research Context

2.1. Context

Digital transformation in the construction industry involves the adoption of technologies such as cloud computing, the Internet of Things (IoT), and Building Information Modelling (BIM) to digitise a wide range of organisational and project processes [21,22]. Recent studies indicate that the AECO sector is in a transitional phase of digital advancement, with increasing uptake of BIM, cloud-based platforms, data-driven workflows, and other Construction 4.0 technologies, although the pace and depth of implementation remain uneven across organisations and contexts [23,24].
This shift requires identifying both progress and persistent shortcomings across different stages of adoption, as well as assessing the efficiency and goal achievement of digital transformation initiatives [9]. Although implementation has expanded, the sector continues to face challenges related to fragmented practices, inconsistent strategic direction, and varying levels of organisational digital maturity, reinforcing the need for structured and context-sensitive maturity assessment frameworks and tools [24,25].
Digital maturity assessment tools assist in determining the requirements and difficulties of digitalisation across several operational domains in the construction industry. Production and other internal processes are more directly influenced by management decisions, rendering them to the effects of digital transformation [26]. Strategic alliances and shared management between firms are crucial for operations such as logistics, which occur at the intersection of an organisation and its business partners. Consequently, the cooperation of all parties and their interactions is essential for successful digital transformation [27,28]. To address these complexities, it is recommended to customise digital maturity assessments at the organisational level and specific operational areas [29]. This method ensures successful implementation across all processes and enables a detailed assessment of transformation initiatives in line with the strategic goals. Industry stakeholders often embrace this customised approach, which offers a comprehensive framework for assessing digital maturity and adopting digital transformation [30].

2.2. Approaches to Maturity Model Development

Maturity models have led to the development of maturity assessment tools, crucial for evaluating organisational management competence and enhancing critical success factors [31]. These models are particularly beneficial for digital transformation leaders and managers, enabling them to identify areas for improvement within the organisation [32]. Angreani et al. [33] highlighted three critical purposes of maturity models: auditing and benchmarking measures; tracking progress from current assessments to goals; and identifying strengths, weaknesses, and opportunities. Lasrado et al. [34] suggested four primary methods for developing maturity models, which can be categorised as follows:
Conceptual: These models are based on a theoretical framework for identifying dimensions. A robust theoretical basis is essential, rather than merely referencing previous maturity models, for inclusion in this category.
Qualitative: This category includes models that primarily use a qualitative empirical approach to determine dimensions and levels.
Quantitative: Models that employ a quantitative empirical approach to establish dimensions and maturity levels come under this category.
Derivative: This category includes models that predominantly drive from the existing literature on maturity models, adapting it to address appropriate domain-specific issues without relying heavily on robust theoretical or empirical foundations. Additionally, it encompasses models developed with a focus on practical application for industry.
The integration of qualitative and quantitative methods, leading to a mixed-methods approach, is more commonly employed in the development of maturity models [35]. Typically, a literature review is followed by the development of a conceptual framework and prototype, which is then validated and tested through focus groups and/or interviews before the measuring tool is operationalised. Schumacher et al. [36] established common structural characteristics of a maturity model:
  • Maturity levels typically range from 1 (lowest) to 5 (highest), with some models including Level 0.
  • Maturity assessment areas range from 3 to 16.
  • Assessment modes can involve self-assessment or external audits.
  • Representation methods include numerical data, which are often visualised using radar charts in terms of percentages.
Two primary representation approaches, staged and continuous, are common in model development [32,36]. The staged model emphasises organisational maturity through five levels, making it ideal for structured growth and benchmarking, while the continuous model enables targeted improvements in specific process areas using capability levels, offering greater flexibility. Staged representation requires all areas to achieve a minimum rating to advance maturity levels, while continuous representation evaluates each area independently to align with organisational priorities [37]. The continuous model enhances individual process areas through capability levels 0 to 5, whereas the staged model evaluates organisational maturity across maturity levels 1 to 5 [38]. Both models share components like process areas, goals, and practices, yet they differ in scope. Capability levels assess specific area performance, while maturity levels measure overall organisational maturity. Figure 1 shows a graphical representation of these approaches.

2.3. Digital Transformation and Maturity Assessment Tools

Digital transformation, driven by digitisation and digitalisation, is influenced by Industry 4.0 and has been accelerated by the COVID-19 pandemic [39,40,41]. Digital transformation influences organisations differently based on sector and supply chain position [30]. Companies with advanced digitisation show greater crisis resilience, relying on project success and enhancing project management capabilities through digital adoption [41]. Studies [32,42,43,44] showed that business-to-consumer (B2C) organisations experience faster effects of digital transformation compared to business-to-business (B2B) organisations. These dynamics affect the benchmark outcomes of digital maturity assessment tools.
Maturity assessment tools are crucial in the digital transformation processes across various industries, with a particular emphasis on BIM implementation within the construction sector [29,44,45,46,47,48,49,50]. These tools play a significant role in identifying organisational gaps, highlighting weaknesses and strengths, guiding them on effective digital adoption, and formulating improvement roadmaps [13,45]. Additionally, these tools are essential in assessing an organisation’s innovation strategies [46,47], determining their readiness to adopt innovative technologies, and outlining steps toward achieving advanced maturity levels [48]. Furthermore, these tools assist organisations in benchmarking and managing their digital transformation initiative [42,49]. Previous studies [36,50] highlighted that majority of digital maturity assessment tools employ graded levels to evaluate areas such as technology, strategy, and organisation. Certain tools focused on specific domains, including value creation processes [51], knowledge-intensive business processes [52], and logistics [29,33,42,49]. Similarly, Hein et al. [53] identified technology, strategy, policy, organisational management, processes, and organisational basic culture as primary focus areas, while customers, skills training, financial resources, and corporate digital culture were noted as underemphasised areas [33,54]. Gollhardt et al. [55] and Wagire et al. [10] developed tools shaped for manufacturing and Industry 4.0, designed to assist enterprises in identifying gaps and improvement pathways. Borštnar and Pucihar [56] developed a tool specifically for SMEs, addressing their resource constraints, whereas Zhao et al. [57] and Smits et al. [58] concentrated on tools for risk assessment and BIM adoption. Liang et al. [59] developed a tool to advance BIM technology across projects, companies, and industries. Wernicke et al. [60] proposed a tool to assess digital maturity in construction operations for long-term improvements, while Perera et al. [24] developed a tool for evaluating digital maturity in the Australian design and construction sectors. Some tools take a comprehensive approach rather than targeting specific sectors [61]. Aras and Gülçin [62] analysed the existing tool integration into digital transformation processes to inform the development of a tool for business functions. Tools designed to assess digital maturity gaps enable enterprises to better adapt to digital environments [63]. Prescriptive and modular maturity tools with priority-weighted components guide improvement strategies [64]. However, gaps persist, as most tools focus on a specific domain such as design or operation, and they lack construction industry customisation and international standard adoption [19,24,62,63]. Developing digital maturity assessment tools faces challenges such as limited empirical validation, the assumption of linear digital transformation, and the negligence of industry-specific needs [2,32]. Despite these challenges, existing tools such as the BIM Maturity Matrix [65] and Maturity Scan [66] help organisations reach desired maturity levels. Similarly, the scope of the Capability Maturity Model has expanded into fields like human resource management [52] and technological innovation [53]. Assessment scope varies among tools, with some focusing on internal capabilities [67], while other tools enable cross-organisational comparisons, such as the Capability Maturity Model [38,52,53], BIM Cloud Score [68], or assessments at multiple levels, including individual, team, and organisational levels, such as the BIM Maturity Matrix [65] and BIM Excellence [69]. Tools use different evaluation metrics, including capability maturity scales [38,52], compliance quantification (VDC BIM Scorecard) [70,71], and dimensional assessments (Organisational BIM Assessment Profile) [72]. Each tool has strengths, such as real-time collaboration (BIM Cloud Score) [68] and comprehensive coverage of maturity aspects (VDC BIM Scorecard) [70]. Despite strengths, some tools have limitations including issues of subjectivity (NBIMS-CMM) [38,52,53], reliance on self-reporting (BIMe) [69], and lack of practical use documentation (Organisational BIM Assessment Profile) [72].
Recent initiatives have broadened the scope and evaluation methodologies of maturity tools. Ullah et al. [18] conducted a systematic review of 20 existing digital maturity assessment tools, offering a systematic guide for the development of maturity assessment tools with a comprehensive approach. The study analysed key characteristics of existing tools, including the assessment matrix, accessibility, limitations, and the time required for assessment completion. It proposes a framework for a comprehensive digital maturity gap analysis toolkit, offering an integrated assessment for construction organisations. The maturity assessment tools facilitate systematic improvements and guide organisations in their digital transformation efforts. However, assessment tools designed for SMEs and local authorities remain limited in regional contexts such as Ireland. To address these limitations, this paper introduces the development and validation of a DMGAT for local authorities and construction organisations within the Irish construction sector. The DMGAT uses three dimensions, divided into 16 sub-dimensions and 69 assessment questions, to identify gaps in organisational digital maturity and suggest improvements to achieve higher maturity levels, thereby supporting organisations in maintaining competitiveness.

2.4. Testing Regimes in the Development of Maturity Assessment Tools

Digital maturity assessment tools are increasingly critical to steering digital transformation within the AECO sector [73] and have the potential to greatly enhance operational efficiency, notably reducing document approval time by 75% in construction SMEs [62]. The development and validation of these tools typically involve literature reviews, expert interviews, and case studies [74,75]. Previous studies [64,76,77] used case studies to validate maturity tools. The number of participants in testing and validation varies widely, from 5 interviews [74] to 24 companies [73], or even more across different sectors [13]. Some studies employ larger sample sizes, such as 17 public sector organisations [45], or industry-wide surveys and interviews [61]. However, validating these tools remains a significant challenge, with only a few being empirically evaluated across various sectors [13]. Iterative feedback is crucial in ensuring that KPIs effectively assess digital maturity across different dimensions and refine scoring methods [78]. The toolkit’s clarity and effectiveness are underscored by validation through testing phases, leading to improvements that better meet industry requirements [63,78].

3. Methodology

3.1. Approach to Toolkit Development

This study adopts a mixed-methods approach embedded within a Design Science Research (DSR) framework to develop and validate the DMGAT. Mixed-methods research integrates qualitative and quantitative approaches to strengthen tool development and validation [79], while DSR supports systematic artefact creation and evaluation to address real-world problems [80].
The DMGAT was developed through six phases as shown in Figure 2. Phase 1 employed a systematic literature review to analyse existing digital maturity toolkits and identify research gaps. Phase 2 involved conceptual framework development based on the review findings, focusing on formulation of assessment questions, dimensions, sub-dimensions, and operationalising conceptual constructs into measurable indicators for content validity. In Phase 3, a pilot version of the DMGAT was developed utilising the conceptual framework to integrate qualitative framework components with quantitative scoring criteria and maturity levels, using a mixed-methods approach [79]. Phase 4 comprised iterative testing: alpha testing using quantitative analysis to refine the framework consistency followed by beta testing with AECO stakeholders to collect feedback on usability, relevance, and practical applicability using DSR principals [80]. Phase 5 synthesised results and feedback to finalise a validated toolkit suitable for practical deployment. In phase 6, The validated DMGAT will be deployed for practical application and empirical evaluation within real organisational contexts in the Irish AECO sector, demonstrating its use as an operational diagnostic and learning tool.
The overall methodology of this research adopts a mixed-methods approach aligned with DSR principles through artefact creation (the DMGAT), iterative evaluation cycles (alpha and beta tests), and practical relevance (engagement with industry stakeholders). This approach ensures that the toolkit effectively addresses both operational and strategic needs [35,79,81] and contributes to the advancement of theoretical knowledge while providing a practically applicable assessment tool.

3.2. Framework and Prototype of the Toolkit

A systematic review [18] preceded this study and served as its conceptual foundation. It focused on existing tools to identify gaps and position the DMGAT within current knowledge. Five tools, including the Maturity Scan, BIM Maturity Matrix, BIM Compass, Interactive Capability Maturity Model, and National BIM Standards Capability Maturity Model, were identified as most appropriate to inform development of a digital maturity gap analysis toolkit for the Irish AECO sector. The DMGAT evaluates organisational digital maturity across three dimensions: people, process, and culture, including six sub-dimensions and 23 assessment questions (AQs); technology, with four sub-dimensions and 15 AQs; and policy and governance, with six sub-dimensions and 31 assessment questions, as shown in Figure 3. The toolkit’s assessment methodology appraises maturity levels across digital maturity aspects. By addressing these dimensions, the framework provides a comprehensive view of digital capabilities, supporting organisations to identify improvement areas and formulate strategies to enhance digital maturity. A complete traceability matrix linking assessment questions (AQ1–AQ69) to the relevant ISO 19650 reference areas and principles is provided in Supplementary File S1.
The distribution of assessment questions across the three dimensions was not determined arbitrarily but emerged from the structured literature synthesis and comparative analysis reported in the preceding systematic review [18], followed by expert-led refinement during toolkit development. Recurring maturity constructs identified across existing digital maturity assessment tools were extracted, coded, and clustered into broader organisational, technological, and governance-related domains. This process showed that the people, process, and culture and policy and governance dimensions encompassed a wider range of interrelated constructs, including organisational readiness, collaboration, leadership, standards, compliance, and strategic alignment, therefore requiring broader question coverage. In contrast, the technology dimension addressed a more bounded set of capabilities related to digital tools, infrastructure, and technology use, resulting in a comparatively smaller number of assessment questions. Accordingly, the allocation of 23, 15, and 31 questions across the three dimensions was intended to ensure comprehensive construct coverage rather than equal numerical weighting. Overall, construct development and preliminary validation were informed through structured literature synthesis and expert validation.
The prototype toolkit, developed in accordance with the framework presented in Figure 3, constitutes a substantial advancement in the practical application of research findings. As illustrated in Figure 4, this Microsoft Excel-based toolkit comprises multiple tabs specifically designed to address individual aspects of the framework. The initial page of the toolkit serves as the primary guide and dashboard, enabling users to navigate seamlessly to any section with a single click. The next tab offers a comprehensive overview of the toolkit’s full functionality, encompassing its purpose, development rationale, operational instructions, usage guidelines, and scoring criteria. The next tab details the maturity level matrix, which outlines key performance indicators and their correlation with maturity levels. The self-assessment is organised into the subsequent three tabs, representing the dimensions of people, process, and culture; technology; and policy and governance, respectively. Each dimension contains sub-dimensions with specific questions related to those sub-dimensions. Each question should be rated on a 1-to-5 Likert scale, corresponding to the five maturity levels. The next three tabs display the organisation’s digital maturity results for various sub-dimensions across the three main dimensions in individual tabs, as shown in Figure 5. The results are represented as percentage digital maturity of organisation in each area. This visualisation aids in identifying areas requiring increased focus to enhance successful BIM implementation. Additionally, the tool represents the digital maturity stages of these sub-dimensions within the context of the three primary dimensions. The final tab provides a detail overview of the organisation’s current status across the three main dimensions, displaying the percentage of digital maturity for each dimension and indicating its current maturity stage as shown in Figure 5. The toolkit incorporates components that facilitate data collection and analysis through assessment questions, ultimately presenting the results of organisational digital maturity.

3.3. Approach to Toolkit Validation

The development and validation of the DMGAT followed a structured approach to ensure its reliability, usability, and effectiveness for organisations in the Irish construction sector. The fourth phase of toolkit development, which is the main focus of this paper, evaluated various aspects of the prototype through comprehensive alpha and beta testing. Alpha testing and beta testing were designed to serve distinct validation purposes within the DSR process. Alpha testing focused on evaluating the prototype’s internal logic, structural coherence, core functionality, interface design, and alignment with ISO 19650 maturity stages. This phase was carried out internally by organisational research team experts with extensive experience in both academic and industrial digital domains. Technical issues related to the prototype’s maturity stages, maturity level matrix, visualisation, and refinement of assessment questions were identified and subsequently addressed through iterative feedback loops, leading to an improved version of the tool. While this approach enabled efficient internal refinement of the artefact, it may also have introduced research bias, as the alpha testers were closely connected to the development process. The improved toolkit then underwent beta testing with external AECO industry stakeholders to assess its real-world applicability, usability, and effectiveness. This phase aimed to evaluate participants’ perceptions of the toolkit’s functionality and alignment with industry needs, while also helping to mitigate the potential bias associated with internal alpha testing. The evaluation process involved interactive assessment sessions, each lasting approximately 90 min. These sessions were designed to gather feedback on the toolkit’s effectiveness and ability to address real-world challenges in the industry. The data collection methods included individual assessment sessions, as well as the administration of questionnaires and observational studies conducted during the various phases of toolkit development. A standardised questionnaire, structured around the following four areas, was employed to gather the feedback:
  • General Perception: obtaining initial impressions about the toolkit, including its perceived relevance and potential impact.
  • Usability, User Experience, and Support: evaluating the toolkit interface, ease of navigation, and any required resources for optimal functionality.
  • Relevance and Coverage: assessing the comprehensiveness of the toolkit in addressing digital maturity aspects and their alignment with industry standards.
  • Key Observations and Recommendations: documenting specific insights, suggestions, and recommendations for improving the tool.
A mixed-methods approach was employed to analyse the feedback, which was subsequently integrated into the prototype to further refine the toolkit. The systematic validation process will conclude in the dissemination of the final version of the toolkit to Irish AECO stakeholders, thereby ensuring its continued relevance to the industry and its effectiveness in guiding construction organisations toward successful digital transformation.

3.4. Ethical Considerations

Ethical approval for this study was obtained from the Human Research Ethics Committee (HREC) of Munster Technological University prior to conducting the alpha and beta testing stages involving human participants. The approved study was titled “Development and Validation of Digital Maturity Gap Analysis Toolkit (DMGAT) for Construction SMEs” and was granted approval under reference MTU-HREC-MR-24-023-A on 11 June 2024 (Supplementary File S2).
The ethical approval covered the engagement of industry practitioners participating in the toolkit validation process, including both the alpha and beta testing phases, which involved structured feedback sessions and one-to-one assessment exercises. All participants were informed about the purpose of the research, participation was voluntary, and feedback was collected in accordance with the ethical guidelines outlined by the university’s Human Research Ethics Committee.

4. Prototype Validation and Feedback Analysis

4.1. Alpha Testing

The validation process was designed to evaluate the toolkit’s performance against established criteria, its usability for the intended users, and its capacity to deliver consistent and accurate results. Within this process, alpha testing and beta testing served different purposes. Alpha testing focused on validating the prototype’s internal logic, structural framework, core functionalities, design interface, and alignment with ISO 19650 standards before broader external validation and deployment. This phase involved a focused assessment by experts from research team comprising experts from the research hosting institute and project partners from industry, including a project lead, digital built consultant, public sector client, and developer and specialist contractor, who systematically evaluated the prototype’s core functionalities and interface. These experts possessed extensive academic qualifications and industrial experience, particularly in digital transformation and BIM implementation within the construction sector. Their backgrounds encompassed both theoretical research and practical industry experience, enabling them to critically assess the toolkit’s functionalities with a comprehensive understanding of current industry standards and academic rigour. This dual expertise provided a robust foundation for validating and refining key components such as the toolkit’s structural framework, dimensional parameters, and assessment methodologies, thereby ensuring that optimisations were both practically relevant and theoretically sound. However, as the alpha testing was conducted internally by individuals closely connected to the project, this phase may have introduced a degree of research bias. For this reason, beta testing was subsequently undertaken with external AECO stakeholders to evaluate the toolkit’s practical relevance and applicability under real-world conditions. These experts performed practical evaluations to detect bugs, usability challenges, and performance limitations. Their detailed feedback provided critical insights that guided targeted refinements before the broader user testing phase. The insights obtained facilitated the identification of critical areas for enhancement in functionality, user interface design, and additional features to enhance the toolkit’s effectiveness, thereby laying the groundwork for beta testing with broader industry stakeholders. The key revisions are summarised as follows.
Transition to ISO 19650 Maturity Stages: The Bew and Richard maturity levels [82] were replaced with ISO 19650 maturity stages for international standard alignment. This new model introduced progressive BIM implementation stages, allowing organisations to evaluate their status and plan advancements.
Updating Maturity Levels: The maturity levels were updated to initial, planned, defined, managed, and optimised, accordingly aligning them with a five-point Likert scale [83]. This approach allows organisations to evaluate their digital maturity by assigning levels to each aspect, corresponding to different maturity stages. These levels act as measurable indicators of maturity progression, enabling organisations to assess their current maturity stage and pinpoint areas for improvement.
Introducing Maturity Level Matrix: A maturity level matrix (Table 1) was developed to align maturity levels with sub-dimensions. The matrix details levels for assessing organisational maturity, with each level corresponding to specific benchmarks. This enables organisations to measure their current position and improvement targets while systematically tracking digitisation progress.
Revised Assessment Questions: Assessment questions were updated to improve clarity and alignment with maturity levels, providing specific guidance for evaluating digital transformation practices. These updates help organisations measure maturity levels accurately, while facilitating progression through framework stages.
The alpha test provided significant benefits by identifying key areas for enhancement, including functionality, usability, and alignment with industry standards. This feedback enabled focused and precise refinements that improved the toolkit’s clarity, relevance, and robustness, ensuring it effectively meets AECO sector stakeholder needs and to better support systematic digital maturity evaluation in the sector. The alpha testing results ensured the toolkit’s alignment with the AECO sector stakeholders needs and international standards. The refinement process was subsequently expanded to include a broader range of stakeholders during the beta testing phase, as presented in Section 4.2.

4.2. Beta Testing

4.2.1. Context

Beta testing involved a comprehensive evaluation process with participants from the Irish AECO sector. The beta testing phase was designed primarily to confirm the suitability of the toolkit for industry application; assess the coverage of relevant digital maturity dimensions, sub-dimensions, and assessment questions; evaluate usability across both SMEs and larger organisations; and collect detailed practitioner feedback to support further refinement of the tool. To achieve this, a cohort of 20 participants, representing various organisations within the sector, were invited through targeted email invitations to participate in the evaluation. As participation was voluntary and required in-person, one-to-one assessment sessions, the sample may reflect a degree of self-selection bias, potentially attracting practitioners who were already more digitally engaged or interested in BIM-related practices. Nevertheless, the objective of the beta testing phase was to evaluate the toolkit’s operational validity, usability, and practical applicability in organisational settings, rather than to derive statistically representative conclusions about the digital maturity of the wider AECO sector. The testing methodology involved individual interactive sessions with all participants, each lasting approximately 90 min. These one-on-one assessment sessions lead to a comprehensive assessment of the toolkit’s features and functionalities. Participants were encouraged to engage with the toolkit directly, simulating real-world applications and scenarios relevant to their professional roles. Throughout the sessions, participants were asked to provide feedback on their experiences, focusing on aspects such as user-friendliness, maturity aspects adequacy, practical applicability, and potential areas for improvement. Feedback was collected through a structured questionnaire, ensuring consistent and comparable data across all participants. The combination of interactive testing and structured feedback collection facilitated a thorough evaluation of the toolkit’s potential impact on the Irish AECO sector.

4.2.2. Participants

In the construction sector, organisations are classified into Tiers 1 through 4, based on their position within the supply chain [54]. Tier 1 organisations act as principal contractors or enterprises, employing over 250 individuals. Tier 2 organisations are large subcontractors or enterprises employing 100–249 individuals. Tier 3 comprises small organisations with 50–99 individuals, while Tier 4 organisations are micro enterprises with less than 50 employees. The DMGAT was tested with 20 participants from various organisations within the Irish AECO sector, with a focus on Tier 1 organisations that are likely to be industry leaders in digital adoption. Although the beta testing involved 20 participants, each assessment session was conducted in person, with the researcher personally visiting the participating organisations to administer the toolkit and collect detailed and effective feedback. This approach was adopted to ensure consistency, depth of engagement, and accuracy in evaluating the toolkit’s practical usability and relevance across different organisational contexts.
The inclusion of SMEs (Tier 3 and Tier 4), representing 20% of the sample, was intentional to ensure that the beta testing captured the perspectives of organisations facing distinct digital maturity challenges, including limited resources, lower levels of in-house digital expertise, and slower technology adoption. Their participation provided important insight into the barriers and support needs of smaller organisations, which differ from those of larger firms considered as digital leaders in the industry. At the same time, the stronger representation of Tier 1 organisations reflected their more established role as digital adoption leaders within the Irish AECO sector, making their feedback particularly valuable for evaluating the toolkit’s structure, maturity logic, and benchmarking potential. This sampling approach therefore balanced advanced digital maturity perspectives with the practical realities of smaller firms, thereby improving the relevance and applicability of the DMGAT across the wider Irish AECO sector. The beta testing sample included one participant from a Tier 4 (micro-enterprise) organisation, representing 5% of the total sample. Although this inclusion is valuable in demonstrating that the toolkit can be applied across different organisational scales, the representation of micro-enterprises is too limited to support statistically meaningful conclusions for this category. Therefore, findings related to Tier 4 organisations should be interpreted as exploratory only and should not be generalised to the wider SME or micro-enterprise sector.
The participants profile, as shown in Figure 6, includes information on their experience and involvement within the AECO sector, both digitally and generally. These participants were selected from a range of organisations, including Tier 1, Tier 2, Tier 3, and Tier 4 organisations, operate in various roles within the Irish AECO sector, including clients, consultants, contractors, and developers, as shown in Figure 7. These organisations play distinct roles in project execution and decision-making processes. Consultants primarily provide expert advice and strategic planning, while clients focus on project requirements and outcomes. Contractors and developers are responsible for the actual development and construction activities on site. This diversity ensures a comprehensive evaluation of the toolkit’s effectiveness and relevance to industry needs. Figure 7 illustrates the distribution of participants based on their roles within the industry. The sample comprises various roles including BIM coordinators, architectural technologists, project leads, estimators or quantity surveyors, BIM managers, BIM leaders, professional bodies, project managers, and interior architects. This distribution highlights the diverse representation of roles among participants, reflecting a broad spectrum of operations within the AECO sector. Participants shared diverse perspectives on the toolkit’s suitability. Feedback was collected in four main areas: the overall perception of the toolkit’s relevance and impact, an evaluation of its usability and support required, an assessment of its relevance and scope, and key observations and suggestions for improvement. This structured approach ensured detailed and consistent feedback, facilitating a balanced evaluation of the toolkit. The iterative development process was informed by the findings from this phase, enhancing the toolkit’s capacity to meet the various needs of professionals in the Irish AECO sector.

4.3. Results and Key Findings

4.3.1. General Perception

The overall perception of the DMGAT among the 20 participants was highly positive, with respondents emphasising its clarity, structure, and relevance. Participants praised the toolkit’s user-friendly design and ability to reflect organisational digital maturity. A Tier 1 participant valued its comprehensiveness, while Tier 3 and 4 participants highlighted its clear results and potential as learning resource. The toolkit’s alignment with ISO 19650 standards reinforced its credibility.
Figure 8 illustrates participants’ perceptions across initial impressions, organisational benefits, and role-specific benefits. A five-point Likert scale from “Poor” (1) to “Excellent” (5) was adopted for its common use in digital maturity and organisational assessments, ease of respondent interpretation, and suitability for ordinal data analysis [83]. Initial evaluations were positive, with the toolkit predominantly rated as “Excellent” or “Good.” Regarding organisational value, most respondents rated it as either “Excellent” or “Good,” with a minority assigning an “Average” or “Fair” rating. For role-specific benefits, ratings were consistently “Excellent” or “Good,” reflecting strong perceived personal value. Users recommended that clearer guidance and customisable options would improve the toolkit’s applicability.
Qualitative feedback revealed that the toolkit was well received in multiple areas, with the highest level of enthusiasm observed at the initial impression and maintained through the recognition of its perceived benefits. The drop in “Excellent” ratings from the initial impression to the assessment of applied benefits, as shown in Figure 8, highlights the need for refinement in adaptability across various organisational sizes and the customisation to specific roles. The positive feedback confirms the toolkit’s design and alignment with ISO 19650 standards, building user trust. The DMGAT has gained reliability and industry relevance, with the potential to enhance digital maturity in both large companies and SMEs. Its reception is particularly significant within the Irish AECO sector, as it aligns with the sector’s ongoing efforts in digital adoption.
From an analytical perspective, the pattern in Figure 8 suggests that participants responded more strongly to the toolkit’s conceptual clarity than to its immediate applied value within all organisational settings. This distinction is important because it indicates that the DMGAT has already achieved one of the study’s core objectives, namely the development of a credible and understandable assessment tool but still requires refinement to improve contextual fit across different organisational tiers and professional functions. The contrast between strong initial impressions and slightly more moderated ratings for organisational and role-related benefits also implies that respondents were able to distinguish between general acceptance of the tool and its practical utility in their own working environment, which strengthens the validity of the feedback obtained.

4.3.2. Usability, User Experience and Support

Figure 9 presents feedback on navigation, user experience, and support features, using colour codes to represent five-point Likert-type scale ratings. For support requirements, one-third of the participants rated the toolkit as an “Excellent” and “Good” resource, requiring no additional support, while others rated it “Average” and “Fair,” suggesting that instructional and guidance videos would be helpful. Some participants believed substantial government support is necessary, while others indicated limited support such as incentives for SMEs would be sufficient. Regarding user-friendliness, half of the participants rated the toolkit as “Excellent” and the other half found it “Good” in terms of ease of learning, clarity of instructions and labels, logical layout and design, and suitability for non-expert users. For ease of navigation including structure, hierarchy, tab flow, forward/backward movement, progress indicators, and section locating, the majority rated it “Excellent” while a small portion of participants rated it “Good.” Although the navigation ratings are high, the split between excellent and good ratings indicates room for improvement.
The usability findings show a clear contrast between the technical accessibility of the toolkit and the level of external support perceived to be necessary for wider adoption. While navigation and user-friendliness were rated very positively, support-related responses were more varied, indicating that ease of use alone may not be sufficient to ensure implementation across all organisational contexts. This suggests that the toolkit itself is functionally accessible, but its uptake may still depend on organisational readiness, digital capability, and the availability of supporting resources, particularly for smaller firms with limited internal expertise. In relation to the study objectives, these findings are important because they demonstrate that the DMGAT is operationally usable, while also highlighting that implementation strategies may need to extend beyond interface design to include guidance materials, onboarding support, and possibly sector-level enabling mechanisms.
These findings confirm that while the DMGAT is user-friendly and well structured, external support will be key to maximise its impact. The varied perspectives on government support reflect diversity within the Irish AECO sector, where resource-rich Tier 1 firms easily adopt such tools, while SMEs face challenges. This highlights the need to enhance technical usability and address structural conditions to promote widespread adoption. In Ireland, where SMEs dominate the construction industry, government support is crucial for the toolkit’s accessibility and implementation across organisations for successful BIM adoption.

4.3.3. Relevance and Coverage

The participants rated the DMGAT’s relevance and coverage of maturity aspects highly, as shown in Figure 10. The toolkit effectively addressed digital transformation challenges and evaluated digital maturity aspects. In assessing the toolkit’s relevance and coverage, four components, including participant satisfaction with metrics, alignment with digital transformation challenges, assessment of relevant aspects, and absence of maturity aspects, were considered. Most respondents rated the toolkit’s performance metrics capabilities as “Excellent” or “Good,” indicating consistent satisfaction. In terms of relevant aspect assessment, most respondents described the toolkit as “Excellent” or “Good,” while a small portion considered it “Average”, suggesting the need for role-specific evaluations.
The toolkit received high ratings from participants for its alignment with digital transformation challenges. Participants rated it “Excellent” and “Good,” highlighting its importance in addressing digital transformation challenges, and found it well suited. Responses indicated a generally positive perception of maturity aspects coverage, with many ratings classified as “Excellent” or “Good.” However, a notable portion rated them as “Average” or “Fair,” highlighting opportunities for improvement such as customised assessments for clients, consultants, contractors, and asset management teams. Overall, the feedback confirms the toolkit’s effectiveness while indicating potential for customisation.
The findings reveal high satisfaction with current performance measurement systems, comprehensive coverage of maturity aspects, and strong alignment of the DMGAT with digital transformation challenges, alongside areas for improvement. In the Irish AECO sector, BIM adoption remains limited among SMEs, while larger organisations lead in this domain [84,85]. This deviation highlights the challenge of designing a framework that accommodates both digitally advanced Tier 1 organisations and resource-constrained smaller firms. The toolkit’s alignment with industry standards enhances legitimacy, though participants call for role-specific questions to address stakeholders’ individual needs. Analytically, these findings suggest that the DMGAT is already robust in its overall strategic framing and coverage of digital maturity domains but requires greater sensitivity to stakeholder-specific and organisationally differentiated application. The mixed responses do not indicate weakness in the framework itself; rather, they point to the need for further refinement in tailoring the tool to the priorities of different user groups. This has direct implications for future development of the toolkit and supports the study objectives by demonstrating both its broad sectoral relevance and the importance of more nuanced maturity assessment across organisational tiers.

4.4. Key Observations and Recommendations

4.4.1. Initial Impression of the Tool

The initial impression feedback indicates a strong endorsement of the tool, along with some key observations from the participants. Users noted its “very user-friendly interface,” with “clear and concise questions,” and “easy-to-follow instructions,” highlighting that “results are graphically clear, effective and easy to understand.” The tool’s structure was observed as “well-structured and aligned with ISO-19650,” allowing users to “clearly identify gaps from visuals” and “easily understand the questions and clear sections.” A frequent positive theme we observed was its ability to “reinforce understanding of the company’s basic maturity level and provide valuable insights for improvement.” Users also noted that it “covers significant aspects of BIM maturity,” and evaluated it as an “excellent starting point for any organisation that wants to see where they are on their own BIM journey.” A participant shared his organisation experience that “they had previously worked with a paid BIM consultant to obtain similar outcomes,” highlighting the free availability and impressive capabilities of this tool. Another participant from a Tier 1 organisation was “initially sceptical about the BIM training section, but it eventually revealed the gaps in the organisation’s BIM maturity” that the participant had anticipated. Additionally, respondents specifically recommended enhancing the tool by “maintaining a user-friendly interface with clear and concise questions, aligning marketing materials with ISO-19650 standards, emphasising actionable training and improvement insights, and ensuring visuals enhancement to clearly identify the gaps”. The feedback emphasises the significance of promoting its professional, well-explained, and friendly design by highlighting its strengths in clarity, industry alignment, and actionable outputs to maximise organisational impact.

4.4.2. Navigation Through the Tool

The participant feedback regarding the tool’s navigation was positive. A participant from a Tier 2 organisation described it as a “simple enough and well-organised spreadsheet with all the necessary guidance available”. The user interface was described as “easy to navigate,” with “clearly articulated and well-structured questions” enhancing the intuitive use of the tool. Similarly, it was also observed that “the details provided for each level,” and “easy-to-follow instructions” followed by a “well-structured and engaging interface” further facilitate smooth user navigation. Specific features like “colour coding for sections”, “prompting of questions with level details,” and “scores dropdown,” were identified as key improvements for enhanced readability and visual presentation. Further recommendations from users included prioritising its “online availability in the future to facilitate wider industry stakeholders” to enhance accessibility, maintaining “colour coding for sections” to ensure visual clarity, and continuing to refine the “general layout and design” to sustain user engagement. The feedback underscores the tool’s capability in balancing simplicity with functionality, encouraging developers to maintain its “uncomplex, well-defined questions and answers” while also addressing the demand for broader access. A participant provided overall feedback expressing that he was “very impressed by the clear guidance, logically separated tabs, and the clarity of results presented in pre-established graphics”.

4.4.3. User Friendliness of the Tool

The tool’s user-friendliness was frequently highlighted by participants, with specific strengths noted such as “logical layout,” with “coherently structured format,” and “precisely formulated questions with clearly defined levels.” Users noted the tool’s “Microsoft Excel-based” design as “intuitive and accessible for all users.” Participant feedback highlighted the tool’s highly intuitive and user-friendly nature, with remarks that it was “highly user-friendly with clear explanations and an intuitive interface,” making the tool “self-explanatory” with “accessible design,” to “answer the questions clearly for completing the assessment.” Additionally, it was also observed that “answering the questions requires detailed knowledge of BIM, which is appropriate for BIM leaders in an organisation.” Respondents encouraged developers to “maintain the Microsoft Excel-based interface” for its equal familiarity and high accessibility for both large and small organisations. The consistent acclaim for its clean layout and ease of use strengthens its standing as a great tool, yet users recommend some future updates and improvements.

4.4.4. Assessment Coverage of Relevant Aspects

The toolkit’s comprehensive assessment of relevant aspects received strong approval from participants. A participant from a Tier 1 organisation described the toolkit as “extremely well aligned” with current digital transformation challenges, while another participant from a Tier 2 organisation noted its “clear language” and conformity with ISO-19650 standards. However, including role-specific assessment and refining the toolkit’s scope were identified areas for improvement. Similarly, a participant from a Tier 1 public sector organisation observed the assessment questions’ relevance, describing them as “excellent, thought-provoking, and relevant.” A BIM coordinator highlighted persistent challenges in “training or hiring suitable and qualified staff” and recommended focusing on “individual skills assessment” to enhance usability. Similarly, a participant from a Tier 4 organisation suggested “making technology-related questions more discipline-specific” to ensure relevance across different roles. Another participant highlighted that “limited adoption of BIM by small and medium-sized enterprises (SMEs) complicates effective benchmarking and underscores the necessity for customisation” to address challenges specific to the industry and organisational scale. A Tier 1 participant highlighted the “automatic updates of answers for each question as you move” as a “stroke of genius,” eliminating the need to navigate back and forth between tabs to understand level definitions. To further enhance the toolkit, respondents suggested expanding assessment criteria, updating technology-related questions to align with current industry practices, and offering additional support for SMEs to bridge BIM adoption gaps. The tool was recognised for its consistence performance, well-organised structure, and adherence to established industry standards.

4.4.5. Support and Resources Required

Several participants noted that the toolkit requires no additional support, while others suggested valuable improvements including comprehensive guidelines, developing a maturity level matrix, and providing guidance videos to improve usability for non-expert users. The tool’s user-friendly Microsoft Excel-based design was valued for its practicality and seamless integration into daily workflows. Participants highlighted the tool’s intuitive design and noted that “software knowledge is required to complete” the assessment, although it remains “pretty much self-explanatory,” with “guidance and explanations provided.” Moreover, the tool was acknowledged for meeting stakeholder needs based on accurate result generation and gap identification. Several users identified challenges related to digital adoption, particularly among SMEs with limited resources. The feedback highlights the tool’s reduced reliance on external assistance while also indicates opportunities for targeted support and functional collaboration to address the varied contexts of users. Collectively, these insights highlight a balanced approach combining user-centred design and adaptive support structures to drive successful adoption.
In summary, most participants provided positive feedback, while several offered positive comments accompanied by suggestions and recommendations, as outlined in Section 4.4. Addressing these suggestions will ensure the toolkit’s continued relevance and effectiveness to meet the industry needs. Furthermore, the feedback highlighted the toolkit’s credibility and acceptance, while it also emphasised that its long-term success will depend on addressing the user-specific refinements. In the context of Ireland, where governmental policy increasingly promotes digital transformation, it is essential to ensure the toolkit’s availability and adaptability, alongside the implementation of targeted support mechanisms, to maximise the impact of DMGAT across the sector.

5. Discussion

5.1. Comparison of Alpha and Beta Testing

The alpha and beta testing phases fulfilled distinct but corresponding roles in validating the DMGAT. Alpha testing focused on design integrity, standards alignment, and assessment clarity. It delivered four structural upgrades that reinforced all later results:
  • Transition from the Bew and Richards maturity levels [82] to ISO 19650 maturity stages [19,20].
  • Development of a maturity level matrix to link assessment questions to sub-dimensions and maturity levels.
  • Refining and clarifying assessment questions to enhance their relevance and consistency.
  • Enhancing the scoring and visualisation approach to ensure that outputs are both interpretable and traceable.
These changes established a stable, standard-compliant tool before external testing, ensuring that subsequent feedback relates to the tool’s suitability and capability for practice rather than to basic design flaws.
In contrast, beta testing translated the same construct into real organisational contexts across 20 Irish AECO organisations (Tier 1:65%; Tier 2:15%; Tier 3:15%; Tier 4:5%), revealing how well the tool performs under operational limitations and varied digital readiness. Quantitatively, the findings showed a very strong first-use response and sustained positivity when participants considered practical applications. Initial impressions were consistently favourable, with most participants rating the experience highly. Similarly, perceived benefits were positive at both the organisational and individual levels, reflecting broad approval across different user perspectives. The toolkit’s suitability for performance metrics and alignment with digital transformation challenges was well regarded. However, perceptions of maturity aspects were varied, suggesting capacity for improvement. Support expectations highlighted sector diversity, with responses ranging from no support needed to request for substantial facilitation, especially among SMEs. The beta testing phase was focused on identifying practical limitations and areas for improvement from the end users’ perspective. Alpha testing led to initial reliability and usability impressions, with a limited participant group, whereas beta testing engaged a broader audience to validate functionality, industry relevance, maturity aspect coverage, and support needed under real-world conditions. The progression from alpha to beta testing allowed for refinement based on early positive responses along with suggestions for improvements, followed by more detailed evaluation of practical application, coverage, and resource expectations.
Collectively, both testing progressions clarify the contributions of the two phases. The alpha test confirmed adherence to standards, validated the construct, and ensured precise measurement through alignment of ISO 19650 staging, maturity matrix, refined questions, and maturity level scoring. The beta test confirmed high usability and practical value in the AECO industry, while highlighting that strong first impressions softened from “Excellent” to “Good” when respondents evaluated the practical benefits. This was notably the case for organisation-level benefits and breadth of maturity aspects in contexts that demand role- or discipline-specificity. Qualitative feedback converged on concrete development directions that built on alpha test foundations: role-specific modules and discipline-sensitive items (especially in the technology section), dynamic visualisation, and concise onboarding materials (instructional video and guidance). The adoption challenges faced by smaller firms are mainly due to structural factors rather than interface issues; therefore, tool improvements should be complemented by sector-level support such as personalised guidance and incentives that address the specific needs of SMEs. In summary, alpha testing confirmed the robustness of the framework, while beta testing evaluated its suitability and applicability in real-world contexts. Further improvements prioritise accessibility for broad audiences, including role/discipline specific questions, as well as SMEs-oriented support pathways, to ensure that the consistently positive perceptions documented in the beta test translate into uniformly “Excellent” value in day-to-day use across various organisational contexts.

5.2. Implications for Digital Maturity Assessments in the AECO Sector

The combined evidence from alpha and beta testing of the DMGAT highlights the role of structured digital maturity assessments for AECO organisations. The DMGAT is aligned with the needs of AECO stakeholders by incorporating the ISO 19650 maturity stages and employing a maturity level matrix to link key performance indicators to maturity levels. This design improves analytic traceability and allows organisations to connect performance indicators to actionable priorities rather than static labels.
While the DMGAT primarily assesses an organisation’s current digital maturity, it also facilitates gap analysis through the maturity level matrix, which defines the characteristics and requirements associated with Levels 1 to 5 across the assessed dimensions. By comparing current maturity results against the descriptors of subsequent levels, organisations can identify specific gaps in people, processes, technology, and governance, and they can prioritise realistic improvement actions according to their context and readiness. In this way, the toolkit supports staged and evidence-based digital progression rather than encouraging organisations to pursue Level 5 as a universal endpoint.
Beta testing confirmed that maturity tools are diagnostic and can drive organisational change. Larger firms use assessments to guide investments, benchmark standards, and develop competencies. For SMEs, adoption depends on ease of use, resources, and institutional support. Without these, maturity assessments risk widening the digital adoption gap between advanced organisations and resource-limited ones. Thus, coupling the tool with customised guidance and adoption pathways, especially for SMEs, enhances its value.
Technically, results showed strong expectations for the assessment process to be digital by design. Calls for interactive dashboards, real-time updates, and web deployment suggest replacing paper or spreadsheet workflows with scalable online platforms featuring dynamic visuals, automated scoring, and role-aware interfaces. This shift reduces conflict, enables organisation and sector-level analytics, and speeds up decision-making by multi-disciplinary teams.
Participants noted that integrated guides and instructional videos would improve usability, especially beyond Tier 1 organisations. Feedback also revealed wide variation in maturity aspects, indicating that role- and discipline-specific customisation is essential for relevance to a range of stakeholders. The technology section must incorporate appropriate language to various disciplines, including clients, consultants, contractors, and asset managers, each requiring distinct perspectives. Similarly, extending the framework to include individual competency assessment alongside organisational evaluation would provide a more integrated digital readiness view, linking people, processes, culture, technology, policy, and governance in one measurement system.
Support barriers for smaller firms identified in beta testing are structural rather than interface related. While some respondents noted that the DMGAT required little support, many emphasised the need for government facilitation for SMEs. In Ireland’s AECO sector, where digitalisation policy momentum is strong, SMEs’ adoption remains uneven, making public support critical [86].
Internationally, public sector bodies often lead BIM adoption via central hubs and measurement frameworks [16,85]. In Ireland, the DMGAT could play a similar role as it is aligned with international standards and further enhanced by a maturity grid. Large-scale implementation with support could establish a shared language and evidence base, facilitating exemplary projects, standardised reporting, and targeted support for lagging areas [27,28,86]. Accordingly, a robust DMGAT serves as a valuable support for organisations in identifying areas need improvements and advancing their digital transformation journey. In summary, the key impacts are as follows:
  • Maintaining transparent assessment through clearly defined maturity stages and a maturity grid ensures consistency and clarity in evaluating digital maturity.
  • Ensuring the tool is robust, role-aware, and supportive by incorporating accessibility features, dynamic visuals, concise onboarding, discipline-sensitive language, and individual competency assessments enhances relevance, usability, and adoption across the AECO stakeholders.
  • Embedding adoption within a policy environment that realistically addresses SMEs’ challenges and incorporates longitudinal data for continuous improvement supports equitable digital transformation and bridges adoption gaps.
These impacts highlight that digital maturity tools must be adaptable, comprehensive, and supported by targeted facilitation to effectively drive digital transformation and bridge adoption gaps within the AECO sector. Together, these impacts position the DMGAT as a practical tool that not only measures maturity but actively facilitates digital advancement in the AECO sector.

5.3. Future Development Pathways for the Digital Maturity Gap Analysis Toolkit

The feedback obtained from alpha and beta testing serves as a roadmap for enhancing the toolkit by identifying areas for improvements, thereby making it a more effective resource. This is particularly crucial within the Irish AECO sector, where digital transformation is prioritised, yet significant variations in resource availability exist between Tier 1 organisations and SMEs. Equally important is the inclusion of role-specific evaluation to ensure that clients, consultants, contractors, and asset managers can engage with the toolkit, reflecting their operational realities. Expanding the toolkit to incorporate individual competency assessment alongside organisational-level evaluation could provide a more holistic view of digital readiness, addressing the human dimension of transformation, in addition to organisational processes. Future enhancements could also strengthen visualisation and colour-coded interfaces to simplify interpretation. Beyond technical improvements, providing clear guidance will be crucial for supporting organisations in translating assessment outcomes into practical actions. Finally, the potential to integrate assessment through aggregation of dimensions offers an opportunity to generate industry-wide insights, thereby positioning DMGAT not only an assessment tool but also a valuable resource for organisations to identify gaps and enhance their digital maturity level.

6. Conclusions

This study presents the development and validation of a DMGAT through comprehensive alpha and beta testing. The toolkit assesses organisational digital maturity across three integrated dimensions—processes, people, and culture; technology; and policy and governance—divided into 16 sub-dimensions and 69 assessment questions. A mixed-methods approach within DSR principles was employed to systematically develop and validate the DMGAT artefact for real-world application.
Alpha testing with experts from the in-house organisational research team led to structural improvements, enhanced usability, and alignment with international standards by adopting the ISO 19650 maturity stages. A maturity level matrix was developed to link sub-dimensions with maturity levels. Feedback from alpha testing significantly improved the toolkit, resulting in a more user-friendly and clearer interface.
The improved toolkit underwent beta testing with 20 participants from organisations of varying sizes within the Irish AECO sector, encompassing a range of roles within the industry. Feedback highlighted the toolkit’s clarity, user-friendliness, coverage of adequate maturity aspects, alignment with current digital transformation challenges, and broad applicability across industry stakeholders. Participants noted the graphical representations for identifying digital maturity gaps. The toolkit was well received for its simplicity, clear structure, ease of use, and quick completion. Beta testing yielded valuable insights to improve the DMGAT. Participants suggested adding role-specific questions, improving the clarity of questions, visual enhancement, and updating the technology section questions to meet the needs of various industry stakeholders.
Overall, the DMGAT showed strong potential as a valuable tool for evaluating and advancing digital maturity in the Irish AECO sector.

Limitations and Recommendations for Future Research

The limitations of this study are primarily associated with the sample size and composition utilised during the beta testing phase. The beta testing involved participants from 20 organisations across the Irish AECO sector, with a stronger representation of larger and more digitally advanced organisations. These organisations were selected because of their leadership roles in digital transformation and their capacity to provide informed feedback; however, the relatively small sample size limits the statistical power and broader generalisability of the findings across the sector. This constraint was also influenced by the in-person nature of the assessment process, as the researcher personally visited participating organisations to administer the toolkit and collect detailed feedback. In addition, as participation was voluntary and recruitment was conducted through email invitations, the sample may reflect a degree of self-selection bias, potentially attracting participants who were already more digitally engaged or interested in BIM-related practices, while digitally reluctant stakeholders may have been underrepresented. This may have influenced the feedback towards more favourable evaluations of the toolkit. Nevertheless, this phase was intended primarily to confirm the toolkit’s suitability for industry application, assess the relevance and coverage of its dimensions, sub-dimensions, and assessment questions, evaluate usability across different organisational contexts, and gather detailed practitioner feedback for further refinement.
A further limitation relates to the alpha testing phase, which was conducted by experts from within the research team and associated project partners rather than by fully independent non-developer participants. Although this approach enabled informed and efficient refinement of the toolkit during early-stage development, it may have introduced research bias in the evaluation of the artefact’s internal logic, structure, and functionality. This limitation was partially addressed through beta testing with external AECO stakeholders.
A key limitation of the beta testing sample is the underrepresentation of Tier 4 (micro-enterprise) organisations, with only one participant included in this category. While this participant provided useful insight into the toolkit’s applicability in smaller organisational contexts, the sample is not sufficient to draw reliable or generalisable conclusions about SMEs, particularly micro-enterprises. The findings relating to this category should therefore be interpreted as exploratory rather than representative.
In addition, although the framework and question distribution were developed through structured literature synthesis and expert validation, formal psychometric testing of construct validity and internal consistency was not undertaken in this study. As a result, the balance and robustness of the toolkit would benefit from further statistical validation in future work.
Based on these limitations, future research should prioritise the following directions:
Future studies should involve a larger and more diverse sample, with stronger representation from SMEs and micro-enterprises, to improve the robustness and generalisability of the findings and enable more meaningful cross-tier comparisons.
Future research should incorporate more independent alpha-stage reviewers and apply formal psychometric testing, such as factor analysis and reliability assessment, to strengthen methodological rigour and confirm the internal consistency and construct validity of the toolkit.
Future development should prioritise the introduction of role-specific modules and more context-sensitive assessment pathways so that the toolkit can better reflect the differing priorities of clients, consultants, contractors, and asset management teams across varied organisational settings.
Future work should prioritise the transition of the toolkit into an interactive online platform to support easier administration, dynamic visualisation of results, wider accessibility, and more scalable industry application. This development should also incorporate explicit target-setting functionality, allowing organisations to define desired maturity states according to their size, role, project context, and strategic priorities.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/buildings16071305/s1, Supplementary Table S1: Traceability Matrix Linking DMGAT Assessment Questions (AQ1-AQ69) to ISO 19650. Supplementary Table S1 provides detail on the assessment questions used in the DMGAT toolkit. The matrix identifies, for each assessment question, the associated sub-dimension and toolkit dimension, together with the primary ISO 19650 reference and the related ISO principle or concept used during toolkit development. Where a question operationalises a broader information management idea rather than a single verbatim requirement. The mapping is presented at the most defensible part or requirement-area level to avoid overstating one-to-one clause equivalence. Supplementary File S2 is the approved outcome letter HREC-MR-24-023 for ethical approval from the hosting institute.

Author Contributions

Conceptualisation, R.U., J.H., T.M. and A.F.; methodology, R.U., J.H. and T.M.; validation, R.U.; formal analysis, R.U.; investigation, R.U., J.H., T.M. and A.F.; resources, J.H. and T.M.; data curation, R.U., J.H. and T.M.; writing—original draft preparation, R.U.; writing—review and editing, J.H., T.M., A.F., M.O. and S.C.; visualisation, J.H., T.M. and M.O.; supervision, J.H., T.M., M.O. and S.C.; funding acquisition, J.H. and T.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research project is funded by the Build Digital Project, supported by the Department of Public Expenditure, National Development Plan Delivery and Reform, Ireland.

Data Availability Statement

The original contributions presented in the study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The research team would like to thank all the experts for their participation in this study and sharing their valuable insights. The authors would like to acknowledge the support from the Build Digital Project, funded by the Department of Public Expenditure, National Development Plan Delivery and Reform, Ireland.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BIMBuilding Information Modelling
DTDigital Transformation
AECOArchitecture, Engineering, Construction, and Operations
SMEsSmall and Medium Enterprises
KPIsKey Performance Indicators
DMGATDigital Maturity Gap Analysis Toolkit
DSRDesign Science Research
ISOInternational Organisation for Standardisation
IoTInternet of Things

References

  1. Vial, G. Understanding Digital Transformation: A Review and a Research Agenda. In Managing Digital Transformation; Routledge: London, UK, 2021. [Google Scholar]
  2. Remane, G.; Hanelt, A.; Wiesboeck, F.; Kolbe, L.M. Digital Maturity in Traditional Industries—An Exploratory Analysis. In Proceedings of the ECIS, Guimarães, Portugal, 5–10 June 2017; p. 10. [Google Scholar]
  3. Bharadwaj, A.; El Sawy, O.A.; Pavlou, P.A.; Venkatraman, N. Digital Business Strategy: Toward a next Generation of Insights. MIS Q. 2013, 37, 471–482. [Google Scholar] [CrossRef]
  4. Fitzgerald, M.; Kruschwitz, N.; Bonnet, D.; Welch, M. Embracing Digital Technology: A New Strategic Imperative. MIT Sloan Manag. Rev. 2014, 55, 1. [Google Scholar]
  5. Lee, M.; Yun, J.J.; Pyka, A.; Won, D.; Kodama, F.; Schiuma, G.; Park, H.; Jeon, J.; Park, K.; Jung, K. How to Respond to the Fourth Industrial Revolution, or the Second Information Technology Revolution? Dynamic New Combinations between Technology, Market, and Society through Open Innovation. J. Open Innov. Technol. Mark. Complex. 2018, 4, 21. [Google Scholar] [CrossRef]
  6. Yeow, A.; Soh, C.; Hansen, R. Aligning with New Digital Strategy: A Dynamic Capabilities Approach. J. Strateg. Inf. Syst. 2018, 27, 43–58. [Google Scholar] [CrossRef]
  7. Shi, Y.; Wang, Y.; Wang, L.-N.; Wang, W.-N.; Yang, T.-Y. Bridge Tower Warning Method Based on Improved Multi-Rate Fusion. Buildings 2025, 15, 2733. [Google Scholar] [CrossRef]
  8. Shi, Y.; Wang, Y.; Wang, L.-N.; Wang, W.-N.; Yang, T.-Y. Bridge Cable Performance Warning Method Based on Temperature and Displacement Monitoring Data. Buildings 2025, 15, 2342. [Google Scholar] [CrossRef]
  9. Wu, C.; Xu, B.; Mao, C.; Li, X. Overview of BIM Maturity Measurement Tools. J. Inf. Technol. Constr. 2017, 22, 34–62. [Google Scholar]
  10. Wagire, A.A.; Joshi, R.; Rathore, A.P.S.; Jain, R. Development of Maturity Model for Assessing the Implementation of Industry 4.0: Learning from Theory and Practice. Prod. Plan. Control 2021, 32, 603–622. [Google Scholar] [CrossRef]
  11. Mutis, I.; Mehraj, I. Cloud BIM Governance Framework for Implementation in Construction Firms. Pract. Period. Struct. Des. Constr. 2022, 27, 04021074. [Google Scholar] [CrossRef]
  12. McAuley, B.; Hore, A.; West, R.; Kassem, M.; Kuang, S. Ireland’s BIM Macro Adoption Study: Establishing Ireland’s BIM Maturity. In Proceedings of the 3rd CitA BIM Gathering, Dublin, Ireland, 23–24 November 2017. [Google Scholar]
  13. Gökalp, E.; Martinez, V. Digital Transformation Maturity Assessment: Development of the Digital Transformation Capability Maturity Model. Int. J. Prod. Res. 2022, 60, 6282–6302. [Google Scholar] [CrossRef]
  14. Brodny, J.; Tutak, M. Digitalization of Small and Medium-Sized Enterprises and Economic Growth: Evidence for the EU-27 Countries. J. Open Innov. Technol. Mark. Complex. 2022, 8, 67. [Google Scholar] [CrossRef]
  15. Rashidian, S.; Drogemuller, R.; Omrani, S.; Banakar, F. A Review of the Interrelationships and Characteristics of Building Information Modeling, Integrated Project Delivery and Lean Construction Maturity Models. Smart Sustain. Built Environ. 2023, 13, 584–608. [Google Scholar] [CrossRef]
  16. McKenna, T.; Lamon, D.; Murphy, R.; Carroll, S.; Collins, J.; Concannon, R.; Horgan, E.; Otreba, M. International Best Practice in Digital Adoption Within the AEC Sector; Build Digital Project—Project Ireland, 2040; Build Digital Project; Munster Technological University Ireland: Cork, Ireland, 2023; Available online: https://www.builddigitalproject.ie/international-best-practice-in-digital-construction-adoption-report (accessed on 23 May 2025).
  17. Ullah, R.; Harrington, J.; Farea, A.; Otreba, M.; Carroll, S.; Mckenna, T. Navigating Digital Maturity: Analysis of Digital Maturity Assessment Tools. In Civil Engineering Research Ireland (CERI) Conference; University of Galway: Galway, Ireland, 2024. [Google Scholar]
  18. Ullah, R.; Harrington, J.; Farea, A.; Otreba, M.; Carroll, S.; McKenna, T. Digital Maturity Assessment Tools for the Construction Industry: A PRISMA-ScR Scoping Review. Buildings 2026, 16, 239. [Google Scholar] [CrossRef]
  19. Abanda, F.H.; Balu, B.; Adukpo, S.E.; Akintola, A. Decoding ISO 19650 Through Process Modelling for Information Management and Stakeholder Communication in BIM. Buildings 2025, 15, 431. [Google Scholar] [CrossRef]
  20. ISO 19650-1:2018; Organization and Digitization of Information about Buildings and Civil Engineering Works, Including Building Information Modelling (BIM)—Information Management Using Building Information Modelling. ISO: Geneva, Switzerland, 2018. Available online: https://www.iso.org/standard/68078.html (accessed on 27 January 2026).
  21. Lin, C.; Hu, Z.-Z.; Yang, C.; Deng, Y.-C.; Zheng, W.; Lin, J.-R. Maturity Assessment of Intelligent Construction Management. Buildings 2022, 12, 1742. [Google Scholar] [CrossRef]
  22. Adekunle, S.A.; Aigbavboa, C.O.; Ejohwomu, O.; Adekunle, E.A.; Thwala, W.D. Digital Transformation in the Construction Industry: A Bibliometric Review. J. Eng. Des. Technol. 2024, 22, 130–158. [Google Scholar] [CrossRef]
  23. Rinchen, S.; Banihashemi, S.; Alkilani, S. Driving Digital Transformation in Construction: Strategic Insights into Building Information Modelling Adoption in Developing Countries. Proj. Leadersh. Soc. 2024, 5, 100138. [Google Scholar] [CrossRef]
  24. Perera, S.; Jin, X.; Das, P.; Gunasekara, K.; Samaratunga, M. A Strategic Framework for Digital Maturity of Design and Construction through a Systematic Review and Application. J. Ind. Inf. Integr. 2023, 31, 100413. [Google Scholar] [CrossRef]
  25. Jäkel, J.-I.; Fischerkeller, F.; Oberhoff, T.; Klemt-Albert, K. Development of a Maturity Model for the Digital Transformation of Companies in the Context of Construction Industry 4.0. J. Inf. Technol. Constr. 2024, 29, 778–809. [Google Scholar] [CrossRef]
  26. Di Vaio, A.; Latif, B.; Gunarathne, N.; Gupta, M.; D’Adamo, I. Digitalization and Artificial Knowledge for Accountability in SCM: A Systematic Literature Review. J. Enterp. Inf. Manag. 2024, 37, 606–672. [Google Scholar] [CrossRef]
  27. Farea, A.; Otreba, M.; Ullah, R.; McKenna, T.; Carroll, S.; Harrington, J. Digital Transformation of the AEC Industry: A Review of BIM Implementation Toolkits from Leading Countries. Smart Sustain. Built Environ. 2026, 1–24. [Google Scholar] [CrossRef]
  28. Farea, A.; Otreba, M.; Ullah, R.; Mckenna, T.; Carroll, S.; Harrington, J. Macro BIM Adoption: Global Initiatives and Strategies; University of Galway: Galway, Ireland, 2024. [Google Scholar] [CrossRef]
  29. Schilling, L.; Seuring, S. Sustainable Value Creation through Information Technology-Enabled Supply Chains in Emerging Markets. Int. J. Logist. Manag. 2022, 33, 1001–1016. [Google Scholar] [CrossRef]
  30. Tubis, A.A. Digital Maturity Assessment Model for the Organizational and Process Dimensions. Sustainability 2023, 15, 15122. [Google Scholar] [CrossRef]
  31. Looy, A.V. Business Process Maturity: A Comparative Study on a Sample of Business Process Maturity Models; Springer Science & Business Media: Berlin, Germany, 2014. [Google Scholar]
  32. Carrijo, P.; Alturas, B.; Pedrosa, I. Similarities and Differences Between Digital Transformation Maturity Models: A Literature Review. In Intelligent Systems in Digital Transformation; Kahraman, C., Haktanır, E., Eds.; Lecture Notes in Networks and Systems; Springer International Publishing: Cham, Switzerland, 2023; Volume 549, pp. 33–52. [Google Scholar]
  33. Angreani, L.S.; Vijaya, A.; Wicaksono, H. Systematic Literature Review of Industry 4.0 Maturity Model for Manufacturing and Logistics Sectors. Procedia Manuf. 2020, 52, 337–343. [Google Scholar] [CrossRef]
  34. Lasrado, L.A.; Vatrapu, R.; Andersen, K. Maturity Models Development in IS Research: A Literature Review. IRIS Sel. Pap. Inf. Syst. Res. Semin. Scand. 2015, 6, 6. Available online: http://aisel.aisnet.org/cgi/viewcontent.cgi?article=1005&context=iris2015 (accessed on 10 May 2025).
  35. Ishtiaq, M. Book Review Creswell, J.W. (2014). Research Design: Qualitative, Quantitative and Mixed Methods Approaches (4th ed.). Thousand Oaks, CA: Sage. Engl. Lang. Teach. 2019, 12, 40. [Google Scholar] [CrossRef]
  36. Schumacher, A.; Erol, S.; Sihn, W. A Maturity Model for Assessing Industry 4.0 Readiness and Maturity of Manufacturing Enterprises. Procedia Cirp 2016, 52, 161–166. [Google Scholar] [CrossRef]
  37. Kosieradzka, A.; Smagowicz, J. The Concept of a Pilot Study for the Verification of a Maturity Model in Public Crisis Management. Zesz. Nauk. Politech. Poznańskiej Ser. Organ. Zarządzanie 2020, 77, 127–143. [Google Scholar]
  38. SEI. CMMI for Development, Version 1.2; Carnegie Mellon—Software Engineering Institute: Pittsburgh, PA, USA, 2006. Available online: https://www.inf.ufsc.br/~ricardo.silva/download/cmmi/CMMI%20for%20Development%20Version%201-2%2006tr008.pdf (accessed on 13 May 2025).
  39. Gray, J.; Rumpe, B. Models for the Digital Transformation. Softw. Syst. Model. 2017, 16, 307–308. [Google Scholar] [CrossRef]
  40. Almeida, F.; Santos, J.D.; Monteiro, J.A. The Challenges and Opportunities in the Digitalization of Companies in a Post-COVID-19 World. IEEE Eng. Manag. Rev. 2020, 48, 97–103. [Google Scholar] [CrossRef]
  41. Amankwah-Amoah, J.; Khan, Z.; Wood, G.; Knight, G. COVID-19 and Digitalization: The Great Acceleration. J. Bus. Res. 2021, 136, 602–611. [Google Scholar] [CrossRef]
  42. Werner-Lewandowska, K.; Kosacka-Olejnik, M. Logistics 4.0 Maturity in Service Industry: Empirical Research Results. Procedia Manuf. 2019, 38, 1058–1065. [Google Scholar] [CrossRef]
  43. Mashalah, H.A.; Hassini, E.; Gunasekaran, A.; Bhatt, D. The Impact of Digital Transformation on Supply Chains through E-Commerce: Literature Review and a Conceptual Framework. Transp. Res. Part E Logist. Transp. Rev. 2022, 165, 102837. [Google Scholar] [CrossRef]
  44. Jing, H.; Fan, Y. Digital Transformation, Supply Chain Integration and Supply Chain Performance: Evidence From Chinese Manufacturing Listed Firms. Sage Open 2024, 14, 21582440241281616. [Google Scholar] [CrossRef]
  45. Ustaoğlu, N. A Maturity Model for Digital Transformation. Master’s Thesis, Sabancı University, Istanbul, Turkey, 2019. [Google Scholar]
  46. Yarmohammadian, M.H.; Tavakoli, N.; Shams, A.; Hatampour, F. Evaluation of Organizational Maturity Based on People Capacity Maturity Model in Medical Record Wards of Iranian Hospitals. J. Educ. Health Promot. 2014, 3, 54. [Google Scholar] [CrossRef] [PubMed]
  47. Igartua, J.I.; Retegi, J.; Ganzarain, J. IM2, a Maturity Model for Innovation in SMEs. Dir. Organ. 2018, 64, 42–49. [Google Scholar]
  48. Wiesner, S.; Gaiardelli, P.; Gritti, N.; Oberti, G. Maturity Models for Digitalization in Manufacturing—Applicability for SMEs. In Advances in Production Management Systems. Smart Manufacturing for Industry 4.0; Moon, I., Lee, G.M., Park, J., Kiritsis, D., Von Cieminski, G., Eds.; IFIP Advances in Information and Communication Technology; Springer International Publishing: Cham, Switzerland, 2018; Volume 536, pp. 81–88. [Google Scholar]
  49. Oleśków-Szłapka, J.; Wojciechowski, H.; Domański, R.; Pawłowski, G. Logistics 4.0 Maturity Levels Assessed Based on GDM (Grey Decision Model) and Artificial Intelligence in Logistics 4.0-Trends and Future Perspective. Procedia Manuf. 2019, 39, 1734–1742. [Google Scholar] [CrossRef]
  50. Zoubek, M.; Šimon, M. Methodology for Evaluating the Readiness of Internal Logistics Processes for Industry 4.0. In Proceedings of the IOP Conference Series: Materials Science and Engineering; IOP Publishing: Istanbul, Turkey, 2020; Volume 947, p. 012005. [Google Scholar]
  51. Tonelli, F.; Demartini, M.; Loleo, A.; Testa, C. A Novel Methodology for Manufacturing Firms Value Modeling and Mapping to Improve Operational Performance in the Industry 4.0 Era. Procedia Cirp 2016, 57, 122–127. [Google Scholar] [CrossRef]
  52. Jochem, R.; Geers, D.; Heinze, P. Maturity Measurement of Knowledge-Intensive Business Processes. TQM J. 2011, 23, 377–387. [Google Scholar] [CrossRef]
  53. Hein-Pensel, F.; Winkler, H.; Brückner, A.; Wölke, M.; Jabs, I.; Mayan, I.J.; Kirschenbaum, A.; Friedrich, J.; Zinke-Wehlmann, C. Maturity Assessment for Industry 5.0: A Review of Existing Maturity Models. J. Manuf. Syst. 2023, 66, 200–210. [Google Scholar] [CrossRef]
  54. Hellweg, F.; Lechtenberg, S.; Hellingrath, B.; Thomé, A.M.T. Literature Review on Maturity Models for Digital Supply Chains. Braz. J. Oper. Prod. Manag. 2021, 18, 1–12. [Google Scholar] [CrossRef]
  55. Gollhardt, T.; Halsbenning, S.; Hermann, A.; Karsakova, A.; Becker, J. Development of a Digital Transformation Maturity Model for IT Companies. In Proceedings of the 2020 IEEE 22nd Conference on Business Informatics (CBI); IEEE: New York, NY, USA, 2020; Volume 1, pp. 94–103. [Google Scholar]
  56. Kljajić Borštnar, M.; Pucihar, A. Multi-Attribute Assessment of Digital Maturity of SMEs. Electronics 2021, 10, 885. [Google Scholar] [CrossRef]
  57. Zhao, X.; Hwang, B.-G.; Low, S.P. Developing Fuzzy Enterprise Risk Management Maturity Model for Construction Firms. J. Constr. Eng. Manag. 2013, 139, 1179–1189. [Google Scholar] [CrossRef]
  58. Smits, W.; Van Buiten, M.; Hartmann, T. Yield-to-BIM: Impacts of BIM Maturity on Project Performance. Build. Res. Inf. 2017, 45, 336–346. [Google Scholar] [CrossRef]
  59. Liang, C.; Lu, W.; Rowlinson, S.; Zhang, X. Development of a Multifunctional BIM Maturity Model. J. Constr. Eng. Manag. 2016, 142, 06016003. [Google Scholar] [CrossRef]
  60. Wernicke, B.; Stehn, L.; Sezer, A.A.; Thunberg, M. Introduction of a Digital Maturity Assessment Framework for Construction Site Operations. Int. J. Constr. Manag. 2023, 23, 898–908. [Google Scholar] [CrossRef]
  61. Kumar, M.; Taj, M.G.; Zevalov, A.; Bhetwal, R.; Kanani, K.V.; Rahman, M. Maturity Model Taxonomy for Digital Transformation. In Proceedings of the 2023 IEEE 12th International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS); IEEE: New York, NY, USA, 2023; Volume 1, pp. 288–296. [Google Scholar]
  62. Aras, A.; Büyüközkan, G. Digital Transformation Journey Guidance: A Holistic Digital Maturity Model Based on a Systematic Literature Review. Systems 2023, 11, 213. [Google Scholar] [CrossRef]
  63. Babkin, A.V.; Shkarupeta, E.V.; Gileva, T.A.; Polozhentseva, J.S.; Chen, L. Methodology for Assessing Digital Maturity Gaps in Industrial Enterprises. Mod. Innov. Razvit. 2022, 13, 443–458. [Google Scholar] [CrossRef]
  64. Kırmızı, M.; Kocaoglu, B. Digital Transformation Maturity Model Development Framework Based on Design Science: Case Studies in Manufacturing Industry. J. Manuf. Technol. Manag. 2022, 33, 1319–1346. [Google Scholar] [CrossRef]
  65. Succar, B. Building Information Modelling Maturity Matrix. In Handbook of Research on Building Information Modeling and Construction Informatics: Concepts and Technologies; IGI Global: Hershey, PA, USA, 2010; pp. 65–103. [Google Scholar]
  66. EU Commission Maturity Scan | Digitalisation of Construction SMEs. Available online: https://digital-construction.ec.europa.eu/maturity-scan (accessed on 13 May 2025).
  67. NBIMS. NBIMS, Capability Maturity Model NBIMS National BIM Standard—United States | National BIM Standard—United States. Available online: https://nibs.org/wp-content/uploads/2025/04/NBIMS-US_V3_4.2_COBie_Annex_B.pdf (accessed on 13 May 2025).
  68. Du, J.; Liu, R.; Issa, R.R.A. BIM Cloud Score: Benchmarking BIM Performance. J. Constr. Eng. Manage. 2014, 140, 04014054. [Google Scholar] [CrossRef]
  69. BIMe. Available online: https://bimexcellence.org/ (accessed on 13 May 2025).
  70. Kam, C. The VDC Scorecard: Formulation and Validation. Available online: https://stacks.stanford.edu/file/druid:xd249sp3509/WP135.pdf (accessed on 13 May 2025).
  71. Kam, C.; Song, M.H.; Senaratna, D. VDC Scorecard: Formulation, Application, and Validation. J. Constr. Eng. Manag. 2017, 143, 04016100. [Google Scholar] [CrossRef]
  72. Alankarage, S.; Chileshe, N.; Samaraweera, A.; Rameezdeen, R.; Edwards, D.J. Organisational BIM Maturity Models and Their Applications: A Systematic Literature Review. Archit. Eng. Des. Manag. 2023, 19, 567–585. [Google Scholar] [CrossRef]
  73. Siebelink, S.; Voordijk, J.T.; Adriaanse, A. Developing and Testing a Tool to Evaluate BIM Maturity: Sectoral Analysis in the Dutch Construction Industry. J. Constr. Eng. Manag. 2018, 144, 05018007. [Google Scholar] [CrossRef]
  74. Brooks, A.; Krebs, L.; Paulsen, B. Beta-Testing a Requirements Analysis Tool. ACM SIGSOFT 2014, 39, 1–6. [Google Scholar] [CrossRef]
  75. De Bruin, T.; Rosemann, M.; Freeze, R.; Kaulkarni, U. Understanding the Main Phases of Developing a Maturity Assessment Model. In Australasian Conference on Information Systems (ACIS); Bunker, D., Campbell, B., Underwood, J., Eds.; Australasian Chapter of the Association for Information Systems: Atlanta, GA, USA, 2005; pp. 8–19. [Google Scholar]
  76. Kocaoglu, B.; Kirmizi, M. Prescriptive Digital Transformation Maturity Model: A Development and Validation Study. Kybernetes 2024, 54, 2662–2705. [Google Scholar] [CrossRef]
  77. Laaber, F.; Florack, A.; Koch, T.; Hubert, M. Digital Maturity: Development and Validation of the Digital Maturity Inventory (DIMI). Comput. Hum. Behav. 2023, 143, 107709. [Google Scholar] [CrossRef]
  78. Voss, M.; Jaspert, D.; Ahlfeld, C.; Sucke, L. Developing a Digital Maturity Model for the Sales Processes of Industrial Projects. J. Pers. Sell. Sales Manag. 2024, 44, 7–28. [Google Scholar] [CrossRef]
  79. Creswell, J.W.; Clark, V.L.P. Designing and Conducting Mixed Methods Research; SAGE Publications, Inc: Thousand Oaks, CA, USA, 2017; Available online: https://collegepublishing.sagepub.com/products/designing-and-conducting-mixed-methods-research-3-241842 (accessed on 17 January 2026).
  80. Hevner, A.R.; March, S.T.; Park, J.; Ram, S. Design Science in Information Systems Research. Manag. Inf. Syst. Q. 2004, 28, 75. [Google Scholar] [CrossRef]
  81. Peffers, K.; Tuunanen, T.; Rothenberger, M.A.; Chatterjee, S. A Design Science Research Methodology for Information Systems Research. J. Manag. Inf. Syst. 2007, 24, 45–77. [Google Scholar] [CrossRef]
  82. Bew, M. Bew-Richards BIM Maturity Model. In Proceedings of the BuildingSMART Construct IT Autumn Members Meeting; Brighton: Bellows Falls, VT, USA, 2008. [Google Scholar]
  83. Adeniran, A.O. Application of Likert Scale’s Type and Cronbach’s Alpha Analysis in an Airport Perception Study. Sch. J. Appl. Sci. Res. 2019, 2, 1–5. [Google Scholar]
  84. McAuley, B.; Hore, A.V.; West, R.P. The Irish Construction Industry’s State of Readiness for a BIM Mandate in 2020. In Proceedings of the Civil Engineering Research in Ireland 2020 Conference, Cork, Ireland, 27–28 August 2020. [Google Scholar]
  85. McAuley, B.; West, R.P.; Hore, A.V. Digital Construction and BIM Research in Ireland 2016–2020. In Proceedings of the 5th CitA BIM Gathering, Online, 21–23 September 2021. [Google Scholar]
  86. Farea, A.; Munir, M.; Ullah, R.; Otreba, M.; Carroll, S.; Harrington, J. Comparison of the Barriers to BIM Adoption and Digital Transformation within the Construction Industry of Pakistan and Ireland. In Proceedings of the 2023 IEEE 22nd International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), Exeter, UK, 1–3 November 2023; pp. 2394–2399. [Google Scholar] [CrossRef]
Figure 1. Continuous and staged model representation.
Figure 1. Continuous and staged model representation.
Buildings 16 01305 g001
Figure 2. Approach to development of a digital maturity gap analysis toolkit.
Figure 2. Approach to development of a digital maturity gap analysis toolkit.
Buildings 16 01305 g002
Figure 3. Overview of the digital maturity gap analysis toolkit framework.
Figure 3. Overview of the digital maturity gap analysis toolkit framework.
Buildings 16 01305 g003
Figure 4. Prototype of digital maturity gap analysis toolkit—guidance and assessment tabs.
Figure 4. Prototype of digital maturity gap analysis toolkit—guidance and assessment tabs.
Buildings 16 01305 g004
Figure 5. Prototype of digital maturity gap analysis toolkit—maturity result representation.
Figure 5. Prototype of digital maturity gap analysis toolkit—maturity result representation.
Buildings 16 01305 g005
Figure 6. Participant details: role and experience in the AECO industry.
Figure 6. Participant details: role and experience in the AECO industry.
Buildings 16 01305 g006
Figure 7. Overview of organisational context and participant roles.
Figure 7. Overview of organisational context and participant roles.
Buildings 16 01305 g007
Figure 8. Participants’ general perception of the tool.
Figure 8. Participants’ general perception of the tool.
Buildings 16 01305 g008
Figure 9. Users’ insights on tool usability, experience, and support/resources needed.
Figure 9. Users’ insights on tool usability, experience, and support/resources needed.
Buildings 16 01305 g009
Figure 10. Users’ perception of the relevance and coverage of the tool.
Figure 10. Users’ perception of the relevance and coverage of the tool.
Buildings 16 01305 g010
Table 1. Maturity level matrix.
Table 1. Maturity level matrix.
Maturity LevelsLevel 1—InitialLevel 2—PlannedLevel 3—DefinedLevel 4—ManagedLevel 5—Optimised
DescriptionProcesses are unstructured, and success depends on individual efforts. Lack of formal standards and protocols.Processes are planned and documented, with repeatable success. Basic BIM standards may be recognised.Processes are defined, standardised, and documented. BIM practices are integrated into workflows.Processes are measured and controlled. BIM analytics inform decision-making and continuous improvement.Continuous improvement of BIM methodologies drives innovation and business intelligence.
DimensionSub-Dimension
People, Process, and CultureBIM Training and EducationMinimal awareness of BIM; training is inconsistent or non-existent.Occasional BIM training sessions; growing awareness of BIM’s importance.Regular training programs; increasing proficiency in BIM.Advanced training; specialised BIM roles developed.Continuous learning culture; thought leadership in BIM education.
BIM AwarenessLimited understanding of BIM benefits; sporadic use without formal strategy.Recognising BIM benefits; discussions on implementation are in place.Widespread understanding of BIM’s strategic benefits.Full organisational awareness; BIM is a core business strategy.BIM is central to the organisational ethos and culture.
BIM CompetencyLow competency with basic BIM functions.Developing competencies for key personnel; increasing familiarity with BIM tools.Broadly improved BIM competency; standard use of BIM tools in projects.High competency; innovation in BIM use is encouraged.Recognised expertise in BIM; reference point for best practices.
BIM Change Management and AdaptabilityResistance to change; BIM considered as a burden.Planned approach and openness to change; discussions on managing transitions.Structured change management practices embrace adaptive methodologies.Proactive change management; agile and innovative.Excels at predicting change; trendsetter in BIM methodologies.
BIM Leadership and SupportLeadership is unaware of or unconvinced by BIM advantages.Few leaders support BIM projects.BIM is part of the leadership agenda.Leadership actively drives BIM initiatives.Inspirational BIM leadership.
Collaborative CultureCollaboration is incidental, not systemic.Pockets of collaboration emerging.Collaborative culture is fostered and encouraged.Strong culture of collaboration exists.Collaboration transcends organisational boundaries.
DimensionSub-DimensionLevel 1—InitialLevel 2—PlannedLevel 3—DefinedLevel 4—ManagedLevel 5—Optimised
TechnologyBIM Certification (EN ISO 19650)Unfamiliar with the standard.Consideration for certification; some employees may be certified.Certification achieved for key individuals or the organisation.Continuous improvement beyond certification.Certification emphasises leading-edge practices and continuous improvement.
BIM ExperienceLittle to no practical experience; BIM not integrated into projects.Initial projects undertaken; experiential learning in progress.Consistent application of BIM; sharing of best practices.Extensive experience across successful projects.Recognised for innovative BIM applications.
BIM Usage for Data Sharing, Collaboration, and IntegrationInfrequent, manual data sharing; limited collaboration.Basic digital tools and early collaboration stages.Effective sharing and collaboration; smoother integration.Streamlined and automated workflows.Leveraged for predictive analysis and integrated delivery.
BIM TechnologyBIM technology usage is ad hoc, uncoordinated, and driven by individual preferences.BIM technologies are selected more strategically based on specific project needs.BIM technology integrates into practice, with interoperability considered.Advanced BIM technologies enable high interoperability and customisation.BIM tools and integrated solutions support enterprise-wide digital innovation.
Policy and GovernanceBIM Policy and StrategyNo or less familiarity with defined policy or strategy.Initial strategy development; draft policies.Policies aligned with business objectives.Fully integrated strategy with regular updates.Dynamic strategy evolving with the industry.
BIM Data ManagementUnstructured data management; frequent data loss.Beginning to organise and structure data.Established procedures; central repositories.Advanced systems with analytics.Fully integrated system with business intelligence.
BIM Standards and ProtocolsNo or minimal adherence to standards; non-standardised processes.Early adoption of standards; internal protocols in development.Full compliance with standards.Continuous refinement and optimisation.Industry-setting standards shared broadly.
BIM Legal and Contractual AspectsContracts overlook BIM requirements; unclear risks.Initial incorporation of BIM in contracts.Clearly outlined roles and responsibilities.Proactive risk management practicesAdvanced legal and contractual innovation
BIM Integration with Existing ProcessesNo or minimal integration.Some project-specific efforts.Good integration.BIM enables innovation.Integral to all business processes.
Cyber SecurityNo or minimal focus on cyber security.Basic practices with growing awareness.Integrated strategies tailored to BIM processes.Advanced proactive measures.State-of-the-art security with improvement.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ullah, R.; Harrington, J.; Farea, A.; Otreba, M.; Carroll, S.; McKenna, T. Development and Validation of a Digital Maturity Gap Analysis Toolkit: Alpha and Beta Testing. Buildings 2026, 16, 1305. https://doi.org/10.3390/buildings16071305

AMA Style

Ullah R, Harrington J, Farea A, Otreba M, Carroll S, McKenna T. Development and Validation of a Digital Maturity Gap Analysis Toolkit: Alpha and Beta Testing. Buildings. 2026; 16(7):1305. https://doi.org/10.3390/buildings16071305

Chicago/Turabian Style

Ullah, Rahat, Joe Harrington, Adhban Farea, Michal Otreba, Sean Carroll, and Ted McKenna. 2026. "Development and Validation of a Digital Maturity Gap Analysis Toolkit: Alpha and Beta Testing" Buildings 16, no. 7: 1305. https://doi.org/10.3390/buildings16071305

APA Style

Ullah, R., Harrington, J., Farea, A., Otreba, M., Carroll, S., & McKenna, T. (2026). Development and Validation of a Digital Maturity Gap Analysis Toolkit: Alpha and Beta Testing. Buildings, 16(7), 1305. https://doi.org/10.3390/buildings16071305

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop