Next Article in Journal
Cognition–Paradigm Misalignment in Heritage Conservation: Applying a Correspondence Framework to Traditional Chinese Villages
Previous Article in Journal
Three-Dimensional Numerical Simulation for Mechanical Performance of Semi-Prefabricated Second Lining of Highway Tunnels
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Modelling Project Control System Effectiveness in Saudi Arabian Construction Project Delivery

School of Architecture, Building, and Civil Engineering, Loughborough University, Loughborough LE11 3TU, UK
*
Author to whom correspondence should be addressed.
Buildings 2025, 15(18), 3426; https://doi.org/10.3390/buildings15183426
Submission received: 18 August 2025 / Revised: 18 September 2025 / Accepted: 19 September 2025 / Published: 22 September 2025
(This article belongs to the Section Construction Management, and Computers & Digitization)

Abstract

Persistent cost overruns, schedule delays, and weak control mechanisms continue to hinder construction project delivery in Saudi Arabia, where 64% of projects exceeded their planned time and 53% experienced cost overruns. Although project control systems (PCSs) have received increasing attention, existing research lacks an empirically grounded and theory-informed framework explaining how project control system determinants (PCSDs) influence performance. This study addresses this gap by developing and testing an Input–Process–Output (IPO) model linking organisational, human, and technological inputs with operational control stages and project outcomes. Data were collected from 222 completed construction projects in Saudi Arabia using a cross-sectional survey of professionals directly involved in their delivery. Partial Least Squares Structural Equation Modelling (PLS-SEM) was applied to test hypothesised relationships, supported by Importance–Performance Map Analysis (IPMA) to identify high-impact but underperforming areas. Seventeen of twenty hypotheses were supported, highlighting the dominant role of post-operational controls, the significant indirect influence of in-operational controls, and the most impactful total effects of organisational factors on project performance through control processes. The IPMA results identified leadership and team capacity, estimation accuracy, stakeholder integration, PMO engagement, audits, knowledge management, and corrective scheduling actions as priority areas for improvement. This study provides the first empirical tests of a multi-dimensional PCS effectiveness model in the region, contributing both to the academic literature and practical efforts aimed at improving project delivery outcomes in alignment with national development goals, such as Saudi Vision 2030.

1. Introduction

Construction projects worldwide are inherently complex and subject to significant uncertainty, often due to broad scopes, evolving designs, multiple stakeholder interfaces, and dynamic environments [1,2]. These conditions often result in persistent cost overruns and schedule delays. Global evidence underscores the scale of the challenge. Flyvbjerg et al. [3], in a study of 285 projects across 20 countries, found that 9 out of 10 experienced cost overruns. Similarly, KPMG [4] reported that fewer than one-third of projects were delivered within budget, and only one-quarter were delivered on schedule. The Construction Industry Institute (CII) reached the same conclusion: out of nearly 1000 projects assessed, only 53 achieved their original cost and time targets [5,6]. These findings highlight the urgent global need for more effective project control systems (PCSs).
PCSs are broadly defined as structured frameworks that integrate people, processes, and tools to monitor progress, identify deviations, and implement corrective actions that keep projects aligned with cost, schedule, and scope objectives [7,8]. When effectively implemented, PCSs have been shown to reduce delays by up to 15% and deliver cost savings of more than 10% [9,10]. Conversely, weak or inconsistent control practices frequently result in poor performance [11,12].
These challenges are especially pronounced in Saudi Arabia, where construction activity is expanding rapidly under Vision 2030. Despite heavy investment and greater emphasis on project management, PCS implementation remains largely ineffective, contributing to widespread underperformance. Studies indicate that around 70% of large projects experience schedule delays [1], while more than 80% report cost overruns [13]. Similarly, according to [14,15], the Ministry of Municipal and Rural Affairs (MOMRA) reported in 2017 that 75% of its projects failed to meet their initial goal. More recent analyses confirm this trend, with studies of projects completed between 2012 and 2022 reporting delays in nearly all cases and cost overruns in the majority [16]. Collectively, these deficiencies pose significant risks to the timely and cost-effective delivery of infrastructure projects central to the Kingdom’s development agenda.
Although the importance of PCSs is widely recognised, much of the existing research remains fragmented. Prior studies have tended to focus narrowly on operational tools, such as earned value management or scheduling techniques, while overlooking how broader organisational, technological, and human determinants interact with control mechanisms to shape project outcomes [7,17,18]. Moreover, PCS effectiveness goes beyond monitoring intensity and procedural compliance. Scholars highlight the need for a balanced approach that integrates behavioural control, outcome control, and trust-based mechanisms, particularly in complex, multi-stakeholder environments [19,20]. However, few frameworks capture these interconnections, and empirical validation remains limited.
To address this gap, this study builds on Alotaibi et al. [7]’s classification of project control system determinants (PCSDs) and applies the Input–Process–Output (IPO) model as its theoretical foundation. The IPO framework conceptualises projects as open systems [21,22,23,24] in which inputs (organisational, human, and technological conditions) shape processes (operational controls) that ultimately influence outputs (project performance). This approach, widely applied in team and systems research, provides both conceptual clarity and practical value in analysing causal relationships in construction management [25,26,27].
Accordingly, in this study, a unified PCS framework is developed and empirically tested using data from 222 projects completed in Saudi Arabia. The model examines both direct and indirect effects of PCSDs on project performance and employs Partial Least Squares Structural Equation Modelling (PLS-SEM) to test interrelationships, complemented by Importance–Performance Map Analysis (IPMA) to identify high-impact but underperforming areas. The contributions of this study are threefold: (i) theoretical, by delivering the first empirically validated multi-dimensional PCS framework in the region and refining influence pathways across control stages; (ii) methodological, by applying PLS-SEM and IPMA in combination to capture both direct/indirect effects and priority areas, supported by sequential mediation analysis; and (iii) contextual and practical, by offering sector-specific insights into delay and cost overrun drivers and providing a practical roadmap for managers and policymakers to enhance PCS effectiveness in alignment with Vision 2030. Figure 1 provides a structured overview of these contributions, distinguishing theoretical, methodological, and contextual/practical dimensions.
The study addresses the following research questions:
  • RQ1: How do organisational, human, and technological PCS determinants influence operational control determinants in construction projects?
  • RQ2: How do operational control determinants influence project performance in construction projects?
  • RQ3: To what extent do operational control determinants mediate the relationship between organisational, human, and technological PCS determinants and project performance?
  • RQ4: Which areas should be prioritised to enhance PCS effectiveness in Saudi construction project delivery?
The remainder of this paper is structured as follows: Section 2 and Section 3 review the relevant literature and outline the conceptual model and hypotheses. Section 4 describes the research design, data collection, and analysis approach. Section 5 presents the results, followed by a discussion of key findings in Section 6. Finally, Section 7 and Section 8 contain limitations and directions for future research and the study conclusions, respectively.

2. Literature Review

2.1. Overview of Project Control Systems in Construction

A PCS is a structured mechanism that integrates resources, people, and processes to monitor project performance, identify deviations, and implement corrective actions, ensuring delivery within the defined time, cost, and scope constraints [7,8]. PCS implementation typically follows four core steps: performance planning, actual progress measurement, deviation analysis and reporting, and corrective actions [28,29]. The system supports timely evidence-based decision-making by providing accurate and up-to-date project information. Consequently, data reliability, control processes, and stakeholder engagement form essential components of an effective PCS.
In construction projects characterised by complexity, uncertainty, and multi-stakeholder coordination, the importance of robust control mechanisms becomes especially pronounced [2,30]. PCSs provide structured oversight throughout the project life cycle, enhancing management capabilities across planning, execution, and closing stages. Empirical studies have demonstrated that effective control systems can significantly reduce delays, mitigate cost overruns, and improve alignment with project objectives [8,12,18,31]. Conversely, poor project performance, including delays, cost overruns, and scope creep, is often associated with inadequate or poorly implemented PCS practices [12,32].
In the Saudi Arabian construction sector, the need for effective PCSs is particularly urgent. Studies show that approximately 70% of public projects experience schedule delays, while nearly 80% face cost overruns [33,34]. Case-level investigations reinforce this trend: Mahamid [35] found that all 55 projects studied between 2011 and 2015 exceeded planned durations by an average of 58%, and Aldossari [16] reported that 95% of 27 projects completed between 2012 and 2022 experienced average delays of 160%, with 80% also facing cost overruns. These figures mirror global evidence, where project delays and cost overruns are widespread. For example, in the UK and US, 85 of 113 projects experienced delays, and 53 reported cost overruns [36]; in Hong Kong, 89% of 57 projects recorded cost overruns averaging 46% [37]; in Malaysia, 55% of 359 projects exceeded their budgets [38]; and in Ghana, 70% of 45 projects faced delays averaging 17 months, and 52% experienced cost overruns [39]. These persistent issues, both locally and globally, underscore the necessity of strengthening PCS practices, particularly in Saudi Arabia, where effective control is critical to achieving the infrastructure ambitions of Vision 2030. A precise analysis of how control mechanisms function and interact is thus essential to improving performance outcomes.
The effectiveness of PCSs varies widely, often influenced by contextual factors across organisational, human, technological, and operational domains [7,40,41]. However, a closer examination of the literature reveals an imbalance in emphasis: while operational aspects receive the most attention (36%), organisational (25%), technological (17%), and human (4%) factors remain comparatively underexplored [7]. Furthermore, few studies examine how these dimensions interact, particularly in the complex real-world environments of construction projects [7]. Although operational control mechanisms are widely referenced in the literature, many studies overlook how these are shaped or constrained by organisational, human, and technological contexts. This contributes to a fragmented view of a PCS as a holistic system that reflects real-world project conditions. Few studies examine how pre-operational baselines, in-operational tracking, and post-operational reporting and corrective mechanisms function in coordination. Even fewer investigate how these control processes interact with team behaviours, digital tools, or governance structures throughout the PCS control lifecycle, limiting clarity on how integrated control systems influence project outcomes.
From human perspectives, studies have shown that project team behaviours, leadership, and team competencies influence control implementation [8,32,42]. However, these are rarely treated as core components of a PCS. Similarly, organisational constraints, such as the absence of standardised control processes, continue to undermine implementation efforts [40], suggesting a need for more integrated governance approaches [5,8,20]. On the technological side, tools such as data analytics, Building Information Modelling (BIM), and real-time monitoring systems are increasingly adopted in project control, although their integration into PCS-related decision-making frameworks remains inconsistent [17,18,43]. For instance, Waqar et al. [44] showed that BIM implementation enhances construction project success through accurate estimation, resource allocation, risk management, facility integration, and real-time monitoring. More importantly, when integrated with a Common Data Environment (CDE), BIM operates not just as a digital tool, but as a structured methodology that integrates planning, estimation, and control functions within a unified data environment [45,46,47]. Yet, despite this potential, prior studies have not fully explained how such technological factors influence operational control stages within a unified PCS framework, a gap addressed in this study.
Crucially, the literature does not consistently establish whether and which of these organisational, human, and technological factors enhance operational control or improve the overall performance outcomes. This fragmented outlook reflects a broader lack of empirical evaluation regarding how organisational, human, technological, and operational determinants interact and collectively shape the overall effectiveness of PCSs in construction project delivery. This multidimensional perspective is especially absent in the context of Saudi Arabia. Existing studies tend to address isolated challenges or generic best practices, offering little in terms of tailored frameworks or empirical insights that reflect the unique institutional and operational realities of the Kingdom’s construction sector.

2.2. Review of the Existing Empirical Framework and Models of PCS

While interest in PCSs has grown, many empirical models remain fragmented, lacking a unified view of control effectiveness. Prior studies have commonly addressed PCSs through isolated lenses, focusing either on tools such as earned value or scheduling techniques [7,17,18]. This division has limited the development of integrated frameworks capable of explaining how organisational, human, technological, and operational determinants interact throughout project execution.
For instance, Olawale and Sun [48] identified inhibitors of time and cost control in the UK construction industry, including risk, uncertainty, and planning deficiencies. However, their model primarily focused on in-process challenges, without addressing upstream enablers or their interrelationships and project performance. Le and Sutrisna [32] introduced a project cost control framework that included process maturity and enabling factors but did not test their effect or mediation effect on project performance. Similarly, Jawad et al. [8] examined six PCS maturity elements in the Saudi petroleum and chemical industry yet did not explore inter-construct effects or their cumulative influence on project outcomes. Jawad et al. [8]’s study reaffirmed the relevance of earned value, governance programmes, and resource planning in improving schedule and cost control; however, again, the relationships among constructs were treated independently.
Other relevant studies, such as [40], underscored the importance of technical expertise, clear responsibilities, and structured breakdown systems but did not assess how these elements interacted to affect project performance. Orgut et al. [5] identified 15 project control success factors, covering scope, cost, risk, communication, and audit processes. While comprehensive in coverage, their framework lacked analytical testing of interdependencies or performance impacts. Likewise, Yean et al. [12] found that factors such as schedule flexibility and project manager competence influenced the cost and quality outcomes, but their analysis focused on direct associations without accounting for interaction or mediation effects across control phases.
Moreover, few studies have adopted theoretical models to guide PCS development, and even among those proposing integrated frameworks, structured approaches such as the IPO model have rarely been applied. The IPO framework is particularly suited to this study as it provides a systematic means of linking organisational, human, and technological inputs to multi-stage control processes and, ultimately, to project performance outcomes. Likewise, PLS-SEM and IPMA were chosen not because of their limited prior use in PCS research, but because they offer the most appropriate tools for modelling complex, multi-construct relationships, examining sequential mediation, and translating analytical results into targeted improvement strategies. In line with this rationale, the present study develops and empirically tests a unified PCS model grounded in the IPO framework. Building on the framework developed by [7] (Figure 2).
The model integrates empirical evidence on organisational, human, technological, and operational determinants across all stages of control. It advances prior research by (1) modelling inter-construct relationships, (2) testing both direct and indirect effects on project performance through PLS-SEM, and (3) applying IPMA to identify priority areas for improvement. This integrated approach provides a theory-informed and context-specific contribution that addresses longstanding limitations in PCS research and offers actionable insights for strengthening project control in construction delivery.

3. Conceptual Model and Hypotheses

3.1. Conceptual Model Overview

This study adopts the IPO model to develop an integrated model for evaluating PCS effectiveness within the construction sector. The IPO model provides a robust theoretical foundation for examining how inputs, comprising organisational, human, and technological determinants, shape the operational control determinants (processes), which, in turn, influence project performance (outputs). In this framework, determinants represent high-level construct groupings, each comprising multiple underlying factors. This layered approach enables a comprehensive analysis of both the individual and cumulative impact of control determinants across all stages of the execution phase.
The conceptual model is grounded in a study conducted by [7], which identified, synthesised, and categorised empirical evidence on key PCSDs. While the prior study offered a classification of these factors, it did not empirically test their relationships or structure them within an integrated model. The current study addresses this gap by developing and empirically testing a model that integrates and captures the interdependencies between input conditions and control processes, as well as their ultimate effect on project outcomes. The model comprises three main components:
  • Input determinants: Organisational (OPCSD), human (HPCSD), and technological (TPCSD) factors that collectively shape the environment for PCS implementation.
  • Process constructs: Operational control mechanisms structured into four interrelated subcomponents, namely pre-operational (Pre-OCD), in-operational (In-OCD), post-operational (Post-OCD), and uncertainty-related control (UOCD), reflecting key functional stages of the project control cycle.
  • Output construct: Project performance (PP), measured through adherence to time, cost, and scope targets, representing the direct outcome of effective control processes.
Figure 3 illustrates the full conceptual structure and the hypothesised pathways among these constructs. The model recognises both direct effects (e.g., from operational controls to project performance) and indirect relationships (e.g., from inputs through processes to outputs). Importantly, the model includes sequential links among the control mechanisms themselves, such as the progression from pre-operational to in- and post-operational control, in line with recognised control cycles in project control concepts. Additionally, uncertainty control is modelled as a cross-cutting factor that spans all operational stages and directly influences project outcomes. By applying this structured model, the study enables a detailed investigation of both the individual and combined effects of PCS determinants on performance, offering a cohesive explanation of how control systems function within the complex environment of construction project delivery.

3.2. Hypothesis Development

Analysing how PCSDs influence construction project performance requires establishing a structured set of hypotheses grounded in theory-informed logic and supported by empirical evidence. This study draws on the IPO model to examine both direct and indirect relationships between input constructs (organisational, human, and technological determinants), process constructs (operational control mechanisms), and output outcomes (project performance). The following hypotheses are formulated to examine these relationships, beginning with the influence of organisational conditions on PCS operational effectiveness (see the summary in Table 1). This section elaborates on the core constructs incorporated in the model and presents the hypotheses formulated to examine their interrelationships. Each construct is theoretically grounded in the study conducted by [7] and operationalised to reflect its specific function within the PCS framework.

3.2.1. H1: Positive Influence of Organisational PCS Determinants on Operational and Human Determinants

Organisational PCS determinants are expected to shape all phases of operational control by providing executive support, governance, and stakeholder alignment. These elements are consistent with project governance theory, which emphasises the significance of coordinated stakeholder processes, leadership commitment, and clear accountability structures to facilitate decision-making and control throughout the project lifecycle [19,49,50,51]. A well-structured organisation creates the conditions for effective pre-control planning, real-time monitoring, post-project corrections, and uncertainty management. Previous studies have confirmed that top management support, oversight mechanisms, and inter-stakeholder coordination significantly influence operational elements of PCS implementation and efficiency [7,8,30]. In addition, organisational structure can affect internal communication, team performance, and even the integration of technological tools, suggesting that it may also influence the human dimension of PCSs.
H1a. 
Organisational PCS determinants positively influence pre-operational control determinants.
H1b. 
Organisational PCS determinants positively influence in-operational control determinants.
H1c. 
Organisational PCS determinants positively influence post-operational control determinants.
H1d. 
Organisational PCS determinants positively influence uncertainty operational control determinants.
H1e. 
Organisational PCS determinants positively influence human PCS determinants.

3.2.2. H2: Positive Influence of Human PCS Determinants on Operational and Technological Determinants

Human PCS determinants are expected to play a pivotal role in translating organisational strategy into operational control practices. These determinants include leadership, team skills, and collaborative behaviour, all of which directly support the implementation and execution of PCS mechanisms. When project teams possess relevant skills and exhibit strong collaboration, they are better able to manage planning, performance tracking, reporting, and risk mitigation processes [18,40]. These components are consistent with the Behavioural Theory of the Firm [52,53,54], which emphasises the importance of decision-making under bounded rationality, organisational learning, and group dynamics, which are particularly pertinent in complex project environments. Furthermore, team competence affects how well organisations adopt and leverage digital tools for real-time monitoring, analytics, and predictive modelling [7,12].
Human capabilities are also critical for the uptake and effective use of technology. Skilled and adaptive teams are more likely to engage with digital tools, align them with control needs, and integrate them into project workflows. As such, human PCS determinants are hypothesised to not only affect each operational control stage but also influence the technological environment in which PCSs operate.
H2a. 
Human PCS determinants positively influence pre-operational control determinants.
H2b. 
Human PCS determinants positively influence in-operational control determinants.
H2c. 
Human PCS determinants positively influence post-operational control determinants.
H2d. 
Human PCS determinants positively influence uncertainty operational control determinants.
H2e. 
Human PCS determinants positively influence technological PCS determinants.

3.2.3. H3: Positive Influence of Technological PCS Determinants on Operational Control Determinants

Technological PCS determinants refer to the digital capabilities, platforms, and methodologies that support automation, tracking, and real-time responsiveness in project control systems. These include project dashboards, predictive analytics, real-time monitoring technologies, and BIM [7]. The effective use of these technologies is essential for managing all stages of control, from planning and scheduling to execution monitoring, reporting, and risk response. Importantly, BIM integration is inherently tied to the CDE, which provides a project-specific information backbone that ensures that modelling outputs and project data are consistently collected, stored, and shared across stakeholders [46,55]. In this study, BIM is recognised not only as a modelling and decision-support tool but also, when combined with CDE, as a methodological framework that structures information flows, modelling, planning, and control activities across the PCS lifecycle [45,46,47]. The CDE sustains BIM’s functionality by enhancing process integrity, collaboration, and responsiveness in PCS implementation [46].
Within the Technology–Organisation–Environment (TOE) framework [56,57,58] and the Resource-Based View (RBV) [59], technological competency is conceptualised as a strategic resource. Well-integrated systems such as BIM–CDE platforms, predictive analytics, and digital dashboards improve project visibility, accuracy, and responsiveness by enabling accurate baselining, proactive deviation detection, and structured feedback loops [7,60]. Methodologically, BIM–CDE supports: (i) pre-operational control through model-based planning, scheduling, and reliable estimation [44,61,62]; (ii) in- and post -operational control through real-time progress tracking, 4D/5D visualisations, and deviation alerts, digital as-built records, knowledge capture, and optimised resource allocation [45,63,64]; and (iii) uncertainty-oriented control through scenario modelling, risk simulations, proactive and adaptive decision-making [44,61,63]. For example, 4D BIM enhances schedule predictability by visually forecasting potential delays [63,65], while the integration of earned duration metrics with BIM enables managers to visualise float and address deviations before they escalate [66,67]. Similarly, UAVs, photogrammetry, and AI-driven dashboards support progress monitoring and forecasting accuracy [68], though their benefits remain limited when applied in isolation. By contrast, an integrated PCS supported by BIM–CDE, predictive analytics, and digital dashboards provides a methodological framework that surpasses conventional tools such as earned value management, which often lag in detecting real-time deviations [69].
Accordingly, this study hypothesises that technological PCS determinants positively influence each of the four operational control mechanisms (Pre-, In-, Post-, and UOCD).
H3a. 
Technological PCS determinants positively influence pre-operational control determinants.
H3b. 
Technological PCS determinants positively influence in-operational control determinants.
H3c. 
Technological PCS determinants positively influence post-operational control determinants.
H3d. 
Technological PCS determinants positively influence uncertainty operational control determinants.

3.2.4. H4–H7: Positive Influence of Operational Control Determinants on Project Performance

The core function of PCSs is to enhance project performance through structured planning, continuous monitoring, corrective actions, and uncertainty management. In this study, performance is assessed using the widely adopted Iron Triangle framework, encompassing cost, time, and scope metrics [70,71]. The effectiveness of PCSs is thus reflected in how well these three dimensions are maintained throughout project delivery.
Each operational control determinant plays a distinct yet interrelated role in achieving these outcomes, detailed as follows: Pre-operational controls, such as detailed planning and scheduling, lay the foundation for scope clarity and baseline alignment [32,40]. In-operational controls allow for real-time tracking and performance analysis, helping to minimise deviation during execution [32,41]. Post-operational controls provide feedback and correction mechanisms, ensuring project performance stability. Uncertainty control acts as a continuous overlay, equipping projects to proactively manage risks and change across all control stages [7,8]. Together, these controls form a comprehensive system for performance optimisation.
H4. 
Pre-operational control determinants positively influence project performance.
H5. 
In-operational control determinants positively influence project performance.
H6. 
Post-operational control determinants positively influence project performance.
H7. 
Uncertainty operational control determinants positively influence project performance.

3.2.5. H8–H9: Sequential Nature of Operational Control

In effective project control systems, the pre-, in-, and post-operational phases are not isolated stages but form a continuous interdependent cycle. Pre-operational control activities, such as planning and scheduling, provide the baseline conditions that inform real-time execution monitoring [32]. In turn, insights gathered during in-operational tracking are essential for accurate post-project reporting and implementing corrective actions [29,32,70]. Without this sequence, PCS mechanisms become fragmented, reactive, and less capable of supporting cost, schedule, and scope stability [18,40]. This framework clearly corresponds with the project control paradigm, emphasising control as a feedback-oriented process that assesses actual performance against established objectives and initiates corrective measures upon the occurrence of deviations [30,70,72]. It also utilises Project Operations Management Theory, which perceives project delivery as a sequence of operational decisions and control points designed to optimise performance across the project’s life cycle [73,74,75]. Therefore, this study models these operational stages as a cascading process in which earlier phases directly support the effectiveness of subsequent ones.
H8. 
Pre-operational control determinants positively influence in-operational control determinants (In-OCDs).
H9. 
In-operational control determinants positively influence post-operational control determinants (Post-OCDs).
Drawing on these patterns, the present study hypothesises that organisational, human, and technological inputs, mediated by layered operational controls, positively influence project performance across cost, schedule, and scope dimensions. These relationships are tested through a structured IPO model, with specific sub-hypotheses examining both direct and mediated effects. The next section outlines the methodology adopted to operationalise these hypotheses and empirically test the proposed model within the Saudi construction context.

4. Methodology

4.1. Research Design

This study adopted a quantitative and cross-sectional research design to evaluate the effectiveness of PCSs in construction project delivery, as shown in Figure 4. The approach is deductive, using hypotheses derived from a structured literature review to test the interrelationships between PCSDs and project performance outcomes in the Saudi construction sector. This design is well suited to examining real-world practices in which multiple interrelated constructs influence project outcomes. A survey method was employed to gather data from professionals involved in completed construction projects, enabling the collection of perspectives that are otherwise difficult to directly observe [76,77].
The IPO framework underpins the conceptual model, separating input factors (organisational, human, and technological determinants), process mechanisms (operational control determinants), and output performance indicators (cost, time, and scope). To empirically test this model, PLS-SEM was selected due to its robustness in handling small to moderate sample sizes, tolerance for non-normal data, and the ability to model complex relationships including mediation and latent constructs [78,79]. The method supports both explanatory and predictive purposes, making it well suited for this study, which primarily aims to explain the interrelationships among PCS determinants while also assessing their predictive relevance for project performance. The integration of IPMA further enhances the managerial relevance of the findings by highlighting key priority areas. This systematic methodological approach ensures a rigorous and context-sensitive analysis of PCS effectiveness in the Saudi construction sector.

4.2. Instrument Development

The measurement instrument in this study was designed to empirically investigate the relationships between PCSDs and project performance in completed construction projects in Saudi Arabia. Its development was grounded in [7], which consolidated empirical evidence from PCS research and identified a comprehensive set of determinants relevant to construction project delivery. This prior work provided the theoretical foundation for the conceptual model by grouping PCSDs into four overarching dimensions: organisational, human, technological, and operational. In this framework, determinants represent high-level construct groupings, each comprising multiple underlying factors. Each determinant was further divided into sub-determinants, resulting in 59 measurable indicators to support a structured empirical evaluation (see Table 2, Table 3 and Table 4).
The questionnaire was developed from these indicators following established survey design principles of neutrality, clarity, and consistency [146,147,148]. A closed-ended design was adopted to facilitate quantitative analysis, with items formulated from the PCSDs identified by [7] and complemented by validated measures from previous studies to ensure alignment with the research framework. As recommended by [149] and Hair et al. [150], constructs and indicators were refined into concise, measurable statements. This approach aligns with guidance from Mackenzie et al. [151] and Wilson [152] on linking conceptual domains to observable indicators. All items were rated on a five-point Likert scale (1 = very low, 5 = very high), a widely used format in construction management research for its reliability, interpretability, and suitability for statistical analysis, originally developed by Likert in 1932 [153]. It enables respondents to express their level of evaluation, agreement or experience, thereby facilitating the collection of detailed and statistically valuable data.
The questionnaire comprised three main sections. The first collected descriptive data on the respondent and their chosen project, including both planned and actual duration and cost. The second assessed project performance using cost, schedule, and scope as outcome criteria. The third measured the implementation of the 13 PCSDs in the selected project, with each determinant assessed through multiple short, context-specific statements. The survey was designed for completion within 15–20 min, with neutral and consistently structured questions to reduce bias and maintain engagement.
Instrument validation followed a two-phase process. First, four academic and industry experts, each with over 11 years’ experience in project and construction management, reviewed the draft questionnaire for its alignment with the conceptual framework, clarity, and structure, as recommended by [154]. Their feedback was used to refine the survey’s coherence and precision. Second, a pilot study was conducted with 34 construction professionals, including 31 project managers and 3 contract managers. The sample was diverse: 68% were from the public sector, 56% had experience in residential projects, 53% worked in consultancy roles, and over 68% had more than two decades of experience (see Appendix A).
Pilot feedback indicated high clarity: 79% rated the instructions as very clear, 76% confirmed that the vocabulary was widely understood in the sector, and 91% found all questions relevant to the research topic. The average completion time was 14 min. Although the questionnaire was deemed well-structured and aligned with the study objectives, minor adjustments were made to further streamline instructions and wording without compromising data quality. The refined and validated instrument was then deployed for large-scale data collection, enabling robust assessment of PCS implementation and its impact on actual project performance.

4.3. Sample and Data Collection

This study employed a non-probability sampling approach, combining purposive and snowball techniques to target professionals involved in completed construction projects across Saudi Arabia. These methods were selected due to the absence of a comprehensive sampling frame for the target population and the limited time available for data collection. Initially, purposive sampling was used to identify construction organisations and projects most likely to yield relevant and informed participation. To ensure sectoral diversity, data were collected from residential, industrial, service, and infrastructure projects. Within each sector, two among the largest organisations (by project volume and workforce size) were contacted, as these entities typically manage a wide range of complex projects across the country and employ professionals with substantial experience in project control implementation.
The study was conducted in accordance with established ethical standards to protect the participants’ rights, dignity, and confidentiality. Participation was entirely voluntary, and the respondents received an information sheet outlining the study’s purpose, their rights, and options for withdrawal. Informed consent was obtained before data collection. No personal or sensitive identifiers were gathered, and all responses were anonymised. Data were stored securely, used solely for academic purposes, and accessible only to the research team. The questionnaire was designed to avoid intrusive content, and clear instructions were provided to minimise ambiguity.
Questionnaires were disseminated online through self-administration, with organisations asked to circulate them internally to eligible personnel. To ensure the responses reflected real project conditions rather than abstract perceptions, each respondent was instructed to select one completed project in which they had been directly involved, prioritising the most recent and largest to provide an accurate evaluation of PCS practices and associated performance outcomes. This approach grounded the data in actual delivery experiences across residential, industrial, service, and infrastructure projects, thereby enhancing the applied relevance of the findings.
Sample adequacy was assessed using established criteria for PLS-SEM. Following [150,155], the minimum required sample size was 130, based on the maximum number of structural paths leading to a single construct. A second calculation using the inverse square root method Kock et al. [156] confirmed a minimum threshold of 155 cases for statistical validity at a 5% significance level and a minimum expected path coefficient of 0.2. A total of 222 valid responses were obtained and included in the analysis, exceeding both recommended thresholds. To assess potential non-response bias, the time-trend extrapolation method proposed by Armstrong and Overton [157] was applied by comparing early (first 25%) and late (last 25%) respondents across key demographic and model variables using independent-samples t-tests. No statistically significant differences were found (p > 0.05), suggesting that non-response bias was unlikely to have influenced the results. The final sample size was therefore deemed sufficient to support model estimation, hypothesis testing, and subgroup reliability.

4.4. Data Analysis Procedure

PLS-SEM was selected for this study due to its ability to analyse complex, high-dimensional data, evaluate relationships among latent variables, and handle skewed distributions effectively [79]. Unlike covariance-based SEM, PLS-SEM is variance-oriented and imposes fewer restrictions on data normality, making it highly suitable for practice-based fields such as construction [78]. In construction management research, it offers advantages for examining both direct and indirect effects, as well as mediation, allowing for a structured and comprehensive evaluation of interrelationships among PCSDs and their influence on project performance [79,158]. In addition, PLS-SEM integrates effectively with IPMA, enabling the identification of constructs and indicators that are both influential and underperforming [159].
The analysis followed the two-stage procedure outlined by Hair et al. [150,155]. The first stage involved evaluating the reflective measurement model to ensure the reliability and validity of the constructs. Indicator reliability was evaluated through factor loadings, where values of 0.70 or higher indicate that over 50% of the variance in the observed variable is explained by the underlying construct [150]. Internal consistency was assessed using Cronbach’s Alpha and Composite Reliability (CR), with 0.70 as the minimum acceptable threshold [150,160]. Convergent validity was confirmed by calculating the Average Variance Extracted (AVE), with values of at least 0.50 indicating that the construct explains more than half of the variance in its indicators [150]. Discriminant validity was assessed using the heterotrait–monotrait (HTMT) ratio, with values below 0.85 for distinct constructs and below 0.90 for closely related constructs considered acceptable [150,161].
The second stage involved evaluating the structural model following the six-step process recommended by Hair et al. [150,155]. Model fit was assessed using the Standardised Root Mean Square Residual (SRMR), with values below 0.08 indicating adequate fit [155,162]. Variance Inflation Factor (VIF) values were checked for collinearity, with values below 5 suggesting no multicollinearity issues [163]. Path significance was tested using a bootstrapping procedure with 10,000 resamples; paths with t-values greater than 1.96 and p < 0.05 were deemed significant [150,164]. Explanatory power (R2) was used to measure the variance explained in each endogenous construct, while predictive relevance (Q2) was evaluated using PLSpredict, with values above zero indicating predictive relevance [150,155,165]. Effect size (f2) was also assessed, with values of 0.02, 0.15, and 0.35 interpreted as small, medium, and large effects, respectively [166,167].
IPMA was conducted to complement the PLS-SEM results by mapping each determinant’s importance (total effects) against its performance (average latent variable scores) in a four-quadrant matrix (Figure 5). The X-axis represents importance and the Y-axis represents performance, enabling prioritisation of improvement efforts [159]. The lower-right quadrant (high importance, low performance) indicates the most urgent areas for intervention, followed by the upper-right, lower-left, and upper-left quadrants [159].
All analyses were conducted using SmartPLS version 4.1, chosen for its robust SEM capabilities, integrated IPMA functionality, and widespread use in project management and construction research [159,168]. SmartPLS also supports advanced techniques such as multi-group analysis, permutation testing, and prediction-oriented segmentation, which contribute to a rigorous and replicable analytical process.
This method and software combination enabled a comprehensive, flexible, and rigorous evaluation of the hypothesised relationships within the conceptual model. It supported the examination of both theoretical constructs and practical performance implications, aligning closely with the study’s aim of assessing PCS effectiveness in the context of construction project delivery in Saudi Arabia.

5. Results

5.1. Respondent and Project Profile

A total of 222 valid responses were collected, with each respondent asked to select one specific completed project and complete the questionnaire based on their direct involvement. The respondent group consisted primarily of project managers (65.8%), with additional representation of construction managers (11.7%), project control managers (1.8%), and executive-level professionals (3.6%). The majority had more than 11 years of professional experience, with 34.2% having over 20 years (see Table 5).
The projects covered a range of types, with residential (30.6%) and infrastructure (27.9%) projects being the most common. Public sector projects constituted 52.7% of the sample, with the remainder being private sector developments. The respondent’s organisational role comprised (33.8%) client, (28.8%) consultant, (32.4%) contractor, and the remainder as sub-contractor. The dataset indicates that the project delivery methods covered integrated project delivery (IPD) at 42.3% and design-bid-build (DBB) at 35.1%, with the remaining methods detailed in Table 5.
Project durations varied widely, with 30.6% of projects lasting between 25 and 36 months and 20.7% lasting between 1 and 12 or 19 and 24 months. In terms of budget, 26.6% and 17.1% of projects had values between SAR 100 M and 499 M and values above SAR 500 M, respectively (see descriptive statistics in Appendix B). In terms of outcomes, 143 projects (64%) experienced schedule delays, whereas 79 (36%) were completed on or ahead of schedule. Cost performance showed a similar pattern: 117 projects (53%) exceeded their budgets, while 105 (47%) were delivered within or below planned/contractual cost. Importantly, no two entries shared identical data or outcomes, confirming that each case represents a distinct project based on variations in planned and actual durations, budgets, and change order costs. This dataset of 222 completed projects represents a broad cross-section of the Saudi construction sector, offering a representative basis for evaluating PCS determinants and project performance across diverse organisational and project contexts.

5.2. Measurement Model Evaluation

In this study, all items exceeded the minimum threshold, with loadings ranging from 0.710 to 0.904 (Table 6). These results confirm that the indicators adequately represent the respective latent constructs and that each item contributes meaningfully to its intended construct.
As shown in Table 6, all CR and Cronbach’s Alpha values fall between 0.806 and 0.949 and 0.808 and 0.950, respectively. These exceed the commonly accepted threshold of 0.70 [150,160], confirming that the items for each construct demonstrate high internal consistency. This suggests that the measurement model is stable and that the observed variables reliably capture the latent variables of interest.
As indicated in Table 6, all AVE values meet or exceed this benchmark, with most constructs exhibiting values well above 0.70. These results provide strong evidence of convergent validity, indicating that the items within each construct are effectively capturing the same underlying concept.
As shown in Table 7, all HTMT values fall within the acceptable range, with the highest being 0.891. These results confirm that each construct in the model is sufficiently distinct from the others, ensuring that they measure conceptually different aspects of PCS effectiveness and project performance.
In conclusion, the measurement model demonstrates satisfactory reliability and validity across all evaluation criteria. The indicator loadings, CR, Cronbach’s Alpha, AVE, and HTMT values collectively confirm the robustness of the model and its suitability for further structural analysis.

5.3. Structural Model Evaluation

5.3.1. Model Fit Assessment

The SRMR values of both the saturated (=0.054) and estimated model (=0.064) fell below this threshold, indicating satisfactory model fit and supporting the reliability of the hypothesised structure [155].

5.3.2. Collinearity Diagnostics

All reported VIF values for the inner model were below this cut-off (Table 8), ranging from 1.000 to 4.394. This indicates that the PCSD constructs in the model operate independently and are not subject to inflation from multicollinearity, allowing for reliable interpretation of regression results.

5.3.3. Path Coefficients: Significance and Relevance

The structural model demonstrated support for 17 of the 20 hypothesised relationships (Figure 6, Table 9). The model supported significant direct effects on project performance. H6: Post-OCD → PP (β = 0.344, p = 0.001), H4: Pre-OCD → PP (β = 0.258, p = 0.014), and H7: UOCD → PP (β = 0.207, p = 0.010) were positively associated with project performance. In contrast, the path of (H5: In-OCD → PP) was not significant (β = −0.098, p = 0.226), suggesting that its effect was mediated through downstream mechanisms of Post-OCD rather than exerting a direct influence.
Constructs including OPCSD, HPCSD, and TPCSD showed a significant effect on Pre-OCD, In-OCD, and UOCD, while Post-OCD is only influenced by HPCSD and In-OCD. The key findings also include a strong positive path from OPCSD to HPCSD (β = 0.805, p < 0.001), and a statistically significant influence of In-OCD on Post-OCD (β = 0.594, p < 0.001). These relationships illustrate the strategic cascading nature of PCS determinants across stages. Notably, two paths (H1c: OPCSD → Post-OCD and H3c: TPCSD → Post-OCD) were not supported, with p values exceeding 0.05, indicating an absence of direct statistical association and thus suggesting their mediated influence on Post-OCD though In-OCD.

5.3.4. Coefficient of Determination (R2)

The model showed substantial explanatory power for In-OCD (R2 = 0.766) and Post-OCD (R2 = 0.738).UOCD (R2 = 0.670) was at the threshold, reflecting strong explanatory performance. HPCSD (R2 = 0.649) and Pre-OCD (R2 = 0.630) demonstrated moderate explanatory power Figure 6.
The R2 for project performance (PP) was 0.425, suggesting moderate explanatory power and demonstrating the model’s capacity to capture meaningful variance in construction project outcomes. TPCSD had the lowest R2 value (0.414), although still exceeding the minimum threshold for interpretability.

5.3.5. Predictive Relevance (Q2)

All endogenous constructs recorded Q2 values well above zero, with several exceeding 0.5, such as HPCSD (0.645), In-OCD (0.632), and UOCD (0.558) (Table 10). The Q2 value for project performance (PP) was 0.331, confirming predictive relevance and supporting the model’s practical utility.
The model’s RMSE and MAE values also support predictive accuracy, with values consistent across constructs and indicating minimal prediction error between observed and estimated values.

5.3.6. Effect Size (f2)

The results revealed several key findings (Table 11). The most substantial effects were from OPCSD to HPCSD (f2 = 1.845) and HPCSD to TPCSD (f2 = 0.706), highlighting the foundational role of organisational structure and leadership in shaping human and technological PCS capabilities.
Additionally, In-OCD to Post-OCD (f2 = 0.357) showed a large effect, suggesting that real-time monitoring is a critical enabler of effective post-operational control. Medium effects were observed in the relationships from OPCSD to In-OCD and UOCD and from Pre-OCD to In-OCD, reflecting interdependencies across control stages. The remaining effects were classified as small but remained statistically significant in the multivariate model.

5.4. Mediation Analysis

Following [150], a robustness check was conducted by adding direct paths from OPCSD, HPCSD, and TPCSD to PP (H1–H3). All three were insignificant, and their inclusion reduced the significance of some mediators (Pre-OCD, UOCD), leaving only Post-OCD significant (see Table A3 in Appendix C). This supports the IPO framework’s full mediation assumption, so the original model without these direct links is retained for the main mediation analysis and IPMA.
The retained model revealed that full mediation was observed in several one-stage indirect pathways, as detailed in (See Table A4 in Appendix C). Specifically, significant effects were found for OPCSD, HPCSD, and TPCSD on PP via Pre-OCD and UOCD, supporting the IPO framework’s expectation that planning and uncertainty control mechanisms mediate input–output relationships. Similarly, Post-OCDs fully mediated the effects of HPCSD on PP (HPCSD → Post-OCD → PP). While OPCSD and TPCSD were statistically insignificant on Post-OCD, their effect is mediated through In-OCDs.
Furthermore, the three-step sequential mediation paths, OPCSD, HPCSD, and TPCSD → Pre-OCD → In-OCD → Post-OCD → PP, were statistically supported. These sequential mediations validate the IPO model’s assumption that indirect effects are cumulative and mediated progressively through structured control processes. The presence of full mediation in these extended pathways strengthens the conceptual basis of the model by demonstrating how early-stage planning translates into final project outcomes through intermediate control stages.
Among the significant mediation paths, the sequential relationship from in-operational control to post-operational control to project performance (In-OCD → Post-OCD → PP, β = 0.205, p = 0.001) demonstrated one of the strongest indirect effects, reinforcing the theoretical structure of PCSs and validating the sequential implementation logic.
Overall, these mediation outcomes suggest that the effectiveness of PCS determinants in improving project performance is primarily indirect, requiring the activation of appropriate process mechanisms.

5.5. IPMA Results and Strategic Priorities

All analyses were conducted using SmartPLS 4.1, which automatically rescales performance values on a 0–100 scale using the formula illustrated in Figure 7.

5.5.1. Construct-Level IPMA Results

The construct-level IPMA was applied with PP as the target construct. Figure 8 presents the total effects (importance), rescaled performance scores, and ranked priorities. The highest ranked determinant was OPCSD, with an importance value of 0.507 and a performance score of 62.81, placing it as the primary area for strategic intervention. This was followed by Post-OCD, HPCSD, and Pre-OCD, ranked second, third, and fourth, respectively. UOCD and TPCSD were placed in the third-priority quadrant, while In-OCD occupied the fourth. Notably, no construct appeared in the first-priority quadrant, and UOCD and TPCSD were the lowest performing. Constructs located in the second-priority quadrant, such as OPCSD, Post-OCD, and HPCSD, remain the most critical contributors to PP and warrant sustained attention to consolidate their strategic value.

5.5.2. Formulation of the Performance Model

To quantify the contribution of each determinant to PP, a conceptual performance estimation equation is formulated as
ΔPP = 0.507 × ΔOPCSD + 0.344 × ΔPost-OCD + 0.342 × ΔHPCSD + 0.289 × ΔPre-OCD + 0.207 × ΔUOCD + 0.156 × ΔTPCSD + 0.106 × ΔIn-OCD
The coefficients represent the total effects of each construct on project performance. A one-unit increase in the performance score of a given determinant (on a 0–100 scale) is estimated to change the PP in proportion to its total effect. For example, a one-unit increase in OPCSD performance would increase project performance by 0.507 units, keeping the other determinants constant.

5.5.3. Indicator-Level IPMA Results

The indicator-level IPMA results are presented in Table 12 and Figure 9. A total of 11 indicators showed relatively high importance with lower-than-average performance, suggesting that they may offer opportunities for targeted improvement. The most frequently identified indicators were from the OPCSD, HPCSD, Pre-OCD, and Post-OCD constructs. These include the following:
  • LTS1, 2, and 3: Team members’ knowledge and expertise level, their skills, and manager’s technical and managerial capability.
  • PPS4: Accuracy of project estimation.
  • SEC1, 2, and 4: Assessment of stakeholder engagement, stakeholders’ understanding of their roles, and consistency and integration of control systems across stakeholders.
  • OGP3, 4, and 5: PMO involvement, audits and compliance frequency, and application of knowledge management for continues improvement.
  • CA4: Application of schedule compression techniques.
Indicators located in the high-importance/low-performance quadrant represent the most critical operational weaknesses in current PCS implementation and should be prioritised for improvement, as enhancements here are likely to yield the greatest impact on project performance.
The sub-indicators identified as priorities for improving overall PP also highlight key areas for strengthening Pre-OCD and In-OCD. In contrast, Post-OCD and UOCD, which are critical for project performance, require further targeted interventions as they currently underperform relative to their importance (see Figure 10 and Appendix C for supporting tables and figures).
Post-operational controls require improvement in areas of in-operational control such as Earned Value Analysis (EVA) (PPA1), forecasting completion (PPA2), critical path review, financial analysis (PPA4), and accuracy of performance measurement. KPI functions essential for accurate corrective-stage control underperformed relative to their importance, as identified by IPMA.
For the improvement of uncertainty control mechanisms, as shown in the IPMA results, there is a pressing need to strengthen the adoption of real-time technologies (TC2), integrated data analytics (TC1), common data environment (CDE) platforms (TC5), and BIM and visualisation tools (TC4) that support predictive and adaptive uncertainty control.

6. Discussion

6.1. Discussion of Key Findings

This section reflects on the empirical results in light of the study’s research questions and hypotheses. The findings offer a comprehensive view of how PCS determinants operate through structured control mechanisms to influence project performance in construction environments, particularly within the Saudi Arabian context.
The model reinforced the significance of early-stage and post-stage control mechanisms, Pre-OCD and Post-OCD, as key determinants. Pre-OCD demonstrated a meaningful direct impact on project performance (β = 0.258, p = 0.014), while Post-OCD emerged as the strongest direct process contributor (β = 0.344, p = 0.001). These findings demonstrated that planning and corrective actions are essential for aligning projects with cost, schedule, and scope targets, as suggested in previous PCS research [5,40]. Notably, UOCD also showed a direct positive relationship with project performance (β = 0.207, p = 0.010), reinforcing its role as a cross-phase mechanism that underpins control resilience in complex projects, as highlighted in [48,169,170].
Conversely, the unsupported hypothesis (H5), in-OCD to PP, showed no significant direct relationship with project performance (β = −0.098, p = 0.226), but its role became clearer through mediation analysis. The sequential path from In-OCD to Post-OCD to project performance was statistically significant and the strongest indirect effect (β = 0.205, p = 0.001), indicating that monitoring alone does not improve performance unless followed by effective corrective action. This finding aligns with the theoretical position of [29,70], who argued that continuous control requires a feedback loop to translate real-time insights into structured interventions. It also reinforces prior observations that real-time tracking alone is insufficient to drive improvement. Tools such as earned value management and milestone analysis must be coupled with responsive decision-making to influence outcomes [40,41,171]. Without this translation from insight to action, in-operational mechanisms risk functioning as passive monitoring tools rather than active contributors to performance enhancement.
The broader mediation structure demonstrated the IPO assumption that PCS effectiveness is largely realised through cumulative process mechanisms rather than isolated direct effects. For example, significant indirect effects from OPCSD, HPCSD, and TPCSD to PP were observed via Pre-OCD and UOCD and through full sequential mediation pathways. This has strong implications for system design, indicating that management practices should focus not just on input enablers but on enhancing their translation through integrated control stages. In terms of operational controls, the model supports earlier work by Le and Sutrisna [32], confirming that pre-, in-, and post-operational controls operate interdependently and affect each other. The mediation results reinforce the findings that project control should not be assessed in isolated stages but as a continuous feedback cycle.
Although the paths from OPCSD (H1c) and TPCSD (H3c) to Post-OCD were not supported, this should not be seen as a limitation but rather as a contribution of this research. It is important to note that this study represents the first attempt to empirically test these relationships within a unified PCS framework. While earlier studies examined determinants in isolation, the integrated model used here revealed that some direct paths are rendered insignificant once interdependencies are accounted for. This suggests that their influence is primarily indirect, operating through intermediary mechanisms such as Pre-OCD, In-OCD, or HPCSD, or shaped by contextual constraints. This interpretation aligns with the results of Vanhoucke [172] and Olawale and Sun [41], who suggested that corrective actions are largely decentralised, with greater reliance on team-level autonomy and real-time project managers judgement. It also resonates with Ong and Bahar [173], who found no significant relationship between top management support and project management effectiveness. The findings therefore indicate that post-operational control mechanisms, particularly project reporting and corrective actions, are more dependent on team-level interpretation and real-time decision-making than on hierarchical directives. This diverges from earlier control phases, where governance and stakeholder oversight play a more prominent role. Given the potential implications for responsiveness to project performance deviations, this shift in control emphasis warrants further investigation. Future research could explore how team-level autonomy in post-operational controls affects overall project performance, and whether this balance between decentralised decision-making and formal oversight varies across different project types, scales, or cultural contexts.
Likewise, TPCSD (H3c) showed no significant direct effect on post-operational control. This finding reinforces broader concerns that digital tools, while effective in planning and tracking, may offer limited utility in reflective analysis and corrective decision-making without human interpretation. However, the significant effects observed on pre-, in-, and uncertainty-oriented controls align with prior studies. For example, BIM–CDE has been found to strengthen pre-operational controls by enhancing baseline reliability and estimation accuracy [44,61,62], while AI-driven tools have improved resource and cost estimation practices [174]. For in-operational control, real-time monitoring, 4D/5D visualisations, and automated deviation detection support early warning systems that enable timely adjustments that may inform post-control mechanisms [45,63]. Similarly, digital platforms have been found to improve risk identification and adaptive decision-making under uncertainty [44,63]. These findings suggest that although technology significantly improves responsiveness and data generation in pre-, in-, and uncertainty-related controls [115], its effectiveness in post-control remains constrained [40,175]. As Amer et al. [176] observed, digital tools can support corrective action mechanisms when integrated with human judgement to ensure effective implementation. Without such integration, digital outputs risk being underutilised, failing to inform meaningful corrective actions and project adjustments.
The model’s explanatory power further reflects this complexity. The R2 value of 0.425 indicates that project performance is moderately explained by the proposed framework. This suggests that additional factors, including project characteristics such as size, type, and delivery method, as well as broader external influences like market volatility, contractual structures, and institutional or cultural conditions, also play a role in shaping outcomes. Acknowledging these influences is important for interpreting the effectiveness of a PCS as part of a broader socio-technical system rather than a closed managerial mechanism. Future research could extend this work by explicitly incorporating project characteristics and contextual variables to examine how they interact with PCS determinants in shaping performance outcomes.
The IPMA results further reinforced the centrality of organisational determinants, with OPCSD ranked highest in terms of the total effect (0.507) on project outcomes. This supports prior findings [7,40], emphasising governance, top management, and stakeholder integration as the backbone of effective PCS. Post-OCD and HPCSD followed closely, illustrating the importance of project reporting, corrective action protocols, leadership, skilled teams, and collaboration in translating frameworks into performance [2,32]. Although TPCSD recorded the lowest performance among all constructs in the IPMA results, its significant effects on pre-, in-, and uncertainty controls indicate that it plays an essential role in shaping operational processes. Strengthening technological competencies should therefore be prioritised, particularly for practitioners and policymakers aiming to improve operational control stages and overall project performance.
At the indicator level, IPMA highlighted several underperforming items as priority areas for improving PP, notably leadership and team competency, estimation accuracy, stakeholder coordination and integration, corrective scheduling actions, and governance mechanisms. These findings are consistent with both local and global evidence that emphasise the critical role of these factors in strengthening project control and delivery outcomes. Notably, TPCSD contributes directly to several of these priority areas. For instance, BIM–CDE’s modelling and integration capabilities enhance estimation accuracy through reliable quantity take-off, resource planning, and 4D/5D simulations [177,178]. Its shared data environment also improves stakeholder coordination and strengthens information flow across fragmented project teams [44,60,179]. When supported by robust governance and skilled teams, BIM–CDE can serve as an integrated control framework that fosters transparency, consistency, and adaptability in managing project performance.
First, strengthening leadership and team competency (LTS1–3) is critical. This finding aligns with local studies [16,40,170] and is reinforced globally in contexts such as Taiwan, Indonesia, Vietnam, China, Iran, and New Zealand [32,169,180,181]. The IPMA results showed that team skills, knowledge and expertise levels, and managerial capacity are of high importance but underperforming, highlighting the need for improvement to enhance PP. In the Saudi context, Aldossari [182] emphasised that soft skills, particularly leadership, problem-solving, decision-making, and teamwork, are vital for managing public construction projects. This is consistent with Mahamid’s study [35], which analysed 55 infrastructure projects (2011–2015) and identified human behaviour and competency among the top causes of project delays, recommending a greater focus on managerial skills and human systems. Several studies further stress that shortages of skilled manpower have long been a root cause of these weaknesses [1,183].
This concern extends beyond Saudi Arabia: a lack of competency among planners, estimators, and managers has been widely reported [4], with insufficient training and development programmes noted in Morocco [184] and Saudi Arabia [185,186]. Addressing these gaps requires training that goes beyond leadership, teamwork, communication, and technical skills to include digital and methodological competencies in data analytics, BIM–CDE, and AI. Embedding digital literacy into capacity-building efforts ensures that project staff can effectively apply modelling, estimation, and collaborative control processes in PCS practice. As Othayman et al. [187] emphasise, training must be well-designed and effectively delivered; extending this to digital platforms would help translate technological capabilities into practical outcomes.
Second, it is also essential to improve estimation accuracy (PPS4). Evidence from Saudi Arabia highlights this issue [188,189,190,191,192], which is further emphasised in the global literature [5,41,174,181,193]. For example, in a study of 27 construction projects completed between 2012 and 2022 in Saudi Arabia, Aldossari [16] identified inaccurate estimation as one of the top causes of poor scheduling and financial performance. Mistakes in cost estimation can delay or cancel projects, reduce project scope, and create substantial financial risks for both owners and contractors [194]. Such errors undermine the achievement of Saudi Arabia’s construction initiatives, particularly those that must be delivered within strict time and cost limits [188].
Globally, poor estimation practices are equally widespread and persistent [4]. Studies emphasised that inaccurate estimation remains a global phenomenon that contributes to misleading cost–benefit analyses, flawed decision-making, and misallocation of resources in construction projects [3,36,193]. A study of Malaysian projects by Yap et al. [195] confirmed that inaccurate resource estimates were a major cause of time–cost overruns and highlighted the importance of effective knowledge management practices in addressing the problem. While knowledge, experience, and skilled estimators are fundamental to accuracy, recent advances in artificial intelligence and machine learning have provided effective solutions. Several studies have demonstrated the potential of these tools to improve the precision of estimation and forecasting [174,196]. Policymakers should therefore consider establishing a national platform that consolidates data from completed projects, including duration, cost, and resources. Such a database would support more reliable estimation practices. At the project level, managers are encouraged to adopt AI-based estimating and predictive tools to enhance the quality and reliability of their estimates.
Third, governance mechanisms (OGP3, 4, and 5) require greater attention, which is highlighted in Saudi research [8,197] and corroborated internationally [5,32,92,96,97]. These mechanisms encompass PMO involvement, the frequency of audits and evaluations, and the application of knowledge management for continuous improvement. In Saudi Arabia, Jawad et al. [8] emphasised that effective governance, supported by clear responsibilities and standardised processes, is critical to achieving scope, schedule, and cost objectives. Similarly, Alghuried [198] identified compliance and governance as among the top success factors critical to project delivery in Saudi Arabia, particularly in the context of sustainable management.
The role of the PMO has been repeatedly underlined in both local and global studies as a cornerstone of effective governance. Research in Saudi Arabia recommends further strengthening PMO capacity and engagement to improve project outcomes [14,197,199]. Strong governance structures, supported by capable PMOs, enable effective project control by clarifying roles, ensuring accountability, aligning projects with strategic objectives, and safeguarding delivery on scope, schedule, and cost [50]. Beyond governance structures, institutionalising knowledge management through learning, sharing experiences sharing, and disseminating best practices provides long-term advantages. It drives continuous improvement, reduces risks across people, processes, and tools, and enhances the overall maturity of PCS practices [92,195]. Accordingly, this study suggests that greater attention should be directed toward improving PMO engagement, strengthening audit and evaluation mechanisms, and embedding knowledge management practices to ensure that PCS implementation achieves sustained improvement and contributes meaningfully to the effectiveness of project delivery.
Fourth, stakeholder coordination and integration (SEC1, 2 and 4) were identified as key deficiencies, consistent with the results of local studies [13,40,139,185] and widely recognised in international contexts such as Singapore, Qatar, China, Iran, and India [100,200,201,202,203]. In Saudi Arabia, Alenazi et al. [204] examined 37 projects completed between 2007 and 2017 and identified ineffective stakeholder coordination as a leading cause of delays. Similarly, Allahaim and Liu [205] reported that poor coordination ranked among the top five causes of cost overruns in large infrastructure projects, an issue also noted in other developing economies. Several other Saudi studies confirm that weak communication and coordination among project parties remain persistent challenges to timely and cost-effective delivery [15,206]. Globally, fragmented stakeholder coordination consistently ranks among the top causes of project delays, as highlighted by Sanni-Anibire et al. [193]. Large construction projects, in particular, often struggle with communication across diverse stakeholder groups, but evidence suggests that early establishment of dedicated communication teams can improve alignment and reduce errors [207]. Collectively, these findings underscore the importance of improving stakeholder integration, strengthening interaction across project parties, and ensuring consistency in control systems. Such improvements represent clear leverage points for enhancing project performance within the Saudi construction context.
These deficiencies reflect sector-specific challenges in Saudi Arabia, including shortages of qualified workers, inaccurate estimation, poor coordination across stakeholders, and a lack of governance maturity [40,183,204]. Comparable challenges have also been reported internationally, where workforce competency and estimation practices are repeatedly highlighted as determinants of successful project delivery [32,208,209]. This suggests that these challenges are not unique to Saudi Arabia but part of broader global project management concerns. This comparison reinforces that while the PCS framework is contextually grounded in Saudi Arabia, its insights and recommendations are transferable to other regions facing similar systemic challenges.
For Post-OCD, IPMA highlighted that deficiencies in project performance analysis (PPA1–5) under In-OCD are the first priorities for improvement. These include the application of EV metrics, forecasting models, critical path analysis, financial reviews, and accurate performance measurement [5,7,40,210]. Addressing these gaps is essential, as they constrain the analytical depth required to support timely and informed corrective actions [41,112,118,128]. Similarly, for UOCD, IPMA results pointed to technological integration (TC1, 2, 4, 5) under TPCSD as the highest-priority improvement area. The limited adoption of real-time monitoring, integrated analytics, BIM, and digital collaboration platforms [7,61,188,211] reflects a disconnect between available digital capabilities and their actual deployment. This finding aligns with broader evidence that technology can play a critical role in enhancing risk and change control [179,212,213,214], but its adoption in Saudi construction projects remains constrained by financial limitations, lack of technical expertise, and cultural barriers [188,211,215]. Strengthening these priority areas can therefore improve not only In-OCD and TPCSD but also, through them, the performance of Post-OCD, UOCD, and ultimately the overall PCS.
When compared with existing PCS studies, these findings highlight the added value of an integrated framework that captures sequential mediation effects and the interdependence of control processes. Rather than treating PCS determinants as independent levers, the model underscores their cumulative influence through structured feedback cycles. This expands on earlier PCS studies by showing how inputs must be channelled through process mechanisms to achieve measurable performance outcomes. Although this study is situated within the Saudi Arabian construction sector, the findings have broader relevance. Similar challenges of weak governance integration, poor estimation, limited digital maturity, and fragmented stakeholder coordination have been reported in other emerging economies, suggesting partial transferability of the results. In such contexts, adopting BIM–CDE platforms offers a structured approach to integrate project modelling, estimation practices, planning, real-time monitoring, and team collaboration within a single data environment. Strengthening PMOs and institutionalising knowledge-sharing practices can further help address persistent gaps in digital maturity, governance, and stakeholder alignment. At the same time, institutional and cultural differences may moderate these effects, underscoring the need for comparative studies to confirm the framework’s wider applicability.
In summary, the empirical results support that PCS effectiveness stems from interdependent sequential processes initiated by strategic enablers and carried through structured control cycles. Organisational, human and technological foundations are essential, while the contribution of process-based controls is realised through their integration and timely execution. The unsupported direct paths (H1c, H3c, and H5) are not anomalies but reveal that certain determinants exert their effects indirectly, mediated by earlier control mechanisms or shaped by contextual constraints. This study is the first to test these relationships in a unified PCS model, and the results provide a robust evidence base for refining both theoretical knowledge and practical applications of PCSs in complex construction environments.

6.2. Theoretical Contributions

This study contributes to the theoretical development of PCSs by empirically testing a unified model grounded in the IPO framework. To the best of the authors’ knowledge, this is the first attempt to empirically test an integrated PCS model that combines organisational, human, and technological determinants with operational control mechanisms and project performance within a single empirical structure. Previous research has often examined PCS effectiveness through fragmented perspectives, focusing on single stages or individual determinants, which limited integration. In contrast, this model advances the theoretical foundations of PCSs by reinforcing the sequential and interdependent relationships that characterise the control cycle in construction projects and by showing how upstream enablers translate into downstream control practices and outcomes.
The empirical findings showed that the effectiveness of PCSs is shaped by the structured interaction of their underlying determinants rather than by isolated improvements. The analysis showed that organisational, human, and technological factors influence operational control phases in a sequential and cumulative manner. Greater implementation of upstream determinants consistently contributed to improvements in subsequent downstream determinant components such as pre-, in- and post-control actions, uncertainty control, and, ultimately, project performance. This emphasis on indirect and sequential effects also explains why some hypothesised direct relationships (for example, OPCSD → Post-OCD, TPCSD → Post-OCD, and In-OCD → PP) were not supported. Their lack of significance highlights that PCS effectiveness cannot be assumed from top-down governance or technology deployment alone; instead, their impact depends on translation through process mechanisms and team-level action. This structural configuration makes it possible to identify the determinants that shape each construct and to trace their measurable effects across the model, providing a detailed account of how each determinant contributes to the system as a whole. It also highlights the need to prioritise areas with the greatest overall influence on PCS effectiveness, with OPCSD exerting the highest total effect, Post-OCD exerting the strongest direct effect on project performance, and In-OCD exerting the strongest indirect effect through Post-OCD. These insights strengthen the explanatory foundations of PCS research and clarify where interventions are most likely to deliver broad and system-wide benefits.

6.3. Practical Implications

These findings offer a structured foundation for project managers and decision-makers aiming to improve PCS effectiveness in construction project delivery. The validated model suggests that strengthening organisational, human, technological, and operational control determinants is central to achieving higher project performance.
In practical terms, this suggests that control strategies should begin with reinforcing OPCSD, including top management commitment, stakeholder coordination, and governance mechanisms, exerts the strongest upstream influence on operational control functions. Within this construct, stakeholder interaction in information sharing and decision-making, consistency of control systems across groups, and top management’s commitment to control principles emerged as the top indicators, underscoring their role in shaping organisational control maturity. HPCSD was also shown to be a significant driver of all operational controls (Pre-, In-, Post-OCD, and UOCD). The highest-ranked indicators reflected human competencies such as technical proficiency and independence, domain-specific expertise, and collaboration and communication. These findings emphasise the operational role of skilled, communicative personnel in PCS execution, making capability development in these areas a key priority. TPCSD was identified as another important lever for Pre-, In-, and UOCD effectiveness. Its top indicators included the use of real-time monitoring technologies, adoption of a common data environment, and integration of data analytics into project control processes. These results highlight the strategic role of digital intelligence in supporting effective PCS implementation.
Post-OCD, which represents reporting and corrective mechanisms, demonstrated the strongest direct association with project performance. Its key indicators included the accessibility and clarity of reports, reporting frequency, and clarity of communication protocols, all of which capture the system’s corrective capacity. Similarly, Pre-OCD was a significant predictor of both project performance and In-OCD, with top indicators such as alignment of performance metrics with objectives, accuracy of milestones and schedules, and the precision of the Work Breakdown Structure reinforcing the importance of robust upfront planning. In-OCD, focusing on progress tracking and analysis, exerted its strongest influence on Post-OCD, with top indicators including monitoring deliverables against quality benchmarks, reviewing critical path activities, and frequent on-site inspections. Together, these emphasise the need for continuous performance verification during execution. UOCD also showed a significant direct effect on project performance, with top indicators focusing on risk monitoring protocols, mitigation strategies, and contingency fund allocation, reflecting readiness under uncertain project conditions.
These findings complement the overall construct-level reliability and validity by specifying which sub-determinants drive each dimension of the PCS framework. They offer a more detailed lens for both researchers and practitioners seeking to refine or implement targeted improvement strategies across control domains.
The IPMA results provide a practical roadmap for implementing improvements by identifying areas of high importance but relative underperformance. Project managers can use IPMA as a diagnostic instrument to prioritise resources by focusing on leadership and skill development, improving estimation accuracy, enhancing stakeholder engagement and system consistency, applying schedule compression techniques, and strengthening governance mechanisms. For Post-OCD, improvements in performance tracking systems, such as earned value analysis, forecasting, critical reviews, and KPIs, are particularly critical, while enhancing UOCD requires prioritising technological integration (e.g., real-time monitoring technologies, integrated data analytics, and BIM-CDE platforms) alongside proactive risk and change response protocols. These actions transform IPMA results from diagnostic insights into an actionable implementation plan.
At a broader level, policymakers, regulators, and industry leaders play a critical role in strengthening PCS effectiveness beyond individual projects. They can establish governance frameworks and support them with a comprehensive, regularly updated national or regional database of completed project outcomes, supplier prices, and standard productivity rates. Such a database would provide sector-wide benchmarks and reliable reference points for improving estimation accuracy and resource planning. Incentivising digital adoption and investing in capability building, particularly in digital literacy and leadership skills for project managers and controllers, are equally important. Finally, aligning regulatory frameworks, such as procurement standards and data governance, with technological advances would create an enabling environment for the adoption of artificial intelligence, machine learning, and other emerging technologies in PCS practices. Although developed in the Saudi context, these recommendations are equally relevant to other emerging economies facing similar challenges.

7. Limitations and Future Research Recommendations

Several limitations should be acknowledged. First, the cross-sectional design of this study limits insights into how PCSDs evolve across the project lifecycle. Second, while self-reported data provided valuable firsthand insights, it may be affected by recall or social desirability bias, although steps such as clear question framing and confidentiality assurances were applied to mitigate these risks. Third, the sample was drawn exclusively from the Saudi construction sector, potentially limiting generalisability to other contexts or industries. Fourth, the study did not account for broader contextual variables such as economic instability, political and regulatory changes, or project-specific characteristics. Nor did it differentiate between organisations at varying levels of project management maturity, which may influence PCS implementation outcomes. The moderate R2 value of 0.425 for project performance indicates a reasonable level of explanatory power. At the same time, it suggests that external and contextual elements may account for some of the unexplained variance. Their inclusion could strengthen the predictive capacity of future models.
Future research could address these limitations in several ways. Longitudinal studies could trace PCS dynamics across time and project phases to capture temporal variations in effectiveness. Comparative research across countries, industries, and organisational maturity levels would also help validate the framework’s wider applicability and clarify which findings are context-specific. The interaction between internal control structures and external uncertainties, whether political, economic, or regulatory, deserves closer scrutiny, particularly in light of increasingly complex global risk environments. Finally, while this study relied on PLS-SEM and IPMA to examine the determinants of PCS effectiveness, future research could complement these methods with artificial intelligence techniques. In particular, machine learning models could be trained on project datasets to predict specific performance outcomes, such as cost, time, or scope adherence. This would extend the explanatory insights of PLS-SEM by adding a predictive dimension, offering practitioners early warning tools to identify projects at risk of delay or overrun.

8. Conclusions

This study developed and tested an integrated model for assessing the effectiveness of PCSs in the construction sector using the IPO framework. Drawing on organisational, human, and technological determinants as inputs, it analysed how these components interact with operational control determinants to shape project performance outcomes. Its significance lies in combining multilevel analytical methods to construct and empirically examine a dynamic IPO-informed PCS framework, extending conventional control models by incorporating the temporal and functional intricacies of real-world control endeavours. In doing so, this study fills a significant gap in the literature and contributes theoretical insight into the dynamics of project control in high-uncertainty, complex contexts such as those in KSA.
The model was empirically tested through PLS-SEM, followed by IPMA using the results of a survey of participants in 222 completed projects in Saudi Arabia. It addresses persisting project control issues in the Saudi construction context, where 64% of projects encounter delays, and 53% suffer from cost overruns. These figures underscore the need for stronger control mechanisms. Results showed that PCS effectiveness is not shaped by isolated factors but by interdependent pathways. Seventeen out of twenty hypothesised relationships were supported, including strong paths such as Post-operational control determinants → Project performance, In-operational control determinants → Post-operational control determinants, Organisational and Technological PCS determinants → Pre-, In-, and Uncertainty- related controls, and Human PCS determinants → all operational control determinants. Mediation analysis established that indirect effects through structured control stages are critical; the pathway of In-operational control determinants → Post-operational control determinants → Project performance exhibited the most significant indirect effects. IPMA highlighted organisational, human, and pre- and post-operational controls as high-impact priorities for improvement. At the same time, it revealed underperformance in technological and in-operational determinants that require targeted strengthening, particularly to enhance post-operational and uncertainty controls.
Together, these findings reinforce the strategic value of integrating input-level enablers with staged operational controls. They demonstrate that effective PCS implementation depends not only on the presence of organisational, human, and technological factors but also on how these factors interact and transition across control stages. This strengthens the empirical basis for PCS evaluation and offers a model that better reflects the realities of complex project delivery.
Beyond academic contributions, the study provides practical guidance for improving PCS implementation. Project managers and firms are encouraged to strengthen estimation accuracy, leadership and team competency, planning alignment, and decision-making capacity. These efforts should be supported by effective governance mechanisms, digital tools, uncertainty response protocols, and performance tracking systems. IPMA results highlighted weaknesses in estimation, coordination, and digital maturity, with technological PCS determinants significantly influencing pre-, in-, and uncertainty controls. This study affirms BIM–CDE’s potential as a unifying framework that addresses critical operational control gaps, particularly those related to estimation accuracy, coordination, and digital integration. It also supports more responsive and structured implementation of PCSs. Policymakers and industry leaders are likewise encouraged to promote governance standards, invest in capability building, encourage technology adoption, and foster knowledge exchange platforms. Although grounded in the Saudi Arabian context, these insights are relevant to other emerging economies that face comparable challenges of fragmented coordination, uneven digital maturity, and governance gaps. Overall, the study contributes a theory-informed and empirically validated framework that clarifies the structural dynamics of PCS effectiveness and provides actionable strategies for enhancing project delivery in complex construction environments.

Author Contributions

Writing—original draft, R.A.; Writing—review & editing, M.S. and R.S.; Supervision, M.S. and R.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available upon request.

Acknowledgments

The authors gratefully acknowledge the late Francis Edum-Fotwe for his early involvement in this research. Thanks are also extended to all questionnaire participants, the contributing professionals, and the Editor and anonymous reviewers for their valuable comments.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

PCSProject Control System
PCSDsProject Control System Determinants
IPOInput–Process–Output
PLS-SEMPartial Least Squares Structural Equation Modelling
PPProject Performance
OPCSDOrganisational Project Control System Determinant
HPCSDHuman Project Control System Determinant
TPCSDTechnological Project Control System Determinant
Pre-OCDPre-Operational Control Determinants
In-OCDIn-Operational Control Determinants
Post-OCDPost-Operational Control Determinants
UOCDUncertainty Operational Control Determinants
TMSTop management support
OGPOversight and Governance Programmes
SECStakeholder Engagement Coordination
TWCTeamwork and Collaboration
LTSLeadership and Team Skills
TCTechnological Competency
PPSProject Planning and Scheduling
CCChange Control
RCRisk Control
PPTProject Progress Tracking
PPAProject Performance Analysis
PCRProject Communication and Reporting
CACorrective Actions

Appendix A. Pilot Study Respondents

Table A1. Respondent and Project Profile of Pilot Study.
Table A1. Respondent and Project Profile of Pilot Study.
VariableCategoryNumberPercentage (%)Cumulative Percentage (%)
Years of experience11–15 years38.88.8
16–20 years823.532.4
20 and above2367.6100.0
Respondent’s positionProject manager level3191.291.2
Contract manager38.8100.0
Project typeCommercial12.92.9
Industrial12.95.9
Infrastructure926.532.4
Residential1955.988.2
Services411.8100.0
Project sectorPrivate1132.432.4
Public2367.6100.0
Organisation roleClient1029.429.4
Consultant1852.982.4
Contractor617.6100.0

Appendix B. Descriptive Statistics

Table A2. Descriptive statistics of PCSDs and PP indicators.
Table A2. Descriptive statistics of PCSDs and PP indicators.
IndicatorsNMinimumMaximumMeanStd. Deviation
Schedule performance222153.291.188
Cost performance222153.531.053
Scope performance222153.731.006
TMS1222153.591.054
TMS2222153.591.084
TMS3222153.641.057
TMS4222153.671.096
OGP1222153.710.970
OGP2222153.720.935
OGP3222153.331.104
OGP4222153.341.088
OGP5222153.311.116
SEC1222153.321.047
SEC2222153.411.072
SEC3222153.491.071
SEC4222153.411.015
TWC1222153.461.202
TWC2222153.491.174
TWC3222153.670.978
TWC4222153.741.060
LTS1222153.361.118
LTS2222153.281.148
LTS3222153.321.152
LTS4222153.640.968
TC1222153.231.094
TC2222153.091.167
TC3222152.511.297
TC4222152.861.244
TC5222153.121.137
PPS1222153.800.965
PPS2222153.680.923
PPS3222153.530.935
PPS4222153.361.053
PPS5222153.540.935
PPS6222153.440.995
CCM1222153.431.021
CCM2222153.271.200
CCM3222153.501.088
CMM4222153.541.079
RM1222152.981.173
RM2222153.091.167
RM_3222153.021.220
RM4222153.091.107
PPT1222153.641.037
PPT2222153.641.005
PPT3222153.680.952
PPT4222153.550.930
PPT5222153.620.971
PPT6222153.580.957
PPA1222153.301.077
PPA2222153.271.079
PPA3222153.341.096
PPA4222153.211.116
PPA5222153.281.053
PCR1222153.580.984
PCR2222153.740.980
PCR3222153.671.005
PCR4222153.631.016
CA1222153.680.988
CA2222153.451.078
CA3222153.441.043
CA4222153.321.086

Appendix C. Mediation Analysis

Table A3. Robustness Check Including OPCSD, HPCSD, and TPCSD → PP Paths.
Table A3. Robustness Check Including OPCSD, HPCSD, and TPCSD → PP Paths.
HRelationshipsPath Coefficient (B)T Statistics (|O/STDEV|)p ValuesSignificance
(p < 0.05)
H1OPCSD → PP0.1851.5260.064Insignificant
H1aOPCSD → Pre-OCD0.3374.2880.000Significant
H1bOPCSD → In-OCD0.3044.6750.000Significant
H1cOPCSD → Post-OCD0.0710.8830.189Insignificant
H1dOPCSD → UOCD0.3714.5560.000Significant
H1eOPCSD → HPCSD0.80531.1520.000Significant
H2HPCSD → PP0.0620.4910.312Insignificant
H2aHPCSD → Pre-OCD0.3123.7120.000Significant
H2bHPCSD → In-OCD0.1962.6940.004Significant
H2cHPCSD → Post-OCD0.2463.0490.001Significant
H2dHPCSD → UOCD0.2232.8560.002Significant
H2eHPCSD → TPCSD0.64313.8630.000Significant
H3TPCSD → PP0.0360.2660.395Insignificant
H3aTPCSD → Pre-OCD0.2373.8160.000Significant
H3bTPCSD → In-OCD0.2013.2150.001Significant
H3cTPCSD → Post-OCD−0.0030.0690.473Insignificant
H3dTPCSD → UOCD0.3264.4410.000Significant
H4Pre-OCD → PP0.2021.610.054Insignificant
H5In-OCD → PP−0.1911.3280.092Insignificant
H6Post-OCD → PP0.2992.7320.003Significant
H7UOCD → PP0.151.3040.096Significant
H8Pre-OCD → In-OCD0.2914.1120.000Significant
H9In-OCD → Post-OCD0.5948.5650.000Significant
Note: This robustness check examines the direct effects of OPCSD, HPCSD, and TPCSD on PP to verify mediation assumptions.
Table A4. Significance analysis of mediation effect.
Table A4. Significance analysis of mediation effect.
HRelationshipsPath Coefficient (B)p ValuesSignificance
(p < 0.05)
Mediation Type
H10aOPCSD → Pre-OCD → PP0.0870.028SignificantFull mediation
H10bHPCSD → Pre-OCD → PP0.0800.043SignificantFull mediation
H10cTPCSD → Pre-OCD → PP0.0610.021SignificantFull mediation
H11aOPCSD → In-OCD → PP−0.0300.231InsignificantNot supported
H11bHPCSD → In-OCD → PP−0.0190.244InsignificantNot supported
H11cTPCSD → In-OCD → PP−0.0200.239InsignificantNot supported
H12aOPCSD → Post-OCD → PP0.0240.203InsignificantNot supported
H12bHPCSD → Post-OCD → PP0.0850.028SignificantFull mediation
H13cTPCSD → Post-OCD → PP−0.0010.473InsignificantNot supported
H14aOPCSD → UOCD → PP0.0770.028SignificantFull mediation
H14bHPCSD → UOCD → PP0.0460.018SignificantFull mediation
H14cTPCSD → UOCD → PP0.0670.027SignificantFull mediation
H15aIn-OCD → Post-OCD → PP0.2050.001SignificantFull mediation
H15bPre-OCD → In-OCD → PP−0.0290.243InsignificantNot supported
H15cPre-OCD → In-OCD → Post-OCD → PP0.0590.005SignificantFull mediation
H16aOPCSD → Pre-OCD → In-OCD → PP−0.010.257InsignificantNot supported
H16bHPCSD → Pre-OCD → In-OCD → PP−0.0090.256InsignificantNot supported
H16cTPCSD → Pre-OCD → In-OCD → PP−0.0070.248InsignificantNot supported
H17aOPCSD → In-OCD → Post-OCD → PP0.0620.004SignificantFull mediation
H17bHPCSD → In-OCD → Post-OCD → PP 0.040.027SignificantFull mediation
H17cTPCSD → In-OCD → Post-OCD → PP0.0410.015SignificantFull mediation
H18aOPCSD → Pre-OCD → In-OCD → Post-OCD → PP0.0200.017SignificantFull mediation
H18bHPCSD → Pre-OCD → In-OCD → Post-OCD → PP0.0190.021SignificantFull mediation
H18cTPCSD → Pre-OCD → In-OCD → Post-OCD → PP0.0140.016SignificantFull mediation

Appendix D. Importance–Performance Values

Table A5. Importance–performance values (indicator level) of the target construct project performance (PP).
Table A5. Importance–performance values (indicator level) of the target construct project performance (PP).
IndicatorsImportancePerformanceImprovement Priority Quadrants
PPB60.06060.923Second priority area
PPB50.05863.401Second priority area
PPB30.05763.288Second priority area
PPB20.05667.005Second priority area
CA30.05660.923Second priority area
LTS20.05656.907First priority area
PPB40.05559.009First priority area
PCSR10.05564.527Second priority area
SEC40.05460.135First priority area
PCSR30.05466.779Second priority area
TMS40.05466.667Second priority area
CA20.05361.149Second priority area
TWC30.05366.667Second priority area
SEC30.05362.275Second priority area
PPB10.05369.932Second priority area
TMS30.05266.104Second priority area
TMS20.05264.865Second priority area
PCSR40.05265.653Second priority area
TWC10.05261.411Second priority area
PCSR20.05268.581Second priority area
LTS10.05159.009First priority area
TWC20.05162.312Second priority area
OGP20.05167.905Second priority area
OGP10.05067.680Second priority area
LTS30.05057.958First priority area
SEC20.05060.360First priority area
LTS40.04866.104Second priority area
SEC10.04857.995First priority area
CA40.04758.108First priority area
TMS10.04764.865Second priority area
TWC40.04768.581Second priority area
OGP50.04757.658First priority area
CA10.04767.117Second priority area
OGP40.04658.446First priority area
OGP30.04158.333First priority area
TC20.03952.365Third priority area
TC10.03855.631Third priority area
TC50.03753.041Third priority area
TC40.03446.622Third priority area
CC40.03263.572Fourth priority area
CC30.03162.387Fourth priority area
RC20.03152.365Third priority area
TC30.03137.725Third priority area
RC40.03152.365Third priority area
RC10.03049.550Third priority area
CC10.03060.811Fourth priority area
RC30.02950.563Third priority area
CC20.02956.869Third priority area
PPT40.01363.739Fourth priority area
PPA20.01256.644Third priority area
PPA50.01257.095Third priority area
PPT10.01265.878Fourth priority area
PPT20.01265.991Fourth priority area
PPA10.01257.432Third priority area
PPT60.01264.414Fourth priority area
PPT30.01267.005Fourth priority area
PPA40.01255.293Third priority area
PPA30.01158.446Third priority area
PPT50.01165.541Fourth priority area
Mean0.04060.678
Figure A1. Importance–performance map (indicator level) of the target construct Pre-OCDs. Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Figure A1. Importance–performance map (indicator level) of the target construct Pre-OCDs. Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Buildings 15 03426 g0a1
Table A6. Importance–performance values (indicator level) of the target construct in-operational control determinants (Pre-OCDs).
Table A6. Importance–performance values (indicator level) of the target construct in-operational control determinants (Pre-OCDs).
IndicatorsImportancePerformanceImprovement Priority Quadrants
SEC40.08760.135Second priority area
LTS20.07556.907First priority area
TMS40.07566.667Second priority area
SEC30.07462.275Second priority area
TMS30.07366.104Second priority area
TMS20.07364.865Second priority area
TWC30.07266.667Second priority area
OGP20.07167.905Second priority area
OGP10.07167.680Second priority area
TWC10.07061.411Second priority area
SEC20.07060.360Second priority area
LTS10.06959.009First priority area
TWC20.06962.312Second priority area
LTS30.06757.958First priority area
SEC10.06757.995First priority area
TMS10.06664.865Second priority area
OGP50.06657.658First priority area
LTS40.06566.104Fourth priority area
OGP40.06458.446Third priority area
TWC40.06468.581Fourth priority area
TC20.05852.365Third priority area
OGP30.05858.333Third priority area
TC10.05755.631Third priority area
TC50.05653.041Third priority area
TC40.05246.622Third priority area
TC30.04737.725Third priority area
Mean0.06659.908
Figure A2. Importance–performance values (indicator level) of the target construct In-OCDs. Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Figure A2. Importance–performance values (indicator level) of the target construct In-OCDs. Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Buildings 15 03426 g0a2
Table A7. Importance–performance values (indicator level) of the target construct in-operational control determinants (In-OCD).
Table A7. Importance–performance values (indicator level) of the target construct in-operational control determinants (In-OCD).
IndicatorsImportancePerformanceImprovement Priority Quadrants
SEC40.08260.135First priority area
TMS40.08266.667Second priority area
SEC30.08162.275Second priority area
TMS30.08066.104Second priority area
TMS20.07964.865Second priority area
OGP20.07767.905Second priority area
OGP10.07767.680Second priority area
SEC20.07660.360First priority area
LTS20.07556.907First priority area
SEC10.07357.995First priority area
TMS10.07264.865Second priority area
OGP50.07157.658First priority area
TWC30.07166.667Second priority area
TWC10.06961.411Second priority area
OGP40.06958.446First priority area
LTS10.06959.009First priority area
TWC20.06962.312Second priority area
LTS30.06757.958Third priority area
TC20.06752.365Third priority area
TC10.06555.631Third priority area
LTS40.06566.104Fourth priority area
TC50.06353.041Third priority area
TWC40.06368.581Fourth priority area
OGP30.06358.333Third priority area
PPS60.06160.923Fourth priority area
TC40.05946.622Third priority area
PPS50.05863.401Fourth priority area
PPS30.05763.288Fourth priority area
PPS20.05767.005Fourth priority area
PPS40.05559.009Third priority area
TC30.05437.725Third priority area
PPS10.05369.932Fourth priority area
Mean0.06860.662
Figure A3. Importance–performance values (indicator level) of the target construct Post-OCDs. Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Figure A3. Importance–performance values (indicator level) of the target construct Post-OCDs. Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Buildings 15 03426 g0a3
Table A8. Importance–performance values (indicator level) of the target construct post-operational control determinants (Post-OCDs).
Table A8. Importance–performance values (indicator level) of the target construct post-operational control determinants (Post-OCDs).
ConstructsImportancePerformanceImprovement Priority Quadrants
LTS20.08456.907First priority area
TWC30.08066.667Second priority area
TWC10.07861.411Second priority area
SEC40.07760.135First priority area
LTS10.07759.009First priority area
TWC20.07762.312Second priority area
TMS40.07766.667Second priority area
SEC30.07662.275Second priority area
LTS30.07557.958First priority area
TMS30.07566.104Second priority area
TMS20.07464.865Second priority area
LTS40.07366.104Second priority area
OGP20.07367.905Second priority area
OGP10.07267.680Second priority area
SEC20.07160.360First priority area
TWC40.07168.581Second priority area
PPT40.07163.739Second priority area
SEC10.06857.995First priority area
PPA20.06856.644First priority area
TMS10.06864.865Second priority area
PPA50.06757.095First priority area
OGP50.06757.658First priority area
PPT10.06765.878Second priority area
PPT20.06665.991Second priority area
PPA10.06557.432First priority area
PPT60.06564.414Second priority area
OGP40.06558.446First priority area
PPT30.06567.005Second priority area
PPA40.06555.293First priority area
PPA30.06458.446First priority area
PPT50.06365.541Second priority area
OGP30.05958.333Third priority area
TC20.03952.365Third priority area
TC10.03855.631Third priority area
TC50.03753.041Third priority area
PPB60.03660.923Fourth priority area
PPB50.03563.401Third priority area
TC40.03546.622Third priority area
PPB30.03463.288Fourth priority area
PPB20.03467.005Fourth priority area
PPB40.03359.009Third priority area
PPB10.03169.932Fourth priority area
TC30.03137.725Third priority area
Mean0.06260.899
Figure A4. Importance–performance map (indicator level) of the target construct UOCDs. Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Figure A4. Importance–performance map (indicator level) of the target construct UOCDs. Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Buildings 15 03426 g0a4
Table A9. Importance–performance values (indicator level) of the target construct uncertainty control determinants (UOCDs).
Table A9. Importance–performance values (indicator level) of the target construct uncertainty control determinants (UOCDs).
IndicatorsImportancePerformanceImprovement Priority Quadrants
TC20.08052.365First priority area
TC10.07955.631First priority area
SEC40.07760.135Second priority area
TC50.07653.041First priority area
TMS40.07666.667Second priority area
SEC30.07562.275Second priority area
TMS30.07466.104Second priority area
TMS20.07464.865Second priority area
OGP20.07267.905First priority area
TC40.07246.622Second priority area
OGP10.07167.680Second priority area
SEC20.07060.360Second priority area
LTS20.07056.907First priority area
SEC10.06857.995First priority area
TWC30.06766.667Second priority area
TMS10.06764.865Third priority area
OGP50.06757.658Fourth priority area
TWC10.06561.411Fourth priority area
TC30.06537.725Third priority area
OGP40.06558.446Third priority area
LTS10.06459.009Third priority area
TWC20.06462.312Third priority area
LTS30.06357.958Third priority area
LTS40.06166.104Fourth priority area
TWC40.05968.581Fourth priority area
OGP30.05858.333Third priority area
Mean0.06959.908

References

  1. Assaf, S.A.; Al-Hejji, S. Causes of Delay in Large Construction Projects. Int. J. Proj. Manag. 2006, 24, 349–357. [Google Scholar] [CrossRef]
  2. Olawale, Y.A. Challenges to Prevent in Practice for Effective Cost and Time Control of Construction Projects. J. Constr. Eng. Proj. 2020, 10, 16–32. [Google Scholar]
  3. Flyvbjerg, B.; Holm, M.K.S.; Buhl, S.L. How Common and How Large Are Cost Overruns in Transport Infrastructure Projects? Transp. Rev. 2003, 23, 71–88. [Google Scholar] [CrossRef]
  4. KPMG International. KPMG International Cooperative 2015 Global Construction Project Owner’s Survey; KPMG International Cooperative: Zug, Switzerland, 2015. [Google Scholar]
  5. Orgut, R.E.; Batouli, M.; Zhu, J.; Mostafavi, A.; Jaselskis, E.J. Critical Factors for Improving Reliability of Project Control Metrics throughout Project Life Cycle. J. Manag. Eng. 2020, 36, 04019033. [Google Scholar] [CrossRef]
  6. CII (Construction Industry Institute). Performance Assessment; CII: Austin, TX, USA, 2012. [Google Scholar]
  7. Alotaibi, R.; Sohail, M.; Edum-Fotwe, F.T.; Soetanto, R. Determining Project Control System Effectiveness in Construction Project Delivery. Eng. Constr. Archit. Manag. 2025; ahead of print. [Google Scholar] [CrossRef]
  8. Jawad, S.; Ledwith, A.; Khan, R. Project Control System (PCS) Implementation in Engineering and Construction Projects: An Empirical Study in Saudi’s Petroleum and Chemical Industry. Eng. Constr. Archit. Manag. 2022, 31, 181–207. [Google Scholar] [CrossRef]
  9. Baars, W.; Harmsen, H.; Kramer, R.; Sesink, L.; Van Zundert, J. Project Management Handbook DANS—Data Archiving and Networked Services The Hague—2006; DANS: The Hague, The Netherlands, 2006. [Google Scholar]
  10. Datta, A. Integrated Project Controls Can Help Avoid Major Disappointments on Mega-Projects. Available online: https://www.turnerandtownsend.com/en/perspectives/integrated-project-controls-can-help-avoid-major-disappointments-on-mega-projects/ (accessed on 16 July 2023).
  11. Maqsoom, A.; Hamad, M.; Ashraf, H.; Thaheem, M.J.; Umer, M. Managerial Control Mechanisms and Their Influence on Project Performance: An Investigation of the Moderating Role of Complexity Risk. Eng. Constr. Archit. Manag. 2020, 27, 2451–2475. [Google Scholar] [CrossRef]
  12. Yean, F.; Ling, Y.; Ang, W.T. Using Control Systems to Improve Construction Project Outcomes. Eng. Constr. Archit. Manag. 2013, 20, 576–588. [Google Scholar] [CrossRef]
  13. Alsulami, B.T. Risk Management Planning for Project Control through Risk Analysis: A Petroleum Pipeline-Laying Project. Buildings 2025, 15, 87. [Google Scholar] [CrossRef]
  14. Alsuliman, J.A. Causes of Delay in Saudi Public Construction Projects. Alex. Eng. J. 2019, 58, 801–808. [Google Scholar] [CrossRef]
  15. Alshammari, A.; Ghazali, F.E.M. A Comprehensive Review of the Factors and Strategies to Mitigate Construction Projects Delays in Saudi Arabia. Open Constr. Build. Technol. J. 2024, 18, e18748368318470. [Google Scholar] [CrossRef]
  16. Aldossari, K.M. Cost and Schedule Performance in Higher Education Construction Projects in Saudi Arabia. Adv. Civ. Eng. 2025, 2025, 7448384. [Google Scholar] [CrossRef]
  17. Willems, L.L.; Vanhoucke, M. Classification of Articles and Journals on Project Control and Earned Value Management. Int. J. Proj. Manag. 2015, 33, 1610–1634. [Google Scholar] [CrossRef]
  18. Hazir, Ö. A Review of Analytical Models, Approaches and Decision Support Tools in Project Monitoring and Control. Int. J. Proj. Manag. 2015, 33, 808–815. [Google Scholar] [CrossRef]
  19. Müller, R.; Lecoeuvre, L. Operationalizing Governance Categories of Projects. Int. J. Proj. Manag. 2014, 32, 1346–1357. [Google Scholar] [CrossRef]
  20. Zwikael, O.; Smyrk, J. Project Governance: Balancing Control and Trust in Dealing with Risk. Int. J. Proj. Manag. 2015, 33, 852–862. [Google Scholar] [CrossRef]
  21. McGrath, J.E. Groups: Interaction and Performance; Prentice-Hall: Hoboken, NJ, USA, 1984; p. 287. [Google Scholar]
  22. Hackman, J.R.; Morris, C.G. Group Tasks, Group Interaction Process, and Group Performance Effectiveness: A Review and Proposed Integration. Adv. Exp. Soc. Psychol. 1975, 8, 45–99. [Google Scholar] [CrossRef]
  23. Ilgen, D.R.; Hollenbeck, J.R.; Johnson, M.; Jundt, D. Teams in Organizations: From Input-Process-Output Models to IMOI Models. Annu. Rev. Psychol. 2005, 56, 517–543. [Google Scholar] [CrossRef]
  24. Mathieu, J.; Maynard, T.M.; Rapp, T.; Gilson, L. Team Effectiveness 1997–2007: A Review of Recent Advancements and a Glimpse into the Future. J. Manag. 2008, 34, 410–476. [Google Scholar] [CrossRef]
  25. Wang, D.; Jia, J.; Jiang, S.; Liu, T.; Ma, G. How Team Voice Contributes to Construction Project Performance: The Mediating Role of Project Learning and Project Reflexivity. Buildings 2023, 13, 1599. [Google Scholar] [CrossRef]
  26. Kwofie, T.E.; Alhassan, A.; Botchway, E.; Afranie, I. Factors Contributing towards the Effectiveness of Construction Project Teams. Int. J. Constr. Manag. 2015, 15, 170–178. [Google Scholar] [CrossRef]
  27. Loganathan, S.; Forsythe, P. Unravelling the Influence of Teamwork on Trade Crew Productivity: A Review and a Proposed Framework. Constr. Manag. Econ. 2020, 38, 1040–1060. [Google Scholar] [CrossRef]
  28. Mubarak, S. Construction Project Scheduling and Control; Wiley: Hoboken, NJ, USA, 2015. [Google Scholar]
  29. Olawale, Y. Project Control Methods and Best Practices: Achieving Project Success; Business Expert Press: Hampton, NJ, USA, 2022. [Google Scholar]
  30. Kerzner, H. Project Management: A Systems Approach to Planning, Scheduling, and Controlling, 12th ed.; Wiley: Hoboken, NJ, USA, 2017. [Google Scholar]
  31. Urgiles, P.; Sebastian, M.A.; Claver, J. Proposal and Application of a Methodology to Improve the Control and Monitoring of Complex Hydroelectric Power Station Construction Projects. Appl. Sci. 2020, 10, 7913. [Google Scholar] [CrossRef]
  32. Le, A.T.H.; Sutrisna, M. Project Cost Control System and Enabling-Factors Model: PLS-SEM Approach and Importance-Performance Map Analysis. Eng. Constr. Archit. Manag. 2023, 31, 2513–2535. [Google Scholar] [CrossRef]
  33. Al-khalil, M.I.; Al-ghafly, M.A. Delay in Public Utility Projects in Saudi Arabia. Int. J. Proj. Manag. 1999, 17, 101–106. [Google Scholar] [CrossRef]
  34. Alofi, A.; Kashiwagi, J.; Kashiwagi, D. The Perception of the Government and Private Sectors on the Procurement System Delivery Method in Saudi Arabia. Procedia Eng. 2016, 145, 1394–1401. [Google Scholar] [CrossRef]
  35. Mahamid, I. Schedule Delay in Saudi Arabia Road Construction Projects: Size, Estimate, Determinants and Effects. Int. J. Archit. Eng. Constr. 2017, 6, 51–58. [Google Scholar] [CrossRef]
  36. Park, J.E. Schedule Delays of Major Projects: What Should We Do about It? Transp. Rev. 2021, 41, 814–832. [Google Scholar] [CrossRef]
  37. Huo, T.; Ren, H.; Cai, W.; Shen, G.Q.; Liu, B.; Zhu, M.; Wu, H. Measurement and Dependence Analysis of Cost Overruns in Megatransport Infrastructure Projects: Case Study in Hong Kong. J. Constr. Eng. Manag. 2018, 144, 05018001. [Google Scholar] [CrossRef]
  38. Shehu, Z.; Endut, I.R.; Akintoye, A.; Holt, G.D. Cost Overrun in the Malaysian Construction Industry Projects: A Deeper Insight. Int. J. Proj. Manag. 2014, 32, 1471–1480. [Google Scholar] [CrossRef]
  39. Amoatey, C.T.; Ankrah, A.N.O. Exploring Critical Road Project Delay Factors in Ghana. J. Facil. Manag. 2017, 15, 110–127. [Google Scholar] [CrossRef]
  40. Jawad, S.; Ledwith, A.; Panahifar, F. Enablers and Barriers to the Successful Implementation of Project Control Systems in the Petroleum and Chemical Industry. Int. J. Eng. Bus. Manag. 2018, 10, 1847979017751834. [Google Scholar] [CrossRef]
  41. Olawale, Y.; Sun, M. Construction Project Control in the UK: Current Practice, Existing Problems and Recommendations for Future Improvement. Int. J. Proj. Manag. 2015, 33, 623–637. [Google Scholar] [CrossRef]
  42. Sandbhor, S.; Choudhary, S.; Arora, A.; Katoch, P. Identification of Factors Leading to Construction Project Success Using Principal Component Analysis. Int. J. Appl. Eng. Res. 2014, 9, 4169–4180. [Google Scholar]
  43. Jawad, S.; Ledwith, A. A Measurement Model of Project Control Systems Success for Engineering and Construction Projects Case Study: Contractor Companies in Saudi’s Petroleum and Chemical Industry. Eng. Constr. Archit. Manag. 2021, 29, 1218–1240. [Google Scholar] [CrossRef]
  44. Waqar, A.; Othman, I.B.; Mansoor, M.S. Building Information Modeling (BIM) Implementation and Construction Project Success in Malaysian Construction Industry: Mediating Role of Project Control. Eng. Constr. Archit. Manag. 2024; ahead of print. [Google Scholar] [CrossRef]
  45. Dallasega, P.; Marengo, E.; Revolti, A. Strengths and Shortcomings of Methodologies for Production Planning and Control of Construction Projects: A Systematic Literature Review and Future Perspectives. Prod. Plan. Control 2021, 32, 257–282. [Google Scholar] [CrossRef]
  46. Jaskula, K.; Kifokeris, D.; Papadonikolaki, E.; Rovas, D. Common Data Environments in Construction: State-of-the-Art and Challenges for Practical Implementation. Constr. Innov. 2024. [Google Scholar] [CrossRef]
  47. Pellerin, R.; Perrier, N. A Review of Methods, Techniques and Tools for Project Planning and Control. Int. J. Prod. Res. 2019, 57, 2160–2178. [Google Scholar] [CrossRef]
  48. Olawale, Y.; Sun, M. PCIM: Project Control and Inhibiting-Factors Management Model. J. Manag. Eng. 2013, 29, 60–70. [Google Scholar] [CrossRef]
  49. Joslin, R.; Müller, R. The Relationship between Project Governance and Project Success. Int. J. Proj. Manag. 2016, 34, 613–626. [Google Scholar] [CrossRef]
  50. Too, E.G.; Weaver, P. The Management of Project Management: A Conceptual Framework for Project Governance. Int. J. Proj. Manag. 2014, 32, 1382–1394. [Google Scholar] [CrossRef]
  51. Samset, K.; Volden, G.H. Front-End Definition of Projects: Ten Paradoxes and Some Reflections Regarding Project Management and Project Governance. Int. J. Proj. Manag. 2016, 34, 297–313. [Google Scholar] [CrossRef]
  52. Baumol, W.J.; Stewart, M. On the Behavioral Theory of the Firm. Corp. Econ. 1971, 118–143. [Google Scholar]
  53. Cyert, R.M.; March, J.G. A Behavioral Theory of the Firm 1963. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1496208 (accessed on 1 January 2024).
  54. Todeva, E. Behavioural Theory of the Firm 1. Int. Encycl. Organ. Stud. Sage 2007, 1, 113–118. [Google Scholar]
  55. BSI. Organisation and Digitisation of Information About Buildings and Civil Engineering Works, Including Building Information Modelling (BIM) Part 1: Concepts and Principles; BSI: London, UK, 2018. [Google Scholar]
  56. Dwivedi, Y.K.; Wade, M.R.; Schneberger, S.L. Informations Systems Theory: Volume 2; Springer: Berlin/Heidelberg, Germany, 2012; Volume 28, p. 461. [Google Scholar] [CrossRef]
  57. Ahmad, I.B.; Omar, S.S.; Ali, M.; Ali, S. A Conceptual Study of TOE and Organisational Performance. Proc. Eur. Conf. Innov. Entrep. ECIE 2023, 2, 962–968. [Google Scholar] [CrossRef]
  58. Tornatzky, L.G.; Fleischer, M. The Processes of Technological Innovation; D. C. Heath and Company: Lexington, MA, USA, 1990; Volume 298. [Google Scholar]
  59. Barney, J. Firm Resources and Sustained Competitive Advantage. J. Manag. 1991, 17, 99–120. [Google Scholar] [CrossRef]
  60. Wang, J.; Yuan, Z.; He, Z.; Zhou, F.; Wu, Z. Critical Factors Affecting Team Work Efficiency in BIM-Based Collaborative Design: An Empirical Study in China. Buildings 2021, 11, 486. [Google Scholar] [CrossRef]
  61. Alnaser, A.A.; Al-Gahtani, K.S.; Alsanabani, N.M. Building Information Modeling Impact on Cost Overrun Risk Factors and Interrelationships. Appl. Sci. 2024, 14, 10711. [Google Scholar] [CrossRef]
  62. Murguia, D.; Vasquez, C.; Demian, P.; Soetanto, R. BIM Adoption among Contractors: A Longitudinal Study in Peru. J. Constr. Eng. Manag. 2023, 149, 04022140. [Google Scholar] [CrossRef]
  63. Ayman, H.M.; Mahfouz, S.Y.; Alhady, A. Integrated EDM and 4D BIM-Based Decision Support System for Construction Projects Control. Buildings 2022, 12, 315. [Google Scholar] [CrossRef]
  64. Lin, J.J.; Golparvar-Fard, M. Visual and Virtual Production Management System for Proactive Project Controls. J. Constr. Eng. Manag. 2021, 147, 04021058. [Google Scholar] [CrossRef]
  65. Gledson, B.J.; Greenwood, D.J. Surveying the Extent and Use of 4D BIM in the UK. J. Inf. Technol. Constr. 2016, 21, 57–71. [Google Scholar]
  66. Hegazy, T.; Menesi, W. Enhancing the Critical Path Segments Scheduling Technique for Project Control. Can. J. Civ. Eng. 2012, 39, 968–977. [Google Scholar] [CrossRef]
  67. Roofigari-Esfahan, N.; Paez, A.; Razavi, S.N. Location-Aware Scheduling and Control of Linear Projects: Introducing Space-Time Float Prisms. J. Constr. Eng. Manag. 2015, 141, 06014008. [Google Scholar] [CrossRef]
  68. Duarte-Vidal, L.; Herrera, R.F.; Atencio, E.; Muñoz-La Rivera, F. Interoperability of Digital Tools for the Monitoring and Control of Construction Projects. Appl. Sci. 2021, 11, 10370. [Google Scholar] [CrossRef]
  69. Wang, X.; Yung, P.; Luo, H.; Truijens, M. An Innovative Method for Project Control in LNG Project through 5D CAD: A Case Study. Autom. Constr. 2014, 45, 126–135. [Google Scholar] [CrossRef]
  70. Morris, P.; Pinto, J. The Wiley Guide to Project Control; John Wiley Sons, Inc.: Hoboken, NJ, USA, 2010. [Google Scholar]
  71. Pollack, J.; Helm, J.; Adler, D. What Is the Iron Triangle, and How Has It Changed? Int. J. Manag. Proj. Bus. 2018, 11, 527–547. [Google Scholar] [CrossRef]
  72. Project Management Institute. The Standard for Project Management and a Guide to the Project Management Body of Knowledge (PMBOK Guide); Project Management Institute: Newtown Square, NJ, USA, 2021; ISBN 9781628256642. [Google Scholar]
  73. Meredith, J.R.; Mantel, S.J. Project Management: A Managerial Approach, 8th ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2012. [Google Scholar]
  74. Turner, J.R.; Müller, R. On the Nature of the Project as a Temporary Organization. Int. J. Proj. Manag. 2003, 21, 1–8. [Google Scholar] [CrossRef]
  75. Winch, G.M. Managing Construction Projects an Information Processing Approach, 2nd ed.; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2010. [Google Scholar]
  76. Babbie, E. The Practice of Social Research, 4th ed.; CENGAGE Learning: Boston, MA, USA, 2016. [Google Scholar]
  77. Bell, E.; Bryman, A.; Harley, B. Business Research Methods, 4th ed.; Oxford University Press: Oxford, UK, 2019. [Google Scholar]
  78. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to Use and How to Report the Results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  79. Batra, S. Exploring the Application of PLS-SEM in Construction Management Research: A Bibliometric and Meta-Analysis Approach. Eng. Constr. Archit. Manag. 2023, 32, 2697–2727. [Google Scholar] [CrossRef]
  80. Baratta, A. The Triple Constraint, a Triple Illusion the Classical Triple Constraint Exhibit 1—Classical Triple Constraint; PMI: Newtown Square, PA, USA, 2006. [Google Scholar]
  81. Del Pico, W.J. Project Control Integrating Cost and Schedule in Construction; RSMeans: Greenville, SC, USA, 2013. [Google Scholar]
  82. Al-Tmeemy, S.M.H.M.; Abdul-Rahman, H.; Harun, Z. Future Criteria for Success of Building Projects in Malaysia. Int. J. Proj. Manag. 2011, 29, 337–348. [Google Scholar] [CrossRef]
  83. Amoah, A.; Berbegal-Mirabent, J.; Marimon, F. Making the Management of a Project Successful: Case of Construction Projects in Developing Countries. J. Constr. Eng. Manag. 2021, 147, 04021166. [Google Scholar] [CrossRef]
  84. Idogawa, J.; Bizarrias, F.S.; Câmara, R. Critical Success Factors for Change Management in Business Process Management. Bus. Process Manag. J. 2023, 29, 2009–2033. [Google Scholar] [CrossRef]
  85. Malik, M.O.; Khan, N. Analysis of ERP Implementation to Develop a Strategy for Its Success in Developing Countries. Prod. Plan. Control 2021, 32, 1020–1035. [Google Scholar] [CrossRef]
  86. Iyer, K.C.; Jha, K.N. Critical Factors Affecting Schedule Performance: Evidence from Indian Construction Projects. J. Constr. Eng. Manag. 2006, 132, 871–881. [Google Scholar] [CrossRef]
  87. Back, W.E.; Grau, D.; Mejia-Aguilar, G. Effectiveness Evaluation of Contract Incentives on Project Performance. Int. J. Constr. Educ. Res. 2013, 9, 288–306. [Google Scholar] [CrossRef]
  88. Jagtap, M.; Kamble, S. Evaluating the Modus Operandi of Construction Supply Chains Using Organization Control Theory. Int. J. Constr. Supply Chain Manag. 2015, 5, 16–33. [Google Scholar] [CrossRef]
  89. Durmic, N. Factors Influencing Project Success: A Qualitative Research. TEM J. 2020, 9, 1011–1020. [Google Scholar] [CrossRef]
  90. Sakka, O.; Barki, H.; Côté, L. Relationship between the Interactive Use of Control Systems and the Project Performance: The Moderating Effect of Uncertainty and Equivocality. Int. J. Proj. Manag. 2016, 34, 508–522. [Google Scholar] [CrossRef]
  91. Hsu, J.S.C.; Shih, S.P.; Li, Y. The Mediating Effects of In-Role and Extra-Role Behaviors on the Relationship between Control and Software-Project Performance. Int. J. Proj. Manag. 2017, 35, 1524–1536. [Google Scholar] [CrossRef]
  92. Al Khoori, A.A.A.G.; Abdul Hamid, M.S.R. Topology of Project Management Office in United Arab Emirates Project-Based Organizations. Int. J. Sustain. Constr. Eng. Technol. 2022, 13, 8–20. [Google Scholar] [CrossRef]
  93. Chen, C.C.; Liu, J.Y.C.; Chen, H.G. Discriminative Effect of User Influence and User Responsibility on Information System Development Processes and Project Management. Inf. Softw. Technol. 2011, 53, 149–158. [Google Scholar] [CrossRef]
  94. Wiener, M.; Mähring, M.; Remus, U.; Saunders, C. Control Configuration and Control Enactment in Information Systems Projects: Review and Expanded Theoretical Framework. MIS Q. 2016, 40, 741–774. [Google Scholar] [CrossRef]
  95. Martens, A.; Vanhoucke, M. The Impact of Applying Effort to Reduce Activity Variability on the Project Time and Cost Performance. Eur. J. Oper. Res. 2019, 277, 442–453. [Google Scholar] [CrossRef]
  96. Molenaar, K.R.; Javernick-Will, A.; Bastias, A.G.; Wardwell, M.A.; Saller, K. Construction Project Peer Reviews as an Early Indicator of Project Success. J. Manag. Eng. 2013, 29, 327–333. [Google Scholar] [CrossRef]
  97. Gustafson, G. Why Project Peer Review? J. Manag. Eng. 1990, 6, 350–354. [Google Scholar] [CrossRef]
  98. Yujing, W.; Yongkui, L.; Peidong, G. Modelling Construction Project Management Based on System Dynamics. Metall. Min. Ind. 2015, 9, 1056–1061. [Google Scholar]
  99. Trappey, A.J.C.; Chiang, T.A.; Ke, S. Developing an Intelligent Workflow Management System to Manage Project Processes with Dynamic Resource Control. J. Chinese Inst. Ind. Eng. 2006, 23, 484–493. [Google Scholar] [CrossRef]
  100. Li, Y.Y.; Chen, P.-H.; Chew, D.A.S.; Teo, C.C.; Ding, R.G. Critical Project Management Factors of AEC Firms for Delivering Green Building Projects in Singapore. J. Constr. Eng. Manag. 2011, 137, 1153–1163. [Google Scholar] [CrossRef]
  101. Kivilä, J.; Martinsuo, M.; Vuorinen, L. Sustainable Project Management through Project Control in Infrastructure Projects. Int. J. Proj. Manag. 2017, 35, 1167–1183. [Google Scholar] [CrossRef]
  102. Ghavamifar, K.; Touran, A. Owner’s Risks versus Control in Transit Projects. J. Manag. Eng. 2009, 25, 230–233. [Google Scholar] [CrossRef]
  103. Almunifi, A.A.; Almutairi, S. Lessons Learned Framework for Efficient Delivery of Construction Projects in Saudi Arabia. Constr. Econ. Build. 2021, 21, 115–141. [Google Scholar] [CrossRef]
  104. Mähring, M.; Wiener, M.; Remus, U. Getting the Control across: Control Transmission in Information Systems Offshoring Projects. Inf. Syst. J. 2018, 28, 708–728. [Google Scholar] [CrossRef]
  105. Abu-Hijleh, S.F.; Ibbs, C.W. Systematic Automated Management Exception Reporting. J. Constr. Eng. Manag. 1993, 119, 87–104. [Google Scholar] [CrossRef]
  106. Franz, B.; Leicht, R.; Molenaar, K.; Messner, J. Impact of Team Integration and Group Cohesion on Project Delivery Performance. J. Constr. Eng. Manag. 2017, 143, 04016088. [Google Scholar] [CrossRef]
  107. Alkhalifah, S.J.; Tuffaha, F.M.; Al Hadidi, L.A.; Ghaithan, A. Factors Influencing Change Orders in Oil and Gas Construction Projects in Saudi Arabia. Built Environ. Proj. Asset Manag. 2023, 13, 430–452. [Google Scholar] [CrossRef]
  108. Zhu, L.; Cheung, S.O.; Gao, X.; Li, Q.; Liu, G. Success DNA of a Record-Breaking Megaproject. J. Constr. Eng. Manag. 2020, 146, 05020009. [Google Scholar] [CrossRef]
  109. Andary, E.G.; Abi Shdid, C.; Chowdhury, A.; Ahmad, I. Integrated Project Delivery Implementation Framework for Water and Wastewater Treatment Plant Projects. Eng. Constr. Archit. Manag. 2019, 27, 609–633. [Google Scholar] [CrossRef]
  110. Cheng, M.-Y.; Peng, H.-S.; Wu, Y.-W.; Chen, T.-L. Estimate at Completion for Construction Projects Using Evolutionary Support Vector Machine Inference Model. Autom. Constr. 2010, 19, 619–629. [Google Scholar] [CrossRef]
  111. Rezakhani, P. Hybrid Fuzzy-Bayesian Decision Support Tool for Dynamic Project Scheduling and Control under Uncertainty. Int. J. Constr. Manag. 2022, 22, 2864–2876. [Google Scholar] [CrossRef]
  112. Ezzeddine, A.; Shehab, L.; Lucko, G.; Hamzeh, F. Forecasting Construction Project Performance with Momentum Using Singularity Functions in LPS. J. Constr. Eng. Manag. 2022, 148, 04022063. [Google Scholar] [CrossRef]
  113. Xiao, L.; Bie, L.; Bai, X. Controlling the Schedule Risk in Green Building Projects: Buffer Management Framework with Activity Dependence. J. Clean. Prod. 2021, 278, 123852. [Google Scholar] [CrossRef]
  114. Lee, N.; Rojas, E.M. Visual Representations for Monitoring Project Performance: Developing Novel Prototypes for Improved Communication. J. Constr. Eng. Manag. 2013, 139, 994–1005. [Google Scholar] [CrossRef]
  115. Aliverdi, R.; Moslemi Naeni, L.; Salehipour, A. Monitoring Project Duration and Cost in a Construction Project by Applying Statistical Quality Control Charts. Int. J. Proj. Manag. 2013, 31, 411–423. [Google Scholar] [CrossRef]
  116. Raymond, L.; Bergeron, F. Project Management Information Systems: An Empirical Study of Their Impact on Project Managers and Project Success. Int. J. Proj. Manag. 2008, 26, 213–220. [Google Scholar] [CrossRef]
  117. Sakka, O.; Barki, H.; Côté, L. Interactive and Diagnostic Uses of Management Control Systems in IS Projects: Antecedents and Their Impact on Performance. Inf. Manag. 2013, 50, 265–274. [Google Scholar] [CrossRef]
  118. Lei, Z.; Hu, Y.; Hua, J.; Marton, B.; Goldberg, P.; Marton, N. An earned-value-analysis (EVA)-based project control framework in large-scale scaffolding projects using linear regression modeling. J. Inf. Technol. Constr. 2022, 27, 630–641. [Google Scholar] [CrossRef]
  119. Bruni, M.E.; Beraldi, P.; Guerriero, F.; Pinto, E. A Scheduling Methodology for Dealing with Uncertainty in Construction Projects. Eng. Comput. 2011, 28, 1064–1078. [Google Scholar] [CrossRef]
  120. Mitchell, V.L. Mitchell Knowledge Integration and Information Technology Project Performance. MIS Q. 2006, 30, 919. [Google Scholar] [CrossRef]
  121. Isaac, S.; Navon, R. Can Project Monitoring and Control Be Fully Automated? Constr. Manag. Econ. 2014, 32, 495–505. [Google Scholar] [CrossRef]
  122. Association for Project Management. APM Body of Knowledge, 7th ed.; Association for Project Management: Princes Risborough, UK, 2019; ISBN 9781903494820. [Google Scholar]
  123. Iroroakpo Idoro, G. Influence of the Monitoring and Control Strategies of Indigenous and Expatriate Nigerian Contractors on Project Outcome. J. Constr. Dev. Ctries. 2012, 17, 49–67. [Google Scholar]
  124. Annamalaisami, C.D.; Kuppuswamy, A. Managing Cost Risks: Toward a Taxonomy of Cost Overrun Factors in Building Construction Projects. ASCE-ASME J. Risk Uncertain. Eng. Syst. Part A Civ. Eng. 2021, 7, 04021021. [Google Scholar] [CrossRef]
  125. Robert Carr, B.I. Cost, Schedule, and Time Variances and Integration. J. Constr. Eng. Manag. 1993, 119, 245–265. [Google Scholar] [CrossRef]
  126. Nguyen, T.Q.; Yeoh, J.K.-W.; Angelia, N. Predicting Percent Plan Complete through Time Series Analysis. J. Constr. Eng. Manag. 2023, 149, 04023038. [Google Scholar] [CrossRef]
  127. Song, J.; Martens, A.; Vanhoucke, M. The Impact of a Limited Budget on the Corrective Action Taking Process. Eur. J. Oper. Res. 2020, 286, 1070–1086. [Google Scholar] [CrossRef]
  128. Votto, R.; Lee Ho, L.; Berssaneti, F. Multivariate Control Charts Using Earned Value and Earned Duration Management Observations to Monitor Project Performance. Comput. Ind. Eng. 2020, 148, 106691. [Google Scholar] [CrossRef]
  129. Abdul-Rahman, H.; Wang, C.; Binti Muhammad, N. Project Performance Monitoring Methods Used in Malaysia and Perspectives of Introducing EVA as a Standard Approach. J. Civ. Eng. Manag. 2011, 17, 445–455. [Google Scholar] [CrossRef]
  130. Hanna, A.S.; Asce, F. Using the Earned Value Management System to Improve Electrical Project Control. J. Constr. Eng. Manag. 2012, 138, 449–457. [Google Scholar] [CrossRef]
  131. Salehipour, A.; Naeni, L.M.; Khanbabaei, R.; Javaheri, A. Lessons Learned from Applying the Individuals Control Charts to Monitoring Autocorrelated Project Performance Data. J. Constr. Eng. Manag. 2016, 142, 04015105. [Google Scholar] [CrossRef]
  132. Colin, J.; Vanhoucke, M. A Comparison of the Performance of Various Project Control Methods Using Earned Value Management Systems. Expert Syst. Appl. 2015, 42, 3159–3175. [Google Scholar] [CrossRef]
  133. Martens, A.; Vanhoucke, M. A Buffer Control Method for Top-down Project Control. Eur. J. Oper. Res. 2017, 262, 274–286. [Google Scholar] [CrossRef]
  134. Assaad, R.; El-Adaway, I.H.; Abotaleb, I.S. Predicting Project Performance in the Construction Industry. J. Constr. Eng. Manag. 2020, 146, 04020030. [Google Scholar] [CrossRef]
  135. Lipke, W. Schedule Adherence and Rework. PM World Today 2011, 13, 1–14. [Google Scholar]
  136. Hu, X.; Cui, N.; Demeulemeester, E. Effective Expediting to Improve Project Due Date and Cost Performance through Buffer Management. Int. J. Prod. Res. 2015, 53, 1460–1471. [Google Scholar] [CrossRef]
  137. Marques, G.; Gourc, D.; Lauras, M. Multi-Criteria Performance Analysis for Decision Making in Project Management. Int. J. Proj. Manag. 2011, 29, 1057–1069. [Google Scholar] [CrossRef]
  138. Li, M.; Niu, D.; Ji, Z.; Cui, X.; Sun, L. Forecast Research on Multidimensional Influencing Factors of Global Offshore Wind Power Investment Based on Random Forest and Elastic Net. Sustainability 2021, 13, 2262. [Google Scholar] [CrossRef]
  139. Alkhattabi, L.; Alkhard, A.; Gouda, A. Effects of Change Orders on the Budget of the Public Sector Construction Projects in the Kingdom of Saudi Arabia. Results Eng. 2023, 20, 101628. [Google Scholar] [CrossRef]
  140. Yap, J.B.H.; Skitmore, M.; Gray, J.; Shavarebi, K. Systemic View to Understanding Design Change Causation and Exploitation of Communications and Knowledge. Proj. Manag. J. 2019, 50, 288–305. [Google Scholar] [CrossRef]
  141. Naji, K.K.; Gunduz, M.; Naser, A.F. The Effect of Change-Order Management Factors on Construction Project Success: A Structural Equation Modeling Approach. J. Constr. Eng. Manag. 2022, 148, 04022085. [Google Scholar] [CrossRef]
  142. Khalafallah, A.; Asce, M.; Shalaby, Y. Change Orders: Automating Comparative Data Analysis and Controlling Impacts in Public Projects. J. Constr. Eng. Manag. 2019, 145, 04019064. [Google Scholar] [CrossRef]
  143. Motaleb, O.; Kishk, M. Controlling the Risk of Construction Delay in the Middle East: State-of-the-Art Review. In Proceedings of the COBRA 2011—Proceedings of RICS Construction and Property Conference, Manchester, UK, 12–13 September 2011; pp. 1351–1363. [Google Scholar]
  144. Obondi, K.C. The Utilization of Project Risk Monitoring and Control Practices and Their Relationship with Project Success in Construction Projects. J. Proj. Manag. 2022, 7, 35–52. [Google Scholar] [CrossRef]
  145. Dey, P.; Tabucanon, M.T. Planning for Project Control through Risk Analysis: A Petroleum Pipeline-Laying Project. Int. J. Proj. Manag. 1994, 12, 23–33. [Google Scholar] [CrossRef]
  146. Blumberg, B.; Cooper, D.; Schindler, P. Business Research Methods, 4th ed.; McGraw-Hill Education: Maidenhead, UK, 2014; ISBN 9780077157487. [Google Scholar]
  147. Greener, S.L. Business Research Methods; BookBoon: London, UK, 2008. [Google Scholar]
  148. Harlacher, J. An Educator’s Guide to Questionnaire Development; REL 2016-108; Regional Educational Laboratory Central: Tallahassee, FL, USA, 2016. [Google Scholar]
  149. Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative & Mixed Methods Approaches, 5th ed.; SAGE: Newcastle upon Tyne, UK, 2018. [Google Scholar]
  150. Hair, J.F.; Ringle, C.M.; Sarstedt, M.; Hult, G.T.M. Partial Least Squares Structural Equation Modeling; Springer Nature: London, UK, 2021; ISBN 9783319574134. [Google Scholar]
  151. Mackenzie, S.B.; Podsakoff, P.M.; Podsakoff, N.P. Linked References Are Available on JSTOR for This Article: Construct Measurement and Validation Procedures in MIS and Behavioral Research: Integrating New and Existing Techniques1. MIS Q. 2011, 35, 293–334. [Google Scholar] [CrossRef]
  152. Wilson, M. Constructing Measures: An Item Response Modeling Approach, 2nd ed.; Routledge: London, UK, 2023; pp. 1–364. [Google Scholar]
  153. Edmondson, D.R. Likert Scales: A History. In Proceedings of the CHARM 2005: Conference on Historical Analysis & Research in Marketing, Long Beach, CA, USA, 28 April–1 May 2005; pp. 127–133. [Google Scholar]
  154. Salant, P.; Dillman, D. How to Conduct Your Own Survey; John Wiley & Sons: Hoboken, NJ, USA, 1994. [Google Scholar]
  155. Hair, J.; Hollingsworth, C.L.; Randolph, A.B.; Chong, A.Y.L. An Updated and Expanded Assessment of PLS-SEM in Information Systems Research. Ind. Manag. Data Syst. 2017, 117, 442–458. [Google Scholar] [CrossRef]
  156. Kock, N.; Hadaya, P. Minimum sample size estimation in PLS-SEM: The inverse square root and gamma-exponential methods. Inf. Syst. J. 2018, 28, 227–261. [Google Scholar] [CrossRef]
  157. Armstrong, J.S.; Overton, T.S. Estimating Nonresponse Bias in Mail Surveys. J. Mark. Res. 1977, 14, 396. [Google Scholar] [CrossRef]
  158. Hair, J.F.; Babin, B.J.; Anderson, R.E.; Black, W.C. Multivariate Data Analysis, 8th ed.; Cengage Learning EMEA: Hampshire, UK, 2018. [Google Scholar]
  159. Ringle, C.M.; Sarstedt, M. Gain More Insight from Your PLS-SEM Results the Importance-Performance Map Analysis. Ind. Manag. Data Syst. 2016, 116, 1865–1886. [Google Scholar] [CrossRef]
  160. Gefen, D.; Straub, D. A practical guide to factorial validity using pls-graph: Tutorial and annotated example. Commun. Assoc. Inf. Syst. 2005, 21, 16. [Google Scholar] [CrossRef]
  161. Henseler, J.; Ringle, C.M.; Sarstedt, M. A New Criterion for Assessing Discriminant Validity in Variance-Based Structural Equation Modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef]
  162. Henseler, J.; Dijkstra, T.K.; Sarstedt, M.; Ringle, C.M.; Diamantopoulos, A.; Straub, D.W.; Ketchen, D.J.; Hair, J.F.; Hult, G.T.M.; Calantone, R.J. Common Beliefs and Reality About PLS: Comments on Rönkkö and Evermann (2013). Organ. Res. Methods 2014, 17, 182–209. [Google Scholar] [CrossRef]
  163. Hair, J.F.; Sarstedt, M. Factors versus Composites: Guidelines for Choosing the Right Structural Equation Modeling Method. Proj. Manag. J. 2019, 50, 619–624. [Google Scholar] [CrossRef]
  164. Streukens, S.; Leroi-Werelds, S. Bootstrapping and PLS-SEM: A Step-by-Step Guide to Get More out of Your Bootstrap Results. Eur. Manag. J. 2016, 34, 618–632. [Google Scholar] [CrossRef]
  165. Akter, S.; D’Ambra, J.; Ray, P. Trustworthiness in MHealth Information Services: An Assessment of a Hierarchical Model with Mediating and Moderating Effects Using Partial Least Squares (PLS). J. Am. Soc. Inf. Sci. Technol. 2011, 62, 100–116. [Google Scholar] [CrossRef]
  166. Cohen, J. The Analysis of Variance and Covariance. In Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Lawrence Erlbaum Associates: Hillsdale, NJ, USA, 1988; pp. 273–403. [Google Scholar]
  167. Cohen, J. Statistical Power Analysis for the Behavioral Sciences. In Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Routledge: New York, NY, USA, 2013. [Google Scholar] [CrossRef]
  168. Sarstedt, M.; Cheah, J.H. Partial least squares structural equation modeling using SmartPLS: A software review. J. Mark. Anal. 2019, 7, 196–202. [Google Scholar] [CrossRef]
  169. Dang, C.N.; Le-Hoai, L. Critical Success Factors for Implementation Process of Design-Build Projects in Vietnam. J. Eng. Des. Technol. 2016, 14, 17–32. [Google Scholar] [CrossRef]
  170. Mitra, S.; Wee Kwan Tan, A. Lessons Learned from Large Construction Project in Saudi Arabia. Benchmarking Int. J. 2012, 19, 308–324. [Google Scholar] [CrossRef]
  171. Love, P.E.D.; Zhou, J.; Matthews, J. Project Controls for Electrical, Instrumentation and Control Systems: Enabling Role of Digital System Information Modelling. Autom. Constr. 2019, 103, 202–212. [Google Scholar] [CrossRef]
  172. Vanhoucke, M. Tolerance Limits for Project Control: An Overview of Different Approaches. Comput. Ind. Eng. 2019, 127, 467–479. [Google Scholar] [CrossRef]
  173. Ong, C.H.; Bahar, T. Factors Influencing Project Management Effectiveness in the Malaysian Local Councils. Int. J. Manag. Proj. Bus. 2019, 12, 1146–1164. [Google Scholar] [CrossRef]
  174. Zhang, J.; Yuan, J.; Mahmoudi, A.; Ji, W.; Fang, Q. A Data-Driven Framework for Conceptual Cost Estimation of Infrastructure Projects Using XGBoost and Bayesian Optimization. J. Asian Archit. Build. Eng. 2025, 24, 751–774. [Google Scholar] [CrossRef]
  175. Olawale, Y.A.; Sun, M. Cost and Time Control of Construction Projects: Inhibiting Factors and Mitigating Measures in Practice. Constr. Manag. Econ. 2010, 28, 509–526. [Google Scholar] [CrossRef]
  176. Amer, F.; Jung, Y.; Golparvar-Fard, M. Transformer Machine Learning Language Model for Auto-Alignment of Long-Term and Short-Term Plans in Construction. Autom. Constr. 2021, 132, 103929. [Google Scholar] [CrossRef]
  177. Pal, A.; Lin, J.J.; Hsieh, S.H.; Golparvar-Fard, M. Automated Vision-Based Construction Progress Monitoring in Built Environment through Digital Twin. Dev. Built Environ. 2023, 16, 100247. [Google Scholar] [CrossRef]
  178. Sánchez-Rodríguez, A.; Soilán, M.; Hussain, O.A.I.; Moehler, R.C.; Walsh, S.D.C.; Ahiaga-Dagbui, D.D. Minimizing Cost Overrun in Rail Projects through 5D-BIM: A Systematic Literature Review. Infrastructures 2023, 8, 93. [Google Scholar] [CrossRef]
  179. Chen, Q.; Hall, D.M.; Adey, B.T.; Haas, C.T. Identifying Enablers for Coordination across Construction Supply Chain Processes: A Systematic Literature Review. Eng. Constr. Archit. Manag. 2020, 28, 1083–1113. [Google Scholar] [CrossRef]
  180. Chou, J.-S.; Irawan, N.; Pham, A.-D. Project Management Knowledge of Construction Professionals: Cross-Country Study of Effects on Project Success. J. Constr. Eng. Manag. 2013, 139, 04013015. [Google Scholar] [CrossRef]
  181. Jahangoshai Rezaee, M.; Yousefi, S.; Chakrabortty, R.K. Analysing Causal Relationships between Delay Factors in Construction Projects: A Case Study of Iran. Int. J. Manag. Proj. Bus. 2021, 14, 412–444. [Google Scholar] [CrossRef]
  182. Aldossari, K.M. Client Project Managers’ Knowledge and Skill Competencies for Managing Public Construction Projects. Results Eng. 2024, 24, 103546. [Google Scholar] [CrossRef]
  183. Alshihri, S.; Al-Gahtani, K.; Almohsen, A. Risk Factors That Lead to Time and Cost Overruns of Building Projects in Saudi Arabia. Buildings 2022, 12, 902. [Google Scholar] [CrossRef]
  184. Bajjou, M.S.; Chafi, A. Empirical Study of Schedule Delay in Moroccan Construction Projects. Int. J. Constr. Manag. 2020, 20, 783–800. [Google Scholar] [CrossRef]
  185. Alotaibi, A.; Edum-Fotwe, F.; Price, A.D.F. Critical Barriers to Social Responsibility Implementation within Mega-Construction Projects: The Case of the Kingdom of Saudi Arabia. Sustainability 2019, 11, 1755. [Google Scholar] [CrossRef]
  186. Alotaibi, S.; Martinez-Vazquez, P.; Baniotopoulos, C. Mega-Projects in Construction: Barriers in the Implementation of Circular Economy Concepts in the Kingdom of Saudi Arabia. Buildings 2024, 14, 1298. [Google Scholar] [CrossRef]
  187. Othayman, M.B.; Meshari, A.; Mulyata, J.; Debrah, Y. The Challenges Confronting the Delivery of Training and Development Programs in Saudi Arabia: A Critical Review of Research. Am. J. Ind. Bus. Manag. 2020, 10, 1611–1639. [Google Scholar] [CrossRef]
  188. Mosly, I. Construction Cost-Influencing Factors: Insights from a Survey of Engineers in Saudi Arabia. Buildings 2024, 14, 3399. [Google Scholar] [CrossRef]
  189. Alragabah, Y.A.; Ahmed, M. Impact Assessment of Critical Success Factors (CSFs) in Public Construction Projects of Saudi Arabia. Front. Eng. Built Environ. 2024, 4, 184–195. [Google Scholar] [CrossRef]
  190. Mahamid, I. Micro and Macro Level of Dispute Causes in Residential Building Projects: Studies of Saudi Arabia. J. King Saud Univ.—Eng. Sci. 2016, 28, 12–20. [Google Scholar] [CrossRef]
  191. Alotaibi, N.O.; Sutrisna, M.; Chong, H.-Y. Guidelines of Using Project Management Tools and Techniques to Mitigate Factors Causing Delays in Public Construction Projects in Kingdom of Saudi Arabia. J. Eng. Proj. Prod. Manag. 2016, 6, 90. [Google Scholar] [CrossRef]
  192. Alahmadi, N.; Alghaseb, M. Challenging Tendering-Phase Factors in Public Construction Projects—A Delphi Study in Saudi Arabia. Buildings 2022, 12, 924. [Google Scholar] [CrossRef]
  193. Sanni-Anibire, M.O.; Mohamad Zin, R.; Olatunji, S.O. Causes of Delay in the Global Construction Industry: A Meta Analytical Review. Int. J. Constr. Manag. 2022, 22, 1395–1407. [Google Scholar] [CrossRef]
  194. Alsugair, A.M. Cost Deviation Model of Construction Projects in Saudi Arabia Using PLS-SEM. Sustainability 2022, 14, 16391. [Google Scholar] [CrossRef]
  195. Yap, J.B.H.; Lim, B.L.; Skitmore, M.; Gray, J. Criticality of Project Knowledge and Experience in the Delivery of Construction Projects. J. Eng. Des. Technol. 2022, 20, 800–822. [Google Scholar] [CrossRef]
  196. Mahmoodzadeh, A.; Nejati, H.R.; Mohammadi, M. Optimized Machine Learning Modelling for Predicting the Construction Cost and Duration of Tunnelling Projects. Autom. Constr. 2022, 139, 104305. [Google Scholar] [CrossRef]
  197. Alghaseb, M.; Alali, H. Exploring Project Management Office Models for Public Construction Projects in Hail, Saudi Arabia. Sustainability 2024, 16, 8177. [Google Scholar] [CrossRef]
  198. Alghuried, A. Assessing the Critical Success Factors for the Sustainable Construction Project Management of Saudi Arabia. J. Asian Archit. Build. Eng. 2025, 24, 1–19. [Google Scholar] [CrossRef]
  199. Ahmad, S.; Aftab, F.; Eltayeb, T.; Siddiqui, K. Identifying Critical Success Factors for Construction Projects in Saudi Arabia. E3S Web Conf. 2023, 371, 02047. [Google Scholar] [CrossRef]
  200. Chen, G.X.; Shan, M.; Chan, A.P.C.; Liu, X.; Zhao, Y.Q. Investigating the Causes of Delay in Grain Bin Construction Projects: The Case of China. Int. J. Constr. Manag. 2019, 19, 1–14. [Google Scholar] [CrossRef]
  201. Derakhshanalavijeh, R.; Teixeira, J.M.C. Cost Overrun in Construction Projects in Developing Countries, Gas-Oil Industry of Iran as a Case Study. J. Civ. Eng. Manag. 2017, 23, 125–136. [Google Scholar] [CrossRef]
  202. Mashali, A.; Elbeltagi, E.; Motawa, I.; Elshikh, M. Stakeholder Management Challenges in Mega Construction Projects: Critical Success Factors. J. Eng. Des. Technol. 2023, 21, 358–375. [Google Scholar] [CrossRef]
  203. Ingle, P.V.; Mahesh, G. Construction Project Performance Areas for Indian Construction Projects. Int. J. Constr. Manag. 2022, 22, 1443–1454. [Google Scholar] [CrossRef]
  204. Alenazi, E.; Adamu, Z.; Al-Otaibi, A. Exploring the Nature and Impact of Client-Related Delays on Contemporary Saudi Construction Projects. Buildings 2022, 12, 880. [Google Scholar] [CrossRef]
  205. Allahaim, F.S.; Liu, L. Causes of Cost Overruns on Infrastructure Projects in Saudi Arabia. Int. J. Collab. Enterp. 2015, 5, 32–57. [Google Scholar] [CrossRef]
  206. Mahamid, I. Relationship between Delay and Productivity in Construction Projects. Int. J. Adv. Appl. Sci. 2022, 9, 160–166. [Google Scholar] [CrossRef]
  207. Caldas, C.; Gupta, A. Critical Factors Impacting the Performance of Mega-Projects. Eng. Constr. Archit. Manag. 2017, 20, 920–934. [Google Scholar] [CrossRef]
  208. Scott-Young, C.M.; Georgy, M.; Grisinger, A. Shared Leadership in Project Teams: An Integrative Multi-Level Conceptual Model and Research Agenda. Int. J. Proj. Manag. 2019, 37, 565–581. [Google Scholar] [CrossRef]
  209. Yunpeng, G.; Zaman, U. Exploring Mega-Construction Project Success in China’s Vaunted Belt and Road Initiative: The Role of Paternalistic Leadership, Team Members’ Voice and Team Resilience. Eng. Constr. Archit. Manag. 2024, 31, 3801–3825. [Google Scholar] [CrossRef]
  210. Al-Gahtani, K.; Alsugair, A.; Alsanabani, N.; Alabduljabbar, A.; Almutairi, B. Forecasting Delay-Time Model for Saudi Construction Projects Using DEMATEL–SD Technique. Int. J. Constr. Manag. 2024, 24, 1225–1239. [Google Scholar] [CrossRef]
  211. Abougamil, R.A.; Thorpe, D.; Heravi, A. An Investigation of BIM Advantages in Analysing Claims Procedures Related to the Extension of Time and Money in the KSA Construction Industry. Buildings 2024, 14, 426. [Google Scholar] [CrossRef]
  212. Igwe, U.S.; Mohamed, S.F.; Dzahir, M.A.M.; Yusof, Z.M.; Khiyon, N.A. Towards a Framework of Automated Resource Model for Post Contract Cost Control of Construction Projects. Int. J. Constr. Manag. 2022, 22, 3088–3097. [Google Scholar] [CrossRef]
  213. Li, Y.; Liu, C. Applications of Multirotor Drone Technologies in Construction Management. Int. J. Constr. Manag. 2019, 19, 401–412. [Google Scholar] [CrossRef]
  214. Omar, T.; Nehdi, M.L. Data Acquisition Technologies for Construction Progress Tracking. Autom. Constr. 2016, 70, 143–155. [Google Scholar] [CrossRef]
  215. Felemban, H.; Sohail, M.; Ruikar, K. Exploring the Readiness of Organisations to Adopt Artificial Intelligence. Buildings 2024, 14, 2460. [Google Scholar] [CrossRef]
Figure 1. Key theoretical, methodological, and contextual contributions of the study. PCS: Project Control System; IPO: Input-Process-Output Model; PLS-SEM: Partial Least Squares Structural Equation Modelling; IPMA: Importance-Performance Map Analysis.
Figure 1. Key theoretical, methodological, and contextual contributions of the study. PCS: Project Control System; IPO: Input-Process-Output Model; PLS-SEM: Partial Least Squares Structural Equation Modelling; IPMA: Importance-Performance Map Analysis.
Buildings 15 03426 g001
Figure 2. Project control system determinants (PCSDs) of construction project delivery. (Adapted from [7]).
Figure 2. Project control system determinants (PCSDs) of construction project delivery. (Adapted from [7]).
Buildings 15 03426 g002
Figure 3. Inputs–Process–Outputs-based conceptual model.
Figure 3. Inputs–Process–Outputs-based conceptual model.
Buildings 15 03426 g003
Figure 4. Research process, methods, and outcomes.
Figure 4. Research process, methods, and outcomes.
Buildings 15 03426 g004
Figure 5. Improvement priority quadrants.
Figure 5. Improvement priority quadrants.
Buildings 15 03426 g005
Figure 6. Structural model β-values and R2 values.
Figure 6. Structural model β-values and R2 values.
Buildings 15 03426 g006
Figure 7. IPMA results according to Smart-PLS.
Figure 7. IPMA results according to Smart-PLS.
Buildings 15 03426 g007
Figure 8. Importance–performance map (construct level) of the target construct project performance (PP). Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Figure 8. Importance–performance map (construct level) of the target construct project performance (PP). Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Buildings 15 03426 g008
Figure 9. Importance–performance map (indicator level) of the target construct project performance (PP). Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Figure 9. Importance–performance map (indicator level) of the target construct project performance (PP). Note: The dashed lines indicate average importance and performance, dividing the map into four quadrants as explained in Figure 5.
Buildings 15 03426 g009
Figure 10. Evaluation of PCS determinants and prioritisation of improvement areas (PLS-SEM and IPMA).
Figure 10. Evaluation of PCS determinants and prioritisation of improvement areas (PLS-SEM and IPMA).
Buildings 15 03426 g010
Table 1. Summary of hypotheses.
Table 1. Summary of hypotheses.
HypothesisRelationshipAbbreviations
H1aOrganisational PCS determinants → Pre-operational control determinantsOPCSD → Pre-OCD
H1bOrganisational PCS determinants → In-operational control determinantsOPCSD → In-OCD
H1cOrganisational PCS determinants → Post-operational control determinantsOPCSD → Post-OCD
H1dOrganisational PCS determinants → Uncertainty control determinantsOPCSD → UOCD
H1eOrganisational PCS determinants → Human PCS determinantsOPCSD → HPCSD
H2aHuman PCS determinants → Pre-operational control determinantsHPCSD → Pre-OCD
H2bHuman PCS determinants → In-operational control determinantsHPCSD → In-OCD
H2cHuman PCS determinants → Post-operational control determinantsHPCSD → Post-OCD
H2dHuman PCS determinants → Uncertainty control determinantsHPCSD → UOCD
H2eHuman PCS determinants → Technological PCS determinantsHPCSD → TPCSD
H3aTechnological PCS determinants → Pre-operational control determinantsTPCSD → Pre-OCD
H3bTechnological PCS determinants → In-operational control determinantsTPCSD → In-OCD
H3cTechnological PCS determinants → Post-operational control determinantsTPCSD → Post-OCD
H3dTechnological PCS determinants → Uncertainty control determinantsTPCSD → UOCD
H4Pre-operational control determinants → Project performancePre-OCD → PP
H5In-operational control determinants → Project performanceIn-OCD → PP
H6Post-operational control determinants → Project performancePost-OCD → PP
H7Uncertainty control determinants → Project performanceUOCD → PP
H8Pre-operational control determinants → In-operational control determinantsPre-OCD → In-OCD
H9In-operational control determinants → Post-operational control determinantsIn-OCD → Post-OCD
Note: Hypotheses were tested using Partial Least Squares Structural Equation Modelling (PLS-SEM) with operational control serving as the process (mediator) construct within the Input–Process–Output (IPO) framework.
Table 2. Project performance indicators of the output construct.
Table 2. Project performance indicators of the output construct.
ConstructCode/ItemItems for ConstructSources
Project performance (PP)PP1The project adhered to its predetermined schedule[8,30,70,71,80,81,82,83]
PP2The project adhered to its predetermined budget
PP3The project adhered to its predetermined scope
Table 3. Indicators of organisational, human, and technological determinants of the PCS input constructs.
Table 3. Indicators of organisational, human, and technological determinants of the PCS input constructs.
ConstructPCSDCode/ItemItems for ConstructSources
Organisational PCS determinant (OPCSD)Top management support (TMS)TMS1The sufficiency of resource allocations provided by TM[40,42,48,84,85,86,87,88,89]
TMS2Promptness of top management in decision-making and changes
TMS3Top management fostering trust and a successful project culture
TMS4Top management’s understanding and commitment to PC principles
Oversight and governance programme (OGP)OGP1Clarity of the project’s organisational structure[8,30,40,42,90,91,92,93,94,95,96,97,98]
OGP2Clarity of established project control procedures
OGP3Engagement of PMO/independent peer reviews
OGP4Audit and evaluation frequency for project control processes
OGP5Application of knowledge and lessons learned for process improvement
Stakeholder engagement and coordination (SEC)SEC1Application of stakeholder engagement assessment[5,42,48,88,90,99,100,101,102,103,104,105,106,107,108]
SEC2Stakeholders’ understanding of their roles in PC engagement
SEC3Interaction level among stakeholders
SEC4Consistency and integration of PC systems across stakeholders
Human
PCS determinant (HPCSD)
Teamwork and collaboration (TWC)TWC1Team members’ commitment to their roles, responsibilities, and alignment with project objectives [5,8,30,40,41,42,83,90,91,92,93,99,100,106,109]
TWC2Team collaboration in information sharing, task coordination, decision-making, and open communication
TWC3Trust levels and conflict management between team members
TWC4Transparency and integrity in information reporting and sharing
Leadership and team skills (LTS)LTS1Team members knowledge and expertise level in their respective areas[5,8,30,40,41,42,83,89,91,94,107]
LTS2Team members’ skills, including technical proficiency, critical thinking, and task independence
LTS3Manager’s technical and managerial capability
LTS4Ability of leaders in guiding, motivating, and managing team members
Technological PCS
determinant (TPCSD)
Technological competency (TC)TC1Extent to which advanced technologies such as data analytics are integrated into project control processes[30,65,69,100,108,109,110,111,112,113,114,115,116,117,118,119,120,121]
TC2Use of automated and sophisticated devices for real-time project monitoring
TC3Utilisation of advanced prediction and forecasting tools, such as artificial intelligence (AI) and machine learning
TC4The use of Building Information Modelling (BIM) technology and infographic visualisation tools
TC5The adaptation of the common data environment for information management within the project
Table 4. Indicators of the operational control determinants (OCDs) of the PCS process constructs.
Table 4. Indicators of the operational control determinants (OCDs) of the PCS process constructs.
ConstructPCSDCode/ItemItems for ConstructSources
Pre-operational control
determinant
(Pre-OCD)
Project
planning and scheduling (PPS)
PPS1Clarity of the project scope[5,30,40,42,43,48,72,107,122]
PPS2Accuracy of the Work Breakdown Structure (WBS) in capturing the project scope
PPS3Comprehensiveness and detail of resource allocation for project activities
PPS4Efficacy of estimated project duration, costs, and resources in meeting project goals
PPS5Project milestones and scheduled activities in terms of clarity, relevance, accuracy, and realism
PPS6The alignment between the established performance measurement metrics and the project scope and objectives
In-operational control determinant (Pre-OCD)Project
progress tracking
(PPT)
PPT1Frequency of physical inspections and reviews to track project progress on site[5,30,40,41,43,82,105,122,123,124]
PPT2Extent of tracking budget expenditures and payments
PPT3Frequency of tracking the project schedule activities and milestone completions
PPT4Regular monitoring of project deliverables against requirements, specifications, and quality benchmarks
PPT5Extent of tracking resource allocation, usage, and productivity throughout the project
PPT6Efficacy of project tracking in updating project progress documentation and logs
Project
performance analysis (PPA)
PPA1Comparing actual project trends to planned ones using techniques such as Earned Value Analysis (EVA) and assessing variance measurements (CV and SV)[2,5,30,40,43,48,72,108,111,112,113,115,118,122,123,125,126,127,128,129,130,131,132,133,134,135,136,137]
PPA2Forecasting project completion time and total cost
PPA3Reviewing critical and near-critical path activities
PPA4Conducting financial analysis and benefit–cost evaluation
PPA5Accuracy of performance measurement and KPIs for detecting project deviations
Post-operational control determinant (Post-OCD)Project communication and reporting (PCR)PCSR1Clarity of communication plan in exchanging data and distribution of information[5,29,30,40,41,42,65,72,92,99,100,105,107,108,109,114,122,128,131,137]
PCSR2Frequency of updating project performance reports
PCSR3The accessibility, clarity, and accuracy of project reports
PCSR4Availability of real-time data for decision-making
Corrective
actions (CAs)
CA1Frequency of conducting review meetings for project performance and corrective actions[5,30,40,41,43,48,95,97,113,122,126,127,132,134,136,138]
CA2Responsiveness in resolving critical issues, such as scope creep, budget overruns, and schedule deviations
CA3Frequency of applying resource optimisation in correcting the project pathway
CA4Frequency of employing fast tracking and schedule-crashing techniques
Uncertainty
operational control determinant (UOCD)
Change
control (CC)
CC1The adherence to established change control procedures and policies[5,28,29,30,43,72,107,122,139,140,141,142]
CC2Timeliness of change approvals or rejections, ensuring they align with project needs
CC3The tracking and monitoring of change requests from submission to resolution
CC4Efficacy of the change review and implementation process
Risk control (RC)RC1Frequency of identifying potential risks and their root causes[20,30,40,72,81,88,102,113,122,138,143,144,145]
RC2Adherence to risk monitoring and documentation, including risk registers and action plans
RC3Allocation level of contingency and reserve funds for risk response
RC4Efficacy of risk mitigation strategies in reducing or eliminating identified risks
Table 5. Respondent and project profile.
Table 5. Respondent and project profile.
VariableCategoryNumberPercentage (%)Cumulative Percentage (%)
Years of experience11–15 years3817.117.1
16–20 years4721.238.3
20 and above7634.272.5
6–10 years3214.486.9
Less than 6 years2913.1100.0
Respondent’s positionConstruction manager level2611.711.7
Director and executive level83.615.3
Project manager level14665.881.1
Contract manager20.982.0
Other3616.298.2
Project controls manager41.8100.0
Project typeCommercial2913.113.1
Industrial177.720.7
Infrastructure6227.948.6
Residential6830.679.3
Services4620.7100.0
Project sectorOther20.90.9
Private10346.447.3
Public11752.7100.0
Organisation roleClient7533.833.8
Consultant6428.862.6
Contractor7232.495.0
Sub-Contractor115.0100.0
Project delivery methodDesign-Bid-Build (DBB)7835.135.1
Design-Build (DB)3917.652.7
Construction Manager at Risk (CMAR)83.656.3
Integrated Project Delivery (IPD)9442.398.6
Other31.4100.0
Duration of project1 to 12 months4620.720.7
13 to 18 months4218.939.6
19 to 24 months4620.760.4
25 to 36 months6830.691.0
37 months and above209.0100.0
Project budgetSAR 100–499 M5926.626.6
SAR 30–99.99 M3616.242.8
SAR 500 M+3817.159.9
SAR 6–29.99 M4319.479.3
<SAR 6 M4620.7100.0
Note: The approximate exchange rates at the time of reporting were 1 USD = 3.75 SAR and 1 GBP = 4.75 SAR. For the sake of comparison, 100,000,000 SAR was approximately equivalent to USD 26.7 million or GBP 21 million.
Table 6. Item loadings, reliability, and convergent validity results.
Table 6. Item loadings, reliability, and convergent validity results.
ConstructCode/ItemFactor LoadingComposite ReliabilityCronbach’s AlphaAVE
Project performance (PP)PP10.8470.8060.8080.721
PP20.870
PP30.829
Organisational PCS determinants (OPCSD)TMS10.7230.9480.9500.618
TMS20.806
TMS30.832
TMS40.852
OGP10.745
OGP20.757
OGP30.710
OGP40.722
OGP50.759
SEC10.764
SEC20.818
SEC30.854
SEC40.852
Human PCS determinants (HPCSD)TWC10.8470.9400.9410.705
TWC20.857
TWC30.856
TWC40.790
LTS10.868
LTS20.881
LTS30.830
LTS40.780
Technological PCS determinants (TPCSD)TC10.8830.9230.9270.764
TC20.904
TC30.851
TC40.859
TC50.874
Pre-operational control PCS determinants (Pre-OCD)PPS10.8240.9250.9260.727
PPS20.859
PPS30.851
PPS40.817
PPS50.879
PPS60.883
In-operational control determinants (In-OCD)PPT10.8250.9490.9490.670
PPT20.819
PPT30.821
PPT40.844
PPT50.806
PPT60.797
PPA10.806
PPA20.819
PPA30.833
PPA40.809
PPA50.825
Post-operational control determinants (Post-OCD)PCSR10.8450.9350.9360.687
PCSR20.853
PCSR30.862
PCSR40.842
CA10.806
CA20.821
CA30.836
CA40.762
Uncertainty operational control determinants (UOCD)CC10.7950.9440.9450.720
CC20.846
CC30.847
CC40.851
RC10.845
RC20.875
RC30.854
RC40.870
Table 7. Discriminant validity statistics using the heterotrait–monotrait ratio (HTMT) matrix.
Table 7. Discriminant validity statistics using the heterotrait–monotrait ratio (HTMT) matrix.
ConstructHPCSDIn-OCDOPCSDPPPost-OCDPre-OCDTPCSDUOCD
HPCSD
In-OCD0.828
OPCSD0.8510.840
PP0.6550.6430.672
Post-OCD0.8150.8910.7850.695
Pre-OCD0.7870.8390.7830.6810.789
TPCSD0.6840.7460.6590.5450.6580.695
UOCD0.7750.8750.7930.6600.7930.8050.749
Table 8. Collinearity statistics (inner model).
Table 8. Collinearity statistics (inner model).
ConstructsVariance Inflation Factor (VIF)
HPCSD → In-OCD3.419
HPCSD → Post-OCD3.466
HPCSD → Pre-OCD3.157
HPCSD → TPCSD1.000
HPCSD → UOCD3.157
In-OCD → PP4.394
In-OCD → Post-OCD3.777
OPCSD → HPCSD1.000
OPCSD → In-OCD3.293
OPCSD → Post-OCD3.597
OPCSD → Pre-OCD2.986
OPCSD → UOCD2.986
Post-OCD → PP3.630
Pre-OCD → In-OCD2.703
Pre-OCD → PP2.978
TPCSD → In-OCD1.942
TPCSD → Post-OCD2.066
TPCSD → Pre-OCD1.790
TPCSD → UOCD1.790
UOCD → PP3.556
Table 9. Significance testing of the structural model.
Table 9. Significance testing of the structural model.
HRelationshipsPath Coefficient (B)T Statistics (|O/STDEV|)p ValuesSignificance
(p < 0.05)
Decision
H1aOPCSD → Pre-OCD0.3374.2940.000SignificantSupported
H1bOPCSD → In-OCD0.3044.6860.000SignificantSupported
H1cOPCSD → Post-OCD0.0710.8760.190InsignificantNot supported
H1dOPCSD → UOCD0.3724.5620.000SignificantSupported
H1eOPCSD → HPCSD0.80531.1220.000SignificantSupported
H2aHPCSD → Pre-OCD0.3113.7090.000SignificantSupported
H2bHPCSD → In-OCD0.1962.6920.004SignificantSupported
H2cHPCSD → Post-OCD0.2463.0490.001SignificantSupported
H2dHPCSD → UOCD0.2222.8680.002SignificantSupported
H2eHPCSD → TPCSD0.64313.8630.000SignificantSupported
H3aTPCSD → Pre-OCD0.2373.8220.000SignificantSupported
H3bTPCSD → In-OCD0.2013.2260.001SignificantSupported
H3cTPCSD → Post-OCD−0.0030.0690.473InsignificantNot supported
H3dTPCSD → UOCD0.3264.4520.000SignificantSupported
H4Pre-OCD → PP0.2582.1990.014SignificantSupported
H5In-OCD → PP−0.0980.7510.226InsignificantNot Supported
H6Post-OCD → PP0.3443.1500.001SignificantSupported
H7UOCD → PP0.2072.3210.010SignificantSupported
H8Pre-OCD → In-OCD0.2914.1150.000SignificantSupported
H9In-OCD → Post-OCD0.5948.5770.000SignificantSupported
Table 10. Predictive relevance Q2 of endogenous constructs.
Table 10. Predictive relevance Q2 of endogenous constructs.
ConstructsQ2 PredictRMSEMAE
HPCSD0.6450.6010.461
In-OCD0.6320.6130.476
PP0.3310.8260.653
Post-OCD0.5420.6850.513
Pre-OCD0.5330.6900.540
TPCSD0.3640.8050.642
UOCD0.5580.6710.503
Table 11. The statistical outcomes of the effect size (f2).
Table 11. The statistical outcomes of the effect size (f2).
Constructsf-SquareEffect Size
HPCSD → In-OCD0.048Small
HPCSD → Post-OCD0.067Small
HPCSD → Pre-OCD0.083Small
HPCSD → TPCSD0.706Substantial
HPCSD → UOCD0.048Small
In-OCD → PP0.003Small
In-OCD → Post-OCD0.357Substantial
OPCSD → HPCSD1.845Substantial
OPCSD → In-OCD0.120Medium
OPCSD → Pre-OCD0.103Medium
OPCSD → Post-OCD0.005Small
OPCSD → UOCD0.140Medium
Post-OCD → PP0.057Small
Pre-OCD → In-OCD0.134Medium
Pre-OCD → PP0.039Small
TPCSD → In-OCD0.089Small
TPCSD → Pre-OCD0.085Small
TPCSD → UOCD0.180Medium
TPCSD → Post-OCD0.001Small
UOCD → PP0.021Small
Table 12. The indicators of the top priority area for improvement of the target construct project performance (PP).
Table 12. The indicators of the top priority area for improvement of the target construct project performance (PP).
IndicatorsImportancePerformancePriority
LTS20.05656.9071
PPS40.05559.0092
SEC40.05460.1353
LTS10.05159.0094
LTS30.0557.9585
SEC20.0560.366
SEC10.04857.9957
CA40.04758.1088
OGP50.04757.6589
OGP40.04658.44610
OGP30.04158.33311
Mean0.04060.678
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alotaibi, R.; Sohail, M.; Soetanto, R. Modelling Project Control System Effectiveness in Saudi Arabian Construction Project Delivery. Buildings 2025, 15, 3426. https://doi.org/10.3390/buildings15183426

AMA Style

Alotaibi R, Sohail M, Soetanto R. Modelling Project Control System Effectiveness in Saudi Arabian Construction Project Delivery. Buildings. 2025; 15(18):3426. https://doi.org/10.3390/buildings15183426

Chicago/Turabian Style

Alotaibi, Rashed, M. Sohail, and Robby Soetanto. 2025. "Modelling Project Control System Effectiveness in Saudi Arabian Construction Project Delivery" Buildings 15, no. 18: 3426. https://doi.org/10.3390/buildings15183426

APA Style

Alotaibi, R., Sohail, M., & Soetanto, R. (2025). Modelling Project Control System Effectiveness in Saudi Arabian Construction Project Delivery. Buildings, 15(18), 3426. https://doi.org/10.3390/buildings15183426

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop