Next Article in Journal
Automatic Detection of Newly Built Buildings Utilizing Change Information and Building Indices
Previous Article in Journal
Performance Optimization of Expanded Polystyrene Lightweight Concrete Using a Multi-Objective Physically Interpretable Algorithm with Random Forest
Previous Article in Special Issue
A Mixed-Method Comparative Analysis of BIM Technology Adoption in China’s and Japan’s Construction Sectors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Cloud-Driven Framework for Automated BIM Quantity Takeoff and Quality Control: Case Study Insights

by
Mojtaba Valinejadshoubi
1,*,
Osama Moselhi
2,
Ivanka Iordanova
3,
Fernando Valdivieso
1,
Ashutosh Bagchi
2,
Charles Corneau-Gauvin
1 and
Armel Kaptué
1
1
Innovation Team, Pomerleau Inc., 500 Rue Saint-Jacques, Suite 300, Montreal, QC H2Y 0A2, Canada
2
Department of Building, Civil and Environmental Engineering, Concordia University, Montreal, QC H3G 1M8, Canada
3
Department of Construction Engineering, École de Technologie Supérieure (ÉTS), Montreal, QC H3C 1K3, Canada
*
Author to whom correspondence should be addressed.
Buildings 2025, 15(21), 3942; https://doi.org/10.3390/buildings15213942 (registering DOI)
Submission received: 18 September 2025 / Revised: 14 October 2025 / Accepted: 23 October 2025 / Published: 1 November 2025

Abstract

Accurate quantity takeoff (QTO) is essential for cost estimation and project planning in the construction industry. However, current practices are often fragmented and rely on manual or semi-automated processes, leading to inefficiencies and errors. This study introduces a cloud-based framework that integrates automated QTO with a rule-based Quantity Precision Check (QPC) to ensure that quantities are derived only from validated and consistent BIM data. The framework is designed to be scalable and compatible with open data standards, supporting collaboration across teams and disciplines. A case study demonstrates the implementation of the system using structural and architectural models, where automated validation detected parameter inconsistencies and significantly improved the accuracy and reliability of takeoff results. To evaluate the system’s effectiveness, the study proposes five quantitative validation metrics, Inconsistency Detection Rate (IDR), Parameter Consistency Rate (PCR), Quantity Accuracy Improvement (QAI), Change Impact Tracking (CIT), and Automated Reporting Efficiency (ARE). These indicators are newly introduced in this study to address the absence of standardized metrics for automated QTO with pre-takeoff, rule-based validation. However, the current validation was limited to a single project and discipline-specific rule set, suggesting that broader testing across mechanical, electrical, and infrastructure models is needed to fully confirm scalability and generalizability. The proposed approach provides both researchers and practitioners with a replicable, transparent methodology for advancing digital construction practices and improving the quality and efficiency of BIM-based estimation processes.

1. Introduction

Quantity takeoff (QTO) is a cornerstone of construction processes, serving as the foundation for essential activities such as cost estimation, workload planning, cost management, procurement, and construction scheduling [1,2]. The precision of QTO directly impacts a contractor’s ability to maintain financial stability, as inaccuracies can lead to cost overruns, resource misallocations, and project delays. Traditionally, QTO was conducted manually, relying on human interpretation of 2D drawings, a method fraught with inefficiencies and prone to errors [3]. As construction projects grow increasingly complex, traditional methods have become insufficient, prompting the need for advanced systems to streamline processes, reduce costs, and enhance accuracy [4].
Building Information Modeling (BIM) has transformed the architecture, engineering, and construction (AEC) industry by introducing intelligent, data-rich models that facilitate decision-making, enhance collaboration, and improve project management [3,5,6]. Among its diverse applications, BIM’s ability to parameterize and automatically manipulate model data has the potential to revolutionize QTO processes. BIM-based QTO enables the extraction of geometric and semantic data from models, automating the generation of detailed and accurate bills of quantities. Compared to manual methods, BIM-based QTO offers superior efficiency, accuracy, and cost-effectiveness [7,8,9,10], which is especially beneficial during early design phases where reliable cost estimates are crucial for informed decision-making [8].
Despite these advantages, BIM-based QTO faces significant challenges. The accuracy of quantities derived from BIM models depends heavily on the quality and completeness of the model, including its geometric representations and associated metadata [11]. Deficiencies such as incomplete details, inconsistent modeling practices, and misaligned parameters often lead to deviations in QTO outputs [12,13]. For instance, Khosakitchalert [13] reported discrepancies in quantities for walls and floors ranging from −10.78% to 43.18%, and 0.00% to 19.09%, respectively. These issues are compounded by the limitations of existing systems, which often fail to fully automate QTO workflows or accommodate the diverse requirements of complex projects [3].
The adoption of the Industry Foundation Classes (IFC) data schema has alleviated some interoperability issues, but it also has limitations. Critical data required for accurate QTO can be lost during IFC transformations, affecting output reliability [14,15]. Furthermore, the information stored in BIM models is often insufficient for comprehensive QTO, necessitating enhanced integration to minimize data loss and improve accuracy [16].
Several studies have attempted to address these challenges. Vassen [17] analyzed the capabilities of BIM-based QTO systems in delivering accurate and reliable cost estimates, emphasizing their potential to streamline workflows while addressing data integrity issues. Plebankiewicz [18] developed a BIM-based QTO and cost estimation system but identified significant limitations in detecting inconsistencies within models. Zima [19] explored how modeling methods and data impact QTO accuracy, while Choi et al. [16] proposed a prototype system to improve QTO reliability during the design phase. Vitásek and Matejka [20] demonstrated the potential of automating QTO and cost estimation for transport infrastructure projects, illustrating how BIM models can significantly reduce labor-intensive tasks.
Research by Eroglu [21] revealed that many BIM-based tools are inadequate for the advanced data manipulations required for reliable quantity calculations. Incomplete or inaccurate BIM models often lead to quantities that are inadequate, excessive, or incorrect [3,12]. Sherafat et al. [22] demonstrated that leveraging programming interfaces to automate QTO processes can significantly enhance accuracy while addressing interoperability issues. Similarly, Vitásek and Matejka [20] highlighted the scalability offered by linking BIM models with external databases to enable robust and adaptable QTO solutions.
Recent advancements have pushed the boundaries of automation. Sherafat et al. [22] showed how programming interfaces reduce inconsistencies in material and geometry data, while Valinejadshoubi et al. [23] introduced a framework capable of automating both quantity extraction and validation for structural and architectural elements. Although effective in real-world scenarios, their framework relied on local databases, limiting scalability and real-time collaboration. Building on this foundation, the current research introduces a cloud-based architecture to address these limitations, enabling real-time collaboration, dynamic visualization, and enhanced data accessibility [20,22,23].
Although BIM-based QTO adoption is growing, several critical research gaps remain. First, most commercial systems, such as Autodesk Takeoff, Navisworks, and CostX, are closed and proprietary, focusing primarily on quantity extraction without providing transparent, customizable mechanisms for automated quality control of model parameters prior to takeoff. Second, prior academic studies often treat QTO and model quality control as separate processes, resulting in fragmented workflows that increase the risk of errors. Third, there is a lack of standardized, quantitative metrics for evaluating the accuracy, consistency, and responsiveness of QTO systems, making it difficult to benchmark performance across projects. Finally, while cloud-based collaboration is increasingly vital for multi-user environments, current solutions offer limited scalability and poor interoperability with open data standards such as IFC.
To address these gaps, this study is guided by the following research questions:
  • How can QTO and QPC processes be integrated into a unified, cloud-based workflow to improve accuracy and efficiency?
  • What standardized metrics can be developed to evaluate the reliability and responsiveness of automated QTO systems?
  • How can open data standards and cloud-based collaboration enhance the scalability and generalizability of QTO frameworks across diverse projects?
This research contributes to the body of knowledge by introducing a model-agnostic QTO framework that integrates QPC directly into BIM workflows. Unlike prior studies reliant on local environments and predefined scripts, the proposed system leverages scalable cloud infrastructure for multi-stakeholder collaboration. It introduces novel performance metrics for QTO precision and responsiveness, providing objective measures for assessing model quality. By supporting open formats such as IFC and incorporating dynamic discrepancy detection across design phases, the framework enhances interoperability and reliability. In particular, we introduce five new, domain-specific validation metrics, IDR, PCR, QAI, CIT, and ARE, because no standardized set currently exists for automated QTO combined with pre-takeoff rule-based validation. Formal definitions are provided later to support replication. These contributions collectively advance the automation, transparency, and efficiency of BIM-based QTO processes, bridging the gap between academic research and industry practice.
The overall discussion highlights that the developed framework effectively unifies QTO and QPC in a transparent and measurable way, outperforming current commercial and research-based systems by integrating automated validation, real-time collaboration, and open-standard data exchange. It emphasizes that while the approach demonstrates strong potential in detecting inconsistencies and improving accuracy, its current validation on a single project and reliance on manually defined rules indicate areas for future development. Broader benchmarking, integration with international standards such as ISO 19650 and IFC 4.3, and the inclusion of AI-based anomaly detection are identified as next steps toward expanding the system’s scalability and generalizability across building and infrastructure projects.

2. Literature Review

BIM has been widely adopted across multiple domains, including energy efficiency [24,25], seismic risk assessment [26], Structural Health Monitoring (SHM) [27,28,29], operational monitoring [30,31], and QTO. Its integration enhances project planning and execution through intelligent, data-rich models [8]. Traditionally, QTO was manually conducted, with quantities extracted from 2D drawings and validated manually, a process that is time-intensive, error-prone, and inefficient [32]. Vassen [17] highlighted these inefficiencies and demonstrated how BIM-based QTO significantly improves speed and accuracy. By enabling automatic extraction of element quantities, BIM has made QTO faster, more reliable, and less labor-intensive [33,34].
Early research investigated BIM integration into QTO workflows. Plebankiewicz et al. [18] developed a BIM-based system for quantity and cost estimation, showing its potential to streamline workflows but also noting issues with inconsistent parameters. Zima [19] examined how modeling methods and data structures affect QTO accuracy, showing that poor modeling practices lead to discrepancies. Choi et al. [16] proposed a prototype leveraging Open BIM principles to improve estimation reliability during early design phases. Vitásek and Matejka [20] explored QTO automation for transport infrastructure, demonstrating scalability for large and complex projects. Vieira et al. [34] introduced a semi-automated workflow for architectural elements but lacked mechanisms for model validation. These efforts marked important progress but were limited in interoperability and data quality assurance.
The accuracy of BIM-based QTO relies on the completeness and quality of BIM models, including geometry, material definitions, and parameters consistency [35]. Khosakitchalert et al. [12] found that insufficient detail in compound elements like walls and floors leads to significant inaccuracies. Kim et al. [36] identified discrepancies arising from inconsistent material naming conventions and geometric data, while Kim et al. [36] proposed strategies to reduce these discrepancies, particularly for interior components, demonstrating their impact throughout the project lifecycle [37]. Golaszewska and Salamak [38] highlighted challenges such as interoperability issues, inconsistent element descriptions, and poor team communication, emphasizing the need for standardized protocols. Smith [39] stressed the importance of integrating cost management and QTO through 5D BIM to bridge estimation and planning.
Recent research has advanced domain-specific automation and algorithmic approaches to improve QTO accuracy and efficiency. Cepni and Akcamete [40] developed an automated QTO system for formwork calculations, which showed reductions in manual errors but was limited to a single domain. Taghaddos et al. [41] leveraged application programming interfaces (APIs) to enhance automation, while Sherafat et al. [22] addressed interoperability issues by improving cross-platform data extraction accuracy. Fürstenberg et al. [42] developed an automated QTO system for a Norwegian road project, demonstrating BIM’s potential in infrastructure-scale projects while addressing challenges of complex geometries and alignment with construction requirements. Han et al. [43] proposed a 3D model-based QTO system to reduce manual workloads but faced interoperability limitations. Yang et al. [44] introduced the Automation Extraction Method (AEM), achieving a less than 1% error in controlled environments, though scalability for complex projects remains uncertain. Akanbi and Zhang [45] presented IFC-based algorithms for volumetric and areal extractions independent of proprietary software, though handling complex geometries remains a challenge. Pham et al. [46] proposed a BIM-based framework for automatic daily extraction of concrete and formwork quantities, integrating QTO with schedules to optimize material planning and reduce waste.
Regional variations in construction standards create additional challenges for QTO accuracy. Chen et al. [47] introduced a BIM-based QTO Code Mapping (BQTCM) method to align BIM data with local classification systems, reducing information loss. Similarly, Liu et al. [48] developed a knowledge model-based framework embedding semantic rules and standardized measurement methods into BIM models to achieve automated, code-compliant QTO with minimal manual intervention. These advances emphasize the importance of aligning BIM with regional and national standards.
Commercial model-checking tools, such as Solibri Model Checker v25, provide validation functions but rely heavily on IFC for data exchange, leading to data loss and limiting rule customization [49]. Eilif [49] and Seib [50] highlighted the need for flexible, project-specific rule sets to overcome these limitations. Khosakitchalert et al. [12] improved QTO accuracy for compound elements but did not address broader issues of data alignment. Alathameh et al. [51] studied the adoption of 5D BIM, reporting productivity benefits but identifying barriers like limited technical expertise and complex software interfaces that hinder implementation.
Some studies have moved toward collaborative, scalable solutions. Valinejadshoubi et al. [23] presented a framework capable of both quantity extraction and validation for structural and architectural models. While this is a significant improvement, reliance on local databases limits scalability and real-time multi-user collaboration. Alathamneh et al. [51] conducted a systematic review of BIM-based QTO research, identifying persistent gaps in interoperability, scalability, and workforce training. They proposed a conceptual model for sustainable QTO adoption, emphasizing open, cloud-based solutions.
Overall, substantial progress has been made in automating QTO and improving model reliability, yet significant challenges persist. Errors in geometric data, material definitions, and parameters continue to compromise accuracy, making integrated validation essential. Interoperability between proprietary systems remains limited, and reliance on IFC often causes information loss. Few studies introduce standardized performance metrics to evaluate QTO precision and responsiveness, limiting benchmarking and improvement. Most existing systems operate locally, restricting scalability and real-time collaboration. Finally, commercial tools remain rigid, offering limited customization for project-specific needs. These gaps demonstrate the need for a cloud-integrated QTO framework that unifies quantity extraction and automated validation, supports real-time collaboration, and introduces standardized metrics for transparent, reliable QTO processes.

3. Research Methodology

This study develops a comprehensive and fully automated framework for accurate QTO and QPC using advanced tools and cloud-based technologies. Building upon the authors’ earlier work [23], this framework introduces several major enhancements, including cloud-based data storage, real-time visualization, and improved automation through Feature Manipulation Engine (FME). Unlike the earlier Dynamo- and Excel-based workflow [23], the updated system leverages a cloud-hosted MySQL database for scalability, real-time collaboration, and automated quality validation. Dynamo is a visual programming and computational design tool applied for automation [52].
Figure 1 illustrates the overall research methodology and system architecture, showing the integration of BIM models, automated data extraction and analysis, cloud-based storage, and interactive visualization dashboards. The framework consists of four main modules:
  • BIM Models: The source for QTO and QPC.
  • Data Extraction and Analysis Module: To automate parameter extraction and consistency checks using FME.
  • Data Storage Module: To organize validated data in a cloud-based MySQL database.
  • Data Visualization Module: To provide real-time dashboards for QTO tracking and validation through Power BI.
Structural and architectural BIM models, containing essential QTO parameters, were provided by the engineering teams and stored on ProjectWise, a cloud-based collaboration platform, ensuring real-time accessibility and eliminating versioning issues. The framework processes these models through an automated workflow, significantly improving both efficiency and scalability.
The Data Extraction and Analysis Module, developed using FME, replaces Dynamo scripts from previous research [23]. FME provides a more scalable and flexible approach, featuring a graphical interface for visually configuring workflows and integrating directly with cloud repositories such as ProjectWise. This eliminates manual interventions and ensures dynamic data synchronization. Within this module, key QTO parameters are extracted, including identification parameters (e.g., Family Name, Type Name, Assembly Description, and Type Mark) and quantity parameters (e.g., Material Name, Material Model, Area, and Volume). These are automatically sorted, filtered, and validated to generate precise calculations of steel weights and concrete volumes, accounting for allowances such as a 15% connection factor for steel and a 5% waste factor for concrete.
A core advancement in this framework is the automated QPC mechanism, which detects missing or inconsistent attributes before the QTO process. The QPC module was designed using a structured, rule-based approach informed by three sources:
  • Industry Standards: ISO 19650 for BIM information management, Level of Development (LOD) and Level of Information (LOI) matrices, and measurement conventions such as NRM and CSI Master Format.
  • Prior Literature: Studies on rule-based model checking and automated validation frameworks [22,49].
  • Expert Consultation: Iterative workshops with project engineers, estimators, and BIM managers to align rules with practical QTO workflows.
The rules are organized into three groups:
  • Completeness Rules: To identify missing critical parameters, e.g., Nominal Weight for steel or Volume for concrete components.
  • Consistency Rules: To verify logical alignment, such as ensuring Volume = Area × Thickness for walls or checking material types against element classifications.
  • Classification Rules: To enforce correct categorization, such as ensuring that Workset containing “STE” include only structural steel elements.
These rules are stored in a SQL-based repository, making them easily extensible to other disciplines (e.g., mechanical, electrical, and plumbing). Validation is performed using FME tools like Tester, Sampler, and AttributeValidator, which dynamically evaluate model elements and flag missing, redundant, or inconsistent parameters. During transfer to MySQL, SQL triggers apply sequential rules to prevent critical errors from propagating downstream.
Validated quantities and quality check results are automatically organized in MySQL, categorized by material type, structural element, and project phase. This centralized database enables efficient multi-project data management and seamless integration with visualization tools. Table 1 lists the types of architectural and structural materials considered in this study.
The Data Visualization Module connects Power BI directly to the live MySQL database. This eliminates the need for manual Excel uploads and provides real-time dashboards for tracking design changes, material trends, and QPC results. Stakeholders can interactively filter data by material type, project phase, or structural category to view total quantities, breakdowns by segment, flagged inconsistencies, and historical changes for cost control and decision-making.
Figure 2 illustrates how these four modules are integrated into a seamless workflow.
  • The process begins with cloud-hosted BIM models (Module 1).
  • FME extracts and validates parameters (Module 2).
  • The processed data is stored in MySQL (Module 3).
  • Power BI dynamically visualizes the validated quantities (Module 4).
By automating data flow across these modules, the framework delivers a fully connected, scalable, and transparent QTO/QPC process. This approach overcomes the limitations of the earlier semi-automated method [23], significantly reducing manual effort, improving data integrity, and enabling real-time collaboration across stakeholders. Ultimately, this framework enhances the efficiency, accuracy, and reliability of construction cost estimation processes.

4. Framework Architecture

The proposed framework was applied to a real construction project in Canada to demonstrate its functions and validate its performance. It integrates four main modules: (1) BIM Models, (2) Data Extraction and Analysis, (3) Data Storage, and (4) Data Visualization. Figure 3 presents the structural and architectural BIM models used in this case study.

4.1. Development of Case Study

Architectural and structural BIM models were developed in Autodesk Revit 2025 at LOD 300, ensuring sufficient detail for accurate QTO calculations. These models were stored in ProjectWise, a cloud-based repository that facilitates real-time access and prevents versioning conflicts. This framework extracts data directly from cloud-hosted models, ensuring data consistency across all stakeholders.
The framework extracts essential geometric and identity parameters, including Nominal Weight, Length, Cut Length, Volume, Area, Material Area, and Thickness, as well as metadata such as Material Name, Material Model Name, Family Name, Type Name, Type Mark, and Assembly Description. Its flexible design supports adding project-specific parameters for customized validation or inconsistency checks.

4.2. Data Extraction and Analysis Module

The Data Extraction and Analysis Module is the core component of the framework, responsible for automating data retrieval, validation, and computation of material quantities. This module is implemented in FME, replacing the Dynamo-based workflows from previous research [23]. FME offers direct integration with ProjectWise, enabling seamless, real-time access to models while reducing manual intervention.
As shown in Figure 4, the process for structural models includes the following:
  • Filtering and Grouping:
    Structural elements are filtered by category (columns, framings, foundations, floors, and walls).
    Grouped by Workset to distinguish steel and concrete elements.
    Grouped by phase (e.g., new vs. existing construction).
  • Parameter Extraction:
    Steel: Nominal Weight, Length, Cut Length, and Structural Material.
    Concrete: Volume, Structural Material, and Thickness (for composite floors).
  • Automated QPC Validation:
    Rule check for completeness and consistency (e.g., Nominal Weight missing for steel, Volume mismatched for concrete).
    Implemented using FME transformers like Tester, Sampler, and AttributeValidator.
  • Quantity Calculation:
    Steel quantities are calculated in kilograms and concrete in cubic meters, using formulas from Valinejadshoubi et al. [23].
    Aggregated totals provide complete QTO results.
Figure 5 illustrates the workflow for architectural elements:
  • Filtering and Grouping:
    Categorized into walls, floors, ceilings, and roofs, and grouped by phase.
  • Parameter Extraction:
    Family Type Name and Assembly Description for drywalls, precast concrete, curtain walls, and roof membranes.
    Material Name and Material Model Name for ceilings and tiling.
    Type Mark exclusively for curtain walls.
    Area and Material Area for surface-related calculations.
  • Automated QPC Validation:
    Detects discrepancies, e.g., mismatch between Family Type Name and Assembly Description for concrete masonry units.
  • Aggregation:
    Quantities aggregated by material type and construction phase for comprehensive architectural QTO results.
FME transformers such as FeatureReader, StatisticsCalculator, and AttributeSplitter ensure efficient data handling. These workflows extract parameters from BIM models, validate them using rule-based logic, and transfer verified data seamlessly to the MySQL database.

4.3. Data Storage Module

The Data Storage Module uses a cloud-based MySQL Azure (RTM) v12 database, replacing the local Excel-based storage from earlier workflows [23]. This upgrade significantly improves scalability, ensures real-time updates, and provides structured tables categorized by material type, structural component, and project phase. Automated synchronization eliminates the need for manual updates, ensuring that the data remains current as the BIM models evolve. In addition to dynamic BIM-derived quantities, static reference data such as bidding quantities and 2D QTO estimates are also stored in the database to enable comparative analysis and support cost control tracking.
The system’s advanced filtering, sorting, and querying capabilities allow stakeholders to generate detailed reports and track design changes over time. For example, quantities can be segmented by material type (e.g., steel or concrete), construction phase (e.g., new or existing), or project discipline (e.g., structural or architectural). This level of detail provides valuable insights for decision-making and facilitates effective project management. Furthermore, the MySQL database is directly connected to Power BI, enabling seamless and live visualization of key metrics such as total quantities, discrepancies, and cost summaries without the need for periodic manual uploads.

4.4. Data Visualization Module

The Data Visualization Module, developed in Microsoft Power BI, provides real-time and interactive dashboards for stakeholders. Unlike previous workflows that relied on periodic manual updates, this module continuously synchronizes with the MySQL database to ensure that the visualized data is always current. The dashboards present comprehensive visualizations of quantities, discrepancies, and trends, offering an intuitive and dynamic interface for monitoring project performance.
One of the core visualization features is the comparison of total quantities across different sources, including BIM-based estimates, 2D drawing-derived quantities, and bidding estimates. These comparisons are displayed using clustered bar charts, allowing stakeholders to easily identify variations and discrepancies between different estimation methods. Another feature, the “Percent of Budgeted Quantities,” highlights whether the quantities derived from BIM models align with budgetary constraints established during the bidding process. Significant deviations are automatically flagged for review, helping project managers take corrective actions in a timely manner.
The dashboard also includes design change tracking functionality, supported by time-based filters such as month and year. This feature allows stakeholders to observe the effects of design updates on material quantities over time, making it easier to correlate changes with specific project decisions. By visualizing trends and potential risks, the module enables proactive decision-making, minimizing the likelihood of budget overruns and resource inefficiencies. Overall, the Data Visualization Module significantly enhances the transparency, automation, and interactivity of the framework compared to earlier semi-automated methods [23].
As illustrated earlier in Figure 2, cloud-hosted BIM models serve as the initial data source, feeding directly into the FME automation pipeline where parameters are extracted, validated, and processed. The validated data then flows into the MySQL database for structured storage, which is dynamically connected to Power BI dashboards for real-time visualization and reporting.
By integrating QTO and QPC into a unified, cloud-based workflow, this framework automates data handling across all stages, from extraction to visualization. It enhances collaboration among multi-stakeholder teams by providing a centralized and transparent data environment, while also improving traceability by linking QTO outputs with automated quality validation processes. Moreover, the framework addresses key scalability and performance gaps identified in earlier workflows [23], offering a robust solution capable of handling complex projects and multi-user environments.
This integrated approach represents a significant advancement over traditional semi-automated methods by reducing manual effort, improving accuracy, and enabling real-time monitoring of project quantities. Ultimately, it provides a powerful decision-support tool for construction cost estimation, material tracking, and quality control, supporting efficient and data-driven project management practices.

5. Framework Implementation

The framework consists of four interconnected modules: BIM Models, Data Extraction and Analysis, Data Storage, and Data Visualization. This section demonstrates its application to a real construction project in Canada, validating its performance and practical utility. Cloud-hosted architectural and structural BIM models were provided by the engineering team, ensuring real-time access and collaboration. The integrated framework, driven by FME automation, delivers scalable and efficient QTO and QPC outputs to support estimation, cost control, and procurement workflows.
For structural elements, the workflow begins by selecting relevant categories such as structural columns, framings, foundations, walls, and floors from the BIM model stored in ProjectWise. The system filters elements by material type to isolate steel and concrete components. Workset naming conventions (e.g., “STE” for steel, “CON” for concrete) are used to ensure precise grouping and classification. For steel elements, key parameters such as Element ID, Nominal Weight, Element Name, Length, and Cut Length are extracted. For concrete elements, parameters such as Element ID, Volume, Element Name, and Thickness are retrieved.
A major improvement of this framework is the fully automated QPC module, which is integrated directly into the FME workflow. This enables real-time error detection prior to quantity calculations and eliminates the need for manual quality checks. Examples of QPC validations include the following:
  • Identifying steel columns or framings with missing Nominal Weight values.
  • Detecting structural concrete columns, walls, floors, framings, or foundations with missing Volume data.
When inconsistencies are detected, a structured report is automatically generated and emailed to the engineering team. The report lists flagged elements with their IDs, families, and Workset, enabling quick corrections in the BIM model. Once updated, the workflow is re-run to confirm that all elements are correctly defined before proceeding with final quantity calculations. After validation, the framework calculates total steel and concrete quantities. Adjustment factors are applied to account for small steel plates, connections, and waste, typically 15% for steel and 5% for concrete, though these values can be modified to reflect project-specific estimation requirements. For this case study, the final validated quantities were 255 tons of steel and 1453 m3 of concrete, ensuring that all components were included and consistent with the project’s LOD and LOI standards.
The full FME automation process is shown in Figure 6, which provides an overview of the integrated QTO/QPC workflow. The process begins with reading BIM data directly from ProjectWise, followed by automated parameter extraction, validation, and structured data transfer to the cloud-based MySQL database. To ensure accuracy, the QPC workflow developed within FME is illustrated in Figure 7. Critical parameters, such as Nominal Weight for steel and Volume for concrete, are continuously monitored. Inconsistencies, such as missing values or material mismatches (e.g., steel elements incorrectly tagged as concrete), are automatically flagged. Flagged elements are highlighted in red within the workflow, and their details, such as Object ID, Family Name, Family Type Name, Workset Name, and Category, are recorded in a report for the engineering team. For example, three steel elements missing Nominal Weight values were identified and flagged during testing. These automated checks ensure that only verified and complete data proceed to the quantity calculation stage, greatly improving reliability and efficiency.
The same principles are applied to architectural BIM elements. The framework extracts parameters such as Family Type Name, Assembly Description, Material Name, Material Model Name, Type Mark, Area, and Material Area, depending on the element category (e.g., walls, floors, ceilings, and roof membranes).
An example of QPC validation is illustrated by Concrete Masonry Unit (CMU) walls, where quantities are cross-checked using multiple parameters to confirm consistency. For this project, the CMU wall quantities were consistent across all parameters, totaling 1090 m2. However, in the precast architectural concrete category, discrepancies were detected. The Assembly Description parameter recorded only 339 m2, while calculations based on Family Type Name and Material Name produced 580 m2. Investigation revealed that nine precast elements lacked an Assembly Description, meaning they were excluded from calculations using that parameter. The framework flagged these issues and provided a list of missing elements, allowing for prompt updates to the BIM model. After corrections were applied, the quantities were re-calculated to ensure accuracy.
Table 2 summarizes the QTO outputs for each architectural category. Most categories, such as CMU and curtain walls, show perfect consistency across parameters. However, the highlighted discrepancies in precast architectural concrete quantities emphasize the framework’s ability to identify problematic data and ensure quality. As shown in Table 2, the quantity calculated using the Assembly Description parameter (339 m2) significantly deviates from the quantities derived using Family Type Name (580 m2) and Material Name (580 m2). This inconsistency, highlighted in red, represents a 41.5% deviation, and indicates missing or incorrect parameter values in the BIM model. Such discrepancies require manual review and correction to ensure data integrity.
Additionally, the “N/A” entries in the table denote parameters that were not utilized in the workflow for calculating the corresponding category’s quantity. For instance, the Material Model Name and Type Mark parameters were not applied to categories like Glazed Barriers, Glazed Railings, or Roof Membrane, as these parameters were not relevant for those specific workflows.
Validated QTO data are automatically stored in the cloud-based MySQL database, which serves as a centralized repository for project data. This eliminates versioning issues and provides scalable, real-time access for stakeholders. Figure 8 shows the MySQL table structure for concrete elements. It includes identity and quantity parameters such as Object ID, Category, Family Name, Workset Name, Created Phase, Material, Structural Length, Volume, and Thickness. The structured format allows efficient querying and advanced analytics, supporting seamless integration with Power BI for visualization.
The MySQL database connects directly to Power BI, allowing for real-time visualization of QTO and QPC outputs without manual uploads. Stakeholders such as estimation, procurement, and cost control teams can access interactive dashboards that display key performance metrics and enable detailed exploration of quantities and quality checks.
Figure 9 shows the total steel quantity of 205.1 tons, divided into structural framing (78.66%, 612 elements) and structural columns (21.34%, 166 elements). The total concrete quantity of 1432 m3 is distributed among columns (32.5%), walls (25.94%), floors (18.44%), framings (15.31%), and foundations (7.81%). Interactive filters, such as Family Name and Workset Name, allow users to isolate specific elements, for example, columns in the Workset “43-STE-COLUMN” or “43-CON-WALL.”
Figure 10 presents historical trends in quantities. From April to December, steel decreased from 1452 tons to 1419 tons (a 2.27% reduction), while concrete dropped from 211.5 m3 to 196.7 m3 (a 7.0% reduction). This feature provides valuable insights into how design changes affect quantities, helping teams anticipate costs and schedule impacts.
Figure 11 visualizes QPC results. For this project, three steel elements were identified with missing Nominal Weight, representing 0.93% of total steel, while three concrete wall and floor elements were flagged with missing Structural Material, accounting for 0.13% of total concrete volume. The dashboard categorizes issues by Family Name, Workset Name, and Category, enabling rapid correction by the design team.
The framework supports industry standards such as New Rules of Measurement (NRM) and CSI Master Format. Unit conversions, classification codes, and element mappings can be configured to align with region-specific rules. Future enhancements will include localized rulesets to expand the framework’s international applicability.
The implementation of this framework demonstrated its ability to automate QPC, manage data in real time, and detect issues early. The system produced fully validated results: 255 tons of steel, 1453 m3 of concrete, and 1090 m2 of CMU walls. By linking QTO and QPC in a single, cloud-based workflow, the framework enhances transparency, collaboration, and decision-making while providing a scalable solution for complex construction projects.

6. Results and Validation Metrics

To evaluate the performance and reliability of the proposed automated QTO and QPC framework, a set of various validation metrics were developed. These metrics assess the framework’s ability to detect inconsistencies, improve quantity accuracy, monitor design changes dynamically, and enhance reporting efficiency. The cloud-based integration of FME automation, real-time tracking, and interactive visualization provides significant improvements over traditional manual workflows.
In the absence of established indicators for automated QTO, this study introduces five quantitative metrics designed to be reproducible and model-agnostic. IDR quantifies completeness; PCR captures internal consistency; QAI expresses accuracy improvement as the relative change between initial and post-QPC quantities; CIT measures responsiveness to design change by normalizing quantity deltas over time; and ARE evaluates process efficiency as the percentage reduction in reporting/notification latency achieved by automation. Formal definitions and equations (Equations (1)–(5)) are provided in the following subsections to enable replication.

6.1. Inconsistency Detection Rate (IDR)

The Inconsistency Detection Rate (IDR) (Equation (3)) measures the percentage of BIM elements flagged with inconsistencies relative to the total elements processed. In this case study, the framework successfully identified three structural steel elements with missing Nominal Weight parameter and multiple structural concrete elements (columns, framings, foundations, walls, and floors) with missing Volume values. These errors were automatically flagged before final QTO calculations, preventing incomplete or inaccurate quantity extractions. For the architectural model, three wall and floor elements (representing 0.13% of total concrete volume) were found with missing structural material data and corrected before final processing.
By automating this detection process, the framework significantly reduces manual effort and ensures that issues are identified early in the workflow. Unlike traditional manual inspections, FME provides a scalable and efficient quality control process that improves both accuracy and efficiency.
I D R = N u m b e r   o f   f l a g g e d   e l e m e n t s T o t a l   e l e m e n t s   p r o c e e s e d × 100

6.2. Parameter Consistency Rate (PCR)

The Parameter Consistency Rate (PCR) (Equation (4)) evaluates the consistency of QTO-related parameters across different extraction methods. A 41.5% deviation was detected in precast architectural concrete quantities due to missing or inconsistent parameter. For example, the Assembly Description parameter was recorded as 339 m2, while the Family Type Name and Material Name parameters each were recorded as 580 m2. This discrepancy demonstrates the need for robust validation to ensure reliability in QTO processes.
For other categories, such as Curtain Walls and Concrete Masonry Units (CMUs), the framework achieved a perfect 100% consistency rate, indicating that properly structured BIM parameters lead to accurate and repeatable automated quantity extraction.
P C R = 1 N u m b e r   o f   I n c o n s i s t e n t   e l e m e n t s T o t a l   e l e m e n t s   c h e c k e d × 100

6.3. Quantity Accuracy Improvement (QAI)

The Quantity Accuracy Improvement (QAI) metric (Equation (3)) measures the enhancement in QTO precision achieved through automated validation.
Following QPC corrections, the final structural quantities were refined to 205.1 tons of steel and 1432 m3 of concrete, ensuring that missing or incorrect parameters did not distort calculations. By addressing inconsistencies before QTO computations, the framework prevents overestimations or underestimations that could lead to procurement delays or cost overruns.
The integration of FME automation eliminates the need for manual interventions, streamlining the entire workflow and improving confidence in the results. Accurate and reliable quantities support better decision-making in cost estimation and resource planning.
Q A I = C o r r e c t e d   Q T O   v a l u e s I n i t i a l   Q T O   v a l u e s I n i t i a l   Q T O   v a l u e s × 100

6.4. Change Impact Tracking (CIT)

The Change Impact Tracking (CIT) metric (Equation (4)) qualifies how effectively the framework monitors the impact of design modifications over time. During the project, the framework tracked a reduction of 33 tons (2.27%) in steel quantities and 14.8 m3 (7.0%) in concrete quantities, demonstrating its ability to dynamically adjust to changes.
By automating the detection and visualization of these variations, stakeholders can make informed decisions based on the most up-to-date project data. This functionality improves material procurement planning, minimizes waste, and enhances transparency, as stakeholders can access live updates without risk of outdated information.
C I T = T r a c k e d   c h a n g e   i n   m a t e r i a l   q u a n t i t i e s I n i t i a l   q u a n t i t y × 100

6.5. Automated Reporting Efficiency (ARE)

The Automated Reporting Efficiency (ARE) metric (Equation (5)) evaluates improvements in reporting speed achieved by replacing manual Excel-based communication with automated email notifications.
Whenever inconsistencies are detected, the framework automatically sends structured reports to engineers, listing flagged elements with their IDs, families, and missing parameters. This automation reduces reporting and response times by approximately 60%, ensuring that issues are addressed promptly and QTO workflows remain uninterrupted.
This improvement enhances collaboration between teams, supports more efficient model updates, and ensures that inconsistencies are resolved before they affect cost estimation and procurement.
A R E = M a n u a l   r e p o r t i n g   t i m e A u t o m a t e d   r e p o r t i n g   t i m e M a n u a l   r e p o r t i n g   t i m e × 100
The proposed validation metrics were developed and tested on an active construction project, using live structural and architectural BIM models managed within the firm’s ongoing workflow. The results therefore represent real operational conditions rather than controlled laboratory tests. Although validation was limited to a single case, the same metric definitions are designed to be transferable across future projects and disciplines. Ongoing testing is now being extended to additional projects within the organization to further benchmark these indicators and refine their threshold parameters. This progressive validation approach ensures that the metrics remain both practical for industry application and scalable for academic replication.
The implementation of the proposed framework demonstrated its ability to detect parameters errors at early stages, preventing inaccurate QTO outputs and ensuring reliable results. A notable finding was the 41.5% deviation detected in precast architectural concrete quantities, emphasizing the necessity for robust parameters validation. Following automated corrections, the final validated quantities reached 205.1 tons of steel and 1432 m3 of concrete, significantly enhancing accuracy and confidence in the QTO process. In addition, the framework’s dynamic change-tracking capabilities improved transparency and supported proactive project management, while automated reporting reduced response times by approximately 60%, fostering better collaboration and more efficient model updates. Collectively, these results highlight the framework’s potential to transform BIM-based QTO and QPC workflows through automation, scalability, and real-time validation.

7. Discussion

The proposed framework integrates automated QTO and QPC into a single, cloud-based environment, addressing key challenges in ensuring accuracy and efficiency in BIM-based workflows. The case study results demonstrate the framework’s effectiveness in detecting parameters inconsistencies and improving QTO accuracy. To position these results in the broader context, Table 3 compares the capabilities of the proposed system with commonly used commercial tools. While Navisworks and CostX provide robust visualization and estimation features, they lack automated, customizable rule-based validation prior to quantity extraction. Solibri offers predefined quality checks but does not support user-defined rules or performance tracking through standardized metrics. The proposed framework uniquely combines QTO and QPC, leveraging open data standards and introducing novel metrics such as IDR and QAI to evaluate system performance. These features extend beyond the functionality of current commercial and research-based tools, providing both practical and academic value.
Unlike commercial systems such as Navisworks, Solibri, and CostX, which primarily focus on quantity extraction and visualization, this framework introduces a transparent, open, and customizable approach. Existing commercial tools lack mechanisms for rule-based validation of model parameters before quantity extraction, leading to potential propagation of errors in material quantities and classifications into cost estimates and procurement decisions. Moreover, these tools operate as proprietary systems, limiting opportunities for integration with open standards such as IFC or adaptation to project-specific needs.
Academic research has addressed aspects of these challenges, with some studies focusing on automated QTO processes [18,20] and others on model checking and validation workflows [22,49]. However, most of these efforts treat QTO and model quality control as separate workflows and rarely combine them within a unified, scalable, cloud-based framework. Furthermore, there has been little progress in developing systematic performance metrics to evaluate the accuracy, consistency, and responsiveness of automated QTO systems. This research advances the field by bridging these gaps: it integrates QTO and QPC into a single workflow while introducing novel quantitative metrics such as Inconsistency Detection Rate (IDR), Parameter Consistency Rate (PCR), and Quantity Accuracy Improvement (QAI). These contributions differentiate the framework from both commercial systems and prior academic approaches by providing a replicable and measurable methodology for improving BIM-based estimation.
The framework’s main contribution lies in its ability to ensure that quantities are extracted only from verified and consistent model data. Its cloud-based database architecture supports real-time collaboration and multi-user access, eliminating version control issues and ensuring that stakeholders work with up-to-date information. A significant advantage of the developed framework lies in its real-time collaboration capability, enabled through the cloud-hosted MySQL database and its live connection to Power BI. This architecture ensures that multiple stakeholders, estimators, designers, and project managers can simultaneously access and visualize the same dataset without version conflicts or latency issues. As soon as an engineer updates the BIM model, the automated FME process extracts, validates, and pushes new data to the database, instantly refreshing the dashboard for all users. This bidirectional synchronization shortens feedback loops between design and estimation teams, allowing immediate correction of modeling errors, dynamic cost updates, and improved decision-making. In comparison with conventional local or file-based workflows, where data exchange often relies on manual exports, the proposed system enables continuous coordination, reduces the risk of outdated information, and supports transparent multi-user collaboration across different disciplines and locations.
In addition, the modular rule-based QPC engine is designed for flexibility, allowing users to extend and adapt the validation rules for different disciplines and project types without modifying the core system. To assess the generalizability and robustness of the QPC module, three evaluation dimensions are proposed:
  • Rule Coverage (RC): Measures the percentage of model elements checked by at least one rule, indicating completeness of validation. In this case study, 94.7% of elements were covered.
  • Rule Precision (RP): Evaluates the accuracy of flagged issues by comparing true errors with false positives. The current implementation achieved a precision of 91.5%.
  • Rule Extensibility (RE): Quantifies the ease of adding new rules without modifying the core framework. This was demonstrated by introducing six additional rules for MEP elements during testing, requiring no changes to the main QPC engine.
These metrics provide a systematic foundation for evaluating and expanding the rule-based QPC system beyond project-specific applications. By introducing standardized QTO performance metrics, the framework also provides researchers and practitioners with tools for objectively assessing the quality of QTO processes, which is a major advancement over existing methods that rely primarily on qualitative evaluations or proprietary software outputs.
Despite these strengths, several limitations must be acknowledged.
  • Validation was conducted on a single case study involving structural and architectural BIM models at LOD 300, which may not capture the complexities of other disciplines such as mechanical, electrical, and plumbing (MEP) or large-scale infrastructure projects.
  • The QPC rules were manually defined for this project, and while they are extensible, further development is needed to align them with international standards such as ISO 19650, IFC 4.3, and IDS templates.
  • The reliance on cloud infrastructure also introduces potential challenges related to data privacy, cybersecurity, and network reliability, which must be addressed for broader adoption.
  • User feedback was limited to an internal project team; larger-scale testing with diverse stakeholders is required to refine usability and assess organizational readiness for implementation.
  • Although these comparisons highlight conceptual differences, future research should include direct experimental benchmarking with commercial systems to quantify improvements in speed, accuracy, and scalability.
The design of the framework supports scalability and generalizability beyond the specific project studied. Its compatibility with open data formats such as IFC, Civil 3D, and Tekla allows integration with multiple BIM authoring tools and its deployment across various project types, from vertical buildings to horizontal infrastructure. The modular nature of the QPC engine enables the addition of discipline-specific rules, making it adaptable to specialized domains such as MEP systems or transportation networks. The cloud-based architecture further supports multi-project management, allowing companies to implement the framework across entire portfolios and leverage cross-project benchmarking. Future enhancements could incorporate AI-driven anomaly detection and predictive analytics to automate rule creation and improve system intelligence.
In summary, this study addresses long-standing challenges in BIM-based QTO by combining automated validation, real-time collaboration, and measurable performance evaluation within a single, scalable process. While the initial implementation focused on structural and architectural models, its modular, cloud-based design supports expansion to other domains such as MEP and infrastructure projects. By refining rule sets, incorporating international standards, and integrating AI-driven anomaly detection, the framework can evolve into a comprehensive platform for data-driven construction management. These advancements will improve accuracy, reduce costs, and accelerate the digital transformation of the construction industry.

8. Conclusions

This study presented a comprehensive, cloud-based framework for automating QTO and QPC in BIM-based workflows. Accurate QTO is essential for project planning, cost estimation, procurement, and scheduling, yet current practices often face challenges such as inconsistent parameter definitions, fragmented data, and limited real-time collaboration. Traditional or semi-automated BIM-based QTO processes still rely on manual interventions and local file management, which create inefficiencies and increase the risk of inaccurate outputs. The framework developed in this research addresses these long-standing issues by integrating QTO and QPC into a unified, fully automated process, significantly improving both accuracy and scalability.
The proposed system leverages FME for workflow automation, a cloud-hosted MySQL database for centralized data storage, and Power BI dashboards for real-time visualization. A key innovation is the rule-based QPC engine, which validates BIM parameters before quantity extraction, ensuring that all quantities are derived from consistent and complete data. This prevents errors from propagating into downstream processes such as procurement and cost management. By introducing a structured rule hierarchy based on international standards, the literature, and expert consultation, the framework provides a flexible and extensible mechanism for quality control that can evolve alongside project needs.
The case study validation demonstrated the framework’s effectiveness and practical relevance. The system automatically detected inconsistencies such as missing Nominal Weight and Volume parameters, which, if left unresolved, would have resulted in inaccurate QTO outputs. After applying automated corrections, the final validated quantities were 205.1 tons of steel and 1432 m3 of concrete. Additionally, the framework reduced reporting and response times by approximately 60% through automated notifications sent directly to design teams, eliminating the need for manual Excel-based communication. A dynamic tracking feature monitored design-driven changes, capturing a 2.27% reduction in steel quantities and a 7.0% reduction in concrete quantities over time. These outcomes demonstrate measurable improvements in accuracy, efficiency, and transparency compared to traditional approaches and provide stakeholders with actionable insights for cost control and project planning.
Another significant contribution of this study is the introduction of standardized performance metrics to evaluate and benchmark QTO/QPC workflows. Metrics such as Inconsistency Detection Rate (IDR), Parameter Consistency Rate (PCR), and Quantity Accuracy Improvement (QAI) offer objective ways to measure the quality of BIM data and the effectiveness of automated validation. These metrics bridge the gap between qualitative evaluations used in industry and the need for reproducible, quantitative research measures. This advancement addresses a critical gap identified in the literature and enables researchers and practitioners to monitor continuous improvements in QTO workflows over time.
Several future research directions are proposed to enhance the framework’s capabilities.
  • Integrating artificial intelligence (AI) and machine learning could enable automated rule generation and anomaly detection, allowing the system to adapt dynamically to different project contexts.
  • Expanding interoperability with emerging BIM platforms and GIS-integrated systems would improve collaboration across multiple disciplines and support applications in mega projects and infrastructure developments.
  • Linking validated QTO data with external cost estimation databases would provide real-time budget tracking and automated procurement decision-making. Fourth, incorporating pre-QTO model health checks could ensure models meet LOD/LOI requirements before takeoff begins, reducing downstream errors.
  • Integrating sustainability analytics, such as embodied carbon calculations, would help project teams optimize material usage for environmental performance.
In conclusion, the proposed framework delivers a scalable, measurable, and transparent solution for BIM-based QTO and QPC. By combining automated validation, centralized data management, and real-time visualization, it significantly improves the accuracy and efficiency of construction workflows. The framework addresses critical gaps identified in both academic research and industry practice, offering a replicable methodology that integrates open standards and supports continuous performance monitoring. With further refinement, broader testing across diverse project types, and integration of advanced technologies, this framework has the potential to transform digital construction management by reducing costs, improving collaboration, and enabling data-driven decision-making for more efficient and sustainable construction practices.

Author Contributions

Conceptualization, M.V. and F.V.; methodology, M.V.; software, M.V. and A.K.; validation, M.V.; formal analysis, M.V.; investigation, M.V.; resources, M.V., F.V., and C.C.-G.; data curation, M.V.; writing—original draft, M.V.; writing—review and editing, M.V., O.M., I.I., and A.B.; visualization, M.V.; supervision, M.V., O.M., and I.I.; funding acquisition, O.M., I.I., F.V., and A.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

Authors Mojtaba Valinejadshoubi, Fernando Valdivieso, Charles Corneau-Gauvin and Armel Kaptué were employed by the company Pomerleau Inc. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Abbreviations

AECArchitecture, Engineering, and Construction
AREAutomated Reporting Efficiency
BIMBuilding Information Modeling
BQTCMBIM-based Quantity Takeoff Code Mapping
CITChange Impact Tracking
CMUConcrete Masonry Unit
IFC/IDSIndustry Foundation Classes/Information Delivery Specification
FMEFeature Manipulation Engine
IDRInconsistency Detection Rate
LOD/LOILevel of Development/Level of Information
MEPMechanical, Electrical, and Plumbing
NRMNew Rules of Measurement
PCRParameter Consistency Rate
QAIQuantity Accuracy Improvement
QPCQuantity Precision Check (used this single term throughout)
QTOQuantity Takeoff

References

  1. Liu, H.; Lu, M.; Al-Hussein, M. Ontology-based semantic approach for construction-oriented quantity takeoff from BIM models in the light-frame building industry. Adv. Eng. Inform. 2016, 30, 190–207. [Google Scholar] [CrossRef]
  2. Olanrewaju, A.; Anahve, P.J. Duties and responsibilities of quantity surveyors in the procurement of building services engineering. Procedia Eng. 2015, 123, 352–360. [Google Scholar] [CrossRef]
  3. Monteiro, A.; Martins, J.P. A survey on modeling guidelines for quantity takeoff-oriented BIM-based design. Autom. Constr. 2013, 35, 238–253. [Google Scholar] [CrossRef]
  4. Azhar, S.; Nadeem, A.; Mok, A.Y.N.; Leung, B.H.Y. Building information modeling (BIM): A new paradigm for visual interactive modeling and simulation for construction projects. In Proceedings of the First International Conference on Construction in Developing Countries ICCIDC–I. Advancing and Integrating Construction Education, Research and Practice, Karachi, Pakistan, 4–5 August 2008; pp. 435–446. Available online: https://api.semanticscholar.org/CorpusID:112822775 (accessed on 22 October 2025).
  5. Hartmann, T.; van Meerveld, H.; Vossebeld, N.; Adriaanse, A. Aligning building information model tools and construction management methods. Autom. Constr. 2012, 22, 605–613. [Google Scholar] [CrossRef]
  6. Kula, B.; Ilter, D.A.; Ergen, E. Building Information Modelling for Performing Automated Quantity Takeoff. In Proceedings of the 5th International Project and Construction Management Conference (IPCMC2018), Kyrenia, Cyprus, 16–18 November 2018; pp. 645–653. [Google Scholar]
  7. Tiwari, S.; Odelson, J.; Watt, A.; Khanzode, A. Model Based Estimating to Inform Target Value Design. AECbytes. Available online: https://www.aecbytes.com/ (accessed on 22 October 2025).
  8. Sacks, R.; Eastman, C.; Lee, G.; Teicholz, P. BIM Handbook: A Guide to Building Information Modeling for Owners, Designers, Engineers, Contractors, Facility Managers; John Wiley & Sons: Hoboken, NJ, USA, 2018; ISBN 978-1-119-28753-7. [Google Scholar]
  9. Sattineni, A.; Bradford, R.H. Estimating with BIM: A Survey of US Construction Companies. In Proceedings of the 28th International Symposium on Automation and Robotics in Construction (ISARC) 2011, Seoul, Republic of Korea, 29 June–2 July 2011; pp. 564–569. [Google Scholar] [CrossRef]
  10. Nadeem, A.; Wong, A.K.D.; Wong, F.K.W. Bill of Quantities with 3D Views Using Building Information Modeling. Arab. J. Sci. Eng. 2015, 40, 2465–2477. [Google Scholar] [CrossRef]
  11. Khosakitchalert, C.; Yabuki, N.; Fukuda, T. BIM-based wall framing calculation algorithms for detailed quantity takeoff. In Proceedings of the 4th International Conference on Civil and Building Engineering Informatics (ICCBEI), Bangkok, Thailand, 19 July–22 July 2023; pp. 251–258, ISBN 978-4-600-00276-3. [Google Scholar]
  12. Khosakitchalert, C.; Yabuki, N.; Fukuda, T. Improving the accuracy of BIM-based quantity takeoff for compound elements. Autom. Constr. 2019, 106, 102891. [Google Scholar] [CrossRef]
  13. Khosakitchalert, C. Development of Quantity Takeoff Methods for Compound Elements Based on Building Information Modeling (BIM). Doctoral Dissertation, Osaka University Knowledge Archive, Osaka, Japan, 2020. [Google Scholar] [CrossRef]
  14. Lee, G. What Information Can or Cannot Be Exchanged? J. Comput. Civ. Eng. 2011, 25, 62. [Google Scholar] [CrossRef]
  15. Ma, H.; Ha, K.M.E.; Chung, C.K.J.; Amor, R. Testing Semantic Interoperability. In Proceedings of the JICCDMCBE, Montreal, QC, Canada, 14–16 June 2006; Available online: https://www.cs.auckland.ac.nz/~trebor/papers/MA06.pdf (accessed on 22 October 2025).
  16. Choi, J.; Kim, H.; Kim, I. Open BIM-based quantity takeoff system for schematic estimation of building frame in early design stage. J. Comput. Des. Eng. 2015, 2, 16–25. [Google Scholar] [CrossRef]
  17. Vassen, S.A. Impact of BIM-based Quantity Take off for Accuracy of Cost Estimation. Int. J. Constr. Eng. Manag. 2021, 10, 55–69. [Google Scholar] [CrossRef]
  18. Plebankiewicz, E.; Zima, K.; Skibniewski, M. Analysis of the first Polish BIM-Based cost estimation application. Procedia Eng. 2015, 123, 405–414. [Google Scholar] [CrossRef]
  19. Zima, K. Impact of information included in the BIM on preparation of Bill of Quantities. Procedia Eng. 2017, 208, 203–210. [Google Scholar] [CrossRef]
  20. Vitasek, S.; Matejka, P. Utilization of BIM for automation of quantity takeoffs and cost estimation in transport infrastructure construction projects in the Czech Republic. IOP Conf. Ser. Mater. Sci. Eng. 2017, 236, 012110. [Google Scholar] [CrossRef]
  21. Eroglu, E. Evaluation of the Reliability of BIM-Based Quantity Takeoff Processes in Construction Projects. Master’s Thesis, Graduate School of Natural and Applied Sciences, Civil Engineering, Middle East Technical University, Ankara, Turkey, 2019. Available online: https://hdl.handle.net/11511/44637 (accessed on 22 October 2025).
  22. Sherafat, B.; Taghaddos, H.; Shafaghat, E. Enhanced automated quantity take-off in building information modeling. Sci. Iran. 2022, 29, 1024–1037. [Google Scholar] [CrossRef]
  23. Valinejadshoubi, M.; Moselhi, O.; Iordanova, I.; Valdivieso, F.; Bagchi, A. Automated system for high-accuracy quantity takeoff using BIM. J. Autom. Constr. 2024, 157, 105155. [Google Scholar] [CrossRef]
  24. Shoubi, M.V.; Shoubi, M.V.; Bagchi, A.; Barough, A.S. Reducing the operational energy demand in buildings using building information modeling tools and sustainability approaches. Ain Shams Eng. J. 2015, 6, 41–55. [Google Scholar] [CrossRef]
  25. Dorey, C.; Valinejadshoubi, M.; Bagchi, A. Integrated Building Design and Energy Simulation. In Proceedings of the CSCE Annual Conference, Montreal, QC, Canada, 12–15 June 2019; ISBN 9781713802013. [Google Scholar]
  26. Valinjadshoubi, M.; Bagchi, A.; Moselhi, O. Identifying At-Risk Non-Structural Elements in Buildings Using BIM: A Case Study Application. J. Earthq. Eng. 2018, 24, 869–880. [Google Scholar] [CrossRef]
  27. Valinjadshoubi, M.; Bagchi, A.; Moselhi, O. Development of a BIM-Based Data Management System for Structural Health Monitoring with Application to Modular Buildings: Case Study. J. Comput. Civ. Eng. 2019, 33, 826. [Google Scholar] [CrossRef]
  28. Valinjadshoubi, M.; Bagchi, A.; Moselhi, O. Managing structural health monitoring data using building information modeling. In Proceedings of the 2nd World Congress and Exhibition on Construction and Steel Structure 2016, Las Vegas, NV, USA, 22–24 September 2016. [Google Scholar] [CrossRef]
  29. Valinejadshoubi, M.; Bagchi, A.; Moselhi, O.; Shakibaborough, A. Investigation on the potential of building information modeling in structural health monitoring of buildings. In Proceedings of the CSCE Annual Conference 2018, Fredericton, NB, Canada, 13–16 June 2018; pp. 407–416, ISBN 9781510889743. [Google Scholar]
  30. Valinejadshoubi, M.; Moselhi, O.; Bagchi, A.; Salem, A. Development of an IoT and BIM-based automated alert system for thermal comfort monitoring in buildings. J. Sustain. Cities Soc. 2021, 66, 102602. [Google Scholar] [CrossRef]
  31. Valinejadshoubi, M.; Moselhi, O.; Bagchi, A. Integrating BIM into sensor-based facilities management operations. J. Facil. Manag. 2021, 20, 385–400. [Google Scholar] [CrossRef]
  32. Olsen, D.; Taylor, M. Quantity Takeoff Using Building Information Modeling (BIM), and Its Limiting Factors. Procedia Eng. 2017, 196, 1098–1105. [Google Scholar] [CrossRef]
  33. Disney, O.; Roupé, M.; Johansson, M.; Leto, A.D. Embracing BIM in its totality: A Total BIM case study. Smart Sustain. Built Environ. 2022, 13, 512–531. [Google Scholar] [CrossRef]
  34. Vieira, L.; Campos, M.; Granja, J.; Azenha, M. Framework for (semi) automatized construction specification and quantity takeoff in the context of small and medium architectural design offices. Archit. Struct. Constr. 2022, 2, 403–437. [Google Scholar] [CrossRef]
  35. Smith, P. BIM & the 5D project cost Manager. Procedia Soc. Behav. Sci. 2014, 119, 475–484. [Google Scholar] [CrossRef]
  36. Kim, S.; Chin, S.; Kwon, S. A discrepancy analysis of BIM-based quantity takeoff for building interior components. J. Manag. Eng. 2019, 35, 05019001. [Google Scholar] [CrossRef]
  37. Monterio, A.; Martins, J.P. Bim Modeling for Contractors—Improving Model Takeoffs. In Proceedings of the CIB W78 2012: 29th International Conference, Beirut, Lebanon, 17–19 October 2012; Available online: https://hdl.handle.net/10216/66830 (accessed on 22 October 2025).
  38. Golaszewska, M.; Salamak, M. Challenges in Takeoffs and Cost Estimating in the BIM Technology, Based on the Example of a Road Bridge Model. Civ. Eng. 2017, 4, 71–79. [Google Scholar] [CrossRef]
  39. Smith, P. Project cost management with 5D BIM. Procedia Soc. Behav. Sci. 2016, 226, 193–200. [Google Scholar] [CrossRef]
  40. Cepni, Y.; Akcamete, A. Automated Bim-Based Formwork Quantity Takeoff. In Proceedings of the 20th International Conference on Construction Applications of Virtual Reality, Middlesbrough, UK, 30 September–2 October 2020; Available online: https://hdl.handle.net/11511/78555 (accessed on 22 October 2025).
  41. Taghaddos, H.; Mashayekhi, A.; Sherafat, B. Automation of Construction Quantity Takeoff: Using Building Information Modeling (BIM). In Proceedings of the Construction Research Congress 2016, San Juan, Puerto Rico, 31 May–2 June 2016. [Google Scholar] [CrossRef]
  42. Fürstenberg, D.; Hjelseth, E.; Klakegg, O.J. Automated quantity take-off in a Norwegian road project. Sci. Rep. 2024, 14, 458. [Google Scholar] [CrossRef]
  43. Han, P.A.; Siu, M.F.F.; AbouRizk, S.; Hu, D.; Hermann, U. 3D Model-Based Quantity Take-Off for Construction Estimates. Comput. Civ. Eng. 2017, 2017, 382–390. [Google Scholar] [CrossRef]
  44. Yang, F.; Zhang, J.; Chong, O.W.; Sexton, C. Introducing Computing to Construction Management Undergraduate Students through Automated Quantity Takeoff from IFC-Based BIM. In Computing in Civil Engineering 2023; ASCE: Reston, VA, USA, 2024; pp. 457–466. [Google Scholar] [CrossRef]
  45. Akanbi, T.; Zhang, J. IFC-Based Algorithms for Automated Quantity Takeoff from Architectural Model: Case Study on Residential Development Project. J. Archit. Eng. 2023, 29, 04023026. [Google Scholar] [CrossRef]
  46. Pham, V.-H.; Chen, P.-H.; Nguyen, Q.; Duong, D.-T. BIM-Based Automatic Extraction of Daily Concrete and Formwork Requirements for SiteWork Planning. Buildings 2024, 14, 4021. [Google Scholar] [CrossRef]
  47. Chen, B.; Jiang, S.; Qi, L.; Su, Y.; Mao, Y.; Wang, M.; Cha, H.S. Design and Implementation of Quantity Calculation Method Based on BIM Data. Sustainability 2022, 14, 7797. [Google Scholar] [CrossRef]
  48. Liu, H.; Cheng, J.C.P.; Gan, V.J.L.; Zhou, S. A knowledge model-based BIM framework for automatic code-compliant quantity take-off. J. Autom. Constr. 2022, 133, 104024. [Google Scholar] [CrossRef]
  49. Hjelseth, E. BIM-based model checking (BMC). In Building Information Modeling: Applications and Practices; ASCE: Reston, Virginia, 2015; pp. 33–61. [Google Scholar] [CrossRef]
  50. Seib, S. Development of Model Checking Rules for Validation and Content Checking. In WIT Transactions on The Built Environment; WIT Press: Southampton, UK, 2019; Volume 192. [Google Scholar] [CrossRef]
  51. Alathameh, S.; Collins, W.; Azhar, S. BIM-based quantity takeoff: Current state and future opportunities. Autom. Constr. 2024, 165, 105549. [Google Scholar] [CrossRef]
  52. Dynamo BIM. Dynamo BIM—Community-Driven Open-Source Graphical Programming for Design. Available online: https://dynamobim.org (accessed on 10 February 2017).
Figure 1. Overall research methodology and system architecture of the proposed QTO/QPC framework.
Figure 1. Overall research methodology and system architecture of the proposed QTO/QPC framework.
Buildings 15 03942 g001
Figure 2. Detailed research methodology workflow showing integration of the four framework modules.
Figure 2. Detailed research methodology workflow showing integration of the four framework modules.
Buildings 15 03942 g002
Figure 3. Structural and architectural BIM models.
Figure 3. Structural and architectural BIM models.
Buildings 15 03942 g003
Figure 4. Dataflow schema for structural QTO/QPC Data Extraction and Analysis Module [23].
Figure 4. Dataflow schema for structural QTO/QPC Data Extraction and Analysis Module [23].
Buildings 15 03942 g004
Figure 5. Dataflow schema for architectural QTO/QPC Data Extraction and Analysis Module [23].
Figure 5. Dataflow schema for architectural QTO/QPC Data Extraction and Analysis Module [23].
Buildings 15 03942 g005
Figure 6. Conceptual overview of the FME-based data extraction and QC workflow.
Figure 6. Conceptual overview of the FME-based data extraction and QC workflow.
Buildings 15 03942 g006
Figure 7. Workflow for automated QPC validation using FME.
Figure 7. Workflow for automated QPC validation using FME.
Buildings 15 03942 g007
Figure 8. Designated SQL database table for storing QTO and identity data of concrete structural elements.
Figure 8. Designated SQL database table for storing QTO and identity data of concrete structural elements.
Buildings 15 03942 g008
Figure 9. The proposed visualization dashboards for QTO and QC overview.
Figure 9. The proposed visualization dashboards for QTO and QC overview.
Buildings 15 03942 g009
Figure 10. Visualization dashboards for BIM design change tracking in concrete and steel quantities.
Figure 10. Visualization dashboards for BIM design change tracking in concrete and steel quantities.
Buildings 15 03942 g010
Figure 11. Visualization dashboards for BIM QPC process: (a) concrete quantities and (b) steel quantities.
Figure 11. Visualization dashboards for BIM QPC process: (a) concrete quantities and (b) steel quantities.
Buildings 15 03942 g011
Table 1. Types of materials considered in this study.
Table 1. Types of materials considered in this study.
DisciplineMaterial Type
ArchitecturalGlazingDrywallsPrecast ARCH ConcreteConcrete Masonry UnitsCurtain WallsCeramic TilingCeilingsRoofing Membrane
StructuralSteelConcrete
Table 2. Architectural QTO outputs based on different parameters.
Table 2. Architectural QTO outputs based on different parameters.
Architectural
Category
QTO Value_
Family Type
QTO Value_
Assembly Description
QTO Value_
Material Name
QTO Value_
Material Model Name
QTO Value_
Type Mark
Glazed Barriers34 m2N/AN/AN/AN/A
Glazed Railings71 m2N/AN/AN/AN/A
Glazed Lifts101 m2N/AN/AN/AN/A
DrywallsN/A208 m2208 m2N/AN/A
Precast ARCH Concrete580 m2339 m2580 m2N/AN/A
Concrete Masonry Unit1055 m21055 m21055 m2N/AN/A
Curtain Walls1389 m21389 m2N/AN/A1389 m2
Ceramic Tiles (Walls)133 m2N/A133 m2N/AN/A
Ceramic Tiles (Floors)1065 m2N/A1065 m2N/AN/A
Podotactile Tiles94 m2N/A94 m2N/AN/A
Metallic CeilingsN/AN/A384 m2384 m2N/A
Roof Membrane1514 m2N/A1514 m2N/AN/A
Table 3. Comparing the capabilities and limitations of existing tools and the proposed framework.
Table 3. Comparing the capabilities and limitations of existing tools and the proposed framework.
Feature/CapabilityNavisworksSolibriCostXProposed Framework
Cloud-based real-time collaborationLimited
Automated rule-based QPC before QTO✓ (predefined rules only)✓ (customizable and IFC-based)
Open data standards (IFC, Civil 3D, etc.)LimitedLimited
Transparent, research-ready
Novel performance metrics (IDR, QAI)
Extensible rule engine for multiple disciplinesLimited
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Valinejadshoubi, M.; Moselhi, O.; Iordanova, I.; Valdivieso, F.; Bagchi, A.; Corneau-Gauvin, C.; Kaptué, A. A Cloud-Driven Framework for Automated BIM Quantity Takeoff and Quality Control: Case Study Insights. Buildings 2025, 15, 3942. https://doi.org/10.3390/buildings15213942

AMA Style

Valinejadshoubi M, Moselhi O, Iordanova I, Valdivieso F, Bagchi A, Corneau-Gauvin C, Kaptué A. A Cloud-Driven Framework for Automated BIM Quantity Takeoff and Quality Control: Case Study Insights. Buildings. 2025; 15(21):3942. https://doi.org/10.3390/buildings15213942

Chicago/Turabian Style

Valinejadshoubi, Mojtaba, Osama Moselhi, Ivanka Iordanova, Fernando Valdivieso, Ashutosh Bagchi, Charles Corneau-Gauvin, and Armel Kaptué. 2025. "A Cloud-Driven Framework for Automated BIM Quantity Takeoff and Quality Control: Case Study Insights" Buildings 15, no. 21: 3942. https://doi.org/10.3390/buildings15213942

APA Style

Valinejadshoubi, M., Moselhi, O., Iordanova, I., Valdivieso, F., Bagchi, A., Corneau-Gauvin, C., & Kaptué, A. (2025). A Cloud-Driven Framework for Automated BIM Quantity Takeoff and Quality Control: Case Study Insights. Buildings, 15(21), 3942. https://doi.org/10.3390/buildings15213942

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop