Remote Indoor Construction Progress Monitoring Using Extended Reality

: Construction Progress monitoring noticed recent expansions by adopting vision and laser technologies. However, inspectors need to personally visit the job-site or wait for a time gap to process data captured from the construction site to use for inspection. Recent inspection methods lacks automation and real-time data exchange, therefore, it needs inspection manpower for each job-site, the health risk of physical interaction between workers and inspector, loss of energy, data loss, and time consumption. To address this issue, a near real-time construction work inspection system called iVR is proposed; this system integrates 3D scanning, extended reality, and visual programming to visualize interactive onsite inspection for indoor activities and provide numeric data. The iVR comprises ﬁve modules: iVR-location ﬁnder (ﬁnding laser scanner located in the construction site) iVR-scan (capture point cloud data of job-site indoor activity), iVR-prepare (processes and convert 3D scan data into a 3D model), iVR-inspect (conduct immersive visual reality inspection in construction ofﬁce), and iVR-feedback (visualize inspection feedback from job-site using augmented reality). An experimental lab test is conducted to verify the applicability of iVR process; it successfully exchanges required information between construction job-site and ofﬁce in a speciﬁc time. This system is expected to assist Engineers and workers in quality assessment, progress assessments, and decision-making which can realize a productive and practical communication platform, unlike conventional monitoring or data capturing, processing, and storage methods, which involve storage, compatibility and time-consumption issues.


Introduction
The construction industry has devoted significant efforts toward sustainable construction by utilizing automation and vision technologies [1,2]. Especially, Construction progress monitoring has been leveraged to increase accuracy and time efficiency [3,4]. This includes measuring the progress through site assessments and comparisons with the project plan; in these methods, the quality of progress data completely depends on the expertise of inspection and the measurement quality [5].
Generally, manual visual observations and conventional progress tracking can be used to provide feedback on progress measurement [3], productivity rate [5], tracking devices and materials [6], and safety planning [7]. Work inspection engineers typically adhere to guidelines provided by construction quality control institutes like the International Organization for Standardization (ISO)(accessed on 22 December 2020). One of the primary aims of inspection is to evaluate dimensional (dimension and position) errors and surface defects (cracks and spalling) in activities involving concrete columns and HVAC pipes [8]. However, conventional methods for construction work inspection are time-consuming, needs inspection manpower [9], consume energy, economically, inefficient and prone to

Literature Review
To demonstrate the integration of 3D scanning and extended reality for onsite progress monitoring, this study considers various technical challenges of 3D laser scanning and extended reality, including interdependency, spatial site layout collision, analyses, and management, digital to a physical link, project control, and design visualization during production.
In the Construction processes, the term 'sustainability' connotes 'efficiency' that considers as the key aspect, integrating building automation system in construction activities, including progress monitoring to reduce workforce, increase efficiency, create a healthier and safer environment by minimizing the physical interaction between inspectors and workers in the job-site [1,4]. In more detail, a point cloud is applied for different phases with various targets, including planning and design, manufacturing and construction, and operation and maintenance. In the planning and design stage, a point cloud is used to reconstruct 3D models of the site and existing buildings [14,15]. In the manufacturing and construction phase, a point cloud is applied for automating construction equipment [13], digital reproduction [4,16], identification of safety hazards [17], monitoring of construction progress [18], and inspection of construction quality [17]. In the operation and maintenance phase, a point cloud is applied for reconstructing 3D architectural structures [15], building performance analyses [19], facility management for 3D model reconstruction [12], and inspection of building quality [18,19].
As-built point clouds can be acquired using laser scanning or image-based photogrammetry methods. The generated point clouds are co-registered with the model by using an adapted iterative-closest point (ICP) algorithm, and the as-planned model is transformed in to a point cloud by simulating the points using the recognized positions of the laser scanner. Recently, researchers have developed systems that can detect different component categories using a supervised classification based on features obtained from the as-built point cloud; after that, an object can be detected if the classification category is identical to that in the model [3,20,21].
Utilizing cameras as the acquisition tool results in lower geometric accuracy, as compared to that obtained by laser scanning point clouds. A few recent studies have detected spatial differences between time frame images [16,22] and updated the project schedule using images [7], and compare as-built to as planned BIM model. The Kinect 3D scanner employs the similar technology as that of a mid-range 3D scanner, i.e., a camera and an infrared camera to calculate the depth field of an object and its surroundings. These two "cameras" used in the Kinect enable it to scan nearly every chosen item, with decent 3D accuracy. The depth camera has been employed for the surveillance and assessment of construction sites, tracking of materials, equipment, and labor [23], safety monitoring [24], and reconstruction [2,18]. Recently, advanced technologies such as robotics and drones have been proposed in the construction site to aid or replace workers which need real time remote monitoring technology that can be integrated with its own system such as facade pick-and-place using robot arm [25], brick laying using drone [26], humanoid robotics [27], and integrating site terrain with BIM model [28].
The construction industry has explored reconstruction architectures using active scanners, 3D laser scanners, and passive cameras. However, such equipment has not been implemented in real-world applications thus far. Moreover, state-of-the-art methods necessitate high computational resources to analyze 3D points and varying dynamic scenes. However, the incorporation of Kinect low-cost scene scanners can enable the rapid replication of 3D models with improved size and resolution. Several simultaneous localization and mapping systems have used sparse maps to locate and focus on real-time tracking [24]. Moreover, other methodologies have also been used for reconstructing simple point-based representations [15]. Kinect cameras exceed such point-based representations by reconstructing surfaces that approach real-world geometry, with greater reliability [2,29].
The extended reality in construction progress monitoring Immersive simulation improves the comprehension of dynamic construction processes and promotes the assessment of different project conditions with minimal cost and intervention [30]. On top of that, Building professionals and researchers perceive the value of simulations and have been exploring the use of VR for interactive visualization. Potential experiments in this area include construction defect management [31], information delivery in the excavation stage [32,33], dynamic visualization using GPS [34,35]. Cloud-based computing represent another field that has enabled real-time coordination and empowered project teams to expand BIM from the design to planning stages. BIM-based platforms have recently facilitated real-time collaboration in cloud computing. However, cloud-based BIM data-exchange protocols, resource management, and connection to construction sites however lack realtime and user-friendly tools. Research review advances the principle of construction scheduling automation.
Although laser scanning data have been widely utilized for the dimensional and surface engineer in various civil applications and project quality checking, its integration with visual algorithms and extended reality technologies is important to explore more to encourage remote monitoring without the need to visit a job-site in person. Therefore, this paper proposes a laser scanning-based, accurate, and reliable technique for near real-time assessment of dimensional and surface quality, to reduce the gap between construction offices and job sites by integrating the Kinect 3D scanning and extended reality.

Methodology
The proposed framework incorporates knowledge transfer from the job-site to the construction office and vice versa. The core feature of this framework is to integrate a monitoring device with extended reality technology in one platform that syncs between the construction office and job-site in near real-time. The following research questions are asked: One, how can the inspector monitoring indoor activity remotely from the construction site using extended reality? Two, how to erect one platform that can capture activity 3d information, dispatch it to the construction office, generate inspection report, and send it back to construction site in near real time? Three, can we utilize virtual and augmented reality for activity inspection in an integrated platform.
The research methodology is divided into three sections: (1) inspection checklist, which explains the method to build the inspection agenda in VR; (2) laser scanner and extended reality-based quality inspection procedure (iVR), which addresses the system architecture of the proposed methodology; and (3) data storage and data compatibility, which discusses innovative methods of storing data exchange between job-site and construction office. The proposed system, iVR, comprises the following five modules: (1) iVR-location finder that optimizes laser scanner device location in the physical model; (2) iVR-scan which captures point cloud of target activity using 3D laser scanners; (3) iVR-prepare converts point cloud into a 3D mesh geometry; (4) iVR-inspect that let site engineer generate a quality check and feedback report using VR, and (5) iVR-feedback visualizes of inspection feedback in job-site using the AR (iVR-feedback) module.
Generally, iVR-location finder helps inspector find the best laser scanner position to monitor the targeted activity in the construction site and view how the point cloud data will look like before monitoring activity starting and pass position coordinates to iVR-scan so that the inspector position the laser scanner in the correct position in the physical site following the digital position optimized using iVR-location finder module. Next, the iVR-scan starts capturing point cloud data of target activity and passes it to iVR prepare where it converts received data into 3D mesh geometry and sends it from the construction office to iVR-inspect in the construction office. After that, the inspector measures progress checking, quality control in virtual reality mode, and sends the inspection report back to the construction site. Finally, the worker views the inspection report in augmented reality hosted inside the iVR-feedback module.

Inspection Checklist
Before performing progress monitoring, it is essential to identify the inspection checklist, which includes the attributes of activity elements that need to be inspected and their required degree of accuracy. Generally, the construction specifications of a project play a major role in determining the inspection checklist. These specifications detail the quality requirements for each component of the construction project, which can then be translated into the inspection checklist. In this study, the inspection checklist is determined based on the tolerance manual for object placement and casting tolerance in the guide for concrete columns provided by the American Concrete Institute. Table 1 displays the checklists listed, which comprise two sections: geometry and defects. Existent two characteristics for the geometry category: dimension and location of concrete components. Each dimensional function has its checklist, which is referred to as "attributes." For instance, height, width, and thickness are characteristics of the "scale" function. The resistance of each feature differs with the form of design, item size, and product duration. It also presents tolerances for an exemplary concrete column and HVAC pipe. Each dimensional attribute should be carefully checked to ensure that its value lies within the predefined tolerance. Otherwise, the dimensional errors in each prefixed variable will aggregate in the prefix segment. There are three characteristics of defect: spalling, crack, and flatness.

Extended Reality Based Progress Monitoring System (iVR)
After determining the checklist and required tolerances for inspection, an inspection protocol must be conducted. iVR system comprises four modules: module 1, which involves capturing the site model using 3D laser scanners (iVR-scan), module 2, in which the point cloud is converted into a 3D mesh (iVR-prepare), module 3, which involves the quality check by the site engineer and the feedback report generation using VR (iVRinspection), and module 4 in which the inspection feedback is visualized in job-site using AR ( iVR-feedback). In more detail, 3D laser scanners are used to capture point clouds of activities, which are then converted into 3D models and sent to the building information modeling cloud of the construction office by the iVR-prepare algorithm. Subsequently, iVR-inspect enables engineers to trace workflow, compare current project progress with blueprints, measure objects, and add text/design notes to 3D models, to leverage the quality of project site management. Finally, iVR-feedback sends inspection reports to job-site workers, who can visualize it in augmented reality mode integrated with graphical algorithms as illustrated in Figure 1. The iVR system has a predefined settings that need to be set before starting the monitoring system. The predefined settings are matching scanner position in the reality and BIM model, determining a field of view, camera resolution quality, color coding activity objects, BIM model level of details, and work breakdown structure. The point cloud density needs to be set before the monitoring process to prevent geometry data overload during data transfer between construction job-site and offices.

iVR-Location Finder Module
In addition to the predefined setting requirement list, this research developed a tool to aid inspectors to determine camera location before monitoring activity called iVR-location finder. This tool consists of a list of options to help the inspector control location of the scanning camera to monitor the targeted activity in the construction job-site in the planning stage of the proposed iVR system. These options are choosing a laser scanner type and brand from a drop-down list, select the date of target activity from this tool's calendar-see Figure 2 (component 1)-, laser scanner field of view (FOV), visible and blocked angles, laser scanner camera view in BIM environment, laser scanner field of view window sphere. Once the inspector chooses the laser camera type and date, iVR location finder will generate an optimized FOV location. The optimized location is found by (a) dividing working area A d to equal isocurves where A d is construction activity daily working space-closed boundaries where d is the day selected from the work-space breakdown structure schedule, (b) calculating arithmetic mean points of each isocurve, (c) connecting arithmetic mean points by an interpolated curve to generate activity area Center-line (CL), (d) divide CL by Camera Radius (CR) divided by k * CR to generate a double amount of (Ni + 1) of cameras needed for A d , where k is greater than CR. In this research k is 2.01, which is half CR added amount of (0.01) to ensure that the camera FOV is covering all A d .
where CR is the radius of camera field of view and Ni is number of CR needed for A d .
After that, (e) Cull (remove) CP by True and False logic to divide Ni by two and get optimized CP, and finally (f) draw FOV, CR and CP in the BIM model as shown in Figure 2 (component 3). All iVR-location finder process data that consist of FOV, Ni, Li, CR and CP are overlaid in the BIM model. The physical location of the camera is synchronized with with the digital model manually to ensure an accurate overlay of as-built data with the as-planned data of BIM model in the iVR system.

iVR-Scan Module
Therein phase, a laser scanner is placed on the job-site to monitor a specific activity and send live scanning data to the construction office. The location, direction, type of laser scanner, software, and hardware required to accomplish successful scanning this are the primary focus of this phase as illustrated in Figure 3. The coordinates and 3D model of the construction site need to be recorded and registered in the BIM environment in advance. Subsequently, the inspector defines a clear visible location for the laser scanner called iVR-location finder.
The scanning duration depends on the following variables: (1) importance of activity: the number of times the activity needs to be monitored by project engineer; (2) daily working hours: start and end times of daily work; (3) activity location: whether the activity is conducted indoors or outdoors; and (4) scale of target objects. Therefore, the inspector should determine the duration of scanning during the planning stage of the activity, especially in the design-build delivery method, where activity implementation occurs at a faster rate, as compared to that in other delivery methods. Laser scanners can be divided into two types of hardware. The first type is stand-alone laser scanners, which are mobile and can store and transfer data to the computer after the completion of the scanning process. The second type is live scanning devices, which need to be connected to a computer during the scanning process to transfer data in real-time. This study considers the second type of scanner to reduce the gap between scanning and data processing. However, such scanners are difficult to implement in construction sites, as they require a power source and large amounts of space; these scanners have also limited mobility compared with portable scanners. Before choosing a monitoring system, utility, time efficiency, accuracy, level of automation, required preparation, training requirements, cost, and mobility of existing progress monitoring were evaluated in the literature review and analyzed. The research chose Kinect V2 as the 3D laser scanner because it offers live scanning (LIDAR + photogrammetry = Kinect 3D scan). The research component is built on a combination of cameras and sensors for target observation and distance recognition.
The process of scanning the target object is as follows: (1) set up a monitoring camera in the construction site and direct it toward the target object following Li of iVR-location finder; and (2) connect Kinect V2 to the PC Grasshopper graphical algorithm editor incorporated with the commercially developed Rhinoceros software. Three point cloud libraries are used to manage the point cloud data and convert them into the point, color, and GPS coordinates, to process the produced model in real-time as depicted in Figure 3. The registered point cloud in the iVR-scan is then passed to iVR-prepare to start converting the point cloud to 3D mesh geometry. iVR-scan set up a timer to release live frames of point cloud every 100 ms and pass it to the next module to control data processing and computation.

iVR-Prepare Module
The iVR-prepare receives point cloud data from iVR-scan every 100 Milliseconds (ms) time-lapse. This module filters the point cloud converts it to a 3D mesh model and sends it to iVR-inspect using custom cloud based data transfer. The iVR-prepare algorithm processes received point cloud data from iVR-scan and process is as follows: (a) iVR-prepare received point cloud data and RGB-D images from IVR-scan every 100 ms; (b) The inspector sets the target object boundary and uses iVR-prepare to crop the targeted object from the scanned scene using color coding to reduce computational data and time required for subsequent steps; (c) iVR-scan erases all geometries located outside the targeted bounding box set prior by the inspector and keeps only point cloud data located inside it; (d) This module creates a uniform voxel grid structure to initiate 3D mesh object from point cloud; (e) iVR-prepare employs mesh marching mathematical logic and ball-pivoting algorithm (BPA) to convert the point cloud into a 3D mesh object; (f) iVR-prepare employs a bounding box to surround the produced 3D mesh to reduce computational data and time. (g) iVRprepare sends the final 3D mesh geometry from construction job-site PC to office PC using Speckle cloud as depicted in Figure 4. This study uses BPA to reconstruct a mesh from a point cloud. This logic employs a sphere to roll across the "surface" of the point cloud Delta p. The sphere scale is slightly larger than the distance between two points, to cover them inside its boundary. This sphere begins from two points and starts rolling to include additional points to generate the outer shell of the mesh. The sphere is rolled and pivoted along the edge of the triangle-shaped between two lines. After that, the ball settles at a new location; a new triangle is created from two previous vertices and a new point is attached to the mesh, forming the expansion triangle. As the sphere continues to roll and rotate, additional triangles are shaped and connected to the mesh. The sphere continues this motion until the mesh is completely shaped as shown in Figure 5.
The subsequent sections present a 2D BPA visualization, wherein a sphere settles among pairs of points. The two points are connected each time the sphere settles to fill in the mesh. In 3D cases, the sphere is connected among three points as shown in Figure 5.
The sphere radius, θ, is obtained empirically based on the size and scale of the input point cloud: The average distance among neighboring triple points, ∆p, is divided by 2 to obtain the radius of the sphere and then increased by multiplying it with (1.05-1.25) to ensure that θ includes ∆p within its boundaries. All points need to be obtained within a distance of 2θ from the starting point p. To control the variance of 2θ between neighboring points, we use the voxel cube ν data structure. This cube has a side length of 2θ. Voxel ν location coordinate of a given position p with coordinates P x , P y , P z is a triplet: To populate the voxel ν on ∆p, the above-mentioned equation is developed to fit the visual algorithm developed in iVR-prepare. For a point in ∆p, the x coordinate P x is subtracted from the voxel corner point x coordinates ν x and divided by 2θ (similarly for P y andP z ). The generated voxel of the outer shell of the mesh is then relaxed to produce a smooth surface as depicted in Figure 5. iVR-prepare conducts an empirical analysis of the mesh produced from the point cloud and compares it with the BIM model to calculate the progress rate and quantify the achievement during that activity as shown in Figure 4 ( component d and f). Besides, iVR-prepare stores the progress data in a report and attaches it to iVR-feedback for AR visualization. Besides, the developed visual algorithm provides an option to the QA inspector to download the progress data to the local machine in Excel format.
The iVR-prepare is limited to indoor construction activities that contain one object or multiple objects and can be monitored by the field of view of a Kinect camera. It converts targeted object-captured point clouds to a light size 3d mesh and sends it to the construction office. After that, speckle cloud is used to transfer the point cloud data from the job-site to the construction office. Subsequently, the Grasshopper and Rhinoceros algorithms synchronize and update the point cloud for each period.

iVR-Inspect Module
In this module, the 3D mesh is received from job-site iVR-prepare using Speckle cloud data receiver. The 3D mesh generated in the iVR-prepare phase is aligned with the BIM model. The Kinect camera and 3D model in the construction job-site physical site and construction office are aligned in the predefined setting. The project engineer connects the VR headset to the Rhinoceros environment and begins inspecting the progress data received from the job-site in VR.
Generally, iVR-inspect proceeds as follows: (1) Align the 3D mesh built-in iVR-prepare with the BIM model; (2) connect a VR headset to the immersive reality and set up the quality, scale, and colors of different models; (3) Project engineer monitors the progress, building accuracy and quality check; (4)The inspector uses VR inspection tools provided by Mindesk in the virtual reality mode that include writing comments, highlights objects, taking measurements, comparing BIM model to the progress model received from iVR-prepare and provides illustrations as feedback; and (5) the iVR-inspect sends the monitoring report that includes (comments, drawings, and geometry) to iVR-feedback as illustrated in Figure 5.
This study employed the VR inspection method and BIM, instead of conventional site visits and point cloud screen checking, to award project engineer with the advantages of an immersive reality, thereby improving inspection rate, speed, and accuracy as well as the virtual feeling of being present in a job-site even when sitting in a construction office as illustrated in Figure 5. This study employs the Mindesk package of Grasshopper to control the iVR-inspection module. Mindesk comes with apart from Grasshopper, which allows your parametric model in the space of VR. 3D CAD models immersed in VR can be built, edited, and navigated without exporting the Rhino project by using this package. The current tools used in iVR-inspection include 3D navigation, multi-scale view, geometric editing, NURBS editing, surface modeling, curve modeling, text tagging, and note drawing. Mindesk functions with HTC Vive™, Oculus Rift™, and Windows Mixed Reality™ devices. It can also incorporate AR into the same platform; however, in this study, we only focus on its VR-related aspects.
After performing progress and progress checks, the project engineer provides illustrations, and adds comments, iVR-inspect sends the multilayer data, including the BIM model, overlaid model, comments and illustrations, and accomplishment report, to iVR-feedback as depicted in Figure 6.

iVR-Feedback Module
iVR-feedback focuses on visualizing the engineer's reports using mixed reality in jobsite by utilizing augmented reality application that synchronize BIM model in construction office with worker smart phone in real time. The workflow of iVR-feedback includes the following steps: (1) The worker phone is registered with the construction office's Fologram tool to receive live data; (2) engineer sends the BIM model, comments and illustrations, accomplishment report, and overlaid model via the Fologram to job-site's cell phone; and (3) job-site worker visualizes all received data in mixed reality using the cell phone as illustrated in Figure 7.
An immersive holographic instruction set is generated using parametric models representing the architecture. Fologram, which is included in Rhino and Grasshopper text, synchronizes the geometry with that developed in mixed reality devices linked via a local Wi-Fi network. If a consumer makes improvements to a pattern in Rhino or Grasshopper, these modifications are observed and transmitted to all linked mixed reality apps. Using familiar and efficient desktop CAD tools, the users can view digital models in a physical setting and on a scale when making improvements to such models. The placement of the 3D model is controlled by placing a QR code in the Rhinoceros canvas in the construction office and a corresponding code in the job-site. Once workers receive the data, they can visualize the QA inspector's comments and overlaid the model in a near real-time manner. This system requires that the worker phone is registered in the construction office's Fologram tool and both the construction office and job-site mobile phones need to be connected to the same Wi-Fi. As a result, engineer and job-site workers function synchronously and can receive and send near real-time data between job-site and construction sites using extended reality and 3D scanners associated with visual programming algorithms as shown in Figure 7.

Data Management
To address the application interoperability issue, primarily induced by discrepancies in the application format, this paper proposes using the Rhinoceros platform and the (.3dm) format as a unified data format between different software and hardware. The entire hardware is directly connected to the Rhinoceros modeling platform and then synchronized using the speckle cloud-based data-sharing. In job-site, Kinect V2 is connected to Grasshopper and Rhinoceros, and its point cloud data and resolution parameters are managed and stored in it. Subsequently, all the captured point cloud data are sent to the construction office's Grasshopper and Rhinoceros using the speckle cloud tool, which maintains the synchronized and updated models in both computers. After that, the inspector assesses progress and associates the report with the model. Finally, the feedback data are sent back to the job-site cell phone using the AR synced with the office BIM model.

Case Study
To validate the feasibility of the proposed iVR system, a lab test is conducted to simulate a specific work inspection between the job-site and the construction office. Two case studies were built to test iVR. The first case study is column-finishing material monitoring activity and the second case study is HVAC pipe installation monitoring activity.

Case Study 1: Column Finishing Material Activity Monitoring
The first case study consists of a grid of 4 m × 4 m concrete columns with 0.4 m length × 0.4 m width. Concrete columns are designed in a BIM environment and represented by Styrofoam boxes in the lab to resemble a job-site. Before starting the work inspection experiment, an iVR-location finder was used to optimize the Kinect camera location. After that, all four phases of iVR were implemented, and the time consumed, accuracy, and practicality of these junctures are recorded for further development of the system. In this phase, a laser scanner is placed within the range of the target object (concrete columns) using an iVR-location finder, and the coordinates of its focal point are registered, for synchronization with its corresponding location in the BIM environment. In this case study, Kinect V2 is chosen from the list of 3D scanners. The user can choose the optimal scanner location based on various essential parameters, including the type of laser scanner, duration of the activity, work schedule, breakdown structure, field-of-view of the scanner, visible angles, blocked angles, perspective viewpoint of the scanner in the BIM model, and working window sphere of the scanner, as illustrated in Figure 8. Kinect V2 chosen in this case study parameters embedded in the iVR-location finder tool are outlined in Table 2. The Kinect V2 used in this case study has a horizontal range of 0.3-4500 mm, and therefore, one laser scanner location can cover two columns ( Figure 8); the laser scanner's field-of-view window sphere component in each direction is at 292°.   Figure 9 (component a). The case study was built in the lab monitoring one column and placing the Kinect camera in the physical position corresponding to the digital position generated using iVR-location finder as shown in Figure 9 (component b). As part of predefined settings. iVR-location finder is used to generate a camera position in the digital model. The results of iVR-location finder are day one of work breakdown structure needed 4 3d scanners in the job-site to monitor 6 targeted columns, a field of view of the camera is 297 degrees, blocked fields of view are 63 degrees and laser camera view in the BIM environment as illustrated in Figure 9 (component c).
Lab experiment work was divided into five stages in correspondence with iVR system modules: first stage covers iVR-location finder, second stage covers iVR-scan, third stage covers iVR-prepare, fourth stage covers iVR-inspect, and final stage covers iVR-feedback. First, In the case study, the lab test is commenced by targeting one column for 3D laser scanning, capturing 3D geometry vertices, and registering the point cloud using Kinect V2 (Seconds Time). Qualla library is operated to conduct the resolution of the point clouds and manage computational data processing and the time required. Subsequently, a cropping box is built to include only the targeted objects (concrete columns for this case study) for the next step, to reduce computational time, and to exclude unnecessary objects scanned from the point cloud as described in Figure 9. Second, the iVR-prepare registers point clouds in each interval and updates the existing point clouds as a loop. In this case study, the interval for updating existing point clouds is set to 15 min. As stated in the Methodology section, iVR-prepare converts the concrete column point clouds into a 3D mesh by using PVA. Thereafter, the 3D mesh and point cloud are sent to the construction office using a speckle cloud 3D model synchronizing tool and overlaid with the BIM model through the Grasshopper and Rhinoceros software. Third, In the construction office, the project engineer receives the point cloud and 3D mesh data using the speckle data receiver tool of Grasshopper and Rhinoceros. This study uses Oculus rift S for iVR-inspection. Inspector begins visualizing the 3D model through an immersive VR, taking measurements, writing notes, checking progress, assessing quality, acquiring snapshots of necessary details, and generating work inspection reports. Fourth, the algorithm developed by iVR-feedback lunches the inspector's report back to the jobsite using the Fologram AR tool of Grasshopper and Rhinoceros. Workers on job-site receive work inspection reports, which include the BIM model, comments and drawings, accomplishment rate, and overlaid model. It should be noted that external data can also be attached to progress monitoring reports; these data include site location workspace, task independence, schedules, and construction methods. For the AR to function appropriately, the worker's phone should be registered in the Fologram tool in the office, and both devices should connect to the same WiFi network. In this case study, iVR-feedback successfully visualizes the feedback data in the construction site and the worker can view the inspector's comments and the overlaid model and follow the necessary updated information recommended by the inspector without any physical interactions as described in Figure 10.
Changes in the mixed reality are introduced in models by associating the observed movement behaviors parametrically with the framework generated on the device. In an event-based parametric modeling interface, Buttons are created and placed in the substantial space adjacent to the construction site; these buttons can increase or decrease the current course of finishing material tiles in covering of columns without using BIM environmental show in Figure 10. The columns in the design model were clustered into courses to allow building stages to display each course as needed separately and to avoid unbuilt areas of the plan causing visual disturbances during development. Figure 10. Column lab test case study for work inspection using the iVR system all phases.
To accomplish one loop of the data process in iVR, the time needed, precision and the file size was monitored and quantified to check how the hidden layers of this system work. Before starting the case study, some parameters were fixed including target activity at job-site is selected, a 3D scanner is installed on-site and connected to site computer, office computer and site computer are connected to same wifi ID and speckle cloud ID, VR is set up and ready to use, worker mobile phone ID is registered in-office computer, and QR code in BIM model and job-site is generated, placed, and synchronized in one benchmark. Table 3 outlines the details of the time needed to finish one loop of iVR system. To measure the pure data computational time, Table 3 excluded location finding, inspector's comments, worker feedback time because it is variable dependent on person and activity nature. Data process time needed to finish one loop of iVR for one concrete column take place nearly 17 min, and the total time demanded to finish one loop including inspector commenting in VR, worker visualizing in AR, and iVR-location finder for the case study was nearly 40 min. Time calculated in this table is mathematically related to the computational power of the device, internet speed, data size, and activity nature. Therefore, it is a effective practice to calculate how much time a loop in iVR will take before an activity progress monitoring starts.

Case Study 2: HVAC Pipe Installation Activity Monitoring
In this case study, an HVAC pipe installation activity was targeted to examine the proposed system. Using iVR, inspectors can inspect HVAC pipes hanging on the ceiling easily and perceive them from a variety of angles in different scales that are difficult to do in a real job-site. Therefore, case study 2 is important to check the advantages of the proposed study. The Kinect camera location was generated using iVR-location finder and directed toward target activity as shown in Figure 11. Next, the camera started generating point clouds of the targeted HVAC pipes and surrounding objects including the wall, ceiling, and partition window. Then iVR-preparation converted the point cloud into a 3d model as shown in Figure 11. After that, the inspector could view HVAC pipes from peculiar angles that are hard to view on the job-site. iVR-inspect of HVAC pipe activity lets the inspector view the 3D model generated from the site and compare it to the BIM model while sitting in the construction office using a virtual reality tool. The inspector checks defects, the inspector writes comments, takes screenshots, checks measurements, and coordinates and views the model from different scales and angles. Finally, the iVR-feedback enables the worker in the job-site to view the inspector report in AR mode. One loop of data processing in iVR in case 2 took 11 min to complete as illustrated in Figure 12.

Discussion
This research starts with an inspection checklist and predefined setting that include synchronizing physical and digital models, setting the field of view, laser scanner resolution, and illumination, BIM model level of details, and work breakdown structure. In addition to the predefined setting, iVR-location finder aids the inspector to optimize the laser scanner device's position using a user interface developed list that controls scanner selection, WBS data, and visualize the field of view of the laser scanner in the predefined setting before monitoring activity starting. The inspector positions the laser scanner in the exact position generated using the iVR-location finder and synchronizes its field of view both in the physical and digital model. The iVR monitoring system starts when iVR-scan registers point cloud data every 100 ms and send it to iVR-prepare on the construction site. Next iVR-prepare converts point cloud data into a 3D mesh using marching machine techniques and send generated geometry from the construction site to the construction office using Speckle cloud. After that, iVR-insect receives 3D mesh geometry from the construction site and the inspector views it exploiting a virtual environment by utilizing the Mindesk plugin. The inspector compares received an as-built model from job-site with an as-planned model in the office, check the quality, measures work progress, write comments, draw sketches, takes snapshots of specific areas and details that need to be taken care of by the worker packs it while in the virtual reality mode and send it to the iVR-feedback. Finally, the worker received an inspection report on his smartphone, uses the augmented reality of the iVR-feedback module to view it, and follow instructions written in it. The relationship between all iVR modules and data passing is tested in two case studies and delivered expected results successfully.
The results of this research include recording the timing, data transfer between modules and significant findings of each case study. The time needed finishing one loop of iVR data passing was about 17 min and the total case study needed about 40 min including the inspector writing comments and comparing BIM and the built model and the worker viewing the inspection report in augmented reality mode. The iVR-location finder targeted one column and optimized the best laser scanner location and passed location coordinates data to iVR-scan which captures point cloud and passes it to iVR-prepare. The iVR-prepare can convert point cloud data to voxel structure (low-resolution light model with rough surface and low thickness accuracy) and mesh format (the high precision model that takes a long time to generate). 3D mesh geometry data transfer from a construction site to an office is limited to small objects only and cannot handle big size geometries like a complete building.
The result of the case study on monitoring the progress of concrete columns and HVAC revealed that the proposed system can support more advanced and comprehensive activities in construction job sites. The developed iVR system successfully registered monitoring data, processed these data and enabled the inspector to generate a report and send it back to the construction job-site as feedback. In iVR-inspection, the inspector can employ VR to view concrete columns and HVAC pipes from different angles and obtain measurements that would not be possible when using conventional monitoring processes. This study argues that the use of iVR can improve decision-making in the work inspection stage and result in a better construction product in a shorter time, with limited human involvement. Moreover, this method reduces physical interactions, which helps maintain social distancing and prevents the spread of the current health pandemic.
By exploiting the advantages of VR, the inspector can orbit around the target activity, obtain measurements, magnify the structure (1/1 scale or closer), locate areas, and observe precise existing and future conditions. Besides, difficult situations to access in real life, such as the space between a false ceiling and the ceiling beams, can be monitored using this system. However, digital models may not present all the required information; thus, onsite verification needs to be performed remotely. Once onsite dimensions are verified, they can be used to inform and guide project engineers regarding future steps. Owing to the simple parametric setup of a Grasshopper environment, design engineers can adjust the capacity, size, and location of each system quickly and effortlessly. This process also enables the design engineer or the project manager to access the model, thereby facilitating collaboration between them; thus, a global model that all disciplines can observe and contribute to. This workflow can be extremely helpful for projects that require the deployment of prefabricated components as soon as they arrive, such as double-supported frame walls that occupy more space than normal.
In general, This system contributes to the automation and sustainability in the construction monitoring by reducing manpower that reduces inspection cost and logistics since the inspector is sitting in the office, remotely inspecting indoor activity without needing to examine the site personally. The innovative methods, algorithms, and technologies developed in this study distinguish it from previous research. In general, the iVR system integrates point cloud capturing devices with extended reality (VR and AR) technology for remote construction indoor activities into one platform. The most important contributions of this study are as follows: First, in this study, a 3D laser scanner is directly connected to the BIM environment and captures point cloud data inside the iVR system platform, which does not need a medium platform to handle or send point cloud data between other software. However, as the literature review stated, most current best practice work inspection requires a medium platform to convert the scanned data into point cloud or massing geometry.
Second, the iVR system eliminates file format compatibility issues, as all hardware and software are connected directly to the graphical algorithm editor of Grasshopper that this system is hosted on inside Rhinoceros3D commercially developed software. In contrast, other progress-monitoring methods mentioned in the literature review involves different scanning devices with various formats (XYZ, ply, pts, e57, las, and laz) and BIM formats (.3dm, .dwg,.obj, .ifc, .rvt, and .nwd), which result in incompatibility issues. This might even cause distortion, loss of data, and time-consuming due to the use of different versions of data processing. Third, this system contributes to creating a healthy monitoring environment by minimizing physical interaction between workers and inspections in the job-site by proposing remote construction monitoring from the construction office without needing to be physically in the construction job-site. The time gap between the live data capturing in the construction site and construction office is about 17 min which might be impractical for intense activities like installing sensors, CCTV cameras, and pouring fast hardening materials onsite that need monitoring in live time but this time gap is good for other ordinary indoor activities.
Fourth, in iVR, data storage is managed and controlled within the BIM environment. Kinect devices, VR headset, and Fologram are attached to the BIM environment to import/export data, resulting in a lightweight and rapid system. However, in other state-of-the-art progress-monitoring approaches, data is stored on the cloud of the scanner's manufacturer or portable devices; this is impractical because the user needs to connect all stored point cloud data to another data cloud or BIM software to monitor activity progress. Besides commercially developed construction progress tools builds as-built model by personally visiting the job-site and capture data manually then process it which take more than 24 h before being ready for inspection while in iVR the system updates the as-built model automatically in near real-time (every 17 min) as shown in Table 4.
The applied research is deprived of some limitations; for instance, all data needs to be readily available to feed into the system prior to iVR to start performing. Next, in iVR-inspect module, data overload accrues occasionally while using virtual reality mood for inspection which causes the system to crash. Therefore, it is advised to reduce BIM model details before overlaying it with as built model inside VR environment to prevent data overload.
In addition, the iVR-scan module's point cloud capturing device contains limitations in scanning highly reflective items, such as mirrors and highly polished surfaces; transparent items, such as glass and liquids; and items that refract and reflect light, like number plates and clothing reflective panels. Nevertheless, objects that typically do not reflect light may do so under certain conditions (such as the paint coating of cars and stainless steel); hence, additional scans will be required to obtain sufficient information regarding such objects. Besides, iVR-scan module works well in the indoor data acquisition but does not perform well in the outdoor environment due to strong sunlight lamination therefore iVR-scan can replace kinect device with other data capturing devices easily for outdoor monitoring purposes because it is built using visual programming which makes iVR-scan host different data capturing devices with a minor change in the code.
Moreover, each iVR module has certain input and output requirements to perform correctly. Therefore in case some data is not available or corrupted, the system will not work properly and the inspector needs to debug the problem. Finally, In this research, iVR-prepare target indoor activities that contain small objects that can be observed by the field of view of one Kinect camera and convert its captured point cloud to 3d mesh. Therefore point cloud density and geometry quality need be optimized and set prior to the monitoring stage to prevent data overload that might cause failure in data transfer between job-site and office. Therefore iVR-prepare works best when it transfer geometry data of objects inside activities like column finish work and HVAC pipe successfully from jobsite to office but it is unable to handle big geometry data like a full building or multiple workspaces at the same time and it needs to be separated to separate monitoring stations to prevent data overload.

Conclusions
Although there have been significant developments in 3D scanning and photogrammetry technologies for construction work inspection, existing progress tracking methods still require manual intervention or data processing. This significantly increases the time required for monitoring and relies on conventional methods. To address this issue, we developed and tested a near real-time construction project monitoring tool (iVR). Based on the findings, the significant benefits of iVR are as follows: • The results indicate that the developed tool amplifies the potential to bridge the gap between the construction office and job-site by only capturing the necessary data and exchanging it to decrease computational complexity and time consumption. • Immersive virtual and AR helps the inspector and workers to visualize the activity from different angles, which cannot be achieved via conventional methods, and provides the necessary tools for drawing, commenting, attaching data, and communicating faster without requiring physical interactions. • Using iVR, inspectors can monitor various activities quickly without physically leaving the construction office. This can have valuable implications on the nature of work inspection, reduce human resources, increase automation in monitoring activity and increase quality and production which will reduce monitoring cost. • Another contribution of the proposed approach is that it facilitates social distancing and reduces physical interaction between workers and the construction officer; this is expected to help curb the spread of the current pandemic and create healthier and safer environment in construction site.
In summary, the significant potential of iVR in work inspection at the construction stage was ascertained and confirmed using a lab case study. In the future, this tool can be developed as a tab plug-in to commercial software applications, thereby enhancing the entire inspection process. However, the architecture of the present methodology is limited to the geometry of the activity, such as built-in work inspection.
Hence, subsequent works should aim to expand the proposed model to include progress tracking (photogrammetry + real 3D scanner), material tracking (GPS and RFID), worker tracking (RFID), and equipment tracking (GPS and distributed sensors). iVRscan only scans a job-site target activity; hence, it should be extended to focus on object quality assessments, human activities, as well as human recognition and skeletonizing, for identifying human figures and tracking the skeleton images of people moving within the Kinect field of view. In the future, this system will be implemented and tested on actual construction sites to elucidate the potential advantages and challenges of this system. Moreover, it is anticipated that iVR can become a communication platform between QAQC Engineer, Owner, and General Contractor during the construction stage of the project by utilizing extended reality integrated with 3D laser scanning to increase automation and efficiency as a key toward sustainability in construction. Furthermore, in iVR-prepare, the generated reports can be integrated with the targeted activity reports of the BIM schedule and provide feedback for the self-update of the 4D BIM, thereby enhancing automation in QA inspection.