1. Introduction
Cultural heritage represents the accumulated legacy of societies, civilizations, and artistic expression over time. Preserving historic buildings, artifacts, and landscapes requires accurate documentation methods capable of capturing both geometric complexity and material condition. In recent decades, advanced digital technologies have played an increasingly important role in heritage documentation and conservation planning [
1,
2,
3].
Among these technologies, terrestrial laser scanning (TLS) has emerged as a powerful tool for recording historic structures with high geometric accuracy and efficiency. TLS enables rapid acquisition of dense three-dimensional (3D) point cloud data representing the shape, dimensions, and surface characteristics of architectural elements [
1,
2,
3,
4,
5,
6]. When combined with temporal information, these datasets can be extended into four-dimensional (4D) models, allowing the monitoring of changes in geometry and material condition over time. Such 4D documentation is particularly valuable in cultural heritage applications, where long-term deterioration processes and intervention impacts must be carefully assessed [
7,
8,
9].
Photogrammetry represents an alternative and complementary approach to laser scanning, generating 3D models from overlapping photographic images captured from multiple viewpoints [
5,
10]. This technique is well-suited for large structures and outdoor heritage sites; however, it remains sensitive to lighting conditions, surface reflectivity, and image overlap requirements. Complex geometries, shadows, and reflective materials can introduce inaccuracies, and the processing of large image datasets may be computationally demanding [
7,
11,
12]. As a result, hybrid workflows combining TLS and photogrammetry are increasingly adopted to maximize accuracy and completeness [
5].
The application of 3D and 4D scanning technologies has significantly transformed cultural heritage documentation. High-resolution digital models now support defect detection, geometric analysis, and the monitoring of structural and material changes over time. These capabilities enable preventive conservation strategies, facilitate virtual access for research and education, and provide digital safeguards against risks such as natural disasters, theft, and material loss [
7,
13,
14].
Accurate and up-to-date documentation is essential for assessing the condition of historic structures, identifying defects, and planning appropriate conservation interventions. Advanced imaging and scanning techniques allow the detection and quantification of cracks, erosion, deformation, and material deterioration, as well as the registration of multi-temporal datasets for change analysis—the fourth dimension of documentation [
15,
16,
17]. However, despite these advances, existing workflows often focus primarily on geometry and lack systematic integration with defect characterization, material diagnostics, and treatment evaluation.
Defect detection and registration remain critical challenges in heritage documentation. Reliable analysis requires precise alignment of as-built laser scanning data with reference models and consistent spatial frameworks for defect localization [
18,
19,
20]. Addressing these challenges calls for integrated information models capable of linking geometric data, defect attributes, material behavior, and temporal evolution within a unified environment.
In response to these needs, this study presents a four-dimensional Historical Building Defect Information Model (HBDIM) designed to support defect monitoring, digital modeling, and conservation treatment assessment in historic buildings. The proposed framework integrates terrestrial laser scanning, photogrammetry, total station measurements, and laboratory material analyses to enable comprehensive documentation and long-term monitoring. The methodology is applied to the Baron Empain Palace in Egypt, as shown in
Figure 1, demonstrating the potential of HBDIM to enhance conservation planning, defect analysis, and preventive heritage management.
1.1. Plaster Implementation Technology in the Baron Palace
By looking closely, we can see the exquisite details of the decorations. As in Hindu constructions, filled with statues and inscriptions, but in a moderate form, these decorative building elements were often executed in advance and then moved. The molds themselves were implemented in reinforced cement. Finally, to give distinction and uniqueness, the architect decided to dye this building Siena red, the result being obtained using colors based on iron oxides, spraying the parts and coloring them with pumps [
20].
Shedding light on the technology of implementing plasterwork, which turned out to be done by using molds prepared for implementation, for small-sized formations, and in the case of large-sized formations, a concrete structure is made, and precast parts are installed on this structure. The colors that were applied to it through tests and analyses of this unique Indian model in Baron Empain Palace, as shown in
Figure 2, turned out to be oxide colors mixed with Arabic gum [
20].
The Greeks are said to be the first to use gypsum plasters in European architecture, having learnt how to create and use them from the Egyptians and Minoans [
21]. The Romans also spread this information to Central and Northern Europe. Over time, it was incorporated into the traditional European building procedure, either alone or in conjunction with other ingredients such as sand, lime, and various organic chemicals. Contrary to popular opinion, it was used for a number of specific reasons, such as floor screeds, brick mortars, wall relief embellishments, architectural features, and bases for exquisite frescoes [
21,
22].
For instance, the final ten years of the twentieth century saw the discovery and application of the medieval technique of utilizing floor screeds and gypsum mortars for reconstruction and restoration projects [
21,
23]. However, because gypsum plasters made it possible to replicate noble materials like marble, which were often more veined and brilliant than the genuine materials, their popularity skyrocketed during the Baroque and Rococo periods [
21,
24].
The Baron Palace utilized stucco formations as decorative elements in both art and construction, representing one of the most important design features that provided a distinctive tone to the entire structure. These elements were usually crafted beforehand and subsequently transferred for installation. The processes and factors of stucco formation deterioration include relative humidity, air pollution, temperature variation, and wind erosion; there were also physicochemical and mechanical factors that caused weathering of stucco formations [
25,
26,
27,
28,
29].
Figure 2.
Plaster implementation technology in Baron Palace [
30].
Figure 2.
Plaster implementation technology in Baron Palace [
30].
1.2. Deterioration Processes of Stucco Formations
The stucco formations and their individual components were disintegrated because of deterioration factors that led to the detachment of stucco; indirect effects, such as crystallization of some salt species owing to water evaporation cycles, mechanical destruction due to the alternative cycles of relative humidity, air pollution, and temperature variation, caused some mechanical deterioration. Also, there are effects of chemical deterioration forms through air pollution and evaporation cycles, causing colored crusts on stucco formation surfaces and flaking forms. Additionally, the stucco formation durability index is also affected by weathering as micro agents, which can change their petrophysical properties rapidly. Due to deterioration factors, stucco formations and their structural components were completely disintegrated [
30,
31,
32,
33].
Salts can crystallize just beneath the surface if the structure underneath is less porous. Those crystals can put pressure on the interior of stucco due to water absorption by salts and their evaporation at high temperatures, resulting in expansion and contraction [
34,
35], as shown in
Figure 2. Because of the irrigation of gardens in the Baron Palace, groundwater levels can rise. Alternatively, humidity can accumulate in the wall gaps and freeze in the winter months, as well as melt in the spring; it causes damage to the stucco, compromising the construction [
36,
37].
The primary novelty of the study is the development of a 4D Historical Building Defect Information Model (HBDIM). This model adds a fourth dimension to represent changes over time, utilizing x, y, and z coordinates derived from real-world 3D data. By reducing the need to restore damaged copies of historically significant things that have been destroyed, cultural heritage is preserved while saving money and time.
The second study objective is to investigate the critical components for detecting, recording, and analyzing historical architectural damages using 4D data from the Baron Palace with varied degrees of precision. To accomplish this purpose, a range of tools such as digital cameras, laser scanners, CT, SEM, TEM, and XRF will be used for 4D monitoring, registration, and damage analysis of the historical structure, as described by the study methodology in the next section.
1.3. Research Gap
Despite significant advances in the digital documentation of historic buildings, existing approaches for stucco conservation largely focus on geometric recording and visual inspection, often treating defect documentation, material diagnostics, and conservation treatment assessment as separate and disconnected processes. While three-dimensional and four-dimensional scanning technologies have proven effective for capturing geometry and monitoring changes over time, they are rarely integrated into a unified framework that systematically links detected defects with material behavior and consolidation performance. Furthermore, many heritage documentation workflows rely on static models that lack explicit temporal structuring, limiting their capacity to support long-term monitoring and predictive conservation strategies. In addition, laboratory-based evaluations of consolidants, particularly nanomaterial treatments, are frequently conducted independently from spatially referenced digital models, making it difficult to relate microstructural improvements to specific defect locations or to assess treatment effectiveness at the building scale. As a result, there remains a critical gap in heritage conservation research for an integrated methodology that combines four-dimensional digital documentation, defect-oriented information modeling, and material-level performance assessment within a single, scalable framework applicable to historic stucco formations.
1.4. Research Novelty and Contributions
This study addresses the identified research gap by proposing a four-dimensional Historical Building Defect Information Model (HBDIM) that integrates geometric documentation, defect characterization, and material diagnostics within a unified digital environment. Unlike conventional HBIM or scan-based documentation approaches, the proposed HBDIM framework explicitly focuses on defect-oriented modeling and temporal tracking, enabling the systematic localization, classification, and monitoring of deterioration phenomena over time.
The novelty of this research lies in the integration of non-destructive digital documentation techniques, such as terrestrial laser scanning, photogrammetry, and total station measurements, with laboratory-based analyses including computed tomography, SEM, TEM, XRF, and mechanical testing. This integration allows material-level consolidation performance to be directly linked to spatially referenced defects within a four-dimensional model.
Key contributions of this study include the following:
- I.
The development of a defect-oriented 4D digital modeling framework tailored for historic stucco conservation.
- II.
The validation of calcium hydroxide nanoparticle consolidation through combined microstructural and mechanical assessment.
- III.
The demonstration of how HBDIM can support preventive conservation planning, data-driven decision-making, and long-term monitoring of heritage assets.
1.5. Conceptual Framework and Terminological Hierarchy
To eliminate conceptual ambiguity, the relationship between HBIM, DIM, HBDIM, and the 4D temporal layer is structured hierarchically as follows:
HBIM (Historic Building Information Modeling) represents the general digital modeling environment for historic structures, integrating geometry and semantic information.
DIM (Defected Information Modeling) is a defect-oriented subset of HBIM, focusing specifically on identifying, extracting, and modeling deterioration features from point cloud datasets.
HBDIM (Historical Building Defect Information Modeling) extends DIM by embedding defect entities within a structured information model linked to material properties, diagnostic results, and conservation actions.
4D Layer represents the temporal dimension, enabling comparison between different scanning campaigns and recording defect evolution and consolidation performance over time.
Thus, HBDIM can be defined as follows:
This structured hierarchy clarifies that HBDIM is not a separate modeling paradigm, but a defect-focused and temporally enabled extension of HBIM.
To clarify the implementation of the temporal dimension within the proposed framework, the fourth dimension (time) in HBDIM is designed to incorporate datasets acquired through repeated scanning campaigns conducted at different time intervals. Each dataset obtained from terrestrial laser scanning or photogrammetry can be registered within the same spatial reference system and linked to a specific timestamp within the HBDIM database. This structure enables the comparison of time-stamped point clouds or mesh models to detect geometric deviations associated with deterioration processes such as crack propagation, surface erosion, deformation, or material loss. By integrating defect-oriented information modeling, these detected changes can be spatially associated with previously identified deterioration patterns within the digital model. In addition, conservation treatments can be documented as time-linked intervention layers within the model, allowing pre- and post-treatment datasets to be compared in order to evaluate the effectiveness of consolidation measures. This approach enables the HBDIM framework to support long-term condition monitoring and data-driven conservation management through multi-temporal analysis.
2. Materials and Methods
This section describes the methodological framework adopted to develop and implement the proposed Historical Building Defect Information Modeling (HBDIM) approach. The methodology integrates field-based digital documentation, laboratory material characterization, and data processing workflows to support the detection, modeling, and assessment of deterioration in historic stucco elements.
The workflow begins with multi-source data acquisition, including terrestrial laser scanning, high-resolution photography, and total station measurements, followed by defect detection, registration, and digital modeling processes. In parallel, laboratory-based analyses, including computed tomography (CT), scanning electron microscopy (SEM), transmission electron microscopy (TEM), X-ray fluorescence (XRF), and mechanical testing, were conducted to evaluate the effectiveness of calcium hydroxide nanoparticle consolidation treatments. The methodological structure is organized to reflect the sequential stages of the HBDIM framework, ensuring reproducibility, accuracy, and integration between geometric documentation, material diagnostics, and four-dimensional data modeling. The integration process in HBDIM follows a sequential data flow where geometric data acquired by TLS and total station measurements establish the spatial framework, photogrammetric images provide surface texture and visual defect information, and laboratory analyses are linked to spatially localized defects to evaluate treatment performance.
Laboratory diagnostic results were linked to the digital HBDIM by associating each analyzed sample with its spatially identified defect zone within the registered point cloud model. This allows microstructural and compositional information to be interpreted in direct relation to geometrically localized deterioration.
The work system and the activities are as follows, as shown in (
Figure 3).
2.1. Data Acquisition and Detection
The data acquisition and detection process began with the three-dimensional laser scanning of the defective Baron statue, conducted on a specified date and time to ensure temporal consistency. The resulting point cloud dataset was imported into the modeling software environment, while high-resolution digital photographs of the defective statue were captured on the same date and prepared for analysis. A comparative assessment between the laser-derived 3D geometry and the photographic data was then carried out to identify geometric defects, define their spatial boundaries, and support initial defect interpretation.
In parallel, observable deterioration features of the Baron Empain Palace were identified through geospatial detection using high-resolution satellite imagery within the Google Earth platform. The palace model was located and examined using geographic coordinates (northing, easting, and elevation), providing contextual spatial reference for the documented defects. This geospatial identification facilitated the correlation between field observations, the four-dimensional (4D) point cloud model, and external spatial datasets, supporting accurate localization of damaged areas prior to detailed defect modeling.
2.2. Data and Materials Evaluations
At this stage of the methodology, the defective historical models of the Baron Empain Palace and the studied statue were imported into the modeling environment for integrated digital and material evaluation. Geomagic Studio (version 12.0) was used to identify and attribute defect characteristics, while high-resolution digital photographs were systematically compared with laser scanning and modeling outputs to support accurate damage interpretation. This integrated evaluation enabled the correlation of visible surface deterioration with spatially referenced geometric data, forming the basis for subsequent defect modeling and analysis, as further discussed in the
Section 3 and
Section 4.
In parallel, the efficiency of calcium hydroxide nanoparticles dispersed in isopropanol alcohol at a concentration of 30 g/L was evaluated to assess their effectiveness in improving the mechanical and microstructural properties of stucco formations.
Although the analytical procedures described above were conducted under controlled laboratory conditions, the obtained results provide essential reference information for interpreting deterioration and consolidation processes directly on the structure. In practical applications, the diagnostic findings are integrated into the HBDIM environment and spatially associated with the corresponding architectural elements identified in the digital model. Through periodic in situ surveys using terrestrial laser scanning or photogrammetry, changes in the geometry and condition of these elements can be detected and compared with the material behavior identified through laboratory analyses. By linking time-stamped survey data with the diagnostic information stored in the model, the HBDIM framework enables the monitoring of deterioration patterns and the assessment of consolidation effectiveness over time. In this way, laboratory-based material characterization supports in situ conservation evaluation within a multi-temporal digital documentation framework.
Treated and untreated stucco samples were comparatively examined using a combination of destructive and non-destructive laboratory techniques. Computed tomography (CT) was employed as a non-destructive method to investigate internal structure, penetration depth, and material homogeneity before and after consolidation, using a tube voltage of 90 kV, a tube current of 160 μA, a field of view (FOV) of 20 mm, and a scanning time of 4.5 min. Microstructural surface morphology was analyzed using scanning electron microscopy (SEM) (JEOL JSM-5500 LV, JEOL Ltd., Tokyo, Japan) at an accelerating voltage of 15 kV under secondary electron imaging (SEI) mode, while nanoscale characteristics of the consolidant and its interaction with the stucco matrix were examined using transmission electron microscopy (TEM) (JEOL JEM-2100, JEOL Ltd., Tokyo, Japan) operating at 200 kV. In addition, X-ray fluorescence (XRF) analysis was conducted to identify the elemental composition of the stucco samples. Mechanical performance was evaluated by comparing the compressive strength of treated samples against untreated reference specimens to quantify consolidation effectiveness.
The consolidation process was carried out using an immersion method, in which eight stucco samples were treated with calcium hydroxide nanoparticles (Ca(OH)2) without altering their original physical characteristics or inducing adverse side effects. The samples were immersed in the consolidant for 21 days to allow sufficient time for penetration and completion of the consolidation process prior to laboratory evaluation.
2.3. Data Registration
The method was created to improve the realism of the 4D information models of damaged historical items in the fourth step of the process. This was accomplished by creating a registration between the 4D geometric information of the historical models from the point clouds of the Palace and the statue. Matching points of the paper targets formed the basis for the registration job. The automated total station is utilized to detect matching points that are established through paper target viewing during the scans. In the 4D historical object, the damaged model will be registered into its actual location and orientation. The registration process consisted of two steps:
2.4. Data Modeling
Data modeling represents a core component of the proposed HBDIM framework, as it enables the transformation of raw point cloud data into structured, analyzable digital representations of defects in historic stucco elements. In this study, defect data were modeled to simulate their geometric and spatial characteristics within a real-world digital environment, allowing accurate localization, classification, and subsequent assessment. The data modeling process incorporates a sequence of point cloud processing techniques collectively referred to as Defected Information Modeling (DIM). These techniques were applied to laser-scanned datasets to enhance data quality, isolate defective regions, and generate reliable three-dimensional and four-dimensional representations suitable for defect analysis and long-term monitoring. The main DIM steps are summarized in and described in detail below, as shown in (
Figure 4).
2.4.1. Data Acquisition
Multiple terrestrial laser scans were acquired from different positions surrounding the Baron Empain Palace and the defected stucco statue. This multi-position scanning strategy was adopted to ensure full coverage of object surfaces, minimize occlusions, and capture fine geometric details associated with deterioration features [
40]. The overlapping scans provided redundant geometric information, which is essential for reliable registration and subsequent defect detection. The resulting point cloud datasets served as the primary geometric input for the DIM workflow.
2.4.2. Data Format Conversion
The acquired point cloud data were initially stored in ASCII format, containing three-dimensional spatial coordinates (X, Y, Z) along with intensity and color information. To enable efficient processing and modeling, the datasets were converted into software-specific formats compatible with the applied modeling platforms [
41]. This conversion step ensured interoperability between different software environments, including Geomagic Studio (version 12.0) and RiSCAN (version 2.23), while preserving spatial accuracy, intensity values, and color attributes required for defect detection and visualization.
2.4.3. Defected Outlier Point Detection
Defect detection was initiated through the application of an outlier detection algorithm designed to identify anomalous point distributions associated with surface deterioration. The algorithm identifies points that deviate significantly from the local geometric neighborhood, which often correspond to cracks, material loss, erosion zones, or surface irregularities. The primary control parameter of the algorithm is the sensitivity field, which determines the threshold for classifying points as outliers. By adjusting this parameter, defective regions could be isolated while minimizing the misclassification of intact surface geometry. The detected outlier points represent the initial quantitative indicators of defect presence within the DIM framework.
Defect Detection Workflow
Defect detection in this study follows a semi-automatic workflow consisting of three stages:
Geometric deviation analysis using the outlier detection algorithm (sensitivity 50%).
Intensity-based surface inspection to enhance crack and material loss visualization.
Manual expert validation through visual comparison with high-resolution photographs.
The approach is therefore classified as semi-automatic, where algorithmic detection is combined with expert interpretation to ensure reliability in complex historic stucco geometries.
2.4.4. Filtration and Noise Reduction
Following outlier detection, filtration, and noise reduction were applied to improve the quality and usability of the point cloud datasets. This step was performed using Geomagic Studio and RiSCAN software to reduce excessive point density, eliminate residual noise, and remove irrelevant portions of the scans that did not belong to the studied objects. Statistical noise-reduction filters were applied by repositioning sample points toward statistically corrected locations based on standard deviation analysis. This process enhanced point cloud uniformity and stability, facilitating subsequent meshing and defect visualization while preserving essential geometric characteristics of the historic models.
2.4.5. Model Meshing (Triangulation) and Volume Wrapping
Meshing was performed to convert the filtered point cloud data into continuous surface representations suitable for visualization and quantitative analysis. A triangulation algorithm, such as the ball pivoting method, was applied to generate polygonal meshes by connecting neighboring points into triangular elements. The resulting polygonal models consist of interconnected triangular facets forming a closed surface that approximates the geometry of the original object. Volume wrapping was subsequently applied to refine surface continuity, eliminate minor gaps, and enhance the structural coherence of the digital models. This step is essential for producing stable and analyzable representations of defective stucco elements.
2.4.6. Texture Mapping
Texture mapping was applied to enhance the visual realism and interpretability of the meshed models. High-resolution digital photographs were mapped onto the polygonal surfaces, transferring color and texture information to the three-dimensional geometry. This process enabled realistic visualization of surface features such as discoloration, staining, cracking patterns, and erosion zones. Texture-mapped models support both qualitative inspection and spatial correlation between visual deterioration indicators and geometric defect data within the DIM environment.
2.4.7. Accuracy Evaluation
To evaluate the geometric precision of the modeling and registration processes, an accuracy assessment was conducted using the total standard deviation (S) of corresponding point distances between aligned datasets. The standard deviation provides a quantitative measure of modeling and registration quality and reflects the cumulative effects of scanning accuracy, registration performance, and data processing.
The total standard deviation is expressed as follows:
where
n represents the number of corresponding point distances used in the evaluation, and v denotes the residual error, defined as the difference between the measured distance and the most probable value for that distance. This accuracy metric provides an objective basis for assessing the reliability of the DIM workflow and supports the use of the generated models for defect localization, comparative analysis, and future multi-temporal monitoring applications.
2.5. Defect Coding and Information Structuring Within HBDIM
To operationalize the HBDIM framework, each detected defect is assigned a structured information record containing the following:
Defect ID;
Defect type (crack, delamination, material loss, biological growth, erosion);
Severity level (low, moderate, or high);
Geometric parameters (length, area, depth, and deviation magnitude);
Date of detection;
Associated material properties;
Consolidation status (untreated/treated/monitored).
This coding structure enables systematic querying, comparison across temporal datasets, and integration with laboratory consolidation results within the 4D environment.
2.6. Clarification of the Four-Dimensional (4D) Concept in HBDIM
Although the present study demonstrates a single-epoch implementation, the proposed Historical Building Defect Information Modeling (HBDIM) framework is explicitly designed as a four-dimensional (4D) system, in which the fourth dimension represents time-dependent changes in geometric defects and material condition. The framework supports the integration of multi-temporal terrestrial laser scanning datasets, enabling quantitative comparison of building conditions before consolidation, after intervention, and during subsequent monitoring phases. Consequently, the current application should be understood as a proof-of-concept that establishes a scalable and repeatable platform for long-term condition assessment, deterioration rate analysis, and validation of conservation treatments over time.
2.7. Used Tools and Instruments
The following tools and instruments were employed to support the implementation of the proposed HBDIM workflow and ensure accurate data acquisition, processing, and analysis. The selected equipment was chosen based on its suitability for heritage-scale documentation, non-destructive investigation, and material-level assessment. Together, these tools enabled the integration of geometric data, visual documentation, laboratory analyses, and consolidation evaluation within a unified four-dimensional digital framework.
2.7.1. Photographic Data Acquisition
A full-frame digital camera, as shown in (
Figure 5), was employed to acquire high-resolution photographic documentation of the Baron Empain Palace façades and the defective stucco statues. The camera was operated using a fixed focal length of 20 mm to ensure consistent image geometry and minimize lens distortion. All images were captured using a free-hand acquisition approach, without physical attachment to the laser scanner, allowing flexible positioning and comprehensive coverage of decorative and deteriorated areas. The photographic dataset was primarily used for visual defect identification, texture mapping, and validation of laser scanning outputs. The use of homogeneous lighting conditions and consistent camera settings enhanced image quality and facilitated reliable integration with the 3D point cloud data.
2.7.2. Geometric Registration Using a Total Station
A Leica automated total station, (Leica Geosystems AG, Heerbrugg, Switzerland) as shown in (
Figure 6), was utilized to support accurate geometric registration and spatial referencing of the laser scanning datasets. The total station was employed to measure the precise coordinates of artificial paper targets distributed across the Palace and on selected defective statues. These control measurements provided a reliable reference framework for aligning multi-scan point cloud datasets and improving overall registration accuracy. The integration of total station observations reduced cumulative alignment errors and ensured consistency between the as-built geometry and the digital defect information model.
2.7.3. Terrestrial Laser Scanning Data Acquisition
The Z+F Imager 5006i terrestrial laser scanner (Zoller + Fröhlich GmbH, Wangen im Allgäu, Germany) as shown in (
Figure 7), was used for acquiring high-density three-dimensional point cloud data of the Baron Empain Palace and selected stucco elements. The scanner enabled rapid, non-contact acquisition of geometric data with high spatial resolution, making it particularly suitable for documenting complex architectural details and surface irregularities associated with deterioration. Multiple scanning positions were employed to ensure full coverage of façades, statues, and decorative elements while minimizing occlusions. The resulting point cloud datasets formed the geometric foundation of the HBDIM framework and supported subsequent defect detection, registration, and modeling processes.
2.7.4. Data Processing Hardware and Software
Data processing and modeling were conducted using a high-performance laptop computer, as shown in (
Figure 8), equipped with professional 3D modeling and point cloud processing software. The primary software packages employed included RiSCAN, Geomagic Studio, and AutoCAD 3D (version 2025), each serving a specific role within the workflow. RiSCAN (version 2.23), was used for scan registration, visualization, and preliminary point cloud management, while Geomagic Studio (version 12.0) supported noise filtering, outlier detection, meshing, and surface reconstruction. AutoCAD 3D was used for geometric referencing, dimensional verification, and integration of modeled elements. This software combination enabled efficient handling of large datasets and accurate implementation of the Defected Information Modeling (DIM) process.
2.7.5. Target-Based Scan Registration
Artificial paper targets were employed to support accurate registration and spatial alignment of the multi-scan laser datasets. The targets consisted of white circular markers with black backgrounds, as well as black-and-white square targets, which were strategically placed across the Baron Empain Palace façades and mounted on selected defective stucco statues, as illustrated in
Figure 9. The targets served as clearly identifiable reference points within the laser scanning datasets, enabling reliable correspondence between overlapping scans acquired from different positions. Their high-contrast design ensured consistent detection under varying lighting and surface conditions, improving both rough and fine registration stages. Coordinates of the artificial targets were measured using the automated total station, providing an independent geometric reference framework for scan alignment. This integration reduced cumulative registration errors and enhanced the overall accuracy and stability of the point cloud datasets, particularly in areas characterized by complex geometry or limited surface features. The use of artificial paper targets proved especially effective for improving registration robustness at the object scale, supporting precise defect localization and enabling consistent comparison of datasets within the HBDIM framework and future multi-temporal monitoring applications, as shown in (
Figure 9).
2.7.6. Sample Preparation
Stucco formation samples representative of the original historic material were prepared for laboratory analysis by cutting them into cubic specimens measuring 5 × 5 × 5 cm, in accordance with standard mechanical testing requirements. Separate sets of samples were prepared for microstructural investigations using scanning electron microscopy (SEM) to examine morphological characteristics before and after consolidation treatment. Prior to treatment, all samples were dried in a convection oven at 110 °C for 24 h to achieve constant weight and eliminate residual moisture, thereby ensuring consistent initial conditions across all specimens. This preparation procedure minimized variability related to moisture content and enhanced the reliability and comparability of both mechanical and microstructural analyses, as shown in (
Figure 10 and
Figure 11).
2.7.7. Calcium Hydroxide Nanoparticles
Calcium hydroxide nanoparticles (Ca(OH)2) were employed as the consolidation material for improving the mechanical performance of deteriorated stucco formations. The nanoparticles, with an average particle size ranging between 40 and 50 nm, were dispersed in isopropanol alcohol at a concentration of 3% (w/v). To ensure homogeneous dispersion, 30 g of calcium hydroxide nanoparticle powder was added to 1000 mL of isopropanol alcohol and subjected to sonication for 20 min using an ultrasonic mixer combined with a magnetic stirrer. This process minimized particle agglomeration and enhanced the penetrability of the consolidant into the stucco pore network prior to application.
3. Results
This section presents the outcomes of the proposed HBDIM-based workflow, focusing on the digital documentation, defect detection, material characterization, and registration accuracy of the Baron Empain Palace and its deteriorated stucco elements. The results are organized to reflect the sequential stages of the methodology, beginning with data acquisition and defect identification, followed by laboratory-based material analyses and mechanical testing, and concluding with registration performance and Defected Information Modeling outcomes. The presented results provide quantitative and visual evidence supporting the integration of terrestrial laser scanning, photogrammetry, and nanomaterial-based consolidation within a unified four-dimensional digital environment. These findings establish a reliable baseline for subsequent discussion, sustainability assessment, and future multi-temporal monitoring applications.
3.1. Data Acquisition Results
The 3D model of the defective Baron statue was scanned in specified date, and opened in modeling software, as shown in left of
Figure 12. Then, the Canon Eos 5D digital camera image of the defective statue was captured on the same date, as shown on the right of the figure, with some artificial paper targets, as shown in
Figure 12. The comparison between the laser-scanned geometry and high-resolution photographic images enabled preliminary verification of surface discontinuities and supported subsequent defect localization within the HBDIM environment. The simultaneous acquisition ensured temporal consistency between geometric and visual datasets.
3.2. Data Detection Results
The observed model of the Palace was detected during the time with its coordinates (30°05′11″ N, 31°19′49″ E and 59 m elevation) and identified in Google Earth with high-detail resolution, on 30 November 2017, as shown in
Figure 13. This geospatial identification provided a contextual reference for the study area and facilitated spatial consistency between field observations, point cloud data, and external geospatial datasets.
3.3. Results of Data and Materials Analysis
The results of data and materials analysis revealed several recurring deterioration patterns within the Baron Empain Palace. These include cracking in the external palace walls, erosion of column coverings, deterioration and localized collapse of floor elements, and moisture-related damage in ceiling tiles, leading to partial loss of painted layers. These observed defect types were systematically analyzed and used to establish a preliminary defect taxonomy within the HBDIM framework. This classification forms the basis for spatial defect mapping, material condition assessment, and the evaluation of appropriate consolidation strategies, supporting subsequent digital modeling and conservation decision-making, as shown in (
Figure 14).
3.4. Material Analysis Results
3.4.1. CT Scan Investigation
Investigations by CT scan imaging allow study the samples from different angels rotated 360 degrees provide us with all the details of the surfaces of the sample. A CT scan shows that for the treatment of stucco formation surfaces, calcium hydroxide nanoparticles were used, which effectively dispersed in the pores of the stucco formations, and the homogeneous distribution on the surfaces and between the sample grains led to an increase in the stucco formation mechanical properties, as shown in
Figure 15.
3.4.2. Scanning Electron Microscope Observation
The investigation was used by SEM in order to examine the samples before and after consolidation and their ability to treat the stucco formations. The SEM investigation of the sample, untreated, can be seen in
Figure 16. The SEM examination of the sample treated with calcium hydroxide nanoparticles, Ca (OH)2-packed granules, filled the pores, spread across the surface of stucco formations, and showed homogeneous diffusion, as in
Figure 17.
3.4.3. Transmission Electron Microscope (TEM) Investigation
Transmission electron microscopes (TEMs) are used to visualize specimens and generate a highly magnified image up to 2 million times. High resolution can be used to analyze the quality, shape, and size of calcium hydroxide nanoparticles, as shown in
Figure 18. Together, CT, SEM, and TEM results provide multi-scale confirmation of nanoparticle penetration, linking internal pore filling, surface morphology improvement, and nanoscale material behavior within a single diagnostic framework.
3.4.4. XRF Analysis
X-ray fluorescence (XRF) analysis was conducted to determine the elemental composition of the stucco samples and to support the interpretation of deterioration mechanisms and consolidation behavior. As illustrated in
Figure 19, the XRF results indicate that calcium (Ca) is the dominant element. The results indicate that calcium (Ca) is the dominant element, with a measured concentration of 60.02 wt.%, confirming the calcareous nature of the stucco matrix. Sulfur (S) was detected at a relatively high concentration of 34.28 wt.%, suggesting the presence of sulfate-based phases, most likely gypsum (CaSO
4·2H
2O), which is consistent with historical plaster production techniques and secondary alteration processes. The elevated sulfur content may also be associated with environmental exposure, including atmospheric pollution and moisture-driven sulfate formation, which are known contributors to salt crystallization and surface degradation in historic stucco. Such sulfate phases can induce volumetric expansion during hydration–dehydration cycles, leading to microcracking, flaking, and loss of cohesion. Trace amounts of iron oxide (0.656 wt.%) were also identified, which are likely related to the use of iron-based pigments or natural impurities in the original plaster materials. These iron compounds may contribute to localized color variations and, under certain environmental conditions, can influence surface staining or crust formation. Overall, the XRF results provide essential compositional context for understanding the deterioration phenomena observed in the stucco formations and support the selection of calcium hydroxide nanoparticles as a chemically compatible consolidant. The dominance of calcium-based phases ensures mineralogical compatibility between the original material and the applied treatment, thereby reducing the risk of adverse chemical interactions and enhancing the long-term effectiveness of the consolidation process.
The elemental composition of the stucco samples obtained from XRF analysis is summarized in
Table 1 3.4.5. Mechanical Testing
Compressive strength testing was conducted using a universal testing machine to evaluate the effectiveness of calcium hydroxide nanoparticle consolidation in improving the mechanical performance of deteriorated stucco formations. The tests were performed in accordance with ASTM C170 [
45], which specifies the use of a minimum of five specimens with smooth, defect-free cube faces to ensure reliable and reproducible results [
42]. The treated samples exhibited a significant increase in compressive strength, reaching an average value of 168 MPa, corresponding to an improvement of approximately 69.06% compared to untreated specimens, as summarized in
Table 2. This enhancement indicates a substantial recovery of mechanical cohesion within the stucco matrix following consolidation. The observed strength improvement can be attributed to the penetration of calcium hydroxide nanoparticles into the pore network of the stucco, as confirmed by CT, SEM, and TEM analyses. The nanoparticles promote pore filling and microstructural densification, leading to improved load transfer between mineral grains and reduced internal voids. This microstructural reinforcement enhances resistance to compressive stresses commonly induced by environmental loading and material weathering processes. It should be emphasized that the reported compressive strength values are intended to serve as performance indicators for conservation assessment, rather than as structural design parameters. The primary objective of the mechanical testing was to evaluate the relative improvement achieved through consolidation and to support conservation decision-making by demonstrating enhanced material stability and durability.
Overall, the mechanical testing results corroborate the microstructural observations and validate the effectiveness of calcium hydroxide nanoparticles as a compatible and efficient consolidant for historic stucco formations. When integrated within the HBDIM framework, these results contribute to a comprehensive understanding of material behavior and provide a quantitative basis for assessing treatment performance and long-term conservation strategies. This level of mechanical improvement is considered sufficient for conservation purposes, as the objective is material stabilization rather than structural reinforcement.
The laboratory consolidation results (CT, SEM, TEM, XRF, and mechanical testing) were digitally linked to corresponding defect entities within the HBDIM environment. Each treated sample was associated with its spatial defect location, enabling consolidation performance data to be visualized and queried within the defect-oriented digital model. This integration transforms laboratory findings from isolated analyses into spatially referenced conservation intelligence embedded within the 4D framework.
3.5. Data Registration Results
The data registration process aimed to ensure accurate spatial alignment of multi-scan point cloud datasets acquired from different positions around the Baron Empain Palace and the defective stucco statue. A total of 23 laser scans comprising 12,702,112 points were successfully filtered and registered in RiSCAN, producing a high-resolution and visually coherent representation of the palace geometry (
Figure 20). In parallel, the defective statue dataset, consisting of 1,412,194 points, was registered and refined in Geomagic Studio, resulting in a detailed and realistic digital model suitable for defect analysis (
Figure 21). The registered point cloud datasets exhibited high geometric fidelity and visual clarity, largely due to the effective utilization of point intensity values during visualization and processing. The ability of both RiSCAN and Geomagic Studio to preserve intensity-based information contributed to enhanced surface detail recognition, which is particularly important for identifying fine-scale deterioration features such as cracks, erosion zones, and material loss. However, the results also revealed practical limitations associated with handling very large point cloud datasets. While higher point densities improved visual resolution, they significantly increased computational complexity, making navigation, manipulation, and real-time interaction more challenging. This observation highlights a critical trade-off between point density and processing efficiency, especially in large-scale heritage documentation projects.
From a methodological perspective, the registration results demonstrate that effective filtration and optimized point cloud management are essential for achieving reliable and efficient HBDIM implementation. Rather than maximizing point density indiscriminately, an optimized balance between data resolution and computational performance is required to support accurate defect localization and future multi-temporal comparisons. These findings provide an important reference for defining scanning strategies and data handling protocols in long-term 4D monitoring and heritage conservation applications. These findings confirm that registration accuracy achieved in this study is adequate for detecting time-dependent geometric changes, supporting future implementation of multi-temporal 4D monitoring within the HBDIM framework. The increase in standard deviation observed in high-density datasets can be attributed to noise amplification, point redundancy, and numerical instability during fine registration. While increased point density enhances geometric resolution, it may introduce correlation errors and computational sensitivity. Therefore, an optimized point density threshold is recommended to balance geometric fidelity and registration stability in long-term 4D monitoring applications.
Clarification of the 4D Implementation
The present study is based on a single high-resolution scanning campaign. Therefore, temporal evolution could not be directly measured. However, the framework is structured to accommodate future multi-temporal datasets through the following:
Re-registration of subsequent scans;
Surface deviation mapping between time phases;
Quantitative comparison of defect geometry;
Monitoring consolidation effectiveness.
Thus, while the current implementation represents a 3D baseline model with embedded time-stamping, the full 4D potential will be realized through future repeated scanning campaigns.
3.6. Registration Analysis Results
The overall standard deviations of the distances are displayed between all pairs of the polydata objects. The distance is calculated as the average normal distance between both planes. Results show some information about the used observations in a graphical manner in RiSCAN, as shown in
Figure 21. In the Graphical results, the color of the point is visualized based on the absolute distance between the two planes of the observation (red = small error and blue = big error). The histogram of the residues shows how many observations have a certain distance between the two planes of the observations. Once the analysis was done, it means that the search for the corresponding point pairs and analysis of the errors without modification of the positions and orientations of the scans poses. A study of the relationship between the input of the number of observations that are used for the calculations as polydata numbers and the output of the standard deviations’ values of the distances between all used pairs of the polydata objects is shown in
Figure 22 and
Figure 23.
Some results are illustrated in (
Table 3 and
Figure 23) to study the relation between the input of numbers of observations as polydata numbers, and the output of the standard deviation values of the distances between the used pairs of the polydata objects.
After the data was aligned very well, there is no notable difference between least squares and robust fitting. In the following section, the relation is investigated between the input of numbers of observations that are used for the calculations as polydata numbers, and the output of the standard deviation values of the registration distances between the used pairs of the polydata objects, as shown in (
Table 3).
The observed increase in standard deviation with higher polydata density highlights the importance of optimized data selection, demonstrating that excessive point density may reduce computational efficiency without proportional gains in accuracy.
From
Table 3, it is concluded that the standard deviation error values (of the distances between the used pairs of the polydata objects) increase from 0.0139 m to 0.0476 m with the increasing number of polydata observations that are used in the registration calculations.
3.7. Results and Analysis of the Defected Information Modeling (DIM)
The defective Data Modeling proposed in the historical data is processed by the researcher to acquire the appropriate modeling results. These results demonstrate that the effectiveness of defect visualization is influenced not only by point density but also by software-specific rendering and intensity handling strategies. This observation supports the adoption of optimized point cloud configurations tailored to defect analysis rather than maximum data density.
3.7.1. Data Acquisition Results and Analysis
It starts with acquiring the point clouds of (Z+F) Imager laser scanners from different positions around the Baron Palace and the defective statue to achieve all defective objects, faces, and details. In total, 54,446,410 point clouds of seven registered laser scans of Baron Palace are imported into the modeling software (Geomagic Studio), as shown in
Figure 24.
And (16,297,294) point clouds of the Baron Palace façade are imported into the RiSCAN modeling software, as shown in
Figure 25. It is found that fewer point clouds (16,297,294) in RiSCAN software give a more realistic image than more point clouds (54,446,410) in Geomagic software. On the other hand, the points navigating and handling in Geomagic are easier than navigating and handling points in RiSCAN.
3.7.2. Format Conversion Result
The studied historical models were converted from the native (Z+F) laser scanner ASCII format into Geomagic Studio 12 (.wrap) format, as illustrated in
Figure 26. The converted datasets included XYZRGB spatial coordinates, where geometric information and intensity values were preserved and represented as color attributes. For the Baron Empain Palace exterior, the imported dataset reached a total data size of approximately 607 MB. This format conversion step enabled efficient handling of large-scale point cloud datasets while maintaining geometric accuracy and radiometric information essential for defect visualization. The successful preservation of spatial coordinates and intensity values facilitated subsequent noise filtering, meshing, and defect-oriented modeling within the DIM workflow. As a result, the converted datasets provided a reliable foundation for accurate defect localization, surface analysis, and further processing stages within the HBDIM framework.
3.7.3. Defective Outliers Algorithm Results and Analysis
The defective outliers detection algorithm was applied to the registered point cloud of the studied statue using a sensitivity parameter of 50%, which was selected to balance defect detectability and noise suppression. Under this configuration, the algorithm identified 54,707 defective points, highlighted in red, from a total of 1,466,901 points within the current dataset, as shown in
Figure 27. The detected outlier points correspond to geometric deviations located at distances significantly different from neighboring surface points. These deviations are primarily associated with deterioration features such as surface loss, cracks, edge degradation, and material detachment. The spatial distribution of the defective points demonstrates the ability of the algorithm to isolate localized damage patterns while preserving the integrity of intact surface geometry. From a methodological perspective, the extracted defective points represent a quantitative defect layer within the DIM workflow. This layer provides a measurable basis for defect classification, spatial mapping, and subsequent correlation with material diagnostics and consolidation assessment within the HBDIM framework. The results confirm that the applied sensitivity setting is suitable for reliable defect detection in complex historic stucco geometries.
3.7.4. Noise Algorithms Results and Analysis
Noise reduction algorithms were applied to the registered point cloud of the studied statue to reduce point density irregularities, suppress random noise, and remove surrounding points not belonging to the target surface. This process resulted in a more uniform spatial distribution of points, facilitating smoother surface wrapping and improved geometric continuity, as illustrated in
Figure 28. Following noise reduction, the processed dataset consisted of 1,412,194 points with a reported standard deviation of 0.000927 m, indicating a high level of geometric consistency. This low standard deviation reflects the effectiveness of the applied noise filtering in preserving the true surface geometry while minimizing distortions introduced by scanning artifacts or environmental interference. From a modeling perspective, the improved uniformity of the point cloud enhanced the stability of subsequent meshing and triangulation steps, reducing surface roughness and preventing the propagation of noise-related artifacts into the polygonal model. These results confirm that the applied noise reduction strategy provides a reliable preprocessing stage for defect-oriented modeling and supports accurate defect visualization and quantitative analysis within the DIM and HBDIM frameworks.
3.7.5. Triangulation (Meshing) and Volume Wrapping Results and Analysis
Triangulation was performed to convert the processed point cloud data into a polygonal surface model by transforming point-based geometry into interconnected triangular elements. This step enabled the generation of a continuous surface representation suitable for visualization, quantitative analysis, and defect interpretation. Following triangulation and volume wrapping in Geomagic, the defective statue dataset was converted from 1,412,194 point cloud elements into 99,999 triangular faces, as illustrated in
Figure 29. The substantial reduction in data complexity while preserving geometric fidelity demonstrates the effectiveness of the meshing and wrapping process. Volume wrapping enhanced surface continuity, eliminated minor gaps, and stabilized edge conditions, resulting in a coherent and watertight model. This improved surface integrity is particularly important for accurately representing defect boundaries, such as cracks and material loss zones, without introducing artificial discontinuities. From a computational perspective, the optimized triangular mesh significantly reduced data size and processing overhead, enabling efficient handling, visualization, and further analysis within the DIM workflow. The resulting mesh provides a robust geometric foundation for texture mapping, defect localization, and subsequent integration into the HBDIM framework and future four-dimensional monitoring applications.
3.7.6. Texture Mapping Results
Texture mapping was applied to the triangulated and volume-wrapped mesh to enhance the visual realism and interpretability of the defective statue model. High-resolution colored digital images were mapped onto the polygonal surfaces in Geomagic Studio, transferring actual surface color information to the geometric model, as shown in
Figure 29. The textured model enabled clear visual correlation between geometric defects and surface conditions, such as discoloration, moisture stains, paint loss, and erosion patterns. This integration improved the identification and interpretation of deterioration features that are not always evident from geometry alone. From an analytical perspective, texture mapping supported more accurate defect localization by combining color-based visual cues with spatially referenced geometric data. The resulting textured model, therefore, enhances qualitative inspection and complements quantitative defect analysis within the DIM and HBDIM frameworks, providing a more comprehensive representation of surface deterioration suitable for documentation, assessment, and conservation planning
4. Discussion
The results of this study demonstrate that integrating four-dimensional digital documentation with material-level diagnostics provides a robust and scalable framework for the sustainable conservation of historic stucco elements. The proposed Historical Building Defect Information Modeling (HBDIM) approach successfully bridges the gap between geometric documentation, defect characterization, and consolidation performance assessment, offering a comprehensive basis for informed conservation decision-making.
4.1. Digital Identification and Localization of Deterioration
The TLS and photogrammetric results enabled accurate identification and spatial localization of various deterioration patterns, including cracking, erosion, material detachment, and moisture-induced damage. Unlike traditional visual inspection methods, which are largely qualitative and subjective, the digital workflow adopted in this study provides measurable and repeatable defect information linked to precise spatial coordinates. By integrating defect attributes within the HBDIM environment, each deterioration feature can be documented not only geometrically but also in relation to its material condition and surrounding context. This capability enhances diagnostic accuracy and establishes a structured digital baseline for future monitoring campaigns.
4.2. Reliability of Registration and Implications for Long-Term Monitoring
The registration analysis demonstrated that acceptable accuracy levels can be achieved for heritage-scale documentation using a combination of artificial targets and automated total station measurements. The observed relationship between polydata density and registration error highlights an important practical consideration: increasing point density does not necessarily result in proportional improvements in accuracy and may introduce computational inefficiencies. These findings are particularly relevant for multi-temporal monitoring applications, where consistent registration accuracy is essential for detecting subtle geometric changes over time. The reported standard deviation values, therefore, provide a quantitative reference for optimizing scanning strategies in future HBDIM-based monitoring programs.
4.3. Integration of Material Diagnostics Within the HBDIM Framework
One of the most significant contributions of this study lies in the integration of laboratory-based material diagnostics into the digital defect modeling environment. CT imaging confirmed the effective penetration and homogeneous distribution of calcium hydroxide nanoparticles within the pore structure of the stucco samples, while SEM and TEM analyses revealed improved microstructural cohesion and pore filling. The mechanical testing results further supported these observations, with treated samples exhibiting an average compressive strength increase of approximately 69% compared to untreated specimens. Embedding such performance-related data within the HBDIM framework establishes a direct link between observed surface deterioration, internal material behavior, and treatment effectiveness—an integration that is rarely achieved in conventional HBIM-based workflows [
40].
4.4. Added Value of the Fourth Dimension (4D)
Although the present application is based on a single scanning epoch, the proposed HBDIM framework is explicitly designed to incorporate time as a fourth dimension. This temporal component enables the systematic comparison of building conditions before consolidation, immediately after treatment, and during subsequent monitoring phases. By structuring defect information in a time-aware digital environment, HBDIM provides a methodological foundation for assessing deterioration rates, validating consolidation durability, and identifying early warning indicators of renewed material degradation. Even at the proof-of-concept stage, the framework demonstrates clear potential for supporting predictive and preventive conservation strategies.
4.5. Implications for Preventive and Sustainable Conservation
The integration of high-resolution digital documentation, non-destructive material analysis, and compatible nanomaterial consolidation represents a shift from reactive intervention toward preventive and data-driven conservation practices. By minimizing invasive investigations and enabling targeted treatments, the HBDIM approach reduces unnecessary material loss and limits the frequency of physical interventions. This integrated methodology supports more sustainable conservation outcomes by optimizing resource use, preserving original historic fabric, and enhancing long-term management efficiency. As such, HBDIM provides a transferable framework applicable to a wide range of heritage materials and architectural contexts beyond the case study presented.
4.6. Limitations and Future Perspectives
Despite the promising results, several limitations should be acknowledged. The current study focuses on a single temporal dataset, and future research should incorporate repeated TLS campaigns to fully exploit the four-dimensional capabilities of HBDIM. In addition, expanding the comparative evaluation of different compatible consolidants under varying environmental conditions would further strengthen the generalizability of the framework. Future developments should also explore the integration of environmental monitoring data, such as humidity, temperature fluctuations, and groundwater levels, to enhance the predictive capacity of HBDIM and support more resilient long-term conservation strategies.
6. Conclusions
This research presented and validated a four-dimensional Historical Building Defect Information Modeling (HBDIM) framework that integrates digital documentation, defect-oriented modeling, and material diagnostics to support sustainable conservation of historic stucco elements. The proposed framework extends conventional HBIM approaches by incorporating a defect-focused information structure and a temporal layer that enables long-term monitoring of deterioration and conservation performance.
The application of the framework to the Baron Empain Palace demonstrated the effectiveness of combining terrestrial laser scanning, photogrammetry, and precise registration techniques to generate accurate digital models capable of identifying and spatially localizing deterioration features such as cracking, erosion, and material loss. Integrating laboratory diagnostic methods—including CT, SEM, TEM, and XRF analyses—enabled detailed evaluation of the consolidation behavior of calcium hydroxide nanoparticles and provided microstructural evidence of pore filling and improved material cohesion.
Mechanical testing confirmed the effectiveness of the consolidation treatment, showing an average compressive strength improvement of approximately 69% in treated stucco samples. Linking these laboratory results to spatially referenced defects within the HBDIM environment demonstrates the potential of the framework to transform isolated material analyses into actionable conservation data within a digital heritage management system.
Overall, the proposed HBDIM framework contributes to the advancement of heritage conservation by providing a data-driven methodology that integrates geometric documentation, material diagnostics, and temporal monitoring within a unified platform. This approach supports preventive conservation strategies, reduces unnecessary physical interventions, and enhances the long-term preservation of historic architectural fabric. Future work should focus on implementing multi-temporal scanning campaigns, integrating environmental monitoring data, and evaluating additional compatible consolidants to further expand the predictive and sustainable capabilities of the HBDIM framework.