Three-Dimensional Point Cloud Task-Speciﬁc Uncertainty Assessment Based on ISO 15530-3 and ISO 15530-4 Technical Speciﬁcations and Model-Based Deﬁnition Strategy

: Data-driven manufacturing in Industry 4.0 demands digital metrology not only to drive the in-process quality assurance of manufactured products but also to supply reliable data to constantly adjust the manufacturing process parameters for zero-defect manufacturing processes. Better quality, improved productivity, and increased ﬂexibility of manufacturing processes are obtained by combining intelligent production systems and advanced information technologies where in-process metrology plays a signiﬁcant role. While traditional coordinate measurement machines offer strengths in performance, accuracy, and precision, they are not the most appropriate in-process measurement solutions when fast, non-contact and fully automated metrology is needed. In this way, non-contact optical 3D metrology tackles these limitations and offers some additional key advantages to deploying fully integrated 3D metrology capability to collect reliable data for their use in intelligent decision-making. However, the full adoption of 3D optical metrology in the manufacturing process depends on the establishment of metrological traceability. Thus, this article presents a practical approach to the task-speciﬁc uncertainty assessment realisation of a dense point cloud data type of measurement. Finally, it introduces an experimental exercise in which data-driven 3D point cloud automatic data acquisition and evaluation are performed through a model-based deﬁnition measurement strategy.


Introduction
Metrology is considered a fundamental tool in the context of Industry 4.0, where reliable data are needed to realise data-driven manufacturing strategies [1][2][3]. As far as metrology is moving from the lab to the shop floor where the manufacturing of goods takes place, it is breaking the stigma of non-productive activity and gaining a position as an enabling technology that adds value to every step of the production process [4]. This perception is becoming more evident in Industry 4.0, where measurement data from several sensors are required, including dimensional data, for the monitoring of complete manufacturing processes and real-time adjustment of process parameters, including the creation and use of metrological digital twins [2,[5][6][7][8][9].
Massive integration of 3D optical sensors within manufacturing processes is occurring nowadays, replacing traditional Coordinate Measurement Machines (CMM) within the automotive, aerospace and power generation industries, among the leading industries in the adoption of MBD [10]. However, while the delivery of millions of points in a matter of seconds is assumed by 3D optical sensors, the process of automatically converting dense data into meaningful information and assuring the quality of these data remains a challenge [11].
This research article presents a practical approach to addressing both challenges. While the process of converting dense data into meaningful information is solved through a Quality Information Framework (QIF)-Model-based Definition (MBD) based measurement post-processing strategy, the assurance of the quality of the data that relies on the establishment of metrological traceability is assessed by the combination of the ISO 15530-3 and ISO 15530-4 technical specifications through which the establishment of metrological traceability, which requires (a) evaluation of the measurement uncertainty and (b) the realisation of an unbroken chain of calibrations to relate a measurement result to a reference value [12], is realised. Thus, the article introduces a task-specific uncertainty assessment of a dense point cloud type of data acquisition in the absence of reliable numerical simulation models for optical systems.
Considering the evaluation of the measurement uncertainty, the Guide to the Expression of Uncertainty in Measurement (GUM) JCGM 100:2008 [13] establishes general rules for evaluating and expressing uncertainty in measurements that are intended to be applied to a broad spectrum of measurements. The GUM-proposed general measurement procedure seems to be clear and easy to adopt but it can be extremely difficult to implement when a complex measurement system is evaluated. As stated by Dury et al., in the broad study about 3D optical systems characterisation performed in the National Physical Laboratory (NPL) "National FreeFrom centre" [11,[14][15][16], there are many potential uncertainty error sources such as the light condition, measurand surface properties, system orientation and resolution, ambient temperature, measurement volume, chromatic effects, etc. that complicate to a high extent the reliable characterisation of those systems.
Compared with traditional CMMs, 3D optical systems are a relatively new technology, and their measurement error sources are still being researched. Even though the German guideline for optical 3D measuring systems, the VDI/VDE 2634 series (parts 2 and 3) [17,18], attempts to provide a procedure for comparing the performance of different systems for the acceptance and re-verification of these systems, it does not consider all the potential uncertainty sources while operating in unfavourable environments. Therefore, the lack of measurement procedures to fully understand how 3D optical systems behave under different measurement scenarios limits to a high extent the development of mathematical modelling for those systems [11], and therefore, the development of a digital metrology twin.
The challenge of converting dense data into meaningful information in a matter of seconds involves providing real-time automatic decision-making capability and therefore constantly adjusting process parameters for a zero-defect manufacturing scenario. However, when a 3D optical system is integrated into a manufacturing process and captures millions of points in seconds, "faster data processing" remains a challenge. Thus, the recent publication of the ISO Standard 23952:2020 "Automation systems and integration-Quality Information Framework (QIF)-An integrated model for manufacturing quality information" [19] opens the door to real-time automatic in-line quality control. This Standard suggests a new XML Schema Definition Language that defines, organises and associates the quality and metrology information needed in manufacturing systems and therefore, it allows the effective exchange of metrology data throughout the entire manufacturing quality measurement process-from product design to inspection planning to execution to analysis and reporting. For product definition, QIF includes the ISO QIF part 3: QIF Model-based Definition (MBD) [20][21][22], which defines a digital data format to convey part geometry (typically called the "CAD" model) and information to be consumed by downstream manufacturing quality processes, such as Product Manufacturing Information (PMI) [21][22][23]. This means that MBD allows the attachment of Geometric Dimensioning and Tolerancing (GD&T) information to a CAD model, typically with full "smart" associativity, to create a semantic model. This semantic CAD model allows metrology software to automatically create either an inspection plan or decision-making results (angles, distances, GD&T tolerances, etc.) from available 3D point data. Thus, the QIF MBD information model allows converting the captured dense data into meaningful information using automatic data processing methodologies [1,8,21,24,25]. Therefore, in general terms, MBD is a digital-product model that defines the requirements and specifications of the product and is the cornerstone for Model-Based Enterprise (MBE) since MBE uses MBD to define the product requirements and specifications instead of paper-based documents as the data source for all engineering activities, including the metrology activities during the manufacturing of the product, throughout the product lifecycle [20][21][22][23][26][27][28].
The state-of-the-art of uncertainty assessment to point cloud measurement shows that task-specific uncertainty assessment has not been frequently applied to dense point cloud measurements. Different approaches were suggested for the uncertainty assessment of point clouds, such as the approach introduced by Ding et al., based on spatial feature registration analysis [29]. Senin et al. suggested a method based on fitting Gaussian random fields to high-density point clouds produced by measurement repeats where the fitted field delivers a depiction of the spatial distribution of random measurement error over a part geometry [30]. Yang et al. investigated the point cloud registration step as a major uncertainty source in the laser scanning-aided aircraft assembly process [31]. Zhang et al. also appointed the reconstruction of every point cloud acquisition process as a critical uncertainty source [32]. Forbes et al. presented an uncertainty assessment method associated with the position, size and shape of point cloud data [33]. Another important approach for the uncertainty assessment of point clouds is the mathematical modelling of the measurement instruments, mainly optical systems, employed in the data acquisition process. Mohammadikaji et al. suggested an approach to categorise and model the dominant sources of uncertainty and study the probabilistic propagation of the uncertainties in a 3D inspection using laser line scanners [34]. Zhao et al. suggested the use of a structured light system including the instrument itself, data acquisition, data processing, and other factors as a black model for the uncertainty assessment of 3D point clouds [35]. Some researchers also presented experimental methods to model the systematic errors pertinent to laser scanners [36,37]. Xi et al. suggested various scanner-to-surface distances and inclination angles raise systematic uncertainties for optical sensors [38,39]. Finally, the use of physical artifacts combined with a Design of Experiment (DOE) method was also suggested for the uncertainty assessment of optical systems [40][41][42][43][44][45].

Practical Approaches to the Uncertainty Assessment within Production Metrology
In cases where potential uncertainty sources for a measurement process can be ascertained, it is relatively easy to follow the prescription of the GUM JCGM 100:2008 [13] uncertainty framework. However, this is not the case for CMMs or 3D optical systems, in which it is extremely difficult to understand how every potential uncertainty source affects the final result. In these cases, different approaches were applied to estimate the uncertainty of the coordinate measurement. In the case of CMMs, the prevailing guidance for users is given in the ISO 15530 technical specifications. While part 1 is very informative and tutorial but not intended to provide operative evaluation tools, parts 3 and 4 are the procedures followed by the manufacturing industry for the uncertainty assessment of coordinate measurement [46,47]. While Section 3 defines an experimental comparison method using a calibrated workpiece, Section 4 suggests a computer simulation approach to provide task-specific uncertainty assessment. The project "Evaluating the Uncertainty in Coordinate Measurement" (EUCOM-under grant agreement nº 17NRM03) project within the European Metrology Programme for Innovation and Research (EMPIR) program has performed the research to develop the two missing parts of the ISO 15530 series: part 2 on a repetition and reversal method and part 5 on a method based on prior information and expert judgement.
In the case of 3D optical systems, system manufacturers employ VDI 2634 parts 2 and 3 [17,18] to characterise and run the product acceptance test before product delivery, but this does not mean that complete system characterisation is performed for a robust measurement uncertainty assessment.

ISO 15530-3 Technical Specification
The ISO 15530-3 [46] technical specification is a substitution method that simplifies the uncertainty evaluation exercise through the similarity between the dimensions and shapes of the workpiece and one calibrated reference part. It is based on a statistical evaluation of the measurement errors observed concerning the calibrated value of the reference part. The user must perform a relevant number (>20) of measurements under various conditions that they might expect while measuring real workpieces. This approach appears to be straightforward from the viewpoint of the user and attempts to cover intrinsic and extrinsic uncertainty contributors. However, in practice, it is fraught with difficulties. Any divergence between the master and measured parts can lead to uncertainties. Because of the similarity requirement between the produced workpiece and the calibrated standard, this approach is very arduous and expensive for large-scale metrology, where the storage, maintenance and calibration of large components is a major expense. However, it is a reliable approach for serial production, usually for small-and medium-sized components because it is affordable to manufacture and calibrate a reference part for uncertainty assessment purposes. It is usually employed for medium-size component uncertainty assessments in CMMs or Machine Tools (MT). This approach determines four input quantities as explained below [46]: u b : standard uncertainty associated with the systematic error of the measurement process; u p : standard uncertainty associated with the measurement procedure; u cal : standard uncertainty associated with the uncertainty of the workpiece calibration; u w: standard uncertainty associated with material and manufacturing variations.
Finally, the law of uncertainty propagation is applied to obtain the combined standard uncertainty according to GUM JCGM 100:2008 [13] and the result is multiplied by an appropriate coverage factor to yield an expanded uncertainty, according to Equation (1). Figure 1 shows the practical approach to this method.
In the case of 3D optical systems, system manufacturers employ VDI 2634 parts 2 and 3 [17,18] to characterise and run the product acceptance test before product delivery, but this does not mean that complete system characterisation is performed for a robust measurement uncertainty assessment.

ISO 15530-3 Technical Specification
The ISO 15530-3 [46] technical specification is a substitution method that simplifies the uncertainty evaluation exercise through the similarity between the dimensions and shapes of the workpiece and one calibrated reference part. It is based on a statistical evaluation of the measurement errors observed concerning the calibrated value of the reference part. The user must perform a relevant number (>20) of measurements under various conditions that they might expect while measuring real workpieces. This approach appears to be straightforward from the viewpoint of the user and attempts to cover intrinsic and extrinsic uncertainty contributors. However, in practice, it is fraught with difficulties. Any divergence between the master and measured parts can lead to uncertainties. Because of the similarity requirement between the produced workpiece and the calibrated standard, this approach is very arduous and expensive for large-scale metrology, where the storage, maintenance and calibration of large components is a major expense. However, it is a reliable approach for serial production, usually for small-and medium-sized components because it is affordable to manufacture and calibrate a reference part for uncertainty assessment purposes. It is usually employed for medium-size component uncertainty assessments in CMMs or Machine Tools (MT). This approach determines four input quantities as explained below [46]: ub: standard uncertainty associated with the systematic error of the measurement process; up: standard uncertainty associated with the measurement procedure; ucal: standard uncertainty associated with the uncertainty of the workpiece calibration; uw: standard uncertainty associated with material and manufacturing variations. * Finally, the law of uncertainty propagation is applied to obtain the combined standard uncertainty according to GUM JCGM 100:2008 [13] and the result is multiplied by an appropriate coverage factor to yield an expanded uncertainty, according to Equation (1). Figure 1 shows the practical approach to this method. According to the ISO 15530-3 technical specification, the standard uncertainty (up) is determined using Equation (2). The standard uncertainty associated with the systematic According to the ISO 15530-3 technical specification, the standard uncertainty (u p ) is determined using Equation (2). The standard uncertainty associated with the systematic error is given by Equation (3). Moreover, if the measurement result is not corrected by the systematic error, the error fully contributes to the uncertainty budget; thus, (u b ) = b. Thus, The ISO 15530-4 [47] technical specification introduces a "task-specific uncertainty" assessment method based on computer simulation. Measuring instruments such as CMMs and 3D optical systems are multi-purpose instruments which means that potential measurement uncertainties vary with the task being performed, environment, operator, or chosen measurement methodologies. The "task-specific uncertainty" in coordinate measurement is the measurement uncertainty that results when a specific feature is measured using a specific inspection plan. The approach is similar to GUM but instead of using an analytical approach based on a complete closed-form mathematical model, it uses a simulation method (for example, the Monte Carlo method) run on a computer to estimate the uncertainty statement for a particular measurement task. This is even more complex than the GUM approach because an initial model of the measurement instrument and process is required to run the simulation. The simulation model or virtual instrument model generates a perturbed point that represents an estimate of what a particular measurement instrument would have reported when measuring that commanded point. This process is performed several times, running as many measurements as the simulation iterations (hundreds or thousands), which enables the creation of simulation results of the measurement uncertainty.
The current state-of-the-art shows that the ISO 15530-4 approach is already being applied to measuring instruments such as CMMs or laser trackers, while modelling of optical sensors is still being developed by the research community and therefore VCMM for optical sensors is not still commercially available. The popular name for the method applied to CMMs is the so-called Virtual Coordinate Measuring Machine (VCMM) [48], which performs a point-by-point simulation of measurements, emulating the measurement strategy, measuring conditions, and physical behaviour of the CMM with dominant uncertainty contributions disturbing the measurement [48][49][50]. For spherical measuring instruments such as laser trackers or laser scanners, a basic spherical error model is considered in combination with a Gaussian Probability Density Function (PDF) to apply the law of uncertainty propagation. Figure 2 shows the VCMM approach, where the thick black lines show the data flow for a normal CMM measurement, while the thick grey lines show the additional data flow that is employed to achieve a VCMM estimate. Wilhelm et al. [50] presented a description of the complete VCMM workflow, as shown in Figure 2. The so-called Digital Metrology/Measurement Twin (D-MT) is the virtual representation of either a measurement instrument or the complete measurement procedure [51][52][53] and it is frequent to use a similar mathematical model to that developed within the ISO 15530-4 approach to run the simulation. Here, Artificial Intelligence (AI) algorithms The so-called Digital Metrology/Measurement Twin (D-MT) is the virtual representation of either a measurement instrument or the complete measurement procedure [51][52][53] and it is frequent to use a similar mathematical model to that developed within the ISO 15530-4 approach to run the simulation. Here, Artificial Intelligence (AI) algorithms such as machine learning, deep learning or neural networks are being researched for the development of those D-MT and uncertainty assessment tasks [54][55][56].

MBD-Based Metrology
The MBD approach will allow meeting the challenge of converting dense data into meaningful information in a matter of seconds within the production line which will allow a quick decision-making process within the production environment. However, the current state of MBD industrial implementation shows that manufacturers have applied MBD to product definition for some time, whereas aerospace and defence customers have played the role of leaders with a slower adoption in other industries [22].
From the CAD suppliers' point of view, MBD is seen as the cornerstone of creating a functioning digital thread. While the goal is to have a single source of truth for downstream operations in making a part, most CAD suppliers provide MBD in a proprietary format which means that interoperability between systems remains a challenge. Previously, a universal CAD format already exists, the ISO 10303 STEP format [57] with its accompanying AP 242 extension which includes 3D model data representation, geometric tolerance and PMI to enable global design and manufacturing collaboration [23]. However, several questions remain regarding the full definition of MBD. Standards such as ASME Y14.41 [58] and ISO 16792 [59] still exist to document how a model should be defined with annotations. These standards also help in understanding how to interpret the data within the model but the standards do not document the required amount of information that the model must contain [60].
From an MBD-based metrology point of view, MBD is allowing an automatic quality assurance workflow, allowing the automation of either the measurement program creation or the data evaluation process stages [24]. While the former is already available within the main CMM commercial software, the latter can be applied for any point cloud if the measurand MBD model and the MBD software are available. Model-based inspection has not been paid much attention to within the metrology community since the 1990s [61][62][63][64].
The MBD-based metrology process starts by creating a 3D CAD model with semantic PMI information that should be both human and machine-readable [65]. The 3D model with PMI shall contain all GD&T geometric information related to the component under measurement as well as the information related to the Bill of materials (BOM), Surface finish, weld symbols, manufacturing or measurement process plan data, metadata and notes, history of an engineering change order, legal/proprietary/export control notices and other definitive digital data [65].
The associativity between the CAD model and MBD is required to have a fully semantic smart model that allows automatic part programming and post-processing. The ability of downstream programs to read MBD models and create measurement programs is as important as creating CAD models with an attached semantic MBD. Thus, the CMM is virtually configured, and once the MBD file is imported and a set of rules is applied and matched to the configured CMM, a part program is automatically generated aided by these a priori digital approaches [24,66,67]. Typically, a second optimisation is performed to reduce the number of probe changes and minimise the CMM path length.
In the post-processing stage, the MBD concept allows fast point cloud analysis and evaluation of the measurement data. During the automatic evaluation process, the acquired point cloud is aligned to the CAD model, and an automatic segmentation process is performed using the available MBD data. At this stage, each measured point is associated with its corresponding geometric features. Then, the geometric features were adjusted using linear regression methods, rejecting possible outliers. Finally, the real relationships among the adjusted features were estimated (dimension, form error, relative positioning, etc.) through the fully automatic interpretation and evaluation of previously defined GD&Ts. Thus, the process of converting dense data into meaningful metrology-rich information is executed automatically in seconds.

The Methodology and Its Experimental Implementation
The following lines describe the experimental exercise performed on the dummy part to realise the practical implementation of the previously mentioned technological concepts (see Figure 3). In general terms, the suggested workflow is based on three main steps: (1) measurement of the dummy part, including the data acquisition process (×10 repetitions) and automatic data process strategy (MBD) (box 1); (2) reference measurement of the dummy part on a CMM according to the ISO 15530-4 technical specification (box 2), and (3) the task-specific (GD&T) uncertainty assessment process according to the ISO 15530-3 technical specification (box 3).
In the post-processing stage, the MBD concept allows fast point cloud analysis and evaluation of the measurement data. During the automatic evaluation process, the acquired point cloud is aligned to the CAD model, and an automatic segmentation process is performed using the available MBD data. At this stage, each measured point is associated with its corresponding geometric features. Then, the geometric features were adjusted using linear regression methods, rejecting possible outliers. Finally, the real relationships among the adjusted features were estimated (dimension, form error, relative positioning, etc.) through the fully automatic interpretation and evaluation of previously defined GD&Ts. Thus, the process of converting dense data into meaningful metrologyrich information is executed automatically in seconds.

The Methodology and its Experimental Implementation
The following lines describe the experimental exercise performed on the dummy part to realise the practical implementation of the previously mentioned technological concepts (see Figure 3). In general terms, the suggested workflow is based on three main steps: (1) measurement of the dummy part, including the data acquisition process (x10 repetitions) and automatic data process strategy (MBD) (box 1); (2) reference measurement of the dummy part on a CMM according to the ISO 15530-4 technical specification (box 2), and (3) the task-specific (GD&T) uncertainty assessment process according to the ISO 15530-3 technical specification (box 3). The three main steps of the methodology are explained next and linked to the uncertainty contributors that comprise the uncertainty budget presented in the third step.
1. Measured GD&T evaluation: Automatic 3D point cloud measurement, evaluation, and statistical analysis of multiple GD&T results based on the MBD-based approach are performed. From these data, the standard uncertainty associated with the measurement process variability (up) is obtained. The three main steps of the methodology are explained next and linked to the uncertainty contributors that comprise the uncertainty budget presented in the third step.

1.
Measured GD&T evaluation: Automatic 3D point cloud measurement, evaluation, and statistical analysis of multiple GD&T results based on the MBD-based approach are performed. From these data, the standard uncertainty associated with the measurement process variability (u p ) is obtained.

2.
Reference GD&T values: The dummy part is calibrated in an MMC according to the ISO 15530-4 technical specification [47]. The ZEISS VCMM™ tool is used to assess the task-specific uncertainty value for every calibrated feature. From these data, the standard uncertainty associated with the uncertainty of the MMC calibration (u cal ) is obtained.

3.
ISO 15530-3 method: The task-specific uncertainty assessment of every GD&T value obtained from the 3D point cloud measurement is performed according to the ISO 15530-3 technical specification [46]. From these data, the standard uncertainty associated with the systematic error of the measurement process is obtained (u b ).
The experimental implementation of the suggested methodology is explained next point by point:

Measured GD&T Evaluation
A dense 3D point cloud data acquisition process is performed using a GOM ATOS III Triple Scan™ 3D optical system on a medium-sized geometric-type dummy part. The data acquisition process is fully automatic by combining an automatic rotary table with the manual triggering of the measuring instrument. Thus, any potential error source derived from external sources, such as the alignment process between partial scanning or operator influence, is avoided.
The experimental exercise is performed in a metrology laboratory at a temperature of 20 ± 1 • C. In this way, thermal stability during the data acquisition process is guaranteed; and therefore, the geometric variation of the dummy part caused by thermal drift is avoided. Figure 4 shows the dummy part employed during the experimental exercise.
The experimental implementation of the suggested methodology is explained next point by point:

Measured GD&T Evaluation
A dense 3D point cloud data acquisition process is performed using a GOM ATOS III Triple Scan™ 3D optical system on a medium-sized geometric-type dummy part. The data acquisition process is fully automatic by combining an automatic rotary table with the manual triggering of the measuring instrument. Thus, any potential error source derived from external sources, such as the alignment process between partial scanning or operator influence, is avoided.
The experimental exercise is performed in a metrology laboratory at a temperature of 20 ± 1 °C. In this way, thermal stability during the data acquisition process is guaranteed; and therefore, the geometric variation of the dummy part caused by thermal drift is avoided. Figure 4 shows the dummy part employed during the experimental exercise.  The measurement instrument configuration is set at a working volume of 320 × 240 × 240 mm 3 so that the measurement resolution was optimised for the specific dummy part under measurement. The automatic data acquisition process is realised by eight angular rotary table equidistant positions, where partial scans are performed and stitched together automatically to reconstruct the overall 3D point cloud. Thus, the entire measurement process is executed within 45 s, and a point cloud comprised of 1 million points is obtained.
The fiducial targets are attached to the rotary table and unequivocally identified in each partial scan. In this manner, the automatic partial point cloud registration problem is solved, and an automatic data merging process between partial scans is performed. Once the reconstructed 3D point cloud is obtained, it is converted into a mesh using the Delaunay triangulation method. In addition to the XYZ information of every point within the point cloud, this mesh also contains information related to the surface normal value The measurement instrument configuration is set at a working volume of 320 × 240 × 240 mm 3 so that the measurement resolution was optimised for the specific dummy part under measurement. The automatic data acquisition process is realised by eight angular rotary table equidistant positions, where partial scans are performed and stitched together automatically to reconstruct the overall 3D point cloud. Thus, the entire measurement process is executed within 45 s, and a point cloud comprised of 1 million points is obtained.
The fiducial targets are attached to the rotary table and unequivocally identified in each partial scan. In this manner, the automatic partial point cloud registration problem is solved, and an automatic data merging process between partial scans is performed. Once the reconstructed 3D point cloud is obtained, it is converted into a mesh using the Delaunay triangulation method. In addition to the XYZ information of every point within the point cloud, this mesh also contains information related to the surface normal value for every point which makes the final MBD-based automatic point cloud segmentation process smarter and more robust. Figure 5 depicts the measurement scenario, comprising a GOM ATOS III Triple Scan™ 3D optical system in combination with the automatic rotary table and the dummy part on it.
Metrology 2022, 2, FOR PEER REVIEW 9 for every point which makes the final MBD-based automatic point cloud segmentation process smarter and more robust. Figure 5 depicts the measurement scenario, comprising a GOM ATOS III Triple Scan™ 3D optical system in combination with the automatic rotary table and the dummy part on it. The batch of experiments comprises ten 3D point cloud measurement repetitions on the dummy part. Once they are completed, an automatic data-processing approach based on the MBD strategy is applied. Thus, the dense point cloud is automatically processed and converted into meaningful GD&T information. The batch of experiments comprises ten 3D point cloud measurement repetitions on the dummy part. Once they are completed, an automatic data-processing approach based on the MBD strategy is applied. Thus, the dense point cloud is automatically processed and converted into meaningful GD&T information.
The software employed at this point is GOM Inspect™ metrology software. It allows MBD-type post-processing of data which means that it can digitally establish a relationship between nominal GD&T information and geometric elements within the captured data. This workflow is aligned with the PMI concept and interpreted using the ISO 1101 standard [68]. A different option to run MBD-type post-processing is to define the MBD data within the CAD model by adding the GD&T information to the available CAD file (for example, a catpart file in SolidWorks). Once this nominal MBD data-based file is prepared, the automatic evaluation of every GD&T can be performed. The workflow is as follows.

•
Step 1: Point cloud-to-mesh data conversion: The measured point cloud is converted into a mesh format to make the following data management and processing steps more robust and precise. The mesh format estimates and adds the surface normal values to the point cloud format, enabling it to achieve higher accuracy results through posterior segmentation operations (step 3 below).

•
Step 2: The 3D mesh is aligned with the available CAD model, which is crucial to ensure the accuracy and robustness of the MBD-based data evaluation method because it determines the correct parameterisation within the point cloud segmentation method. Thus, accurate alignment is required to achieve reliable results. In this study, the bestfit alignment method is used as an accurate method (acceptance criteria below a few microns).

•
Step 3: Automatic geometric feature segmentation is performed, and the mesh is split into multiple point clouds corresponding to each geometric feature with the aid of CAD nominal feature information. In this step, the point coordinates, surface normal data (real and nominal values), and surface curvature parameters are employed to support the point cloud segmentation algorithms and reinforce their robustness.

•
Step 4: Real geometric feature adjustment process: At this point, the previously obtained geometric-specific point cloud segmentation data are fitted to the corresponding geometric features by linear regression methods, rejecting possible outliers. The elimination of noisy points is established using suitable filters that estimated the 3D distance of each point concerning the fitted geometric feature. If the point-to-element distance parameter is higher than the standard deviation value (2σ) of the input points during the geometric feature adjustment process, this input point is detected as a non-suitable point and consequently removed from the process.

•
Step 5: GD&T evaluation: Once the previous step is successfully performed, an automatic evaluation of every GD&T for the fitted features (measured values) is performed with the help of nominally defined annotations and relationships (ISO 1101 standard [68]). Because the software already knows the theoretical relationships among the geometric features and datum objects by the previously recognised annotations, it can estimate the real GD&T values.
Following this process, the standard uncertainty associated with the measurement process variability (u p ) is obtained for each feature. The ten available 3D point cloud measurement repetitions are statistically processed, and the uncertainty (u p ) is given by the standard deviation parameter according to the ISO 15530-3 technical specification. In addition, the average value is reported at this step for the evaluation of the systematic error (u b ) value within the next step. Figure 6 shows the experimental results for the dummy part of the GOM Inspect™ metrology software. dition, the average value is reported at this step for the evaluation of the s (ub) value within the next step. Figure 6 shows the experimental results for the dummy part of the G metrology software.

Reference GD&T Values
The aim is to calibrate the measured dummy part and obtain the refer each evaluated feature to realise the uncertainty budget according to the IS nical specification [46]. Thus, the dummy part is measured in a ZEISS UPM CMM, in which the ZEISS VCMM™ tool is available for task-specific unc ment according to the ISO 15530-4 technical specification [47]. The ZEISS considers the mathematical model of the UPMC 850 CARAT CMM to pe specific uncertainty assessment process through multiple iterations (×1000 feed the mathematical model running within the ZEISS VCMM™ tool, th fluence factors, as well as their variability affecting the measurement accu viously characterised and introduced into the model. Figure 7 shows the d tion process.

Reference GD&T Values
The aim is to calibrate the measured dummy part and obtain the reference values for each evaluated feature to realise the uncertainty budget according to the ISO 15530-3 technical specification [46]. Thus, the dummy part is measured in a ZEISS UPMC 850 CARAT CMM, in which the ZEISS VCMM™ tool is available for task-specific uncertainty assessment according to the ISO 15530-4 technical specification [47]. The ZEISS VCMM™ tool considers the mathematical model of the UPMC 850 CARAT CMM to perform the task-specific uncertainty assessment process through multiple iterations (×1000 repetitions).
To feed the mathematical model running within the ZEISS VCMM™ tool, the error and influence factors, as well as their variability affecting the measurement accuracy, were previously characterised and introduced into the model. Figure 7 shows the dummy calibration process. In this manner, a selection of the GD&T features to be considered within the ex mental implementation is performed. The types of geometric elements and tolerances were considered are explained next, according to the type of tolerance and the numb features measured. In this manner, a selection of the GD&T features to be considered within the experimental implementation is performed. The types of geometric elements and tolerances that were considered are explained next, according to the type of tolerance and the number of features measured.
The positioning tolerances are evaluated considering the coordinate system created by the three planes which define the ABC datums. In total, the task-specific uncertainty of the 52 GD&T features is evaluated. Figure 8 shows the task-specific uncertainty assessment exercise based on the ZEISS UPMC CARAT 850 CMM measurements. In this manner, a selection of the GD&T features to be considered within the exp mental implementation is performed. The types of geometric elements and tolerances were considered are explained next, according to the type of tolerance and the numb features measured.  Size: Cylinder diameter (20× divided into groups by diameter);  Form: Flatness-planes (3×) and complex surfaces (2×);  Location and orientation: Positioning and composed positioning of cylinders vided into three groups).
The positioning tolerances are evaluated considering the coordinate system cre by the three planes which define the ABC datums. In total, the task-specific uncertain the 52 GD&T features is evaluated. Figure 8 shows the task-specific uncertainty ass ment exercise based on the ZEISS UPMC CARAT 850 CMM measurements. From these data, the standard uncertainty associated with the uncertainty of MMC calibration of the dummy part (ucal) is obtained. From these data, the standard uncertainty associated with the uncertainty of the MMC calibration of the dummy part (u cal ) is obtained.

Implementation of ISO 15530-3 Technical Specification
As previously stated, the uncertainty assessment method suggested in this article is based on the ISO 15530-3 technical specification application backed by the ISO 15530-4 technical specification, through which the task-specific uncertainty assessment for the calibrated values (u cal ) is realised.
According to the ISO 15530-3 technical specification, the uncertainty of the systematic error b (u b ) is assessed by the difference between the average value obtained during the measurement process variability (u p ) parameter assessment and the indicated value of the CMM. However, according to the GUM recommendation, the measurement results should be corrected by the amount of systematic effect. Thus, if the measurement result is not corrected by the systematic error, the error fully contributes to the uncertainty budget; thus, The uncertainty budget presented here comprises the uncertainty contributors u b , u p and u cal , whereas u w is negligible because of the lack of variation between the calibrated and measured dummy. The same physical dummy part is used during the calibration and measurement processes.
Other potential uncertainty sources, such as the measuring system resolution or any divergence between the master and measured dummy parts, were discarded because of their negligible effect on the uncertainty budget.
Equation (4) shows the combined standard uncertainty, u, which is given by the quadrature sum of each uncertainty contributor. Equation (1) shows the expanded measurement uncertainty U determined with a coverage factor k = 2 for an approximated coverage probability of 95%.
wherein: u w : standard uncertainty associated with material and manufacturing variations. This was negligible in this case.
u cal : standard uncertainty associated with the uncertainty of the MMC calibration (task-specific uncertainty value of each GD&T estimated by ZEISS VCMM™) u b : standard uncertainty associated with systematic errors in the measurement process. It was assessed by the difference between the average value obtained during the measurement process variability (u p ) parameter assessment and the indicated value of the CMM.
u p : to the standard uncertainty associated with the measurement process variability. The standard deviation of the ten-3D point cloud measurement repetitions was considered for each GD&T.
k: confidence interval: defines an interval with a level of confidence of approximately 95% (k = 2) with a normal distribution U: expanded uncertainty for each GD&T comprising all the uncertainty error sources and their propagation with a confidence interval of 95% (k = 2).
Other uncertainty sources, such as thermal effects, measuring process drift, or the interaction between the light and the part surface, have not been considered separately within the uncertainty budget, as it is assumed that they contribute to (u p ) the standard uncertainty associated with the measurement process variability.

Results
This section describes the results obtained during the experimental implementation of the proposed uncertainty assessment method. For the sake of understanding, the experimental results are presented in such a way that every uncertainty contributor can be explained in detail. First, the standard uncertainty results associated with the measurement process variability (u p ) are presented. Then, the standard uncertainty results associated with the uncertainty of the MMC calibration (u cal ) are presented, along with the values indicated by the ZEISS CMM for each evaluated GD&T. Subsequently, the standard uncertainty results associated with the systematic error of the measurement process (u b ) are presented. Finally, the uncertainty budget for the 3D point cloud task-specific measurement uncertainty assessment is presented, where the expanded measurement uncertainty U is obtained for each GD&T element. Table 1 lists the results obtained for the (u p ) uncertainty contribution. The result comprises every uncertainty component that falls within the measurement process variability, such as the instrument repeatability itself, MBD data processing strategy, or any potential thermal drift, among other minor contributors. Table 1 also considers the obtained mean value for every measured GD&T for further (u b ) uncertainty contributor assessment. For every GD&T element, the highest measurement process variability (u p ) results were below 20 µm whereas the average standard deviation was less than 10 µm. These results demonstrate that the measurement process, from the data acquisition process and 3D point cloud reconstruction to the MBD-based data processing procedure, is within the micrometre accuracy. Table 2 shows the results indicated by the CMM for every GD&T under study. Thus, it shows the reference result reported by the ZEISS UPMC CARAT CMM in addition to the task-specific uncertainty value (u cal ) estimated using the ZEISS VCMM™ tool. In this way, the CMM measurement uncertainty for every measured GD&T is assessed, and thus, a task-specific uncertainty budget can be finally accomplished.

Uncertainty of the MMC Calibration, u cal
For every GD&T element, the highest CMM calibration uncertainty (ucal) values are up to 7 µm, whereas the average value is within 1 µm. These results demonstrate that (u cal ) uncertainty values estimated by the ZEISS VCMM™ tool are consistent with the ZEISS UPMC CARAT CMM Maximum Permissible Error (MPE) specification (0.6 ± L/1000 in µm) although some specific and complex GD&T results are much worse due to the complexity of the evaluation. At this point, the authors made a special effort to understand these results. Thus, a second batch of calibration measurements is performed on the CMM, but the results did not vary significantly, indicating that the obtained results are properly evaluated by the VCMM tool. At this point, the author hypothesises that the source of these results possibly arises from the acquired raw data and employed measuring strategies.  Table 3 lists the standard uncertainty associated with the systematic error of the measurement process (u b ). It is assessed by the difference between the average value obtained during the measurement process variability (u p ) parameter assessment and the indicated value of the CMM during the dummy calibration process.
The systematic error (u b ) results show a wide range of values. While some results are within a few microns, others are between 0.1 ÷ 0.2 mm (absolute values). The average value for all the GD/Ts is within 50 µm. At this point, it is not easy to understand and justify the wide range of values obtained for the (u b ) uncertainty contributor, but it is possibly explained by the higher point quantity and more homogeneous point distribution of the measurements obtained by the 3D optical system compared to the CMM measurements.
At this point, it makes sense to highlight, as in the introduction, that 3D optical systems are a relatively new technology, and their measurement error sources are still being researched. They are affected by many potential uncertainty error sources, such as the light condition, measurement and material properties, system orientation and resolution, ambient temperature, measurement volume, and chromatic effects, which complicate the measurement uncertainty assessment process to a large extent. Nonetheless, as stated previously within the introduction, this study aims to present a point cloud measurement task-specific uncertainty assessment method and its experimental implementation. The obtained expanded measurement uncertainty results are not as important as those of the method presented by the authors.  Table 4 summarises the uncertainty budget for the experimental implementation of the proposed method. It shows the three major uncertainty contributors and the expanded measurement uncertainty result U obtained from Equation (2). It should be noted that the measurement results are not corrected by the amount of systematic effects; therefore, the (u b ) contributor is considered within the final uncertainty budget.
The uncertainty budget shows that the systematic error contributor (u b ) is the main contributor to the final result. While the CMM calibration uncertainty (u cal ) contributor average value falls within 1 µm and the measurement process variability (u p ) average value is less than 10 µm, the systematic error (u b ) average value falls within 50 µm. Thus, the CMM calibration uncertainty (u cal ) becomes negligible, which means that the main contributors to the task-specific uncertainty budget are the measurement process variability (u p ) and systematic error contributor (u b ). As stated before, the measurement result is not corrected by the number of systematic effects which, in this case, are the main uncertainty sources for the measurement with 3D optical systems.

Discussion
This article presents a methodology for task-specific uncertainty assessment of 3D point clouds based on ISO 15530-3 and ISO 15530-4 technical specifications and the application of MBD-based post-processing for the automatic processing of point clouds.
It presents an uncertainty budget comprising three main uncertainty contributors according to ISO 15530-3 technical specifications. The three major uncertainty contributors are (a) measurement process variability (u p ), (b) uncertainty of the CMM calibration (u cal ), and (c) uncertainty of the systematic error (u b ). The uncertainty associated with the material and manufacturing variations, u w , is considered negligible.
The methodology presented here suggests an automatic 3D point cloud measurement and evaluation process, where the statistical analysis of multiple GD&T results is based on an MBD-based approach. From these data, the standard uncertainty associated with the measurement process variability (u p ) is automatically obtained. The standard uncertainty associated with the uncertainty of the MMC calibration (u cal ) is obtained using the ZEISS VCMM™ tool, which assesses a task-specific uncertainty value for every calibrated feature according to ISO 15530-4 technical specifications. Finally, the standard uncertainty associated with the systematic error of the measurement process (u b ) is obtained from the difference between the average value obtained during the measurement process variability and the indicated value of the CMM during the dummy part calibration process.
The experimental results show that the systematic error contribution (u b ) is the main contributor to the uncertainty budget. While the CMM calibration uncertainty (u cal ) contributor average value falls within 1 µm and the measurement process variability (u p ) average value is less than 10 µm, the systematic error (u b ) average value falls within 50 µm. Thus, the CMM calibration uncertainty (u cal ) becomes negligible, which means that the main contributors to the task-specific uncertainty budget are the measurement process variability (u p ) and systematic error contributor (u b ).
In summary, a reliable task-specific uncertainty method is developed and successfully implemented. In the absence of numerical simulation models for optical systems, which are not currently available, this methodology allows for the establishment of an uncertainty budget to understand the order of magnitude of the measurement uncertainty of 3D optical systems.
One of the limitations of the presented methodology is scalability to large components, where CMM reference values are hardly achievable by calibrating the existing dummy part. Hence, this methodology could be applied to the scanning of geometric parts approximately up to 1.5 ÷ 2 m and manufactured in serial production, since this is the most common working range for many applications such as automotive or the manufacturing of metallic components.
Finally, concerning the MBD-based metrology data processing strategy, the experimental approach presented in this article demonstrates that the nominal PMI-based method is appropriate for converting dense point cloud data into desired dimensional metrology results (GD&Ts). It enables an effective data processing approach in terms of accuracy, speed, and robustness which in turn allows a fully automatic geometric point cloud evaluation process to avoid errors during the result interpretation and procurement processes.
Further work will focus on analysing the deviations between the results of the 3D optical system and the reference values obtained with CMM. Because these measuring technologies differ considerably in terms of accuracy, number, and point distribution, deviations will remain, but they would help to understand the complex intrinsic performance of 3D scanning systems. These preliminary results and accuracy assessment methods could support the development of AI-based numerical methods that describe the optical performance of 3D scanners.
Regarding the VCMM approach, this study demonstrates that simulation-based metrology should be applied for task-specific assessment of reference values. This shows the applicability of digital twins within the metrology field in terms of a priori uncertainty estimation and a posteriori uncertainty assessment. Thus, the measurement procedure can be optimised based on those digital twin simulation results. Another interesting future research line within the VCMM field is to employ the simulation-based metrology concept to create nominal dense reference point clouds with known uncertainty values. Therefore, fast uncertainty assessment procedures should be developed for dense point cloud data.