Journal Description
Metrology
Metrology
is an international, peer-reviewed, open access journal on the science and technology of measurement and metrology, published quarterly online by MDPI.
- Open Access—free to download, share, and reuse content. Authors receive recognition for their contribution when the paper is reused.
- Rapid Publication: first decisions in 15 days; acceptance to publication in 3 days (median values for MDPI journals in the second half of 2021).
- Recognition of Reviewers: APC discount vouchers, optional signed peer review, and reviewer names published annually in the journal.
Latest Articles
Outlier Elimination in Rough Surface Profilometry with Focus Variation Microscopy
Metrology 2022, 2(2), 263-273; https://doi.org/10.3390/metrology2020016 - 17 May 2022
Abstract
►
Show Figures
Rough surfaces such as metal additive manufactured surfaces are quite challenging for measurement. Artifacts caused by irregular and difficult-to-measure geometries are inevitable. Removing all the artifacts would cause a portion of surface information to be missing. Different from previous works, the postprocessing in
[...] Read more.
Rough surfaces such as metal additive manufactured surfaces are quite challenging for measurement. Artifacts caused by irregular and difficult-to-measure geometries are inevitable. Removing all the artifacts would cause a portion of surface information to be missing. Different from previous works, the postprocessing in this paper includes an additional step to eliminate artifacts based on autocorrelation functions of particular subimages instead of simply removing them. This increases the accuracy with respect to surface roughness and provides a more comprehensive view on the topography. In addition, a dome shape LED array ring light is proposed to provide all-round lighting due to the high degree of irregularity of workpiece surfaces. The experimental results obtained from FVM are validated and compared with the given roughness values of a Rubert Microsurf 329 comparator test panel as well as measurement results of a metal additive workpiece by a confocal microscope.
Full article
Open AccessArticle
Multilateration with Self-Calibration: Uncertainty Assessment, Experimental Measurements and Monte-Carlo Simulations
Metrology 2022, 2(2), 241-262; https://doi.org/10.3390/metrology2020015 - 11 May 2022
Abstract
Large-volume metrology is essential to many high-value industries and contributes to the factories of the future. In this context, we have developed a tri-dimensional coordinate measurement system based on a multilateration technique with self-calibration. In practice, an absolute distance meter, traceable to the
[...] Read more.
Large-volume metrology is essential to many high-value industries and contributes to the factories of the future. In this context, we have developed a tri-dimensional coordinate measurement system based on a multilateration technique with self-calibration. In practice, an absolute distance meter, traceable to the SI metre, is shared between four measurement heads by fibre-optic links. From these stations, multiple distance measurements of several target positions are then performed to, at the end, determine the coordinates of these targets. The uncertainty on these distance measurements has been determined with a consistent metrological approach and it is better than 5 µm. However, the propagation of this uncertainty into the measured positions is not a trivial task. In this paper, an analytical solution for the uncertainty assessment of the positions of both targets and heads under a multilateration scenario with self-calibration is provided. The proposed solution is then compared to Monte-Carlo simulations and to experimental measurements: it follows that all three approaches are well agreed, which suggests that the proposed analytical model is accurate. The confidence ellipsoids provided by the analytical solution described well the geometry of the errors.
Full article
(This article belongs to the Special Issue Advances in Portable 3D Measurement)
►▼
Show Figures

Graphical abstract
Open AccessCommunication
A Visual Method of Measuring Railway-Track Weed Infestation Level
Metrology 2022, 2(2), 230-240; https://doi.org/10.3390/metrology2020014 - 05 May 2022
Abstract
►▼
Show Figures
This paper concerns the assessment of railway track surface conditions in relation to the degree of weed infestation. The paper conceptually describes the proposed method using a visual system to analyse weed infestation level. The use of image analysis software for weed detection
[...] Read more.
This paper concerns the assessment of railway track surface conditions in relation to the degree of weed infestation. The paper conceptually describes the proposed method using a visual system to analyse weed infestation level. The use of image analysis software for weed detection is also proposed. This new measurement method allows for a mobile assessment of the track’s weed infestation status. Validation of the assessment method in real conditions will allow for further expansion of the system using new shades of green from the RAL palette, and will take into account a more extensive and detailed assessment of weed infestation on the track in accordance with applicable railway regulations.
Full article

Figure 1
Open AccessArticle
The Obtainable Uncertainty for the Frequency Evaluation of Tones with Different Spectral Analysis Techniques
Metrology 2022, 2(2), 216-229; https://doi.org/10.3390/metrology2020013 - 14 Apr 2022
Abstract
Spectral analysis is successfully adopted in several fields. However, the requirements and the constraints of the different cases may be so varied that not only the tuning of the analysis parameters but also the choice of the most suitable technique can be a
[...] Read more.
Spectral analysis is successfully adopted in several fields. However, the requirements and the constraints of the different cases may be so varied that not only the tuning of the analysis parameters but also the choice of the most suitable technique can be a difficult task. For this reason, it is important that a designer of a measurement system for spectral analysis has knowledge about the behaviour of the different techniques with respect to the operating conditions. The case that will be considered is the realization of a numerical instrument for the real-time measurement of the spectral characteristics of a multi-tone signal (amplitude, frequency, and phase). For this purpose, different signal processing techniques can be used, that can be classified as parametric or non-parametric methods. The first class includes those methods that exploit the a priori knowledge about signal parameters, such as the spectral shape of the signal to be processed. Thus, a self-configuring procedure based on a parametric algorithm should include a preliminary evaluation of the number of components. The choice of the right method among several proposals in the literature is fundamental for any designer and, in particular, for the developers of spectral analysis software, for real-time applications and embedded devices where time and reliability constrains are arduous to fulfil. Different aspects should be considered: the desired level of accuracy, the available elaboration resources (memory depth and processing speed), and the signal parameters. The present paper details a comparison of some of the most effective methods available in the literature for the spectral analysis of signals (IFFT-2p, IFFT-3p, and IFFTc, all based on the use of an FFT algorithm, while improving the spectral resolution of the DFT with interpolation techniques and three parametric algorithms—MUSIC, ESPRIT, and IWPA). The methods considered for the comparison will be briefly described, and references to literature will be given for each one of them. Then, their behaviour will be analysed in terms of systematic contribution and uncertainty on the evaluated frequencies of the spectral tones of signals created from superimposed sinusoids and white Gaussian noise.
Full article
(This article belongs to the Collection Measurement Uncertainty)
►▼
Show Figures

Figure 1
Open AccessArticle
Designing Possibilistic Information Fusion—The Importance of Associativity, Consistency, and Redundancy
Metrology 2022, 2(2), 180-215; https://doi.org/10.3390/metrology2020012 - 11 Apr 2022
Abstract
One of the main challenges in designing information fusion systems is to decide on the structure and order in which information is aggregated. The key criteria by which topologies are constructed include the associativity of fusion rules as well as the consistency and
[...] Read more.
One of the main challenges in designing information fusion systems is to decide on the structure and order in which information is aggregated. The key criteria by which topologies are constructed include the associativity of fusion rules as well as the consistency and redundancy of information sources. Fusion topologies regarding these criteria are flexible in design, produce maximal specific information, and are robust against unreliable or defective sources. In this article, an automated data-driven design approach for possibilistic information fusion topologies is detailed that explicitly considers associativity, consistency, and redundancy. The proposed design is intended to handle epistemic uncertainty—that is, to result in robust topologies even in the case of lacking training data. The fusion design approach is evaluated on selected publicly available real-world datasets obtained from technical systems. Epistemic uncertainty is simulated by withholding parts of the training data. It is shown that, in this context, consistency as the sole design criterion results in topologies that are not robust. Including a redundancy metric leads to an improved robustness in the case of epistemic uncertainty.
Full article
(This article belongs to the Collection Measurement Uncertainty)
►▼
Show Figures

Figure 1
Open AccessArticle
Measurement of Conducted Supraharmonic Emissions: Quasi-Peak Detection and Filter Bandwidth
Metrology 2022, 2(2), 161-179; https://doi.org/10.3390/metrology2020011 - 31 Mar 2022
Abstract
►▼
Show Figures
In modern power systems, the integration of renewable energy sources relies on dedicated inverters whose power electronic circuitry switches at high frequencies and causes conducted emissions in the supraharmonic range, i.e., from 9 to 150 kHz. In this regard, the normative framework is
[...] Read more.
In modern power systems, the integration of renewable energy sources relies on dedicated inverters whose power electronic circuitry switches at high frequencies and causes conducted emissions in the supraharmonic range, i.e., from 9 to 150 kHz. In this regard, the normative framework is still lacking a reference measurement method as well as a set of emission limits and performance requirements. From a metrological point of view, it is important to evaluate whether some of the power quality indices adopted for radiated emissions could be transposed also in this context. In particular, the paper considers a recent algorithm for the identification of supraharmonic components and discusses how its estimates affect the estimation of quasi-peak values. To this end, the paper describes the implementation of a fully digital approach and validates the results by means of an experimental comparison against a traditional quasi-peak detector. The proposed analysis confirms the potential of the considered approach and provides some interesting insights about the reliability of quasi-peak estimation in supraharmonic range.
Full article

Figure 1
Open AccessArticle
Analysis of Vector Network Analyzer Thermal Drift Error
Metrology 2022, 2(2), 150-160; https://doi.org/10.3390/metrology2020010 - 23 Mar 2022
Abstract
►▼
Show Figures
Ensuring a high accuracy when measuring the parameters of devices under testing is an important task when conducting research in the terahertz-frequency range. The purpose of this paper is a practical study of the thermal drift errors of a vector network analyzer using
[...] Read more.
Ensuring a high accuracy when measuring the parameters of devices under testing is an important task when conducting research in the terahertz-frequency range. The purpose of this paper is a practical study of the thermal drift errors of a vector network analyzer using low-terahertz-frequency extender modules. For this, the change in the measurement error, which is a function of time, was analysed using system, based on Keysight N5247B vector network analyzer and covering the frequency ranges of 220–330 GHz, 500–750 GHz, and 750–1100 GHz. The results of our experiment showed that the measurement error decreased rapidly during the first half hour of warm-up and stabilized by 3 h after turning on the equipment. These results allow for an estimation of the necessary warm-up time depending on the requirements for the measurement’s accuracy. This makes it possible to optimize the experiment and reduce its duration.
Full article

Graphical abstract
Open AccessArticle
The GUM Tree Calculator: A Python Package for Measurement Modelling and Data Processing with Automatic Evaluation of Uncertainty
Metrology 2022, 2(1), 128-149; https://doi.org/10.3390/metrology2010009 - 15 Mar 2022
Abstract
There is currently interest in the digitalisation of metrology because technologies that can measure, analyse, and make critical decisions autonomously are beginning to emerge. The notions of metrological traceability and measurement uncertainty should be supported, following the recommendations in the Guide to the
[...] Read more.
There is currently interest in the digitalisation of metrology because technologies that can measure, analyse, and make critical decisions autonomously are beginning to emerge. The notions of metrological traceability and measurement uncertainty should be supported, following the recommendations in the Guide to the Expression of Uncertainty in Measurement (GUM). However, GUM offers no specific guidance. Here, we report on a Python package that implements algorithmic data processing using ‘uncertain numbers’, which satisfy the general criteria in GUM for an ideal format to express uncertainty. An uncertain number can represent a physical quantity that has not been determined exactly. Using uncertain numbers, measurement models can be expressed clearly and succinctly in terms of the quantities involved. The algorithms and simple data structures we use provide an example of how metrological traceability can be supported in digital systems. In particular, uncertain numbers provide a format to capture and propagate detailed information about quantities that influence a measurement along the various stages of a traceability chain. More detailed information about influence quantities can be exploited to extract more value from results for users at the end of a traceability chain.
Full article
(This article belongs to the Collection Measurement Uncertainty)
►▼
Show Figures

Figure 1
Open AccessArticle
GUM-Compliant Uncertainty Evaluation Using Virtual Experiments
Metrology 2022, 2(1), 114-127; https://doi.org/10.3390/metrology2010008 - 01 Mar 2022
Abstract
A virtual experiment simulates a real measurement process by means of a numerical model. The numerical model produces virtual data whose properties reflect those of the data observed in the real experiment. In this work, we explore how the results of a virtual
[...] Read more.
A virtual experiment simulates a real measurement process by means of a numerical model. The numerical model produces virtual data whose properties reflect those of the data observed in the real experiment. In this work, we explore how the results of a virtual experiment can be employed in the context of uncertainty evaluation for a corresponding real experiment. The uncertainty evaluation was based on the Guide to the Expression of Uncertainty in Measurement (GUM), which defines the de facto standard for uncertainty evaluation in metrology. We show that, under specific assumptions about model structure and variance of the data, virtual experiments in combination with a Monte Carlo method lead to an uncertainty evaluation for the real experiment that is in line with Supplement 1 to the GUM. In the general case, a GUM-compliant uncertainty evaluation in the context of a real experiment can no longer be based on a corresponding virtual experiment in a simple way. Nevertheless, virtual experiments are still useful in order to increase the reliability of an uncertainty analysis. Simple generic examples as well the case study of a virtual coordinate measuring machine are presented to illustrate the treatment.
Full article
(This article belongs to the Special Issue Virtual Measuring Systems and Digital Twins)
►▼
Show Figures

Figure 1
Open AccessArticle
Noise Limitations in Multi-Fringe Readout of Laser Interferometers and Resonators
by
and
Metrology 2022, 2(1), 98-113; https://doi.org/10.3390/metrology2010007 - 19 Feb 2022
Abstract
►▼
Show Figures
Laser interferometers that operate over a dynamic range exceeding one wavelength are used as compact displacement sensors for gravitational wave detectors and inertial sensors and in a variety of other high-precision applications. A number of approaches are available to extract the phase from
[...] Read more.
Laser interferometers that operate over a dynamic range exceeding one wavelength are used as compact displacement sensors for gravitational wave detectors and inertial sensors and in a variety of other high-precision applications. A number of approaches are available to extract the phase from such interferometers by implementing so-called phasemeters, algorithms to provide a linearised phase estimate. While many noise sources have to be considered for any given scheme, they are fundamentally limited by additive noise in the readout, such as electronic readout, digitisation, and shot-noise, which manifest as an effective, white phase noise in the phasemeter output. We calculated and compared the Cramer–Rao lower bound for phasemeters of some state-of-the-art two-beam interferometer schemes and derived their noise limitations for sub-fringe operation and for multi-fringe readout schemes. From this, we derived achievable noise performance levels for one of these interferometer techniques, deep-frequency modulation interferometry. We then applied our analysis to optical resonators and show that frequency scanning techniques can in theory benefit from such resonant enhancement, indicating that the sensitivities can be improved in future sensors.
Full article

Figure 1
Open AccessArticle
Experimental Design for Virtual Experiments in Tilted-Wave Interferometry
by
, , , , and
Metrology 2022, 2(1), 84-97; https://doi.org/10.3390/metrology2010006 - 17 Feb 2022
Abstract
The tilted-wave interferometer (TWI) is a recent and promising technique for optically measuring aspheres and freeform surfaces and combines an elaborate experimental setup with sophisticated data analysis algorithms. There are, however, many parameters that influence its performance, and greater knowledge about the behavior
[...] Read more.
The tilted-wave interferometer (TWI) is a recent and promising technique for optically measuring aspheres and freeform surfaces and combines an elaborate experimental setup with sophisticated data analysis algorithms. There are, however, many parameters that influence its performance, and greater knowledge about the behavior of the TWI is needed before it can be established as a measurement standard. Virtual experiments are an appropriate tool for this purpose, and in this paper we present a digital twin of the TWI that was carefully designed for such experiments. The expensive numerical calculations involved combined with the existence of multiple influencing parameters limit the number of virtual experiments that are feasible, which poses a challenge to researchers. Experimental design is a statistical technique that allows virtual experiments to be planned such as to maximize information gain. We applied experimental design to virtual TWI experiments with the goal of identifying the main sources of uncertainty. The results from this work are presented here.
Full article
(This article belongs to the Special Issue Virtual Measuring Systems and Digital Twins)
►▼
Show Figures

Graphical abstract
Open AccessArticle
Characterization of Surface Topography Features for the Effect of Process Parameters and Their Correlation to Quality Monitoring in Metal Additive Manufacturing
by
, , , , and
Metrology 2022, 2(1), 73-83; https://doi.org/10.3390/metrology2010005 - 07 Feb 2022
Abstract
►▼
Show Figures
Layering deposition methodology in metal additive manufacturing (AM) and the influence of different processing parameters, such as energy source level and deposition speed, which can change the melt pool condition, are known to be the important influencing factors on properties of components fabricated
[...] Read more.
Layering deposition methodology in metal additive manufacturing (AM) and the influence of different processing parameters, such as energy source level and deposition speed, which can change the melt pool condition, are known to be the important influencing factors on properties of components fabricated via AM. The effect of melt pool conditions and geometry on properties and quality of fabricated AM components has been widely studied through experimental and simulation techniques. There is a need for better understanding the influence of solidified melt pool topography on characteristics of next deposition layer that can be applied to complex surfaces, especially those with sparse topographical features, such as those that occur in AM deposition layers. Topography of deposited layers in metal additive manufacturing is a significant aspect on the bonding condition between the layers and defect generation mechanism. Characterization of the topography features in AM deposition layers offers a new perspective into investigation of defect generation mechanisms and quality evaluation of AM components. In this work, a feature-based topography study is proposed for the assessment of process parameters’ influence on AM deposition layers topography and defect generation mechanism. Titanium alloy (Ti6Al4V) samples deposited on steel substrate, by direct energy deposition (DED) AM technique at different process conditions, were used for the assessment. Topography datasets and analysis of shape and size differences pertaining to the relevant topographic features have been performed. Different AM process parameters were investigated on metallic AM samples manufactured via direct energy deposition (DED) and the potential defect generation mechanism was discussed. The assessment of the topography features was used for correlation study with previously published in-situ monitoring and quality evaluation results, where useful information was obtained through characterization of signature topographic formations and their relation to the in-situ acoustic process monitoring, as the indicators of the manufacturing process behavior and performance.
Full article

Figure 1
Open AccessReview
Surface-Sensing Principle of Microprobe System for Micro-Scale Coordinate Metrology: A Review
Metrology 2022, 2(1), 46-72; https://doi.org/10.3390/metrology2010004 - 20 Jan 2022
Cited by 1
Abstract
►▼
Show Figures
Micro-coordinate measuring machines (micro-CMMs) for measuring microcomponents require a probe system with a probe tip diameter of several tens to several hundreds of micrometers. Scale effects work for such a small probe tip, i.e., the probe tip tends to stick on the measurement
[...] Read more.
Micro-coordinate measuring machines (micro-CMMs) for measuring microcomponents require a probe system with a probe tip diameter of several tens to several hundreds of micrometers. Scale effects work for such a small probe tip, i.e., the probe tip tends to stick on the measurement surface via surface adhesion forces. These surface adhesion forces significantly deteriorate probing resolution or repeatability. Therefore, to realize micro-CMMs, many researchers have proposed microprobe systems that use various surface-sensing principles compared with conventional CMM probes. In this review, the surface-sensing principles of microprobe systems were the focus, and the characteristics were reviewed. First, the proposed microprobe systems were summarized, and the probe performance trends were identified. Then, the individual microprobe system with different sensing principles was described to clarify the performance of each sensing principle. By comprehensively summarizing multiple types of probe systems and discussing their characteristics, this study contributed to identifying the performance limitations of the proposed micro-probe system. Accordingly, the future development of micro-CMMs probes is discussed.
Full article

Figure 1
Open AccessArticle
The Storage within Digital Calibration Certificates of Uncertainty Information Obtained Using a Monte Carlo Method
Metrology 2022, 2(1), 33-45; https://doi.org/10.3390/metrology2010003 - 18 Jan 2022
Abstract
Supplement 1 to the ‘Guide to the expression of uncertainty of measurement’ describes a Monte Carlo method as a general numerical approach to uncertainty evaluation. Application of the approach typically delivers a large number of values of the output quantity of interest from
[...] Read more.
Supplement 1 to the ‘Guide to the expression of uncertainty of measurement’ describes a Monte Carlo method as a general numerical approach to uncertainty evaluation. Application of the approach typically delivers a large number of values of the output quantity of interest from which summary information such as an estimate of the quantity, its associated standard uncertainty, and a coverage interval for the quantity can be obtained and reported. This paper considers the use of a Monte Carlo method for uncertainty evaluation in calibration, using two examples to demonstrate how so-called ‘digital calibration certificates’ can allow the complete set of results of a Monte Carlo calculation to be reported.
Full article
(This article belongs to the Collection Measurement Uncertainty)
Open AccessArticle
Effect of a Misidentified Centre of a Type ASG Material Measure on the Determined Topographic Spatial Resolution of an Optical Point Sensor
Metrology 2022, 2(1), 19-32; https://doi.org/10.3390/metrology2010002 - 05 Jan 2022
Abstract
►▼
Show Figures
The article presents the determination of the topographic spatial resolution of an optical point sensor. It is quantified by the lateral period limit measured on a type ASG material measure, also called (topographic) Siemens star, with a confocal sensor following both
[...] Read more.
The article presents the determination of the topographic spatial resolution of an optical point sensor. It is quantified by the lateral period limit measured on a type ASG material measure, also called (topographic) Siemens star, with a confocal sensor following both a radial measurement and evaluation, as proposed by ISO 25178-70, and the measurement and subsequent evaluation of two line scans, proposed by the NPL Good Practice Guide. As will be shown, for the latter, an only slightly misidentified target centre of the Siemens star leads to quite significant errors of the determined . Remarkably, a misidentified target centre does not necessarily result in an overestimation of , but lower values might also be obtained. Therefore, a modified Good Practice Guide is proposed to determine more accurately, as it includes a thorough determination of the centre of the Siemens star as well. While the measurement and evaluation effort is increased slightly compared to the NPL Good Practice Guide, it is still much faster than a complete radial measurement and evaluation.
Full article

Figure 1
Open AccessArticle
Systematic Distortion Factor and Unrecognized Source of Uncertainties in Nuclear Data Measurements and Evaluations
Metrology 2022, 2(1), 1-18; https://doi.org/10.3390/metrology2010001 - 24 Dec 2021
Abstract
Each experiment provides new information about the value of some physical quantity. However, not only measured values but also the uncertainties assigned to them are an important part of the results. The metrological guides provide recommendations for the presentation of the uncertainties of
[...] Read more.
Each experiment provides new information about the value of some physical quantity. However, not only measured values but also the uncertainties assigned to them are an important part of the results. The metrological guides provide recommendations for the presentation of the uncertainties of the measurement results: statistics and systematic components of the uncertainties should be explained, estimated, and presented separately as the results of the measurements. The experimental set-ups, the models of experiments for the derivation of physical values from primary measured quantities, are the product of human activity, making it a rather subjective field. The Systematic Distortion Factor (SDF) may exist in any experiment. It leads to the bias of the measured value from an unknown “true” value. The SDF appears as a real physical effect if it is not removed with additional measurements or analysis. For a set of measured data with the best evaluated true value, their differences beyond their uncertainties can be explained by the presence of Unrecognized Source of Uncertainties (USU) in these data. We can link the presence of USU in the data with the presence of SDF in the results of measurements. The paper demonstrates the existence of SDF in Prompt Fission Neutron Spectra (PFNS) measurements, measurements of fission cross sections, and measurements of Maxwellian spectrum averaged neutron capture cross sections for astrophysical applications. The paper discusses introducing and accounting for the USU in the data evaluation in cases when SDF cannot be eliminated. As an example, the model case of 238U(n,f)/235U(n,f) cross section ratio evaluation is demonstrated.
Full article
(This article belongs to the Collection Measurement Uncertainty)
►▼
Show Figures

Figure 1
Open AccessArticle
Digital Representation of Measurement Uncertainty: A Case Study Linking an RMO Key Comparison with a CIPM Key Comparison
by
and
Metrology 2021, 1(2), 166-181; https://doi.org/10.3390/metrology1020011 - 06 Dec 2021
Cited by 1
Abstract
This paper considers a future scenario in which digital reporting of measurement results is ubiquitous and digital calibration certificates (DCCs) contain information about the components of uncertainty in a measurement result. The task of linking international measurement comparisons is used as a case
[...] Read more.
This paper considers a future scenario in which digital reporting of measurement results is ubiquitous and digital calibration certificates (DCCs) contain information about the components of uncertainty in a measurement result. The task of linking international measurement comparisons is used as a case study to look at the benefits of digitalization. Comparison linking provides a context in which correlations are important, so the benefit of passing a digital record of contributions to uncertainty along a traceability chain can be examined. The International Committee for Weights and Measures (CIPM) uses a program of international “key comparisons” to establish the extent to which measurements of a particular quantity may be considered equivalent when made in different economies. To obtain good international coverage, the results of the comparisons may be linked together: a number of regional metrology organization (RMO) key comparisons can be linked back to an initial CIPM key comparison. Specific information about systematic effects in participants’ results must be available during linking to allow correct treatment of the correlations. However, the conventional calibration certificate formats used today do not provide this: participants must submit additional data, and the report of an initial comparison must anticipate the requirements for future linking. Special handling of additional data can be laborious and prone to error. An uncertain-number digital reporting format was considered in this case study, which caters to all the information required and would simplify the comparison analysis, reporting, and linking; the format would also enable a more informative presentation of comparison results. The uncertain-number format would be useful more generally, in measurement scenarios where correlations arise, so its incorporation into DCCs should be considered. A full dataset supported by open-source software is available.
Full article
(This article belongs to the Collection Measurement Uncertainty)
Open AccessArticle
Methodology to Create Reproducible Validation/Reference Materials for Comparison of Filter-Based Measurements of Carbonaceous Aerosols That Measure BC, BrC, EC, OC, and TC
by
, , , , , , and
Metrology 2021, 1(2), 142-165; https://doi.org/10.3390/metrology1020010 - 26 Nov 2021
Abstract
►▼
Show Figures
A simple method that reproducibly creates validation/reference materials for comparison of methods that measure the carbonaceous content of atmospheric particulate matter deposited on filter media at concentrations relevant to atmospheric levels has been developed and evaluated. Commonly used methods to determine the major
[...] Read more.
A simple method that reproducibly creates validation/reference materials for comparison of methods that measure the carbonaceous content of atmospheric particulate matter deposited on filter media at concentrations relevant to atmospheric levels has been developed and evaluated. Commonly used methods to determine the major carbonaceous components of particles collected on filters include optical attenuation for “Black” (BC) and “Brown” (BrC) carbon, thermal-optical analysis (TOA) for “Elemental” (EC) and “Organic” (OC) carbon, and total combustion for “Total” carbon (TC). The new method uses a commercial inkjet printer to deposit ink containing both organic and inorganic components onto filter substrates at programmable print densities (print levels, as specified by the printer–software combination). A variety of filter media were evaluated. The optical attenuation (ATN) of the deposited sample was determined at 880 nm and 370 nm. Reproducibility or precision (as standard deviation or in percent as coefficient of variation) in ATN for Teflon-coated glass-fiber, Teflon, and cellulose substrates was better than 5%. Reproducibility for other substrates was better than 15%. EC and OC measured on quartz-fiber filters (QFF) compared to ATN measured at 880 nm and 370 nm on either QFF or Teflon-coated glass-fiber yielded R2 > 0.92 and >0.97, respectively. Four independent laboratories participated in a round robin study together with the reference laboratory. The propagated standard deviation among the five groups across all print levels was <2.2 ATN at 880 nm and <2.7 ATN at 370 nm with a coefficient of variation of <2% at ~100 ATN.
Full article

Figure 1
Open AccessArticle
Three-Dimensional Transfer Functions of Interference Microscopes
Metrology 2021, 1(2), 122-141; https://doi.org/10.3390/metrology1020009 - 09 Nov 2021
Cited by 1
Abstract
►▼
Show Figures
Three-dimensional transfer functions (3D TFs) are generally assumed to fully describe the transfer behavior of optical topography measuring instruments such as coherence scanning interferometers in the spatial frequency domain. Therefore, 3D TFs are supposed to be independent of the surface under investigation resulting
[...] Read more.
Three-dimensional transfer functions (3D TFs) are generally assumed to fully describe the transfer behavior of optical topography measuring instruments such as coherence scanning interferometers in the spatial frequency domain. Therefore, 3D TFs are supposed to be independent of the surface under investigation resulting in a clear separation of surface properties and transfer characteristics. In this paper, we show that the 3D TF of an interference microscope differs depending on whether the object is specularly reflecting or consists of point scatterers. In addition to the 3D TF of a point scatterer, we will derive an analytical expression for the 3D TF corresponding to specular surfaces and demonstrate this as being most relevant in practical applications of coherence scanning interferometry (CSI). We additionally study the effects of temporal coherence and disclose that in conventional CSI temporal coherence effects dominate. However, narrowband light sources are advantageous if high spatial frequency components of weak phase objects are to be resolved, whereas, for low-frequency phase objects of higher amplitude, the temporal coherence is less affecting. Finally, we present an approach that explains the different transfer characteristics of coherence peak and phase detection in CSI signal analysis.
Full article

Figure 1
Open AccessArticle
Measuring Salinity and Density of Seawater Samples with Different Salt Compositions and Suspended Materials
Metrology 2021, 1(2), 107-121; https://doi.org/10.3390/metrology1020008 - 01 Nov 2021
Abstract
►▼
Show Figures
Determining the solute mass amount in seawater using in situ measurements in seas and oceans is currently an unresolved problem. To solve it, it is necessary to develop both new methods and instruments for measurements. The authors of this article analyzed methods for
[...] Read more.
Determining the solute mass amount in seawater using in situ measurements in seas and oceans is currently an unresolved problem. To solve it, it is necessary to develop both new methods and instruments for measurements. The authors of this article analyzed methods for the indirect measurement of salinity and density using parameters that can be measured in situ, including relative electrical conductivity, speed of sound, temperature, and hydrostatic pressure. The authors propose an electric conductivity sensor design that allows for the obtainment of data on solid suspensions along with measuring the impedance of electrodes under various the alternating current frequencies. The authors analyzed the joint measurement technique using the Conductivity-Temperature-Depth (CTD) and Sound Velocity Profiler (SVP) devices in a marine testing area. Based on the results of joint measurements, the authors present tests of water samples of various salt compositions for the presence of solid suspensions.
Full article

Figure 1
Highly Accessed Articles
Latest Books
E-Mail Alert
News
Topics
Topic in
Automation, Fibers, Metrology, Photonics, Sensors
Advance and Applications of Fiber Optic Measurement
Topic Editors: Flavio Esposito, Stefania Campopiano, Agostino IadiciccoDeadline: 31 March 2023
Topic in
Applied Sciences, Metrology, Sensors, Photonics, Machines
Manufacturing Metrology
Topic Editors: Fang Cheng, Qian Wang, Tegoeh Tjahjowidodo, Ziran ChenDeadline: 31 May 2023

Conferences
Special Issues
Special Issue in
Metrology
New Trends and Advances in Manufacturing Metrology
Guest Editor: Shivakumar RamanDeadline: 31 July 2022
Special Issue in
Metrology
Frequency Metrology
Guest Editor: Gianluca GalzeranoDeadline: 30 October 2022
Special Issue in
Metrology
Women’s Special Issue Series: Metrology
Guest Editors: Annalisa Liccardo, Samanta PianoDeadline: 31 March 2023