Measurement Uncertainty

Printed Edition Available!
A printed edition of this Special Issue is available here.

Editor


E-Mail Website
Collection Editor
Dipartimento di Elettronica, Informazione e Bioingegneria (DEIB), Politecnico di Milano, 20133 Milano, Italy
Interests: measurement uncertainty; definition of mathematical theories to handle measurement uncertainty; digital signal processing

Topical Collection Information

Dear colleagues,

Metrology is the scientific study of measurements. In our everyday life, we are constantly surrounded by measurements: From reading the time to weighing apples, we continuously measure something. However, measurements are also below objects, since, for example, the apple we buy has already been measured, before arriving to our greengrocer, to determine its caliber. In these measurements, uncertainty plays a very important rule. Metrologists know that no measurement makes sense without an associated uncertainty value. Without it, no decision can be taken; no comparisons can be made; no conformity can be assessed.

It is hence pivotal to know the meaning of measurement uncertainty, to understand the contributions to measurement uncertainty, to know how these contributions affect the final measurement uncertainty, to have a mathematical tool to represent measurement uncertainty and propagate it through the measurement procedure.

Many important contributions have been published in the literature in recent years, which provide different solutions to the problem of representing and processing measurement uncertainty. While some of them consider, as a mathematical framework, probability theory, others consider different, more recent mathematical theories, such as Shafer’s theory.

These contributions would fit well under the umbrella of Metrology’s Topical Collection “Measurement Uncertainty”. This Topical Collection shall include these contributions and open a discussion between different authors. 

Prof. Dr. Simona Salicone
Collection Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the collection website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Metrology is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Uncertainty contributions
  • Systematic contributions
  • Random contributions
  • Probability theory
  • Possibility theory
  • Imprecise probabilities

Published Papers (18 papers)

2024

Jump to: 2023, 2022, 2021

18 pages, 6447 KiB  
Article
Novel Method of Fitting a Nonlinear Function to Data of Measurement Based on Linearization by Change Variables, Examples and Uncertainty
by Zygmunt L. Warsza, Jacek Puchalski and Tomasz Więcek
Metrology 2024, 4(4), 718-735; https://doi.org/10.3390/metrology4040042 - 3 Dec 2024
Viewed by 325
Abstract
This paper presents a novel method for determining parameters and uncertainty bands of nonlinear functions fitted to data obtained from measurements. In this procedure, one or two new variables are implemented to linearize this function for using the linear regression method. The best [...] Read more.
This paper presents a novel method for determining parameters and uncertainty bands of nonlinear functions fitted to data obtained from measurements. In this procedure, one or two new variables are implemented to linearize this function for using the linear regression method. The best parameters of the straight-line in new variables are adjusted to the transformed coordinates of tested points according to the weighted total mean square criterion WTLS, or WTLS-C of data points are also correlated. Uncertainties of measured points are found according to the rules of the GUM Guide. The parameters and the uncertainty band of the nonlinear function result from the parameters of this straight line and of its uncertainty band. A few examples determining the parameters and uncertainty bands of different types of nonlinear functions are presented. There are also examples of measurements using the presented method and conclusions. Full article
Show Figures

Figure 1

13 pages, 358 KiB  
Article
Using a Multivariate Virtual Experiment for Uncertainty Evaluation with Unknown Variance
by Manuel Marschall, Finn Hughes, Gerd Wübbeler, Gertjan Kok, Marcel van Dijk and Clemens Elster
Metrology 2024, 4(4), 534-546; https://doi.org/10.3390/metrology4040033 - 1 Oct 2024
Viewed by 796
Abstract
Virtual experiments are a digital representation of a real measurement and play a crucial role in modern measurement sciences and metrology. Beyond their common usage as a modeling and validation tool, a virtual experiment may also be employed to perform a parameter sensitivity [...] Read more.
Virtual experiments are a digital representation of a real measurement and play a crucial role in modern measurement sciences and metrology. Beyond their common usage as a modeling and validation tool, a virtual experiment may also be employed to perform a parameter sensitivity analysis or to carry out a measurement uncertainty evaluation. For the latter to be compliant with statistical principles and metrological guidelines, the procedure to obtain an estimate and a corresponding measurement uncertainty requires careful consideration. We employ a Monte Carlo sampling procedure using a virtual experiment that allows one to perform a measurement uncertainty evaluation according to the Monte Carlo approach of JCGM-101 and JCGM-102, two widely applied guidelines for uncertainty evaluation in metrology. We extend and formalize a previously published approach for simple additive models to account for a large class of non-linear virtual experiments and measurement models for multidimensionality of the data and output quantities, and for the case of unknown variance of repeated measurements. With the algorithm developed here, a simple procedure for the evaluation of measurement uncertainty is provided that may be applied in various applications that admit a certain structure for their virtual experiment. Moreover, the measurement model commonly employed for uncertainty evaluation according to JCGM-101 and JCGM-102 is not required for this algorithm, and only evaluations of the virtual experiment are performed to obtain an estimate and an associated uncertainty of the measurand. We demonstrate the efficacy of the developed approach and the effect of the underlying assumptions for a generic polynomial regression example and an example of a simplified coordinate measuring machine and its virtual representation. The results of this work highlight that considerable effort, diligence, and statistical considerations need to be invested to make use of a virtual experiment for uncertainty evaluation in a way that ensures equivalence with the accepted guidelines. Full article
Show Figures

Figure 1

17 pages, 1824 KiB  
Article
Impact of Angular Speed Calculation Methods from Encoder Measurements on the Test Uncertainty of Electric Motor Efficiency
by João P. Z. Machado, Gabriel Thaler, Antonio L. S. Pacheco and Rodolfo C. C. Flesch
Metrology 2024, 4(2), 164-180; https://doi.org/10.3390/metrology4020011 - 2 Apr 2024
Cited by 1 | Viewed by 1163
Abstract
The imperative need to advance the development of more efficient electric motors requires the meticulous measurement of small increments while minimizing the associated uncertainty in dynamometer tests. One of the key variables in such tests is the angular speed, which is typically obtained [...] Read more.
The imperative need to advance the development of more efficient electric motors requires the meticulous measurement of small increments while minimizing the associated uncertainty in dynamometer tests. One of the key variables in such tests is the angular speed, which is typically obtained based on encoder measurements. This paper proposes a systematic measurement uncertainty assessment method based on the Guide to the Expression of Uncertainty for the two most widely used methods for angular speed measurement, namely, the frequency and period methods. In addition, the impact of the angular speed calculation method on the efficiency test uncertainty is assessed using an automatic test rig for electric motors. Our experimental results consider both steady-state and dynamic analyses. The results show that the period measurement method provides measurements with lower uncertainty for the encoders typically used in such test rigs, about 30 times less than the uncertainty determined for the frequency measurement method. Based on these results, the choice of a proper method can drastically decrease the angular speed uncertainty, and consequently the motor efficiency uncertainty, without increasing instrumentation cost. Full article
Show Figures

Graphical abstract

23 pages, 8045 KiB  
Article
Statistical Analysis of Measurement Processes Using Multi-Physic Instruments: Insights from Stitched Maps
by Clement Moreau, Julie Lemesle, David Páez Margarit, François Blateyron and Maxence Bigerelle
Metrology 2024, 4(2), 141-163; https://doi.org/10.3390/metrology4020010 - 26 Mar 2024
Cited by 1 | Viewed by 1045
Abstract
Stitching methods allow one to measure a wider surface without the loss of resolution. The observation of small details with a better topographical representation is thus possible. However, it is not excluded that stitching methods generate some errors or aberrations on topography reconstruction. [...] Read more.
Stitching methods allow one to measure a wider surface without the loss of resolution. The observation of small details with a better topographical representation is thus possible. However, it is not excluded that stitching methods generate some errors or aberrations on topography reconstruction. A device including confocal microscopy (CM), focus variation (FV), and coherence scanning interferometry (CSI) instrument modes was used to chronologically follow the drifts and the repositioning errors on stitching topographies. According to a complex measurement plan, a wide measurement campaign was performed on TA6V specimens that were ground with two neighboring SiC FEPA grit papers (P#80 and P#120). Thanks to four indicators (quality, drift, stability, and relevance indexes), no measurement drift in the system was found, indicating controlled stitching and repositioning processes for interferometry, confocal microscopy, and focus variation. Measurements show commendable stability, with interferometric microscopy being the most robust, followed by confocal microscopy, and then focus variation. Despite variations, robustness remains constant for each grinding grit, minimizing interpretation biases. A bootstrap analysis reveals time-dependent robustness for confocal microscopy, which is potentially linked to human presence. Despite Sa value discrepancies, all three metrologies consistently discriminate between grinding grits, highlighting the reliability of the proposed methodology. Full article
Show Figures

Figure 1

2023

Jump to: 2024, 2022, 2021

8 pages, 1275 KiB  
Communication
Improving Experimental Design through Uncertainty Analysis
by Ian M. Hobbs, Joey A. Charboneau and Todd L. Jacobsen
Metrology 2023, 3(3), 246-253; https://doi.org/10.3390/metrology3030014 - 28 Jun 2023
Cited by 1 | Viewed by 1553
Abstract
In this paper, the development of a fission-gas collecting and physical-analysis-enabling instrument was proposed for small-volume determination. Analysis specifications require a design capable of accurately and repeatably determining volumes in the range of 0.07–2.5 mL. This system relies on a series of gas [...] Read more.
In this paper, the development of a fission-gas collecting and physical-analysis-enabling instrument was proposed for small-volume determination. Analysis specifications require a design capable of accurately and repeatably determining volumes in the range of 0.07–2.5 mL. This system relies on a series of gas expansions originating from a cylinder with known internal volume. The combined gas law is used to derive the unknown volumes from these expansions. Initial system designs included one of two known volumes, 11.85 ± 0.34 mL and 5.807 ± 0.078 mL, with a manifold volume of 32 mL. Results obtained from modeling this system’s operation showed that 0.07 mL can be determined with a relative expanded uncertainty greater than 300% (k = 2) for a single replicate, which was unacceptable for the proposed experimental design. Initial modeling showed that the volume connecting the known volume and rodlet, i.e., the manifold volume, and the sensitivity of the pressure sensor were key contributors to the expanded uncertainty of the measured rodlet volume. The system’s design limited the available options for pressure sensors, so emphasis was placed on the design of the manifold volume. The final system design reduced the manifold volume to 17 mL. These changes in design, combined with replicate analysis, were able to reduce the relative expanded uncertainty by ±12% (k = 2) for the 0.07 mL volume. Full article
Show Figures

Figure 1

15 pages, 430 KiB  
Article
Characteristic Function of the Tsallis q-Gaussian and Its Applications in Measurement and Metrology
by Viktor Witkovský
Metrology 2023, 3(2), 222-236; https://doi.org/10.3390/metrology3020012 - 18 May 2023
Cited by 3 | Viewed by 2748
Abstract
The Tsallis q-Gaussian distribution is a powerful generalization of the standard Gaussian distribution and is commonly used in various fields, including non-extensive statistical mechanics, financial markets and image processing. It belongs to the q-distribution family, which is characterized by a non-additive [...] Read more.
The Tsallis q-Gaussian distribution is a powerful generalization of the standard Gaussian distribution and is commonly used in various fields, including non-extensive statistical mechanics, financial markets and image processing. It belongs to the q-distribution family, which is characterized by a non-additive entropy. Due to their versatility and practicality, q-Gaussians are a natural choice for modeling input quantities in measurement models. This paper presents the characteristic function of a linear combination of independent q-Gaussian random variables and proposes a numerical method for its inversion. The proposed technique makes it possible to determine the exact probability distribution of the output quantity in linear measurement models, with the input quantities modeled as independent q-Gaussian random variables. It provides an alternative computational procedure to the Monte Carlo method for uncertainty analysis through the propagation of distributions. Full article
Show Figures

Figure 1

2022

Jump to: 2024, 2023, 2021

4 pages, 167 KiB  
Editorial
New Frontiers in Measurement Uncertainty
by Simona Salicone
Metrology 2022, 2(4), 495-498; https://doi.org/10.3390/metrology2040029 - 12 Dec 2022
Cited by 1 | Viewed by 1903
Abstract
Metrology is the science of measurements [...] Full article
13 pages, 343 KiB  
Article
Bayesian Measurement of Diagnostic Accuracy of the RT-PCR Test for COVID-19
by Nikhil Padhye
Metrology 2022, 2(4), 414-426; https://doi.org/10.3390/metrology2040025 - 29 Sep 2022
Cited by 1 | Viewed by 2241
Abstract
Reverse transcription polymerase chain reaction (RT-PCR) targeting select genes of the SARS-CoV-2 RNA has been the main diagnostic tool in the global response to the COVID-19 pandemic. It took several months after the development of these molecular tests to assess their diagnostic performance [...] Read more.
Reverse transcription polymerase chain reaction (RT-PCR) targeting select genes of the SARS-CoV-2 RNA has been the main diagnostic tool in the global response to the COVID-19 pandemic. It took several months after the development of these molecular tests to assess their diagnostic performance in the population. The objective of this study is to demonstrate that it was possible to measure the diagnostic accuracy of the RT-PCR test at an early stage of the pandemic despite the absence of a gold standard. The study design is a secondary analysis of published data on 1014 patients in Wuhan, China, of whom 59.3% tested positive for COVID-19 in RT-PCR tests and 87.6% tested positive in chest computerized tomography (CT) exams. Previously ignored expert opinions in the form of verbal probability classifications of patients with conflicting test results have been utilized here to derive the informative prior distribution of the infected proportion. A Bayesian implementation of the Dawid-Skene model, typically used in the context of crowd-sourced data, was used to reconstruct the sensitivity and specificity of the diagnostic tests without the need for specifying a gold standard. The sensitivity of the RT-PCR diagnostic test developed by China CDC was estimated to be 0.707 (95% Cr I: 0.664, 0.753), while the specificity was 0.861 (95% Cr I: 0.781, 0.956). In contrast, chest CT was found to have high sensitivity (95% Cr I: 0.969, 1.000) but low specificity (95% Cr I: 0.477, 0.742). This estimate is similar to estimates that were found later in studies designed specifically for measuring the diagnostic performance of the RT-PCR test. The developed methods could be applied to assess diagnostic accuracy of new variants of SARS-CoV-2 in the future. Full article
Show Figures

Figure 1

20 pages, 3484 KiB  
Article
Three-Dimensional Point Cloud Task-Specific Uncertainty Assessment Based on ISO 15530-3 and ISO 15530-4 Technical Specifications and Model-Based Definition Strategy
by Gorka Kortaberria, Unai Mutilba, Sergio Gomez and Brahim Ahmed
Metrology 2022, 2(4), 394-413; https://doi.org/10.3390/metrology2040024 - 27 Sep 2022
Cited by 8 | Viewed by 3326
Abstract
Data-driven manufacturing in Industry 4.0 demands digital metrology not only to drive the in-process quality assurance of manufactured products but also to supply reliable data to constantly adjust the manufacturing process parameters for zero-defect manufacturing processes. Better quality, improved productivity, and increased flexibility [...] Read more.
Data-driven manufacturing in Industry 4.0 demands digital metrology not only to drive the in-process quality assurance of manufactured products but also to supply reliable data to constantly adjust the manufacturing process parameters for zero-defect manufacturing processes. Better quality, improved productivity, and increased flexibility of manufacturing processes are obtained by combining intelligent production systems and advanced information technologies where in-process metrology plays a significant role. While traditional coordinate measurement machines offer strengths in performance, accuracy, and precision, they are not the most appropriate in-process measurement solutions when fast, non-contact and fully automated metrology is needed. In this way, non-contact optical 3D metrology tackles these limitations and offers some additional key advantages to deploying fully integrated 3D metrology capability to collect reliable data for their use in intelligent decision-making. However, the full adoption of 3D optical metrology in the manufacturing process depends on the establishment of metrological traceability. Thus, this article presents a practical approach to the task-specific uncertainty assessment realisation of a dense point cloud data type of measurement. Finally, it introduces an experimental exercise in which data-driven 3D point cloud automatic data acquisition and evaluation are performed through a model-based definition measurement strategy. Full article
Show Figures

Figure 1

14 pages, 569 KiB  
Article
The Obtainable Uncertainty for the Frequency Evaluation of Tones with Different Spectral Analysis Techniques
by Salvatore Dello Iacono, Giuseppe Di Leo, Consolatina Liguori and Vincenzo Paciello
Metrology 2022, 2(2), 216-229; https://doi.org/10.3390/metrology2020013 - 14 Apr 2022
Cited by 6 | Viewed by 2304
Abstract
Spectral analysis is successfully adopted in several fields. However, the requirements and the constraints of the different cases may be so varied that not only the tuning of the analysis parameters but also the choice of the most suitable technique can be a [...] Read more.
Spectral analysis is successfully adopted in several fields. However, the requirements and the constraints of the different cases may be so varied that not only the tuning of the analysis parameters but also the choice of the most suitable technique can be a difficult task. For this reason, it is important that a designer of a measurement system for spectral analysis has knowledge about the behaviour of the different techniques with respect to the operating conditions. The case that will be considered is the realization of a numerical instrument for the real-time measurement of the spectral characteristics of a multi-tone signal (amplitude, frequency, and phase). For this purpose, different signal processing techniques can be used, that can be classified as parametric or non-parametric methods. The first class includes those methods that exploit the a priori knowledge about signal parameters, such as the spectral shape of the signal to be processed. Thus, a self-configuring procedure based on a parametric algorithm should include a preliminary evaluation of the number of components. The choice of the right method among several proposals in the literature is fundamental for any designer and, in particular, for the developers of spectral analysis software, for real-time applications and embedded devices where time and reliability constrains are arduous to fulfil. Different aspects should be considered: the desired level of accuracy, the available elaboration resources (memory depth and processing speed), and the signal parameters. The present paper details a comparison of some of the most effective methods available in the literature for the spectral analysis of signals (IFFT-2p, IFFT-3p, and IFFTc, all based on the use of an FFT algorithm, while improving the spectral resolution of the DFT with interpolation techniques and three parametric algorithms—MUSIC, ESPRIT, and IWPA). The methods considered for the comparison will be briefly described, and references to literature will be given for each one of them. Then, their behaviour will be analysed in terms of systematic contribution and uncertainty on the evaluated frequencies of the spectral tones of signals created from superimposed sinusoids and white Gaussian noise. Full article
Show Figures

Figure 1

36 pages, 1420 KiB  
Article
Designing Possibilistic Information Fusion—The Importance of Associativity, Consistency, and Redundancy
by Christoph-Alexander Holst and Volker Lohweg
Metrology 2022, 2(2), 180-215; https://doi.org/10.3390/metrology2020012 - 11 Apr 2022
Cited by 4 | Viewed by 2096
Abstract
One of the main challenges in designing information fusion systems is to decide on the structure and order in which information is aggregated. The key criteria by which topologies are constructed include the associativity of fusion rules as well as the consistency and [...] Read more.
One of the main challenges in designing information fusion systems is to decide on the structure and order in which information is aggregated. The key criteria by which topologies are constructed include the associativity of fusion rules as well as the consistency and redundancy of information sources. Fusion topologies regarding these criteria are flexible in design, produce maximal specific information, and are robust against unreliable or defective sources. In this article, an automated data-driven design approach for possibilistic information fusion topologies is detailed that explicitly considers associativity, consistency, and redundancy. The proposed design is intended to handle epistemic uncertainty—that is, to result in robust topologies even in the case of lacking training data. The fusion design approach is evaluated on selected publicly available real-world datasets obtained from technical systems. Epistemic uncertainty is simulated by withholding parts of the training data. It is shown that, in this context, consistency as the sole design criterion results in topologies that are not robust. Including a redundancy metric leads to an improved robustness in the case of epistemic uncertainty. Full article
Show Figures

Figure 1

22 pages, 439 KiB  
Article
The GUM Tree Calculator: A Python Package for Measurement Modelling and Data Processing with Automatic Evaluation of Uncertainty
by Blair D. Hall
Metrology 2022, 2(1), 128-149; https://doi.org/10.3390/metrology2010009 - 15 Mar 2022
Cited by 6 | Viewed by 4474
Abstract
There is currently interest in the digitalisation of metrology because technologies that can measure, analyse, and make critical decisions autonomously are beginning to emerge. The notions of metrological traceability and measurement uncertainty should be supported, following the recommendations in the Guide to the [...] Read more.
There is currently interest in the digitalisation of metrology because technologies that can measure, analyse, and make critical decisions autonomously are beginning to emerge. The notions of metrological traceability and measurement uncertainty should be supported, following the recommendations in the Guide to the Expression of Uncertainty in Measurement (GUM). However, GUM offers no specific guidance. Here, we report on a Python package that implements algorithmic data processing using ‘uncertain numbers’, which satisfy the general criteria in GUM for an ideal format to express uncertainty. An uncertain number can represent a physical quantity that has not been determined exactly. Using uncertain numbers, measurement models can be expressed clearly and succinctly in terms of the quantities involved. The algorithms and simple data structures we use provide an example of how metrological traceability can be supported in digital systems. In particular, uncertain numbers provide a format to capture and propagate detailed information about quantities that influence a measurement along the various stages of a traceability chain. More detailed information about influence quantities can be exploited to extract more value from results for users at the end of a traceability chain. Full article
Show Figures

Figure 1

13 pages, 277 KiB  
Article
The Storage within Digital Calibration Certificates of Uncertainty Information Obtained Using a Monte Carlo Method
by Ian Smith, Yuhui Luo and Daniel Hutzschenreuter
Metrology 2022, 2(1), 33-45; https://doi.org/10.3390/metrology2010003 - 18 Jan 2022
Cited by 7 | Viewed by 2785
Abstract
Supplement 1 to the ‘Guide to the expression of uncertainty of measurement’ describes a Monte Carlo method as a general numerical approach to uncertainty evaluation. Application of the approach typically delivers a large number of values of the output quantity of interest from [...] Read more.
Supplement 1 to the ‘Guide to the expression of uncertainty of measurement’ describes a Monte Carlo method as a general numerical approach to uncertainty evaluation. Application of the approach typically delivers a large number of values of the output quantity of interest from which summary information such as an estimate of the quantity, its associated standard uncertainty, and a coverage interval for the quantity can be obtained and reported. This paper considers the use of a Monte Carlo method for uncertainty evaluation in calibration, using two examples to demonstrate how so-called ‘digital calibration certificates’ can allow the complete set of results of a Monte Carlo calculation to be reported. Full article

2021

Jump to: 2024, 2023, 2022

18 pages, 4056 KiB  
Article
Systematic Distortion Factor and Unrecognized Source of Uncertainties in Nuclear Data Measurements and Evaluations
by Nikolay V. Kornilov, Vladimir G. Pronyaev and Steven M. Grimes
Metrology 2022, 2(1), 1-18; https://doi.org/10.3390/metrology2010001 - 24 Dec 2021
Cited by 4 | Viewed by 2723
Abstract
Each experiment provides new information about the value of some physical quantity. However, not only measured values but also the uncertainties assigned to them are an important part of the results. The metrological guides provide recommendations for the presentation of the uncertainties of [...] Read more.
Each experiment provides new information about the value of some physical quantity. However, not only measured values but also the uncertainties assigned to them are an important part of the results. The metrological guides provide recommendations for the presentation of the uncertainties of the measurement results: statistics and systematic components of the uncertainties should be explained, estimated, and presented separately as the results of the measurements. The experimental set-ups, the models of experiments for the derivation of physical values from primary measured quantities, are the product of human activity, making it a rather subjective field. The Systematic Distortion Factor (SDF) may exist in any experiment. It leads to the bias of the measured value from an unknown “true” value. The SDF appears as a real physical effect if it is not removed with additional measurements or analysis. For a set of measured data with the best evaluated true value, their differences beyond their uncertainties can be explained by the presence of Unrecognized Source of Uncertainties (USU) in these data. We can link the presence of USU in the data with the presence of SDF in the results of measurements. The paper demonstrates the existence of SDF in Prompt Fission Neutron Spectra (PFNS) measurements, measurements of fission cross sections, and measurements of Maxwellian spectrum averaged neutron capture cross sections for astrophysical applications. The paper discusses introducing and accounting for the USU in the data evaluation in cases when SDF cannot be eliminated. As an example, the model case of 238U(n,f)/235U(n,f) cross section ratio evaluation is demonstrated. Full article
Show Figures

Figure 1

16 pages, 320 KiB  
Article
Digital Representation of Measurement Uncertainty: A Case Study Linking an RMO Key Comparison with a CIPM Key Comparison
by Blair D. Hall and Annette Koo
Metrology 2021, 1(2), 166-181; https://doi.org/10.3390/metrology1020011 - 6 Dec 2021
Cited by 4 | Viewed by 2851
Abstract
This paper considers a future scenario in which digital reporting of measurement results is ubiquitous and digital calibration certificates (DCCs) contain information about the components of uncertainty in a measurement result. The task of linking international measurement comparisons is used as a case [...] Read more.
This paper considers a future scenario in which digital reporting of measurement results is ubiquitous and digital calibration certificates (DCCs) contain information about the components of uncertainty in a measurement result. The task of linking international measurement comparisons is used as a case study to look at the benefits of digitalization. Comparison linking provides a context in which correlations are important, so the benefit of passing a digital record of contributions to uncertainty along a traceability chain can be examined. The International Committee for Weights and Measures (CIPM) uses a program of international “key comparisons” to establish the extent to which measurements of a particular quantity may be considered equivalent when made in different economies. To obtain good international coverage, the results of the comparisons may be linked together: a number of regional metrology organization (RMO) key comparisons can be linked back to an initial CIPM key comparison. Specific information about systematic effects in participants’ results must be available during linking to allow correct treatment of the correlations. However, the conventional calibration certificate formats used today do not provide this: participants must submit additional data, and the report of an initial comparison must anticipate the requirements for future linking. Special handling of additional data can be laborious and prone to error. An uncertain-number digital reporting format was considered in this case study, which caters to all the information required and would simplify the comparison analysis, reporting, and linking; the format would also enable a more informative presentation of comparison results. The uncertain-number format would be useful more generally, in measurement scenarios where correlations arise, so its incorporation into DCCs should be considered. A full dataset supported by open-source software is available. Full article
14 pages, 945 KiB  
Article
Calibration of a Digital Current Transformer Measuring Bridge: Metrological Challenges and Uncertainty Contributions
by Guglielmo Frigo and Marco Agustoni
Metrology 2021, 1(2), 93-106; https://doi.org/10.3390/metrology1020007 - 3 Oct 2021
Cited by 8 | Viewed by 3427
Abstract
In this paper, we consider the calibration of measuring bridges for non-conventional instrument transformers with digital output. In this context, the main challenge is represented by the necessity of synchronization between analog and digital outputs. To this end, we propose a measurement setup [...] Read more.
In this paper, we consider the calibration of measuring bridges for non-conventional instrument transformers with digital output. In this context, the main challenge is represented by the necessity of synchronization between analog and digital outputs. To this end, we propose a measurement setup that allows for monitoring and quantifying the main quantities of interest. A possible laboratory implementation is presented and the main sources of uncertainty are discussed. From a metrological point of view, technical specifications and statistical analysis are employed to draw up a rigorous uncertainty budget of the calibration setup. An experimental validation is also provided through the thorough characterization of the measurement accuracy of a commercial device in use at METAS laboratories. The proposed analysis proves how the calibration of measuring bridges for non-conventional instrument transformers requires ad hoc measurement setups and identifies possible space for improvement, particularly in terms of outputs’ synchronization and flexibility of the generation process. Full article
Show Figures

Figure 1

17 pages, 4945 KiB  
Perspective
A General Mathematical Approach Based on the Possibility Theory for Handling Measurement Results and All Uncertainties
by Simona Salicone and Harsha Vardhana Jetti
Metrology 2021, 1(2), 76-92; https://doi.org/10.3390/metrology1020006 - 1 Oct 2021
Cited by 4 | Viewed by 2963
Abstract
The concept of measurement uncertainty was introduced in the 1990s by the “Guide to the expression of uncertainty in measurement”, known as GUM. The word uncertainty has a lexical meaning and reflects the lack of exact knowledge or lack of complete knowledge about [...] Read more.
The concept of measurement uncertainty was introduced in the 1990s by the “Guide to the expression of uncertainty in measurement”, known as GUM. The word uncertainty has a lexical meaning and reflects the lack of exact knowledge or lack of complete knowledge about the value of the measurand. Thanks to the suggestions in the GUM and following the mathematical probabilistic approaches therein proposed, an uncertainty value can be found and be associated to the measured value. In the last decades, however, other methods have been proposed in the literature, which try to encompass the definitions of the GUM, thus overcoming its limitations. Some of these methods are based on the possibility theory, such as the one known as the RFV method. The aim of this paper is to briefly recall the RFV method, starting from the very beginning and the initial motivations, and summarize in a unique paper the most relevant obtained results. Full article
Show Figures

Figure 1

13 pages, 1201 KiB  
Article
A Possibilistic Kalman Filter for the Reduction of the Final Measurement Uncertainty, in Presence of Unknown Systematic Errors
by Harsha Vardhana Jetti and Simona Salicone
Metrology 2021, 1(1), 39-51; https://doi.org/10.3390/metrology1010003 - 17 Aug 2021
Cited by 5 | Viewed by 2577
Abstract
A Kalman filter is a concept that has been in existence for decades now and it is widely used in numerous areas. It provides a prediction of the system states as well as the uncertainty associated to it. The original Kalman filter can [...] Read more.
A Kalman filter is a concept that has been in existence for decades now and it is widely used in numerous areas. It provides a prediction of the system states as well as the uncertainty associated to it. The original Kalman filter can not propagate uncertainty in a correct way when the variables are not distributed normally or when there is a correlation in the measurements or when there is a systematic error in the measurements. For these reasons, there have been numerous variations of the original Kalman filter, most of them mathematically based (like the original one) on the theory of probability. Some of the variations indeed introduce some improvements, but without being completely successful. To deal with these problems, more recently, Kalman filters have also been defined using random-fuzzy variables (RFVs). These filters are capable of also propagating distributions that are not normal and propagating systematic contributions to uncertainty, thus providing the overall measurement uncertainty associated to the state predictions. In this paper, the authors make another step forward, by defining a possibilistic Kalman filter using random-fuzzy variables which not only considers and propagates both random and systematic contributions to uncertainty, but also reduces the overall uncertainty associated to the state predictions by compensating for the unknown residual systematic contributions. Full article
Show Figures

Figure 1

Back to TopTop