# Pragmatic Validation of Numerical Models Used for the Assessment of Radioactive Waste Repositories: A Perspective

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Criticism of Verifiability and Model Validation

## 3. The Need for Model Evaluation

#### 3.1. The Nature of Models

#### 3.2. Calibration

- The mathematical model and/or auxiliary hypotheses are incomplete or are poor representations of the system to be modeled;
- There is a discrepancy in the definition, state, location, or scale of the calculated model output variable and the corresponding observation used for model calibration;
- Measured data have an error component that is systematic;
- The model output has an error component that is systematic; systematic errors include errors in the conceptual model, (over)simplifications in the model structure (processes and features), model truncation errors, reduction in model dimensionality, symmetry assumptions, errors in initial and boundary conditions, etc.;
- Data sets are incomplete, and the inverse problem is either underdetermined or regularized using an artificial or erroneous regularization term;
- The data are not sufficiently informative about the parameters of interest, or the available data are not discriminative enough to sufficiently reduce correlations among the parameters;
- Alternative conceptual models exist that are equally capable of reproducing the calibration data.

#### 3.3. Extrapolation

#### 3.4. Model Space

## 4. Pragmatic Model Validation

#### 4.1. Pragmatism in Validation

#### 4.2. Sensitivity Auditing

#### 4.3. Validation Activities and Acceptance Criteria

- A validated model provides an improved, general understanding of the system, as the model results are examined from disparate lines of evidence. However, the model results are not to be interpreted as predictions about the real system behavior;
- A validated model provides a consistent representation and explanation of the available, complementary data from different scientific disciplines (geology, hydrogeology, geochemistry, etc.);
- A validated model is suitable to examine alternative cases and “what-if” scenarios. The model results are not accurate predictions but reveal relative changes in the expected system behavior as a function of the chosen scenarios;
- A validated model can make specific predictions that are adequate for the purpose of the model. The model does not necessarily represent the real system, but its outcomes are acceptable as they support the ultimate project purpose. For example, the model may be used for conservative or bounding calculations, which—despite being unlikely, unreasonable, or even unphysical—may be adequate within a regulatory framework and may support a performance assessment study;
- A validated model is an approximate representation of the real system. The degree of model fidelity is dictated by the accuracy with which the predictions need to be made, so they can support decision making.

- A validated model should comply with industry-standard QA/QC procedures and have passed a formal software qualification lifecycle test (“verification”);
- A validated model should have undergone a detailed review of the procedures used for the construction of the conceptual and numerical models, including the evaluation (a) of available data, (b) of theoretical and empirical laws and principles, (c) of the abstraction process and conceptual model development, (d) the construction of the calculational model, and (e) the iterative refinement based on predictive simulations, sensitivity analyses, and uncertainty quantification [40,51];
- A validated model should be calibrated against relevant data, with (a) residuals being devoid of a significant systematic component, (b) acceptably low estimation uncertainties, and (c) fairly weak parameter correlations. The criteria for acceptability are determined by the accuracy with which model outputs supporting the project objectives need to be calculated [43];
- A validated model should be peer-reviewed with a general consensus among experts and stakeholders that the model qualifies for its intended use, and that limitations, the range of application, and uncertainties are sufficiently understood and documented;
- A validated model should reproduce—with acceptable accuracy—relevant data not used for model calibration. The type of data, the processes involved, the spatial and temporal scales, and the conditions prevailing during data collection should reflect those of the target predictions as closely as possible. The criteria for acceptability are determined by the accuracy with which model outputs supporting the project objectives need to be calculated;
- A validated model should demonstrate that it can predict emerging phenomena [4].

## 5. The Pragmatic Model Validation Framework

**Definition of the model purpose:**The aim of pragmatic model-evaluation is to determine whether a model is adequate-for-purpose: does the model make a valuable contribution to the solution of the problem at hand? The model purpose must be clearly specified as it determines the benchmark, standards, and acceptance criteria for critical evaluation.**Determination of critical aspects:**For reasons of pragmatism, effectiveness, and efficiency, it is essential to identify which aspects of the models will require particular attention and thus warrant targeted review and testing effort. These aspects are likely to be specific to the intended use and are those that have the greatest impact on critical model outcomes. Moreover, model evaluation should be focused on the subset of aspects that are uncertain or where the modelers lack confidence in their correct or accurate representation in the model.**Definition of performance measures and criteria:**To be able to assess whether a model is adequate-for-purpose requires the definition of suitable performance measures and acceptance criteria. They must either be directly calculatable by the model or indirectly inferable from the modeling results. Most critically, they must be relevant to the end use. Information, observations, or testing data used for model assessment should be as close to the performance measures as possible, in terms of influential factors, processes, and scale. The accuracy of both the model output and data must be sufficiently high that they are discriminative in the evaluation of the acceptance criteria.**Sensitivity and uncertainty analysis of influential factors:**Selecting influential factors is an important step during model development, but even more so for model evaluation. Influential factors are model-specific, although they may be common to several models. The difference between the influential factors identified during model development (specifically model calibration) and the factors identified as influential for the ultimate model use is an indication of the degree of extrapolation undertaken when using a model for a purpose that may not have been envisioned during model development, and for which no closely related calibration data were available.**Prediction-outcome exercises:**An important aspect of pragmatic evaluation is the testing of model predictions [1]. Whilst direct testing of the model predictions against the reality of interest is often not possible, the critical aspects and significant influential factors should be the basis for design and evaluation of prediction-outcome tests. Uncertainties in the influential factors need to be propagated through the model to the performance measures so that meaningful statements about system behaviors can be made that account for relevant uncertainty.**Model evaluation, documentation, and model audit:**As all model predictions are extrapolations (spatially, temporally, parametric, and regarding the features and processes that need to be considered), and the testing data never fully correspond to the ultimate performance metrics, confidence in the model cannot solely rely on the comparison between model output and measurements. Instead, each model development step must be clearly documented. In particular, the conceptual models and their assumptions need to be reviewed as they often have the greatest potential to bias modelling results [11]. It is also important to document and review the criteria used to reject a model or the criteria employed when calling for an update of the model. Any consensus—and in particular any disagreement—among model reviewers should be acknowledged.

## 6. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- International Atomic Energy Agency (IAEA). IAEA Safety Glossary, Terminology Used in Nuclear Safety and Radiation Protection; IAEA: Vienna, Austria, 2019; p. 278. [Google Scholar]
- Popper, K. Logik der Forschung, Zur Erkenntnistheorie der Modernen Wissenschaft; Julius Pringer Verlag: Vienna, Austria, 1934. [Google Scholar]
- Kuhn, T.S. The Structure of Scientific Revolutions; The University of Chicago Press: Chicago, IL, USA, 1962. [Google Scholar]
- Kuhn, T.S. Objectivity, Value Judgment, and Theory Choice. In The Essential Tension: Selected Studies in Scientific Tradition and Change; University of Chicago Press, Chicago: Chicago, IL, USA, 1977; pp. 320–339. [Google Scholar]
- Oreskes, N.; Shrader-Frechette, K.; Belitz, K. Verification, validation, and confirmation of numerical models in the earth sciences. Science
**1994**, 263, 641–646. [Google Scholar] [CrossRef] [PubMed][Green Version] - Gens, A.; Alcoverro, J.; Blaheta, R.; Hasal, M.; Michalec, Z.; Takayama, Y.; Lee, C.; Lee, J.; Kim, G.Y.; Kuo, C.W.; et al. HM and THM interactions in bentonite engineered barriers for nuclear waste disposal. Int. J. Rock Mech. Min. Sci.
**2021**, 137, 104572. [Google Scholar] [CrossRef] - Konikow, L.F.; Bredehoeft, J.D. Groundwater models cannot be validated. Adv. Water Resour.
**1992**, 15, 75–83. [Google Scholar] [CrossRef] - Bredehoeft, J.D.; Konikow, L.F. Groundwater models cannot be validated—Reply. Adv. Water Resour.
**1992**, 15, 371–372. [Google Scholar] [CrossRef] - Bredehoeft, J.D.; Konikow, L.F. Groundwater models—Validate or invalidate. Ground Water
**1993**, 31, 178–179. [Google Scholar] [CrossRef] - Bredehoeft, J.D. From models to performance assessment: The conceptualization problem. Ground Water
**2003**, 41, 571–577. [Google Scholar] [CrossRef] - Bredehoeft, J. The conceptualization model problem—surprise. Hydrogeol. J.
**2005**, 13, 37–46. [Google Scholar] [CrossRef] - De Marsily, G.; Combes, P.; Goblet, P. Comment on 'Groundwater models cannot be validated' by L.F. Konikow & J.D. Bredehoeft. Adv. Water Resour.
**1992**, 15, 367–369. [Google Scholar] [CrossRef] - Bair, E.S. Model (in)validation—A view from the courtroom. Ground Water
**1994**, 32, 530–531. [Google Scholar] [CrossRef] - McCombie, C.; McKinley, I. Validation—another perspective. Ground Water
**1993**, 31, 520–531. [Google Scholar] [CrossRef] - Selroos, J.O.; Walker, D.D.; Strom, A.; Gylling, B.; Follin, S. Comparison of alternative modelling approaches for groundwater flow in fractured rock. J. Hydrol.
**2002**, 257, 174–188. [Google Scholar] [CrossRef] - Finsterle, S.; Lanyon, B.; Åkesson, M.; Baxter, S.; Bergström, M.; Bockgård, N.; Dershowitz, W.; Dessirier, B.; Frampton, A.; Fransson, Å.; et al. Conceptual uncertainties in modelling the interaction between engineered and natural barriers of nuclear waste repositories in crystalline rock. In Multiple Roles of Clays in Radioactive Waste Confinement; Norris, S., Neeft, E.A.C., Van Geet, M., Eds.; Geological Society of London: London, UK, 2019; Volume 482. [Google Scholar]
- Hassanizadeh, S.M.; Carrera, J. Special issue—Validation of geo-hydrological models. Adv. Water Resour.
**1992**, 15, 1–3. [Google Scholar] - Leijnse, A.; Hassanizadeh, S.M. Model definition and model validation. Adv. Water Resour.
**1994**, 17, 197–200. [Google Scholar] [CrossRef] - Oreskes, N. Evaluation (not validation) of quantitative models. Environ. Health Perspect.
**1998**, 106, 1453–1460. [Google Scholar] [CrossRef][Green Version] - Saltelli, A.; Funtowicz, S. When all models are wrong. Issues Sci. Technol.
**2014**, 30, 79–85. [Google Scholar] - Eker, S.; Rovenskaya, E.; Langan, S.; Obersteiner, M. Model validation: A bibliometric analysis of the literature. Environ. Model. Softw.
**2019**, 117, 43–54. [Google Scholar] [CrossRef][Green Version] - Parker, W.S. Model evaluation: An adequacy-for-purpose view. Philos. Sci.
**2020**, 87, 457–477. [Google Scholar] [CrossRef][Green Version] - Konikow, L.F. The modeling process and model validation—Discussion. Ground Water
**1992**, 30, 622–623. [Google Scholar] [CrossRef] - Hawkins, D.M. The problem of overfitting. J. Chem. Inf. Comput. Sci.
**2004**, 44, 1–12. [Google Scholar] [CrossRef] - Hunt, R.J.; Doherty, J.; Tonkin, M.J. Are models too simple? Arguments for increased parameterization. Ground Water
**2007**, 45, 254–262. [Google Scholar] [CrossRef] - Christensen, S.; Doherty, J.D. Predictive error dependencies when using pilot points and singular value decomposition in groundwater model calibration. Adv. Water Resour.
**2008**, 31, 674–700. [Google Scholar] [CrossRef] - Doherty, J.; Christensen, S. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty. Water Resour. Res.
**2011**, 47, W12534. [Google Scholar] [CrossRef][Green Version] - Carrera, J.; Neuman, S.P. Estimation of aquifer parameters under transient and steady-state conditions. 1. Maximum-likelihood method incorporating prior information. Water Resour. Res.
**1986**, 22, 199–210. [Google Scholar] [CrossRef] - Zimmerman, D.A.; de Marsily, G.; Gotway, C.A.; Marietta, M.G.; Axness, C.L.; Beauheim, R.L.; Bras, R.L.; Carrera, J.; Dagan, G.; Davies, P.B.; et al. A comparison of seven geostatistically based inverse approaches to estimate transmissivities for modeling advective transport by groundwater flow. Water Resour. Res.
**1998**, 34, 1373–1413. [Google Scholar] [CrossRef][Green Version] - Yeh, W.W.G. Review of parameter-identification procedures in groundwater hydrology—The inverse problem. Water Resour. Res.
**1986**, 22, 95–108. [Google Scholar] [CrossRef] - Neuman, S.P. Calibration of distributed parameter groundwater flow models viewed as a multiple-objective decision process under uncertainty. Water Resour. Res.
**1973**, 9, 1006–1021. [Google Scholar] [CrossRef] - Ewing, R.E.; Lin, T. A class of parameter-estimation techniques for fluid-flow in porous-media. Adv. Water Resour.
**1991**, 14, 89–97. [Google Scholar] [CrossRef] - McLaughlin, D.; Townley, L.R. A reassessment of the groundwater inverse problem. Water Resour. Res.
**1996**, 32, 1131–1161. [Google Scholar] [CrossRef][Green Version] - Sun, N.-Z. Inverse Problems in Groundwater Modeling; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1994. [Google Scholar]
- ASME. Guide for Verification and Validation in Computational Solid Mechanics; Report VV10-2019; The American Society of Mechanical Engineers: New York, NY, USA, 2019. [Google Scholar]
- Beven, K. Environmental Modelling—An Uncertain Future? Routledge: London, UK, 2009; p. 328. [Google Scholar]
- Selroos, J.-O.; Ivars, D.M.; Munier, R.; Hartley, L.; Libby, S.; Davy, P.; Darcel, C.; Trichero, P. Methodology for Discrete Fracture Network Modelling of the Forsmark Site; Svensk Kärnbränslehantering AB (SKB): Stockholm, Sweden, 2022; p. 261. [Google Scholar]
- Beven, K. A manifesto for the equifinality thesis. J. Hydrol.
**2006**, 320, 18–36. [Google Scholar] [CrossRef][Green Version] - Beven, K. Towards a coherent philosophy for modelling the environment. Proc. R. Soc. Lond. A
**2002**, 458, 2465–2484. [Google Scholar] [CrossRef][Green Version] - Saltelli, A.; Pereira, Â.G.; Van der Sluijs, J.P.; Funtowicz, S. What do I make of your latinorum? Sensitivity auditing of mathematical modelling. Int. J. Foresight Innov. Policy
**2013**, 9, 213–234. [Google Scholar] [CrossRef] - Munafò, M.R.; Davey Smith, G. Repeating experiments is not enough. Nature
**2018**, 553, 399–401. [Google Scholar] [CrossRef] [PubMed][Green Version] - Fienen, M.N.; Doherty, J.E.; Hunt, R.J.; Reeves, H.W. Using Prediction Uncertainty Analysis to Design Hydrologic Monitoring Networks: Example Applications from the Great Lakes Water Availability Pilot Project: U.S. Geological Survey Scientific Investigations Report 2010–5159; USGS: Reston, VA, USA, 2010; p. 44.
- Finsterle, S. Practical notes on local data-worth analysis. Water Resour. Res.
**2015**, 51, 9904–9924. [Google Scholar] [CrossRef][Green Version] - Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C. Quantifying data worth toward reducing predictive uncertainty. Ground Water
**2010**, 48, 729–740. [Google Scholar] [CrossRef] [PubMed] - Box, G.E.P. Science and Statistics. J. Am. Stat. Assoc.
**1976**, 71, 791–799. [Google Scholar] [CrossRef] - Box, G.E.P.; Luceño, A.; del Carmen Paniagua-Quiñones, M. Statistical Control by Monitoring and Adjustment; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar]
- Saltelli, A.; Ratto, M.; Andres, T.; Campolongo, F.; Cariboni, J.; Gatelli, D.; Saisana, M.; Tarantola, S. Global Sensitivity Analysis, The Primer; John Wiley & Sons: Chichester, UK, 2008; p. 292. [Google Scholar]
- Van der Sluijs, J.P.; Craye, M.; Funtowicz, S.; Kloprogge, P.; Ravetz, J.; Risbey, J. Combining quantitative and qualitative measures of uncertainty in model-based environmental assessment: The NUSAP system. Risk Anal.
**2005**, 25, 481–492. [Google Scholar] [CrossRef][Green Version] - Luis, S.J.; McLaughlin, D. A stochastic approach to model validation. Adv. Water Resour.
**1992**, 15, 15–32. [Google Scholar] [CrossRef] - Neuman, S.P. Validation of safety assessment models as a process of scientific and public confidence building. In Proceedings of the International High-Level Radioactive Waste Management, Las Vegas, NV, USA, 12–16 April 1992; pp. 1404–1413. [Google Scholar]
- Tsang, C.F. The modeling process and model validation. Ground Water
**1991**, 29, 825–831. [Google Scholar] [CrossRef][Green Version] - Enemark, T.; Peeters, L.J.M.; Mallants, D.; Batelaan, O. Hydrogeological conceptual model building and testing: A review. J. Hydrol.
**2019**, 569, 310–329. [Google Scholar] [CrossRef] - Hedin, A.; Andersson, E.; Andersson, J.; Greis, C.; Zetterström Evins, L.; Kautsky, U.; Lilja, C.; Lindborg, T.; Lindgren, M.; Löfgren, M.; et al. The SR-site safety assessment for licensing a spent fuel repository in Sweden. In Proceedings of the International High-Level Radioactive Waste Management Conference, Albuquerque, NM, USA, 10–14 April 2011; pp. 193–208. [Google Scholar]
- Larsson, A. The international projects INTRACOIN, HYDROCOIN and INTRAVAL. Adv. Water Resour.
**1992**, 15, 85–87. [Google Scholar] [CrossRef] - Hill, M.C.; Middlemis, H.; Hulme, P.; Poeter, E.; Riegger, J.; Neuman, S.P.; Williams, H.; Anderson, M. Brief overview of selected groundwater modeling guidelines. In Proceedings of the Finite-Element Models, MODFLOW and More, Carlsbad, Czech Republic, 13–16 September 2004; pp. 105–120. [Google Scholar]
- Crout, N.; Kokkonen, T.; Jakeman, A.J.; Norton, J.P.; Newham, L.T.H.; Anderson, R.; Assaf, H.; Croke, B.F.W.; Gaber, N.; Gibbons, J.; et al. Good modelling practice. Dev. Integr. Environ. Assess.
**2008**, 3, 15–31. [Google Scholar] [CrossRef] - Neuman, S.P. Maximum likelihood Bayesian averaging of uncertain model predictions. Stoch. Environ. Res. Risk Assess.
**2003**, 17, 291–305. [Google Scholar] [CrossRef] - Matott, L.S.; Babendreier, J.E.; Purucker, S.T. Evaluating uncertainty in integrated environmental models: A review of concepts and tools. Water Resour. Res.
**2009**, 45, W06421. [Google Scholar] [CrossRef][Green Version]

**Figure 1.**Approximate relation between validation activities needed to reach a particular validation goal. The yellow and red dotted lines indicate, respectively, the modeling goals and validation activities targeted by pragmatic model validation and the relation to the main goal and activity highlighted in the IAEA definition of model validation.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Finsterle, S.; Lanyon, B. Pragmatic Validation of Numerical Models Used for the Assessment of Radioactive Waste Repositories: A Perspective. *Energies* **2022**, *15*, 3585.
https://doi.org/10.3390/en15103585

**AMA Style**

Finsterle S, Lanyon B. Pragmatic Validation of Numerical Models Used for the Assessment of Radioactive Waste Repositories: A Perspective. *Energies*. 2022; 15(10):3585.
https://doi.org/10.3390/en15103585

**Chicago/Turabian Style**

Finsterle, Stefan, and Bill Lanyon. 2022. "Pragmatic Validation of Numerical Models Used for the Assessment of Radioactive Waste Repositories: A Perspective" *Energies* 15, no. 10: 3585.
https://doi.org/10.3390/en15103585