Validity and Validation of Computer Simulations—A Methodological Inquiry with Application to Integrated Assessment Models
Abstract
:1. Introduction
1.1. Validity, Confidence, and Credence
1.2. Challenges to Credence
2. Chance and Uncertainty in IAM
2.1. The Distinction between Epistemic and Aleatory Uncertainty
2.2. Uncertainty Involves More Than Stochasticity
- Risk—in classical risk, the decision maker (DM) faces stochastic harm. The relevant pdf is known and stationary, but the outcome of the next draw is not. The uncertainty is all aleatory.
- Ambiguity—the relevant probability distribution function is not known. Ambiguity piles epistemic uncertainty on top of ordinary aleatory uncertainty.
- Deep uncertainty, gross ignorance, unawareness, etc.—the DM may not be able to enumerate possible outcomes, let alone assign probabilities. Inability to enumerate possible outcomes suggests a rather serious case of epistemic uncertainty, but aleatory uncertainty is likely also to be part of the picture.
- Surprises—in technical terms, the eventual outcome was not a member of the ex ante outcome set. The uncertainty that generates the possibility of a surprise is entirely epistemic—we failed to understand that the eventual outcome was possible. However, there likely are aleatory elements to its actual occurrence in a particular instance.
3. Getting Serious about Uncertainty in IAM
3.1. Uncertainty as a Challenge to Credence
3.2. Scenario Analysis to Address Uncertainty
3.3. The Challenge of Better Capturing the Real-World Uncertainties within the Deterministic, Multiple Scenarios Framework
3.4. Introducing Stochasticity in a Few Variables Thought Ex Ante to Be Sensitive
3.5. How Might IAMs Be Restructured to Better Address the Range of Real-World Uncertainties?
3.6. What Can Be Gained in Validity by Improving Our Characterization of Uncertainty in IAM?
4. Validation and Credence in IAM Output
4.1. Arguments That Validation Claims Re IAM Are Inherently Misleading
4.2. Does Simulation Per Se, as Compared to Other Established Ways of Doing Science, Pose Special Problems for Validation?
4.3. Is it Built Right? The Emergence of Regional and Local CC-IAMs
4.4. Critiques of Validation as Practiced
5. Validation Criteria for IAMs
- Address aleatory (or random) uncertainties in model inputs using cumulative distribution functions.
- Treat epistemic uncertainties as intervals.
- Propagate both types of uncertainties through the model to the system response quantities of interest.
- Estimate numerical approximation errors using verification techniques.
- Quantify model structure uncertainties using model validation procedures.
- ◦
- Compare experimental data, calibrate.
- ◦
- Extrapolate the uncertainty structure beyond experimental data.
- Communicate the total predictive uncertainty to decision makers.
6. Conclusions
6.1. Conclusions Re Validation Criteria for IAMs
- Constructing models that capture the relevant features of the real world, including its uncertainties, in convincing fashion.
- Addressing uncertainty in structural equations and parameter values in the model and its estimation.
- Verifying that the modelers’ intentions are implemented accurately, precisely, and completely.
- Confirming the representations of variation in parameters by applying appropriate statistical measures and tests.
- Testing and calibrating model performance using history matching, tracking, and prediction tests, given near-median and extreme values of key variables. If real-world experience does not yield observable responses to extreme driver values, test whether the model response to extreme driver values accords with expectations informed by theory.
- Sequential updating of model structure and parameterization to reflect what is learned in the calibration process, thereby improving model structure and parameterization.
- Exposing the resulting model to validation tests that are independent of prior calibration.
- To the extent that the model has evolved through sequential learning and updating, communicating this process to end users.
- Communicating results in a manner that conveys the nature of the exercise—in many cases, “if …, then …” analysis of how alternative settings for exogenous and policy drivers may affect future outcomes—and fully reflects the remaining epistemic and aleatory uncertainties.
6.2. Conclusions Re Computer Simulation
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Metcalf, G.E.; Stock, J.H. Integrated Assessment Models and the Social Cost of Carbon: A Review and Assessment of U.S. Experience. Rev. Environ. Econ. Policy 2017, 11, 80–99. [Google Scholar] [CrossRef]
- Pindyck, R. The use and misuse of models for climate policy. Rev. Environ. Econ. Policy 2017, 11, 100–114. [Google Scholar] [CrossRef]
- Weyant, J. Contributions of integrated assessment models. Rev. Environ. Econ. Policy 2017, 11, 115–137. [Google Scholar] [CrossRef]
- Nordhaus, W. Estimates of the social cost of carbon: Concepts and results from the DICE-2013R model and alternative approaches. J. Assoc. Environ. Resour. Econ. 2014, 1, 273–312. [Google Scholar] [CrossRef]
- Winsberg, E. Computer Simulations in Science. In The Stanford Encyclopedia of Philosophy, Summer 2015 ed.; Zalta, E.N., Ed.; Stanford University: Stanford, CA, USA, 2015; Available online: https://plato.stanford.edu/archives/sum2015/entries/simulations-science/ (accessed on 19 May 2023).
- O’Hagan, T. Dicing with the Unknown. Significance 2004, 1, 132–133. Available online: http://www.stat.columbia.edu/~gelman/stuff_for_blog/ohagan.pdf (accessed on 19 May 2023). [CrossRef]
- Carnap, R. Logical Foundations of Probability; University of Chicago Press: Chicago, IL, USA; London, UK, 1950. [Google Scholar]
- Halpern, J. Reasoning about Uncertainty; MIT Press: Cambridge, MA, USA, 2003. [Google Scholar]
- Norton, J. Ignorance and indifference. Philos. Sci. 2008, 75, 45–68. [Google Scholar] [CrossRef]
- Hatfield-Dodds, S.; Schandl, H.; Adams, P.D.; Baynes, T.M.; Brinsmead, T.S.; Bryan, B.A.; Chiew, F.H.S.; Graham, P.W.; Grundy, M.; Harwood, T.; et al. Australia is ‘free to choose’ economic growth and falling environmental Pressures. Nature 2015, 527, 49–53. [Google Scholar] [CrossRef]
- Madrian, B.C.; Shea, D.F. The power of suggestion: Inertia in 401 (k) participation and savings behavior. Quart. J. Econ. 2001, 116, 1149–1187. [Google Scholar] [CrossRef]
- Thaler, R.; Sunstein, C. Nudge: Improving Decisions about Health, Wealth, and Happiness; Penguin Books: New York, NY, USA, 2009. [Google Scholar]
- Cai, Y.; Judd, K.; Lenton, T.; Lontzek, T.; Narita, D. Environmental tipping points significantly affect the cost-benefit assessment of climate policies. Proc. Natl. Acad. Sci. USA 2015, 112, 4606–4611. [Google Scholar] [CrossRef]
- Cai, Y.; Sanstad, A. Model uncertainty and energy technology policy: The example of induced technical change. Comput. Oper. Res. 2016, 66, 362–373. [Google Scholar] [CrossRef]
- Cai, Y.; Lenton, T.; Lontzek, T. Risk of multiple interacting tipping points should encourage rapid CO2 emission reduction. Nat. Clim. Chang. 2016, 6, 520–525. [Google Scholar] [CrossRef]
- Cai, Y.; Steinbuks, J.; Judd, K.L.; Jaegermeyr, J.; Hertel, T.W. Modeling Uncertainty in Large Natural Resource Allocation Problems. World Bank Policy Res. Work. Pap. 2020, 20, 9159. [Google Scholar]
- Cai, Y.; Golub, A.A.; Hertel, T.W. Developing long-run agricultural R&D policy in the face of uncertain economic growth. In Proceedings of the 2017 Allied Social Sciences Association (ASSA) Annual Meeting, Chicago, IL, USA, 6–8 January 2017. [Google Scholar]
- Roy, C.; Oberkampf, W. A complete framework for verification, validation, and uncertainty quantification in scientific computing. Comput. Methods Appl. Mech. Eng. 2011, 200, 2131–2144. [Google Scholar] [CrossRef]
- Dubois, D.; Prade, H. Possibility Theory. In Computational Complexity; Meyers, R., Ed.; Springer: New York, NY, USA, 2012. [Google Scholar]
- Dubois, D.; Prade, H. Possibility theory and its applications: Where do we stand? In Springer Handbook of Computational Intelligence; Kacprzyk, J., Pedrycz, W., Eds.; Springer: Berlin, Germany, 2015; pp. 31–60. [Google Scholar]
- Gerard, R.; Kaci, S.; Prade, H. Ranking Alternatives on the Basis of Generic Constraints and Examples—A Possibilistic Approach. IJCAI 2007, 7, 393–398. [Google Scholar]
- Pindyck, R. Climate change policy: What do the models tell us? J. Econ. Lit. 2013, 51, 860–872. [Google Scholar] [CrossRef]
- Heal, G. The economics of climate. J. Econ. Lit. 2017, 55, 1046–1063. [Google Scholar] [CrossRef]
- Huber, F. Formal Representations of Belief. In The Stanford Encyclopedia of Philosophy, Spring 2016 ed.; Zalta, E.N., Ed.; Stanford University: Stanford, CA, USA, 2016; Available online: https://plato.stanford.edu/archives/spr2016/entries/formal-belief (accessed on 19 May 2023).
- Oreskes, N.; Shrader-Frechette, K.; Belitz, K. Verification, Validation, and Confirmation of Numerical Models in the Earth Sciences. Science 1994, 263, 641–646. [Google Scholar] [CrossRef]
- Konikow, L.; Bredehoeft, D. Groundwater models cannot be validated. Adv. Water Resour. 1992, 15, 75–83. [Google Scholar] [CrossRef]
- McCloskey, D. The rhetoric of economics. J. Econ. Lit. 1983, 21, 481–517. [Google Scholar]
- Stern, N. The Economics of Climate Change: The Stern Review; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
- Stern, N. The economics of climate change. Am. Econ. Rev. Pap. Proc. 2008, 98, 1–37. [Google Scholar] [CrossRef]
- Nordhaus, W. A Review of the Stern Review on the economics of climate change. J. Econ. Lit. 2007, 45, 686–702. [Google Scholar] [CrossRef]
- Millner, A.; Dietz, S.; Heal, G. Scientific ambiguity and climate policy. Environ. Resour. Econ. 2013, 55, 21–46. [Google Scholar] [CrossRef]
- Traeger, C.P. Why uncertainty matters: Discounting under intertemporal risk aversion and ambiguity. Econ. Theory 2014, 56, 627–664. [Google Scholar] [CrossRef]
- Dietz, S.; Gollier, C.; Kessler, L. The Climate Beta; Working Paper 190; Grantham Institute: London, UK, 2015. [Google Scholar]
- Grim, P.; Rosenberger, R.; Rosenfeld, A.; Anderson, B.; Eason, R. How simulations fail. Synthese 2013, 190, 2367–2390. [Google Scholar] [CrossRef]
- Winsberg, E. Simulations, Models, and Theories: Complex Physical Systems and their Representations. Philos. Sci. 2001, 68, S442–S454. [Google Scholar] [CrossRef]
- Ramsey, J. Towards an expanded epistemology for approximations. In PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, East Lansing, MI, USA, 1 January 1992; Cambridge University Press: Cambridge, UK, 1992; Volume 1, pp. 154–166. [Google Scholar]
- Frigg, R.; Reiss, J. The philosophy of simulation: Hot new issues or same old stew. Synthese 2009, 169, 593–613. [Google Scholar] [CrossRef]
- Winsberg, E. Computer Simulations in Science. In The Stanford Encyclopedia of Philosophy, Winter 2022 ed.; Zalta, E.N., Nodelman, U., Eds.; Stanford University: Stanford, CA, USA, 2022; Available online: https://plato.stanford.edu/archives/win2022/entries/simulations-science (accessed on 19 May 2023).
- Randall, A. What practicing agricultural economists really need to know about methodology. Am. J. Agric. Econ. 1993, 75, 48–59. [Google Scholar] [CrossRef]
- O’Neill, B.C.; Carter, T.R.; Ebi, K.; Harrison, P.A.; Kemp-Benedict, E.; Kok, K.; Kriegler, E.; Preston, B.L.; Riahi, K.; Sillmann, J.; et al. Achievements and needs for the climate change scenario framework. Nat. Clim. Change 2020, 10, 1074–1084. [Google Scholar] [CrossRef]
- Kriegler, E.; Edmonds, J.; Hallegatte, S.; Ebi, K.L.; Kram, T.; Riahi, K.; Winkler, H.; Van Vuuren, D.P. A new scenario framework for climate change research: The concept of shared climate policy assumptions. Clim. Change 2014, 122, 401–414. [Google Scholar] [CrossRef]
- Parker, W.S. Computer Simulation through an Error-Statistical Lens. Synthese 2008, 163, 371–384. [Google Scholar] [CrossRef]
- Caldwell, B. Clarifying Popper. J. Econ. Lit. 1991, 29, 1–33. [Google Scholar]
- Mayo, D. Error and the Growth of Experimental Knowledge; The University of Chicago Press: Chicago, IL, USA, 1996. [Google Scholar]
- Parker, W.S. Evidence and knowledge from computer simulation. Erkenntnis 2020, 3, 1–8. [Google Scholar] [CrossRef]
- Parker, W.S. Local Model-Data Symbiosis in Meteorology and Climate Science. Philos. Sci. 2020, 87, 807–818. [Google Scholar] [CrossRef]
- Katzav, J.; Dijkstra, H.; de Laat, A. Assessing climate model projections: State of the art and philosophical reflections. Stud. Hist. Philos. Mod. Phys. 2012, 43, 258–276. [Google Scholar] [CrossRef]
- IPCC. Climate Change 2007: Synthesis Report; Contribution of Working 1611 Groups I, II and III to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change; IPCC: Geneva, Switzerland, 2007; p. 104. [Google Scholar]
- Allen, M.; Ingram, W. Constraints on future changes in climate and the hydrologic cycle. Nature 2002, 419, 224–232. [Google Scholar] [CrossRef]
- Heckman, J. Haavelmo and the birth of modern econometrics. J. Econ. Lit. 1991, 30, 876–886. [Google Scholar]
- Lloyd, E.A. Varieties of support and confirmation of climate models. In Aristotelian Society Supplementary Volume; Oxford University Press: Oxford, UK, 2009; Volume 83, pp. 213–232. [Google Scholar]
- Sargeant, R. Verification and validation of simulation models. J. Simul. 2013, 7, 12–24. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Randall, A.; Ogland-Hand, J. Validity and Validation of Computer Simulations—A Methodological Inquiry with Application to Integrated Assessment Models. Knowledge 2023, 3, 262-276. https://doi.org/10.3390/knowledge3020018
Randall A, Ogland-Hand J. Validity and Validation of Computer Simulations—A Methodological Inquiry with Application to Integrated Assessment Models. Knowledge. 2023; 3(2):262-276. https://doi.org/10.3390/knowledge3020018
Chicago/Turabian StyleRandall, Alan, and Jonathan Ogland-Hand. 2023. "Validity and Validation of Computer Simulations—A Methodological Inquiry with Application to Integrated Assessment Models" Knowledge 3, no. 2: 262-276. https://doi.org/10.3390/knowledge3020018