Next Article in Journal
Ethical Aspects of BCI Technology: What Is the State of the Art?
Previous Article in Journal
The Return to Kalokagathia: Curating as Leverage in the Ongoing Dialogues between Aesthetics and Ethics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Epistemological Framework for Computer Simulations in Building Science Research: Insights from Theory and Practice

1
School of Architecture and Design, Virginia Polytechnic Institute and State University, Blacksburg, VA 24061, USA
2
Department of The Built Environment, Mzuzu University, Private Bag 201, Luwinga, Mzuzu 2, Malawi
*
Author to whom correspondence should be addressed.
Philosophies 2020, 5(4), 30; https://doi.org/10.3390/philosophies5040030
Submission received: 11 September 2020 / Revised: 19 October 2020 / Accepted: 19 October 2020 / Published: 22 October 2020

Abstract

:
Computer simulations are widely used within the area of building science research. Building science research deals with the physical phenomena that affect buildings, including heat and mass transfer, lighting and acoustic transmission. This wide usage of computer simulations, however, is characterized by a divergence in thought on the composition of an epistemological framework that may provide guidance for their deployment in research. This paper undertakes a fundamental review of the epistemology of computer simulations within the context of the philosophy of science. Thereafter, it reviews the epistemological framework within which computer simulations are used in practice within the area of building science research. A comparison between the insights obtained from the realms of theory and practice is made, which then interrogates the adequacy of the epistemological approaches that have been employed in previously published simulation-based research. These insights may help in informing a normative composition of an adequate epistemological framework within which computer simulation-based building science research may be conducted.

1. Introduction

Computer simulations are widely used within the area of building science research. Building science research deals with the physical phenomena that affect buildings, including heat and mass transfer, lighting and acoustic transmission. Among others, the computer simulations may be used for the purposes of building theory by way of prediction, retrodiction and explanation. They may also be used for the purposes of testing existing theories. The use of simulations in lieu of conventional experiments may be necessitated by the unavailability of data, which they may then be called upon to provide [1].
The wide usage of computer simulations, however, is characterized by a divergence in thought on the composition of an epistemological framework that may provide guidance for their deployment in research. This lack of consensus may lead to inconsistent use of computer simulations in building science research, ultimately leading to an erosion of trust and confidence in the knowledge that originates from the simulations.
This paper seeks to achieve three main objectives. First, it seeks to undertake a fundamental review of the epistemology of computer simulations within the context of the philosophy of science. Thereafter, it seeks to review the epistemological framework within which computer simulations are used in practice within the area of building science research. A comparison between the insights obtained from the realms of theory and practice is made, which then interrogates the adequacy of the epistemological approaches that have been employed in previously published simulation-based research. These insights may help in informing a normative composition of an adequate epistemological framework within which computer simulation-based building science research may be conducted.

2. The Ontology of Computer Simulations

According to Winsberg [2], a computer simulation might be understood from three perspectives. At the simplest, it might be understood as a step-by-step process to explore the approximate behavior of a mathematical model, where such a model contains equations that may not be solved analytically. Some typical examples of analytically unsolvable sets of mathematical equations, whose approximate solutions might be obtained by simulation, are the Reynolds Averaged Navier Stokes equations, which are used in Computational Fluid Dynamics (CFD) analyses to study the flow of fluids in buildings. They are conservation equations governing the continuity, momentum and energy of the fluid flow.
A computer simulation may also be understood as a method for studying systems that are best modeled with analytically unsolvable equations. This method may include the process of choosing a model and finding a way of implementing it, drawing inferences from the outputs and validating the model relative to the target system. A computer simulation may also be understood as just one type of a simulation, where the latter may refer to any attempt to learn about the behavior of a phenomenon through inferences drawn from another, where the two phenomena share similar dynamical behavior. Where this attempt is undertaken digitally, a computer simulation is conducted.
Lenhard [3] agrees with Winsberg [2] when he takes a quasi-simplistic and typological approach to describe computer simulation as a type of mathematical modeling, one that comprises a number of interdependent components including experimentation, visualization and adaptability. While Lenhard [3] acknowledges that computer simulations constitute a type of mathematical modeling, he argues that the simulations are a fundamental transformation rather than just a mere extension of mathematical modeling, and that this fundamentalist transformation requires its own characterization within the context of the philosophy of science. Humphreys [4] further asserts this philosophical novelty of simulations. He notes that computer simulations raise novel issues in the philosophy of science principally, as a result of the uniqueness of their epistemological and methodological approaches, which constitute a departure from the traditional anthropocentric epistemologies where the focus has been on the understanding of human knowledge.
The methodological dimension to Winsberg’s [2] description of simulation is echoed by Ord-Smith and Stephenson [5], who describe simulation as a technique by which understanding the behavior of a physical system is obtained by making measurements or observations of the behavior of a model representing that system. From this definition, two key aspects come to the fore, namely understanding and representation. A simulation is capable of enabling understanding by way of representation. Parker [6] similarly describes simulation as a time-ordered sequence of states that serves as a representation of some other time-ordered sequence of states, and that at each point in the former sequence, the simulating system’s having certain properties represents the target system’s having certain properties.
Bennet [7] describes simulation as the process of formulating a suitable mathematical model of a system, the development of a computer program to solve the equations of the model and the operation of the computer to determine values for the system variables. This description presents a three-tiered hierarchy as being characteristic of computer simulations, namely the formulation of a mathematical model representation of a system, the development of a computer program to solve the model equations and, finally, the operation of the computer to obtain insights into the system’s dynamics.

3. Conditions for Use of Computer Simulations

Simulations can be used for a number of scientific purposes including prediction, explanation, retrodiction and proving theories [8,9]. The use of simulations for prediction consists in the act of making a claim that a particular event will occur with certain probability in the near future [9]. Simulations may also be used for purposes of explaining phenomena, in which case they may be seeking to provide any of three types and levels of explanation, namely full explanation, partial explanation and potential explanation [9]. Simulations can also be used for proving theories.
According to Winsberg [1], simulations may be necessitated where there is an unavailability of data, which the simulations are thus called upon to provide, replacing conventional experiments and observations as sources of data about the world. This may particularly occur under two conditions, namely where data on analytically resolving mathematical models is unavailable and where natural experimentation is inappropriate for practical reasons or unattainable for physical reasons [10].

4. Epistemological Framework for Building and Testing Theory Using Computer Simulations

The epistemological framework at the core of this study is an ideological construct that must address the question of how acceptable theories can be built and tested using computer simulations. In addressing this question, the framework must provide insights into the nature of the knowledge that may be obtainable through computer simulations. It is also important that this framework must provide insights into the possible challenges that may be anticipated in the pursuit of the said theory and suggest means of dealing with the challenges. Adequate usage of computer simulations in the process of building and testing theory must then be evaluated on the basis of this framework.

4.1. Nature of Knowledge Obtained from Computer Simulations

The knowledge that is produced by computer simulations is the result of inferences that must conform to three characters, namely that the knowledge must be downward, motley and autonomous [11]. The inferences must be downward in a way that demonstrates a top-down transition from high theory to particular phenomena. For instance, with regard to the earlier example of CFD simulations given in Section 2, the inferences must draw from a general understanding of the theory governing the flow of fluids in space and time. This general understanding must then be applied to a specific situation, such as a building whose indoor and outdoor wind environments need to be analyzed. They must also be motley by way of drawing from a multiplicity of sources, such as the simulationist’s own knowledge of the simulation scenario, simulation best practice guidelines and previous simulation experiences among others. Finally, the inferences must be autonomous such that when they are made, there must be enough ground to believe in them.
In spite of the understanding that mathematical modeling forms part of the core of computer simulations, the knowledge that is obtained from an evaluation of simulations is different from the kind of knowledge that follows from an evaluation of mathematical equations. In the latter, the transparency obtained from the formal presentation of the relationship between the mathematical terms enables understanding without the need for calculations. On the other hand, the understanding that follows from an evaluation of simulations is only feasible as a function of calculations. It is characterized by epistemic opacity, rendering it, to a larger extent, pragmatist. This understanding is a certain threshold about an actual solution such that its use may best be suited to interventions and predictions, rather than theoretical explanations [3]. How to set this threshold about the actual solutions becomes an important epistemic consideration that affects the significance of simulations within the realm of the philosophy of science. In a world where any given problem defined by a mathematical model can have a feasible solution, a satisfactory solution, an optimum solution or no solution at all, computer simulation mathematical models are primarily directed toward finding satisfactory solutions to practical problems [12].
There is wide disagreement on the nature of simulations bordering on whether they may be regarded as experiments or not, with far-reaching consequences on the nature of the kind of understanding that they provide. A fair amount of research work has sought to compare computer simulations and traditional experiments [13,14,15,16,17,18], with other researchers casting doubt on the epistemological power of the former relative to the latter, on the basis of materiality. Morgan [16] suggests that inferences about target systems are more justified when the experimental and target system are made of the same material than when they are made of different materials, as is the case in computer simulations. However, Parker [6] argues that, with respect to epistemic adequacy comparisons between the two, the focus on materiality is misplaced, as it is relevant materiality, and not materiality in general, that ultimately matters. Nonetheless, Parker [6] agrees with Morgan’s [16] suggestion that ontological equivalence provides epistemological power in so far as this suggestion remains consistent with the relevance of similarity.

4.2. Hierarchical Order of Computer Simulations

According to [19], the core of the practice of computer simulations consists in the construction of a hierarchy of models. This hierarchy consists of mechanical models, dynamical models, computational models and phenomena models in that order. The mechanical models place the phenomena of interest within an existing fundamental theoretical context. The dynamical models seek to localize the simulation within specific contexts of the phenomena under investigation and may include data on the appropriate boundary and initial conditions. The computational models are a discretized version of the mechanical models, through which continuous equations are solved numerically. The phenomena models constitute a higher-level representation of real-world phenomena, complete with data that may need to be interpreted to enhance understanding of the phenomena. In essence, computer simulations take a tiered structure, including input conditions, mechanism conditions and output conditions [8].

4.3. Challenges of Computer Simulations

The epistemological trustworthiness of computer simulations has been severally called to question in the literature [2]. This is perhaps due to the essence of the act of obtaining knowledge through simulations, which consists in, among others, approximations, similarities and mathematical modeling. The approximations and similarities mean that simulations build theory by way of inductive claims to generalization. Since simulationists have direct access only to their own peculiar and limited set of experiences, it is difficult to understand how generalizations would be made beyond the simulationists’ empirical domain [20]. There have been questions on computer simulations’ degree of accuracy, confidence in the inferences that are drawn from their explanations and reliability of simulation-based decisions. Lenhard [3] notes that these pervasive doubts on the epistemological adequacy of simulations might be the cause for their slower uptake in the sciences. Among others, these doubts have been promulgated by those who hold the amplifier view of computer simulations as being merely an extension of mathematical modeling and thus inferior to theory, ultimately incapable of yielding any philosophical novelty of their own.

4.4. Validation, Verification and Robustness of Computer Simulations

In view of the epistemological challenges that computer simulations face, a number of strategies have been developed with an aim of evaluating the simulations’ adequacy and enhancing the confidence with which they are used in research. In general, such strategies take the form of validation, verification and a measure of robustness.
Validation and verification focus on the output and the process by which solutions to the simulation model’s continuous equations are estimated respectively [21]. Validation is the process of ensuring that the equations at the core of the simulation model accurately represent the target system [2]. On the other hand, verification entails an attempt at ensuring that the simulation’s numerical solution is close enough to an analytical solution. According to Sargent [22], validation and verification exercises are attempts at addressing concerns on the correctness of simulation models, with the former being a substantiation of the correctness of the model’s representation of the real world, and the latter, the substantiation of the correctness of the fundamental components of the simulation model and their implementation.
Another strategy that may be used in addressing the epistemological weakness of simulations is a measure of the robustness of the simulations. Robustness is a measure of how much value may be placed in a computer simulation as a function of the adequacy of the model’s representation of the target phenomenon under varying input conditions and parameters [9]. D’Arms et al. [23] suggest that a simulation result may be robust if it is achieved across a variety of different starting conditions and parameters. This may be seen to underscore the need for sensitivity analysis for simulation-based studies. Closely related to the measure of robustness is the practice of using falsifications whose success and reliability do not amount to the truth, but can enhance the credibility of simulations [1]. These falsifications can be implemented through the use of principles that do not purport to offer true accounts of the nature of the target phenomena [1]. The falsifications must be proven to have no impact on the simulation results by ensuring that the simulation remains robust despite deliberate variations over repeated runs [8].
Validation in simulations is closely linked with the problem of induction where empiricism is necessary to justify a theory [24]. Typically, a match between the simulation results and those from empirical testing lends to the confidence of predictions and explanations from the simulations [9]. In the pursuit of validation, three positions have emerged in the literature, namely the justificationism or objectivist, anti-justificationism or relativist, and a blend of the two [24]. The justificationist position seeks to eliminate aspects of human judgment in the validation process, limiting it to empiricism and rationalism. The anti-justificationist position, on the other hand, makes room for human judgment in the validation of simulations, noting the difficulty that exists for a simulationist to completely detach themselves and their judgments from the simulation process. Purely empirically-oriented validation standards cannot be met in most instances [24].
In view of the major differences that exist as to what evidentiary requirement is necessary to validate computer simulations, Kleindorfer and Ganeshan [20] suggest that the simulationists should be left free to decide which methodological validation regime they would like to use, so long as they do it in an ethical and professional manner. The wide latitude that this position accords the simulationists, however, might be problematic, especially in so far as scientific epistemology is concerned. On the other hand, the strict objectivist position on the validation of simulations faces insufficiency, especially where empirical data that is characteristic of phenomena under investigation either currently does not exist or may be hard to obtain. The strict empiricism also goes against the autonomous nature of simulations, as advanced by Winsberg [11]. In order to adequately address this problem of the unavailability of empirical data, Winsberg [19] suggests that the simulation process needs to be justified internally. The confidence that a simulationist places in the simulation results must derive from their knowledge of computers, the adequacy of their assumptions, their ability to calibrate models against other simulation tools and empirical results where possible, and their ability to make judgments about the degree of resemblance between different sets of data. On the same point, Winsberg [1] notes that one of the conditions that necessitates the usage of simulations is the unavailability of empirical data, and that the simulations are thus meant to replace experiments and observations as sources of data about the world. Where such data about the world is unavailable, it may not be possible to evaluate simulations simply by comparing them to the world; rather, their credibility can derive from the credentials of their underlying theory and the credentials of the techniques employed by the simulationists. Further, the simulation models must be supported by demonstrated evidence of adequate performance under given hardware and software environments [9].
With verification of simulations, the problem is that simulations usually seek to solve problems whose analytical solutions are hard to get in the first place [2], a thing that renders analytical verification highly intractable. Where there is an expectation that the simulation model’s code may be used in many subsequent studies, the verification can be streamlined to enable usage of the code in an off-the-shelf manner for different applications [21]. This may imply that where such is the case, as is typical when using commercially developed code, explicit verification may not be necessary for model evaluation. However, some process input assumptions may still need to be evaluated in the light of best practices that may have been developed to guide usage of the model’s code.

4.5. Conditions for the Failure of Computer Simulations

A number of conditions will lead to the epistemological failure of computer simulations. Simulations fail when they are unable to properly correspond with reality [8]. To simulate means to build a likeness where the accuracy of the likeness constitutes one of the fundamental considerations in evaluating the simulation’s success [24]. Correspondence between simulations and their targets may range from nearly-identical scale models to abstractions or idealizations [25]. However, on this point, it is noted that the notion of correspondence is highly plastic and relative depending on specific situations and particular aspects of the representation and the representandum [8]. The evaluation of correspondence may consider aspects of spatiotemporal relations, levels of abstraction and the relevance of representations [8].
Another condition for the failure of simulations is the usage of inadequate techniques, such as human error and faulty equipment [8]. Further, they may also fail by way of producing results that run counter to an existing favored theory. Such failure may, however, constitute a point of scientific progress, where the existing theory is successfully disproved. Inadequate representation of conditions as required by existing theory may also lead to failure of simulations.
In guarding against failure while enhancing the epistemological adequacy of computer simulations, several authors [17,19,26] have suggested the use of the same, albeit subtly modified, strategies that Franklin [27,28,29] proposed for use in lending credence to experimental results. These strategies are presented in Table 1 below.
It is important to note that these strategies are neither meant to be exhaustive, nor is there any one or fixed combination of strategies that may always be necessary to enhance the credibility of computer simulations.

5. Epistemological Framework for Computer Simulations in the Practice of Building Science Research

In order to obtain insights into the epistemological framework within which computer simulations are used in practice within the realm of building science research, this study undertook a review of scholarly work that has been published in the Journal of Building Performance Simulation. This journal was selected on account of its primary interest in simulation-based research work within the built environment. For purposes of the review, a total of 39 research papers, published in the journal over a period of 13 years between 2008, when the journal’s first volume was published, and the present day in 2020, were analyzed, as shown in Table 2. These papers were selected following a systematic random sampling technique where three papers were randomly selected from each of the 13 years.
The review of the published scholarly work shows that within the realm of building science research, simulations are predominantly used for two purposes, namely the prediction and proving of theories. Where the simulations are used for the latter purpose, there is a desire to either prove or disprove an a priori theoretical proposition.
From the review, a consistent characterization of the epistemological framework for simulation-based research proves hard to obtain. There is an observed wide variation in terms of strategies that different researchers routinely adopt in order to enhance the epistemological adequacy of their simulation-based research work. In spite of the variation, five distinct epistemic strategies appeared to occur prominently within the body of the published research. These included verification, experimental validation, sensitivity analysis, inter-model comparison and justification for use of simulation over conventional experiments. These five strategies may be understood to form part of the epistemological framework within which computer simulations are presently used in practice within the area of building science research.
There were three different observed ways through which the verification approach was deployed. Researchers may have presented an analytical evaluation of the mathematical models at the core of the simulations in rare instances where this may have been possible. They may have also simply presented all the mathematical equations that form the core of the simulations. In yet other instances, the researchers may have resorted to using commercial simulation codes that have undergone rigorous verification checks prior to their availability for public use. The experimental validation strategy was implemented by a comparison between the simulation results and experimental results, which were obtained as part of the same study or from the literature. The sensitivity analysis, on the other hand, was undertaken by varying the simulation parameters where a fairly stable output result, despite the parametric variations, indicated the solution’s independence of extraneous variables. The inter-model comparison involved two or more differently developed simulation codes being deployed to solve the same problem, with a desire that the solutions did not vary widely. In some instances, a justification, more often than not a logistical one, was provided as part of the study for why simulations, and not the conventional experiment, had been used.
The five epistemic strategies were used in 14 different combinations as shown in Table 3. A simulation-based researcher will typically deploy at least one of the 14 combinations.
Of these combinations, some appeared to enjoy wider usage than others, as shown in Figure 1 below.
The combination ‘AB’, featuring the use of verification and experimental validation, appeared to be the most widely used. This points toward some remote indication that in order for the epistemological framework for simulation-based research to be considered adequate, it ought to include verification checks and experimental validation at the minimum. This, however, is in spite of the fact that it was not uncommon to find the combination ‘E’ that only features a provision of justification for using simulations over conventional experiments in its epistemological framework, without conducting any verification checks and experimental validation.
The five distinct epistemological strategies appeared with different prevalence rates in the 14 combinations that are presented in Figure 1. The verification strategy was used in more than 90% of the simulation-based research studies’ epistemological framework, while sensitivity analysis enjoyed the least prevalence rate, as shown in Figure 2. This suggests that many researchers regard verification as the single most important element of the epistemological framework for simulation-based research, followed by experimental validation.
The lack of a normative idealization of the adequate epistemological framework for computer simulations appears to be a long-running matter in the literature. Over the years, as computing power and simulation understanding and capabilities have grown, there have remained clear differences in terms of the number of strategies that researchers deployed in efforts to enhance the epistemic adequacy of their work, as can be seen in Figure 3 below.
Nonetheless, in general, a bigger proportion of researchers seems to have deployed, at the very least, two strategies in efforts to enhance the epistemological framework of their simulation-based work, as shown in Figure 4 below.
The review shows that the strategies that are put forward in the theoretical realm as being necessary for dealing with the trustworthiness challenges of computer simulations are actually put to use in practice, albeit in an inconsistent manner, across research studies that have been undertaken within the area of building science. It is important to emphasize that each of the five epistemic strategies that are used in practice advances a uniquely different and important aspect of the simulation regime. For this reason, they must be used as a total package. Where only a few selected strategies are used, this potentially compromises the epistemological adequacy of the simulations. In the present scenario, where validation and verification appeared to be widely used exclusively of the other strategies, there might potentially be problems with influences from extraneous variables and faulty models, which may affect the credibility of the simulation results. While the provision of justification for use of computer simulations in lieu of conventional experiments may be necessary to demonstrate that the latter may not be practicable, this may be seen as an affront to the autonomy of the simulations as an equally robust source of data about the world.

6. Conclusions

The study found agreement between the theory of computer simulations and the practice of computer simulation-based research in terms of the purposes of use of the computer simulations. However, while in both realms computer simulations were deployed to build and test theories, in practice, the act of building theory was predominantly by way of prediction, leaving out the others, such as retrodiction and explanation.
There was also another observed agreement between theory and practice with regard to the epistemological framework for computer simulations. While the theory has avoided providing neither an exhaustive framework nor an evaluative standard for adequacy of the framework, the practice of simulation-based building science research has shown some indication of the existence of an unwritten standard for evaluation of the adequacy of the epistemological framework. From the many different epistemic strategy combinations that have been deployed in the research, it was observed that a significant body of the research work deployed, at the very least, two individual epistemic strategies, with verification being the single most widely deployed strategy, followed by experimental validation.
In spite of this unwritten standardization of the epistemological framework, there was wide variation in usage about the same in the body of the research work. This suggests that computer simulation-based building science research proceeds under very varied epistemological frameworks and resultant varying measures of the adequacy of the methodological strategies. Such inconsistency is problematic as it may lead to questions on the confidence and trust that may be placed in simulation-based research.
While an exhaustive list of epistemological framework strategies may be hard to obtain and equally hard to be wholly implemented in research, an evaluative minimum standard would significantly help to ensure consistency in the use of simulation, while lending credence to simulation-based research.
This study recommends that the computer simulationist would do well to ensure that, at the very minimum, the verification, experimental validation, sensitivity analysis and inter-model comparison strategies are deployed as a total package, forming part of the epistemological framework for computer simulations. In the same manner, publishers would do well to require that prior to publication of computer simulation-based research, authors can demonstrate adherence to this minimum evaluative standard at the very least.

Author Contributions

Conceptualization, A.K. and J.J.; methodology, A.K.; formal analysis, A.K.; investigation, A.K.; resources, J.J.; data curation, A.K.; writing—original draft preparation, A.K.; writing—review and editing, A.K.; supervision, J.J.; project administration, J.J.; funding acquisition, J.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research study received no external funding. However, the article processing charges (APC) for this article were covered by the Open Access Subvention Fund (OASF) under the Virginia Tech University Libraries.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Winsberg, E. Models of success versus the success of models: Reliability without truth. Synthese 2006, 152, 1–19. [Google Scholar] [CrossRef]
  2. Winsberg, E. Computer simulation and the philosophy of science. Philos. Compass 2009, 4, 835–845. [Google Scholar] [CrossRef]
  3. Lenhard, J. Computer simulation. In The Oxford Handbook of Philosophy of Science; Oxford University Press: Oxford, UK, 2016. [Google Scholar]
  4. Humphreys, P. The philosophical novelty of computer simulation methods. Synthese 2008, 169, 615–626. [Google Scholar] [CrossRef] [Green Version]
  5. Ord-Smith, R.J.; Stephenson, J. Computer Simulation of Continuous Systems; Cambridge University Press: Cambridge, UK, 1975; Volume 3. [Google Scholar]
  6. Parker, W.S. Does matter really matter? Computer simulations, experiments, and materiality. Synthese 2008, 169, 483–496. [Google Scholar] [CrossRef] [Green Version]
  7. Bennett, A.W. Introduction to Computer Simulation; West Publishing Company: Eagan, MN, USA, 1974. [Google Scholar]
  8. Grim, P.; Rosenberger, R.; Rosenfeld, A.; Anderson, B.; Eason, R.E. How simulations fail. Synthese 2011, 190, 2367–2390. [Google Scholar] [CrossRef]
  9. Grüne-Yanoff, T.; Weirich, P. The philosophy and epistemology of simulation: A review. Simul. Gaming 2010, 41, 20–50. [Google Scholar] [CrossRef]
  10. Humphreys, P. Computer Simulations. In Proceedings of the Extending Ourselves; Oxford University Press: Oxford, UK, 2004; pp. 105–135. [Google Scholar]
  11. Winsberg, E. Simulations, models, and theories: Complex physical systems and their representations. Philos. Sci. 2001, 68, S442–S454. [Google Scholar] [CrossRef] [Green Version]
  12. Paul, R.J.; Neelamkavil, F. Computer simulation and modeling. J. Oper. Res. Soc. 1987, 38, 1092. [Google Scholar] [CrossRef]
  13. Guala, F. The Methodology of Experimental Economics; Cambridge University Press: Cambridge, UK, 2005. [Google Scholar]
  14. Radder, H. The philosophy of scientific experimentation: A review. Autom. Exp. 2009, 1, 2. [Google Scholar] [CrossRef] [Green Version]
  15. Morgan, M.S. Model experiments and models in experiments. In Model-Based Reasoning; Springer: Boston, MA, USA, 2002; pp. 41–58. [Google Scholar]
  16. Morgan, M.S. Experiments versus models: New phenomena, inference and surprise. J. Econ. Methodol. 2005, 12, 317–329. [Google Scholar] [CrossRef]
  17. Winsberg, E. Simulated experiments: Methodology for a virtual world. Philos. Sci. 2003, 70, 105–125. [Google Scholar] [CrossRef] [Green Version]
  18. Angius, N. Qualitative models in computational simulative sciences: Representation, confirmation, experimentation. Minds Mach. 2019, 29, 397–416. [Google Scholar] [CrossRef]
  19. Winsberg, E. Sanctioning models: The epistemology of simulation. Sci. Context 1999, 12, 275–292. [Google Scholar] [CrossRef] [Green Version]
  20. Kleindorfer, G.B.; O’Neill, L.; Ganeshan, R. Validation in simulation: Various positions in the philosophy of science. Manag. Sci. 1998, 44, 1087–1099. [Google Scholar] [CrossRef] [Green Version]
  21. Parker, W.S. Franklin, Holmes, and the epistemology of computer simulation. Int. Stud. Philos. Sci. 2008, 22, 165–183. [Google Scholar] [CrossRef]
  22. Sargent, R.G. Verification and validation of simulation models. J. Simul. 2013, 7, 12–24. [Google Scholar] [CrossRef] [Green Version]
  23. D’Arms, J.; Batterman, R.W.; Gorny, K. Game theoretic explanations and the evolution of justice. Philos. Sci. 1998, 65, 76–102. [Google Scholar] [CrossRef]
  24. Kleindorfear, G.B.; Geneshan, R. The philosophy of science and validation in simulation. In Proceedings of the 25th International Conference on World Wide Web—WWW’16; Association for Computing Machinery (ACM): New York City, NY, USA; 1993; pp. 50–57. [Google Scholar]
  25. Frigg, R.; Hartmann, S. Models in Science. Available online: https://plato.stanford.edu/entries/models-science/ (accessed on 11 September 2020).
  26. Weissart, T. The Genesis of Simulation in Dynamics; Springer: New York, NY, USA, 1997. [Google Scholar]
  27. Franklin, A. The Neglect of Experiment; Cambridge University Press: Cambridge, UK, 1986. [Google Scholar]
  28. Gooding, D.; Pinch, T.; Schaffer, S. The Uses of Experiment: Studies in the Natural Sciences; Cambridge University Press: Cambridge, UK, 1989. [Google Scholar]
  29. Franklin, A.; Newman, R. Selectivity and discord: Two problems of experiment. Am. J. Phys. 2003, 71, 734–735. [Google Scholar] [CrossRef]
  30. Bourdoukan, P.; Wurtz, E.; Joubert, P.; Spérandio, M. Overall cooling efficiency of a solar desiccant plant powered by direct-flow vacuum-tube collectors: Simulation and experimental results. J. Build. Perform. Simul. 2008, 1, 149–162. [Google Scholar] [CrossRef]
  31. Jenkins, D. Using dynamic simulation to quantify the effect of carbon-saving measures for a UK supermarket. J. Build. Perform. Simul. 2008, 1, 275–288. [Google Scholar] [CrossRef]
  32. Ji, Y.; Cook, M.J.; Hanby, V.; Infield, D.G.; Loveday, D.L.; Mei, L. CFD modeling of naturally ventilated double-skin facades with Venetian blinds. J. Build. Perform. Simul. 2008, 1, 185–196. [Google Scholar] [CrossRef]
  33. Chvatal, K.; Corvacho, H. The impact of increasing the building envelope insulation upon the risk of overheating in summer and an increased energy consumption. J. Build. Perform. Simul. 2009, 2, 267–282. [Google Scholar] [CrossRef]
  34. Le, A.D.T.; Maalouf, C.; Mendonça, K.C.; Mai, T.H.; Wurtz, E. Study of moisture transfer in a double-layered wall with imperfect thermal and hydraulic contact resistances. J. Build. Perform. Simul. 2009, 2, 251–266. [Google Scholar] [CrossRef]
  35. Van Treeck, C.; Frisch, J.; Pfaffinger, M.; Rank, E.; Paulke, S.; Schweinfurth, I.; Schwab, R.; Hellwig, R.T.; Holm, A. Integrated thermal comfort analysis using a parametric manikin model for interactive real-time simulation. J. Build. Perform. Simul. 2009, 2, 233–250. [Google Scholar] [CrossRef]
  36. Hugo, A.; Zmeureanu, R.; Rivard, H. Solar combisystem with seasonal thermal storage. J. Build. Perform. Simul. 2010, 3, 255–268. [Google Scholar] [CrossRef]
  37. Sartipi, A.; Laouadi, A.; Naylor, D.; Dhib, R. Convective heat transfer in domed skylight cavities. J. Build. Perform. Simul. 2010, 3, 269–287. [Google Scholar] [CrossRef] [Green Version]
  38. Siva, K.; Lawrence, M.X.; Kumaresh, G.R.; Rajagopalan, P.; Santhanam, H. Experimental and numerical investigation of phase change materials with finned encapsulation for energy-efficient buildings. J. Build. Perform. Simul. 2010, 3, 245–254. [Google Scholar] [CrossRef]
  39. Johansson, D.; Bagge, H. Simulating space heating demand with respect to non-constant heat gains from household electricity. J. Build. Perform. Simul. 2011, 4, 227–238. [Google Scholar] [CrossRef]
  40. Laouadi, A. The central sunlighting system: Development and validation of an optical prediction model. J. Build. Perform. Simul. 2011, 4, 205–226. [Google Scholar] [CrossRef] [Green Version]
  41. Wetter, M. Co-simulation of building energy and control systems with the Building Controls Virtual Test Bed. J. Build. Perform. Simul. 2011, 4, 185–203. [Google Scholar] [CrossRef] [Green Version]
  42. Mahmoud, A.M.; Ben-Nakhi, A.; Ben-Nakhi, A.; Alajmi, R. Conjugate conduction convection and radiation heat transfer through hollow autoclaved aerated concrete blocks. J. Build. Perform. Simul. 2012, 5, 248–262. [Google Scholar] [CrossRef]
  43. Saber, H.H.; Maref, W.; Elmahdy, H.; Swinton, M.C.; Glazer, R. 3D heat and air transport model for predicting the thermal resistances of insulated wall assemblies. J. Build. Perform. Simul. 2012, 5, 75–91. [Google Scholar] [CrossRef]
  44. Wang, X.; Kendrick, C.; Ogden, R.; Baiche, B.; Walliman, N. Thermal modeling of an industrial building with solar reflective coatings on external surfaces: Case studies in China and Australia. J. Build. Perform. Simul. 2012, 5, 199–207. [Google Scholar] [CrossRef]
  45. Barreira, E.; Delgado, J.M.P.Q.; Ramos, N.; De Freitas, V.P. Exterior condensations on façades: Numerical simulation of the undercooling phenomenon. J. Build. Perform. Simul. 2013, 6, 337–345. [Google Scholar] [CrossRef]
  46. Maurer, C.; Baumann, T.; Hermann, M.; Di Lauro, P.; Pavan, S.; Michel, L.; Kuhn, T.E. Heating and cooling in high-rise buildings using facade-integrated transparent solar thermal collector systems. J. Build. Perform. Simul. 2013, 6, 449–457. [Google Scholar] [CrossRef]
  47. Villi, G.; Peretti, C.; Graci, S.; De Carli, M. Building leakage analysis and infiltration modeling for an Italian multi-family building. J. Build. Perform. Simul. 2013, 6, 98–118. [Google Scholar] [CrossRef]
  48. Berardi, U. Simulation of acoustical parameters in rectangular churches. J. Build. Perform. Simul. 2013, 7, 1–16. [Google Scholar] [CrossRef]
  49. Geva, A.; Saaroni, H.; Morris, J. Measurements and simulations of thermal comfort: A synagogue in Tel Aviv, Israel. J. Build. Perform. Simul. 2013, 7, 233–250. [Google Scholar] [CrossRef]
  50. Kalagasidis, A.S. A multi-level modeling and evaluation of thermal performance of phase-change materials in buildings. J. Build. Perform. Simul. 2013, 7, 289–308. [Google Scholar] [CrossRef] [Green Version]
  51. Mahyuddin, N.; Awbi, H.B.; Essah, E.A. Computational fluid dynamics modeling of the air movement in an environmental test chamber with a respiring manikin. J. Build. Perform. Simul. 2014, 8, 359–374. [Google Scholar] [CrossRef]
  52. Su, C.-H.; Tsai, K.-C.; Xu, M.-Y. Computational analysis on the performance of smoke exhaust systems in small vestibules of high-rise buildings. J. Build. Perform. Simul. 2014, 8, 239–252. [Google Scholar] [CrossRef]
  53. Wang, J.; Chow, T.-T. Influence of human movement on the transport of airborne infectious particles in hospital. J. Build. Perform. Simul. 2014, 8, 205–215. [Google Scholar] [CrossRef]
  54. Georges, L.; Skreiberg, Ø. Simple modeling procedure for the indoor thermal environment of highly insulated buildings heated by wood stoves. J. Build. Perform. Simul. 2016, 9, 663–679. [Google Scholar] [CrossRef]
  55. Muslmani, M.; Ghaddar, N.; Ghali, K. Performance of combined displacement ventilation and cooled ceiling liquid desiccant membrane system in Beirut climate. J. Build. Perform. Simul. 2016, 9, 648–662. [Google Scholar] [CrossRef]
  56. Le, A.D.T.; Maalouf, C.; Douzane, O.; Promis, G.; Mai, T.H.; Langlet, T. Impact of combined moisture buffering capacity of a hemp concrete building envelope and interior objects on the hygrothermal performance in a room. J. Build. Perform. Simul. 2016, 9, 589–605. [Google Scholar] [CrossRef]
  57. Corbin, C.D.; Henze, G. Predictive control of residential HVAC and its impact on the grid. Part II: Simulation studies of residential HVAC as a supply following resource. J. Build. Perform. Simul. 2016, 10, 1–13. [Google Scholar] [CrossRef]
  58. Jones, A.; Finn, D. Co-simulation of a HVAC system-integrated phase change material thermal storage unit. J. Build. Perform. Simul. 2016, 10, 313–325. [Google Scholar] [CrossRef]
  59. Kubilay, A.; Carmeliet, J.J.; Derome, D. Computational fluid dynamics simulations of wind-driven rain on a mid-rise residential building with various types of facade details. J. Build. Perform. Simul. 2016, 10, 125–143. [Google Scholar] [CrossRef]
  60. Brideau, S.A.; Beausoleil-Morrison, I.; Kummert, M. Above-floor tube-and-plate radiant floor model development and validation. J. Build. Perform. Simul. 2017, 11, 449–469. [Google Scholar] [CrossRef]
  61. Mortada, A.; Choudhary, R.; Soga, K. Multi-dimensional simulation of underground subway spaces coupled with geoenergy systems. J. Build. Perform. Simul. 2018, 11, 517–537. [Google Scholar] [CrossRef] [Green Version]
  62. Rakotomahefa, T.M.J.; Wang, F.; Zhang, T.; Wang, S. Zonal network solution of temperature profiles in a ventilated wall module. J. Build. Perform. Simul. 2017, 11, 538–552. [Google Scholar] [CrossRef]
  63. Barz, T.; Emhofer, J.; Marx, K.; Zsembinszki, G.; Cabeza, L.F. Phenomenological modeling of phase transitions with hysteresis in solid/liquid PCM. J. Build. Perform. Simul. 2019, 12, 770–788. [Google Scholar] [CrossRef] [Green Version]
  64. Ralph, B.; Carvel, R.; Floyd, J. Coupled hybrid modeling within the Fire Dynamics Simulator: Transient transport and mass storage. J. Build. Perform. Simul. 2019, 12, 685–699. [Google Scholar] [CrossRef]
  65. Van Kenhove, E.; De Backer, L.; Janssens, A.; Laverge, J. Simulation of Legionella concentration in domestic hot water: Comparison of pipe and boiler models. J. Build. Perform. Simul. 2019, 12, 595–619. [Google Scholar] [CrossRef] [Green Version]
  66. Filipsson, P.; Trüschel, A.; Gräslund, J.; Dalenbäck, J.-O. Modeling of rooms with active chilled beams. J. Build. Perform. Simul. 2020, 13, 409–418. [Google Scholar] [CrossRef]
  67. Sardoueinasab, Z.; Yin, P.; O’Neal, D. Energy modeling and analysis of variable airflow parallel fan-powered terminal units using Energy Management System (EMS) in EnergyPlus. J. Build. Perform. Simul. 2019, 13, 1–12. [Google Scholar] [CrossRef]
  68. Mohamed, S.; Buonanno, G.; Massarotti, N.; Mauro, A. Ultrafine particle transport inside an operating room equipped with turbulent diffusers. J. Build. Perform. Stimul. 2020, 13, 443–455. [Google Scholar] [CrossRef]
Figure 1. Prevalence of Epistemic Approach Combinations used in Research Studies between 2008 and 2020, where A = Verification, B = Experimental Validation, C = Sensitivity Analysis, D = Inter-model Comparison, E = Justification for use of simulations over conventional experiments.
Figure 1. Prevalence of Epistemic Approach Combinations used in Research Studies between 2008 and 2020, where A = Verification, B = Experimental Validation, C = Sensitivity Analysis, D = Inter-model Comparison, E = Justification for use of simulations over conventional experiments.
Philosophies 05 00030 g001
Figure 2. Prevalence of Individual Epistemic Approaches in Different Epistemic Approach Combinations where A = Verification, B = Experimental Validation, C = Sensitivity Analysis, D = Inter-model Comparison, E = Justification for use of simulations over conventional experiments.
Figure 2. Prevalence of Individual Epistemic Approaches in Different Epistemic Approach Combinations where A = Verification, B = Experimental Validation, C = Sensitivity Analysis, D = Inter-model Comparison, E = Justification for use of simulations over conventional experiments.
Philosophies 05 00030 g002
Figure 3. Average Number of Epistemic Strategies Used in Research Studies between 2008 and 2020.
Figure 3. Average Number of Epistemic Strategies Used in Research Studies between 2008 and 2020.
Philosophies 05 00030 g003
Figure 4. Overall Number of Epistemic Strategies in Research Studies between 2008 and 2020.
Figure 4. Overall Number of Epistemic Strategies in Research Studies between 2008 and 2020.
Philosophies 05 00030 g004
Table 1. Strategies for Enhancing the Epistemological Power of Computer Simulations. Source [21].
Table 1. Strategies for Enhancing the Epistemological Power of Computer Simulations. Source [21].
Experimental Evaluation Strategies Proposed by FranklinSimulation Model Evaluation StrategiesSimulation Code Evaluation Strategies
Apparatus gives other results that match known resultsSimulation output fits closely enough with various observational dataEstimated solutions fit closely enough with analytic and/or other numerical solutions
Apparatus responds as expected after intervention on the experimental systemSimulation results change as expected after intervention on substantive model parametersSolutions change as expected after intervention on algorithm parameters
Capacities of apparatus are underwritten by well confirmed theoriesSimulation model is constructed using well-confirmed theoretical assumptionsSolution method is underwritten by sound mathematical theorizing and analysis
Experimental results are replicated in other experimentsSimulation results are reproduced in other simulations or in traditional experimentsSolutions are produced using other pieces of code
Plausible sources of significant experimental error can be ruled outPlausible sources of significant modeling error can be ruled outPlausible sources of significant mathematical/computational error can be ruled out
Table 2. Epistemological Framework in Simulation-Based Research Studies.
Table 2. Epistemological Framework in Simulation-Based Research Studies.
YearResearch StudyObjectiveEpistemological Framework
ABCDE
2008[30]PredictionYesYesYesNoneNone
2008[31]PredictionYesYesNoneNoneNone
2008[32]PredictionYesYesYesYesYes
2009[33]Prove theoryYesNoneNoneNoneNone
2009[34]PredictionYesYesYesNoneNone
2009[35]PredictionYesYesNoneNoneYes
2010[36]PredictionYesNoneYesNoneNone
2010[37]PredictionYesNoneNoneYesNone
2010[38]PredictionYesYesNoneNoneNone
2011[39]Prediction and Prove theoryNoneNoneNoneNoneYes
2011[40]Prove theoryYesYesNoneYesNone
2011[41]Prove theoryYesNoneNoneNoneNone
2012[42]PredictionYesNoneYesYesNone
2012[43]Prove theoryYesYesNoneYesNone
2012[44]Prove theoryYesNoneNoneNoneNone
2013[45]Prove theoryYesYesNoneNoneNone
2013[46]PredictionYesNoneNoneYesNone
2013[47]Prediction and Prove theoryYesYesNoneNoneYes
2014[48]PredictionYesYesNoneNoneNone
2014[49]Prove theoryYesYesNoneNoneNone
2014[50]PredictionYesYesNoneNoneYes
2015[51]PredictionYesYesYesNoneYes
2015[52]PredictionYesNoneYesNoneYes
2015[53]PredictionYesYesNoneNoneNone
2016[54]Prove theoryYesYesNoneNoneNone
2016[55]PredictionYesYesNoneNoneNone
2016[56]PredictionYesYesYesNoneNone
2017[57]PredictionYesNoneNoneYesNone
2017[58]PredictionYesYesNoneNoneNone
2017[59]Prove theoryYesYesYesNoneYes
2018[60]Prove theoryYesYesYesYesNone
2018[61]Prove theoryYesYesNoneNoneNone
2018[62]Prove theoryYesYesNoneNoneNone
2019[63]PredictionYesYesNoneNoneNone
2019[64]Prove theoryYesNoneNoneNoneNone
2019[65]Prove theoryYesYesYesYesNone
2020[66]PredictionYesYesNoneNoneNone
2020[67]Prove theoryYesNoneNoneNoneNone
2020[68]PredictionYesYesYesNoneYes
A = Verification, B = Experimental Validation, C = Sensitivity Analysis, D = Inter-model Comparison, E = Justification for use of simulations over conventional experiments.
Table 3. Epistemic Strategy Combinations.
Table 3. Epistemic Strategy Combinations.
CombinationDescription
ABCVerification, experimental validation, sensitivity analysis
ABVerification, experimental validation
ABCDEVerification, experimental validation, sensitivity analysis, inter-model comparison, justification for use of simulations over conventional experiments
AVerification
ABEVerification, experimental validation, justification for use of simulations over conventional experiments
ACVerification, sensitivity analysis
ADVerification, inter-model comparison
EJustification for use of simulations over conventional experiments
ABDVerification, experimental validation, inter-model comparison
ACDVerification, sensitivity analysis, inter-model comparison
ABCEVerification, experimental validation, sensitivity analysis, justification for use of simulations over conventional experiments
ACEVerification, sensitivity analysis, justification for use of simulations over conventional experiments
ABCDVerification, experimental validation, sensitivity analysis, inter-model comparison
ABEVerification, experimental validation, justification for use of simulations over conventional experiments
A = Verification, B = Experimental Validation, C = Sensitivity Analysis, D = Inter-model Comparison, E = Justification for use of simulations over conventional experiments.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kalua, A.; Jones, J. Epistemological Framework for Computer Simulations in Building Science Research: Insights from Theory and Practice. Philosophies 2020, 5, 30. https://doi.org/10.3390/philosophies5040030

AMA Style

Kalua A, Jones J. Epistemological Framework for Computer Simulations in Building Science Research: Insights from Theory and Practice. Philosophies. 2020; 5(4):30. https://doi.org/10.3390/philosophies5040030

Chicago/Turabian Style

Kalua, Amos, and James Jones. 2020. "Epistemological Framework for Computer Simulations in Building Science Research: Insights from Theory and Practice" Philosophies 5, no. 4: 30. https://doi.org/10.3390/philosophies5040030

Article Metrics

Back to TopTop