On Decision Makers ’ Perceptions of What an Ecological Computer Model is , What It Does , and Its Impact on Limiting Model Acceptance

Environmental decision makers are required to understand complex ecological processes and ecological computer models are designed to facilitate this understanding. A set of interviews reveals three main perceptions affecting senior environmental decision makers’ trust in ecological computer models as decision facilitation tools: an ecological computer model is perceived as (i) a ‘black box’, (ii) processing poorly documented, sparse and out-of-date input data, and (iii) whose sensitivity to model parameters enables manipulation to produce desired outcomes justifying pre-conceived decisions. This leads to lack of trust towards both ecological computer models and model-users, including other scientists and decision makers. Model acceptance appears to depend on the amount, currency and geographical origin of input data. This is at odds with modellers’ communication style, which typically places more emphasis on highlighting the ecological computer model’s features and performance, rather than on describing the input data. Developing ‘big data’ capabilities could deliver the large, real-time, local data that may enhance acceptance. However, the size and complexity of ‘big data’ requires automated pre-processing, using modelling and algorithms that are even more inscrutable than current ecological computer models. Future trust in ecological computer models will likely depend on how this dilemma is resolved, which is likely to require improved communication between modellers and decision makers.


Introduction
At its core, effective decision making aimed at supporting environmental sustainability requires an understanding of the interplay of multiple processes interacting at different scales to promote desirable outcomes [1].Some decision makers believe that ecological computer models are an effective tool to understand these complex interplays.Other decision makers are more sceptical.To what extent do these different perceptions depend on the understanding of what an ecological computer model is and how it functions?When a decision maker talks about an ecological computer model, what mental representation does (s)he employ?A machine?A diagram?A picture? Does it affect the decision maker's will to account for model results in the decision making process?We explored these issues in a series of interviews with senior environmental decision makers involved in a range of large environmental projects addressing the sustainability of marine and coastal ecological systems.
The decision makers were scientists with little to no technical background in modelling.The interviews provided some novel insights on how to improve communication in relation to acceptance of ecological computer models as decision facilitation tools.
There are many well-reasoned arguments both for and against the use of computer models to assist complex decision making [2,3].Almost any feature of ecological computer models can be perceived as a strength or a weakness depending on the underlying attitude towards modelling.For instance, simple ecological computer models can help focus on the core processes at play and abstract away confounding effects.However, focusing on the core components of a problem could be perceived as dismissing crucial components of the problem's actual complexity.Alternatively, large complex ecological computer models enable detailed study of the interplay between multiple processes that far exceed the cognitive ability of decision makers and other experts, but may also be perceived as inscrutable and untestable.Ecological computer models can account for uncertainty and provide probabilistic assessment of alternative future events, but may also be perceived as prone to external manipulation [4].
Among other factors, reluctance to employ model outputs to guide environmental decision making may arise from poor understanding of modelling and a lack of trust towards ecological computer models and modellers [5][6][7].Before acceptance of ecological computer models and their outputs, decision makers may expect modellers to provide convincing evidence that model results are reliable before accounting for them in the decision making process.Some past modelling projects have used various communication strategies as an attempt to both pre-empt and address lack of understanding and lack of trust in the results of ecological computer models.Such strategies have included: written and oral communication in non-technical language, conceptual model visualisation, interactive workshops, model demonstrations and hands-on training.These initiatives have produced mixed results in terms of acceptance regarding ecological computer models [8,9].As a result, we decided to further explore the relation between perceptions, understanding and acceptance of ecological computer models in environmental decision making.

Material and Methods
To explore perceptions of ecological computer models, we conducted semi-structured interviews with 17 senior decision makers in two phases.Phase one included eight interviews regarding ecological computer models generally and phase two included nine interviews focused specifically on marine ecological models.In phase one, we conducted separate in-depth interviews with eight senior decision makers (RP in the tables below) associated with a range of complex environmental projects that have involved the use of ecological computer models.Participants were scientists, each from a different area or region, employed by key state government agencies involved in environmental decision making in marine or terrestrial environments in Western Australia.The years of experience of the participants ranged from a minimum of eight years to 40 years in environmental management.The interview questions were open ended in order to minimise imposing our (i.e. the researchers) assumptions on the participants' responses [10,11].The questions aimed to gather information on the perceptions of what an ecological computer model is and how it works.Questions also sought to identify beliefs about the positive and negative aspects of using ecological computer models to assist decision making in environmental projects drawing on the method applied in [12,13].
Phase two applied the same in-depth interview method and questions as in phase one.The nine phase two participants were specifically associated with fishery, ecology and marine conservation projects in Western Australia that used marine ecological computer models.Phase two interview participants included four managers and five researchers (M and RS in the tables below, respectively) from universities and state government institutions with active research programs in marine ecology and conservation.All interviews were carried out individually.The distribution of years of experience recorded among the M and RS interviewed was 33% with less than 10 years (n = 3); 55% with 10 to 20 years (n = 5); and 11% with more than 20 years working in the field of marine ecology (n = 1).All of the nine participants interviewed in phase two were considered "experts"; therefore, no weighting by experience was applied in the answers provided.The template used to guide the interviews in phase two is available in the Supplementary Material.
Interviews were audio recorded and detailed notes were taken.Recordings were transcribed.Responses were tabulated according to participant and key issues raised.The key issues were identified and tabulated independently by each of the authors.We compared and discussed the results of the tabulation, and common themes, indicating general perceptions, were identified based on common agreement [14].
Ecological computer models used in environmental management vary considerably in scope, size and complexity, ranging from fairly simple, data-driven, statistical fitted models to very large ecosystem-based modelling including the dynamics of large food webs interacting with human activities [8].Because one of the main purposes of this work was to explore what stakeholders understand a model to be, in phase one we did not specify what type of ecological computer models were the focus of the interviews.In phase two, the interview focused implicitly on large marine ecological computer models in the class of Ecopath with Ecosym [15] and Atlantis [16].Both models are used to simulate trophic interactions in marine ecosystems under both internal dynamics, external forcing in the form of geophysical processes, like climate change, and human processes, like fishing, tourism and management decisions.

Results
The issues and subsequent themes identified in the phase one and two interviews aligned closely in terms of perceptions of ecological computer models generally and the more specific context of marine ecological computer models.The interviews highlighted three themes regarding general perceptions about using ecological computer models in decision making:

•
An ecological computer model is perceived as a 'black box' whose functioning is poorly understood.

•
Poor quality input data makes ecological computer models unreliable.Data may be poorly documented, sparse, collected in a location of poor ecological and geographical relevance and out-of-date.

•
Model results may not be trusted because ecological computer models can be manipulated to produce any outcome.
Each of these three main perceptions is explained in more detail in the following sections.

A Computer Model is a 'Black Box'
Most participants found it difficult to describe specifically what an ecological computer model is or how it works (Table 1).An ecological computer model is thus perceived as a black box, which processes an input to generate an output, using procedures that are complex and difficult to comprehend.Lack of understanding of the internal workings of an ecological computer model may contribute to lack of trust in, and thus, acceptance of, the model output.

RP7
"Computer modellers use as much information, or data, as they can get their hands on, to make predictions about different scenarios going into the future." M3 "Models can get very complicated very quickly.People [are] unable to understand all the linkages in a complicated system".

Poor Quality Input Data Makes Computer Models Unreliable
The interview participants generally perceived that ecological computer model outputs are not solely a property of the model itself, but also of the quality and reliability of its input.A common response to the question 'What do you perceive to be the main disadvantages of modelling?',referred explicitly to quality of data input (Table 2).The accuracy, comprehensiveness and currency of input were perceived as a key aspect influencing the acceptance of ecological computer modelling for decision making on complex ecological problems.For large marine ecological computer models in the class of Ecopath with Ecosym and Atlantis, input data include information on the biological properties of a number of marine species (abundance, growth, mortality, etc.), their diets (which determine trophic relations in the model), geophysical properties of the modelled marine environment, and human pressures in the form of fishing, tourism footprint and management initiatives.Participants perceived that ecological computer models were flexible and sensitive to assumptions and related adjustments to parameters that can influence model outputs.The ability to adjust how an ecological computer model processes input data was perceived as a means for manipulation to justify any pre-conceived decision (Table 3).This appears to be underpinned by the belief that a skilled modeller can tune a model's parameters to produce any result a decision maker may want to see, even if input data is current, comprehensive and accurate.

RP5
"I've seen people in senior positions look at a model, disagree with the outputs and say that it's providing the wrong scenario and then to ignore the whole modelling rather than to tweak the inputs and make them more accurate." Ecological computer model acceptance in decision making appears to be influenced by three general perceptions that models are a 'black box' and too complicated to understand, may have poor input data, and could be manipulated to produce a desirable output.The three main themes highlight issues to do with a lack of trust in ecological computer models, modellers, and decision makers using models to justify decisions.Our findings align with and build on previous work as in [4,5,7].Using ecological computer models to inform decisions can thus be perceived by decision makers as risky because of the perceived lack of transparency and, hence, the ability to scrutinize the decision rationale.

An Issue of Communication about Models?
A lack of trust and acceptance of ecological computer models indicate a potential weakness in communication between modellers and decision makers.Ecological computer modellers tend to focus a considerable part of the communication effort on the model itself [8,9].This focus is based on the knowledge-gap assumption that in order to accept an ecological computer model, decision makers need to understand what the model does and how it does it.This is not surprising, since for many modellers a model is a simplified representation of the real world and its 'quality' relates to how effective this representation is.A focus on the model itself can work to address the 'black box' perception.However, our interviews suggest that this approach may not address what is important to decision makers (input data quality, and model assumptions and manipulation) as part of the general trust in, and acceptance of, ecological computer models.
The concern about trustworthiness of ecological computer models highlights a further item of potential misunderstanding between modellers and decision makers.Tuning input parameters to generate different model output can be interpreted as (a) a key function of a model, which allows for exploring different alternative scenarios (as a function of natural variability and human intervention); or (b) an avenue for manipulation, allowing for the generation of any output a modeller or a decision maker desires.In the first case, variability in model output is seen as a feature of the real system which the model reproduces, in the latter as an idiosyncrasy of the model, which can be used to provide an appearance of objectivity to what in fact may be a pre-determined, value-laden choice [17].In the first case, a modeller can help the engagement process by communicating, explaining and addressing model uncertainty.In the latter case, transparency can be at least partially addressed in terms of input data, and model validation.
Figure 1 summarises the issues raised in this paper by focusing on the perception of where in a modelling process the highest uncertainty resides, according to modelers (dark bars) and stakeholders and decision makers (light bars).Modellers are deeply aware of uncertainty in input data and model parameterisation (dark tall bar on the left).Because of their professional expertise and possible professional bias, they believe that ecological computer models are reasonable representations of real ecological processes, as shown by the terminology they use to describe them ("micro-cosmos", "virtual ecosystem", "virtual laboratory") [18].As a result, ecological computer models (dark short uncertainty bar in the middle) are believed to constrain the uncertainty arising from input data, by enforcing ecological and physical consistency on the model results.The uncertainty of model output (short dark bar on the right) is mostly constrained by the model.Decision makers are skeptical of ecological computer models as discussed above (tall light uncertainty bar on the middle).As a result, they believe that only input data can constrain the uncertainty arising from model use, by enforcing consistency between model and observations.The uncertainty of model output (light bar on the right) is mostly constrained by input data.As result, in communication about model projects, modellers may focus on the ecological computer model while decision makers may expect the focus to be on input data.

Addressing Concerns about Computer Models
As mentioned in the Introduction, scepticism about computer models and concerns about their use in environmental decision making is not shared only by decision-makers and stakeholders but also by some scientists and modellers [19].Ecological modellers have long been aware of this challenge and have tried to address it by proposing standards for model description, validation, calibration, parameterisation and data use.One such attempt, TRACE [20,21], builds on an analogy between ecological computer models and general laboratory equipment: just like both scientists and decision makers expect sophisticated instrumentation to come with a manual, instructions and certification of quality and accuracy, so should they expect similar documentation and quality control on ecological computer models.This would both reassure decision makers and allow them to verify that criteria for scientific rigour are met.In a similar vein, the Open Science movement (see description and references in [22]) calls for transparency in model and data sharing, to facilitate cross testing among modelling groups.A different approach, also motivated by concerns about the provision of scientific advice for decision making, focusses on information about the source, provenance, currency and reliability of both the ecological computer model and input data, as well as the extent to which the science behind the results is considered by experts to be sound [23].This recommendation has been partly adopted in the notion of 'pedigree' used to assess the reliability of

Addressing Concerns about Computer Models
As mentioned in the Introduction, scepticism about computer models and concerns about their use in environmental decision making is not shared only by decision-makers and stakeholders but also by some scientists and modellers [19].Ecological modellers have long been aware of this challenge and have tried to address it by proposing standards for model description, validation, calibration, parameterisation and data use.One such attempt, TRACE [20,21], builds on an analogy between ecological computer models and general laboratory equipment: just like both scientists and decision makers expect sophisticated instrumentation to come with a manual, instructions and certification of quality and accuracy, so should they expect similar documentation and quality control on ecological computer models.This would both reassure decision makers and allow them to verify that criteria for scientific rigour are met.In a similar vein, the Open Science movement (see description and references in [22]) calls for transparency in model and data sharing, to facilitate cross testing among modelling groups.A different approach, also motivated by concerns about the provision of scientific advice for decision making, focusses on information about the source, provenance, currency and reliability of both the ecological computer model and input data, as well as the extent to which the science behind the results is considered by experts to be sound [23].This recommendation has been partly adopted in the notion of 'pedigree' used to assess the reliability of different implementations of the Ecopath with Ecosim model [24,25], widely used in fishery and marine ecosystem studies.
It is interesting to consider the extent to which these approaches may help the communication between ecological modellers and decision makers in years to come.On the one hand, the core message from our interviews suggests that it is unlikely that the approaches described above would have a short term impact on an existing project, since much of their focus is on filling the knowledge-gap.Improved and more transparent documentation may address the 'black box' challenge only to the extent that environmental modelers are willing to make such documentation easily intelligible to a non-expert audience and the extent to which the non-expert audience is willing to put the necessary effort into understanding some level of technical content.Lack of trust arising from the perception that models can be manipulated to produce any pre-conceived outcome belongs to human and institutional considerations that are beyond modeling standards and knowledge-gap concerns [26].It is, however, possible that in the longer run, initiatives like TRACE and Open Science (see also [27] specifically for agent-based models) may lead to a cultural shift, through a path of first strengthening confidence in ecological models within the modeling community, then porting this confidence to the wider scientific community and finally to the decision making community.Eventually, for many ecological applications, the ultimate goal of ecological modelling is not a specific level of scientific accuracy, as much as better decision making [3,28], and the strong focus on modelers-decision makers' interaction (as predicated by the Open Science platform, see https://www.fosteropenscience.eu/taxonomy/term/7)may indeed lead to an attitude shift.
We left the discussion of input data last, because in most ecological studies poor quality input data is a reality, not a modeling problem.According to some, the field of marine ecological studies is experiencing a 'big data' revolution [29,30].It is reasonable to expect a massive amount of environmental data of policy relevance will be available in real time in many areas of marine ecology and fishery science.This raises an interesting dilemma.On the one hand, the decision makers' desire to see a large amount of current and local data feed marine ecological computer models could be fulfilled.On the other hand, as it is widely recognised, the sheer size and complexity of 'big data' will inevitably require its analysis to be automated.The task of pre-processing, filtering and transforming (possibly even interpreting) data for subsequent modelling will likely be delegated to algorithms, that is, to more models.The need to trust models and modellers, which generally refers to the parameterisation, implementation and running of the models, will thus extend to data collection and selection [31,32].Depending on what algorithm is used to carry out such tasks, this step may in fact be even less transparent than traditional ecosystem simulation.The trust issue may be exacerbated if neural-network-based approaches, like deep learning techniques are at the core of the collection and selection algorithms [33].Due to their complexity, such approaches may be perceived as inscrutable and untestable not just by decision makers but also by the model developers [32].To complicate things further, [34] suggests that improvement in computer power and artificial intelligence (AI) has allowed the simulation of any physical phenomena to the point of making them virtually indistinguishable from reality.If decision makers see current ecological computer modelling as prone to manipulation, this may become even more problematic once such complex algorithms and hardware become commonplace in scientific modelling.

Conclusions
This exploratory study offers insights regarding decision makers' perceptions of ecological computer models.The interviews highlighted some common themes amongst decision maker responses in terms of: limited understanding about what an ecological computer model is and what it does; uncertainty about the reliability of input data used in the ecological computer model; and lack of trust regarding manipulation of model parameters and the subsequent reliability of model outputs.Consequently, we consider that these themes raise a series of questions about the wider perceptions of ecological computer models and how these might be addressed.
How widely held is the notion that ecological computer model reliability depends primarily on the quality and currency of the input data?If this notion is widely held, not only could modellers place more emphasis on communicating details of data collection, they could also carry out sensitivity analysis to clarify to what input data and under what conditions model results are most sensitive.
To what extent is reluctance to account for model results in decision making discipline-dependent?For example, are there differences between the acceptance of models in marine ecology, fisheries, macro and micro-economics, water management, epidemiology or urban dynamics?Could more research on this issue shed light on more effective stakeholder engagement strategies?
Is it possible to discriminate between trust in modelling (what an ecological computer model does) and trust in model use (how a model is used)?If so, could researchers and stakeholders devise some guidelines on ecological computer model use which could make it more transparent and potentially a better accepted tool in decision making involving complex, high-stakes phenomena?
Inevitably the future appears to be a place in which more data (big data) is available with increased complexity, meaning there needs to be more acceptance and trust in automated processing, models and modellers.Decision makers' acceptance of ecological computer models in environmental projects is clearly a complex issue yet to be reconciled and likely to become increasingly pressing.One significant key appears to be a better understanding of the relation between perceptions and acceptance of modelling, leading to more effective communication between modellers and decision makers.
dark short uncertainty bar in the middle) are believed to constrain the uncertainty arising from input data, by enforcing ecological and physical consistency on the model results.The uncertainty of model output (short dark bar on the right) is mostly constrained by the model.Decision makers are skeptical of ecological computer models as discussed above (tall light uncertainty bar on the middle).As a result, they believe that only input data can constrain the uncertainty arising from model use, by enforcing consistency between model and observations.The uncertainty of model output (light bar on the right) is mostly constrained by input data.As result, in communication about model projects, modellers may focus on the ecological computer model while decision makers may expect the focus to be on input data.

Figure 1 .
Figure 1.Graphic representation of the different attitudes towards ecological computer modelling between modellers and decision makers, resulting from interviews with 17 senior scientists discussed in this work.

Figure 1 .
Figure 1.Graphic representation of the different attitudes towards ecological computer modelling between modellers and decision makers, resulting from interviews with 17 senior scientists discussed in this work.

Table 1 .
An indicative sample of responses to the question 'How would you describe your current understanding of ecological computer models?RP6"... almost an equation that has a whole series of inputs, and functions or factors that get inputted and spits out an answer at the other end".
RP3"a tool which is computer-based and it's mathematically driven I suspect.It's a mathematically contrived algorithm which has been designed to help answer some questions. . . .A lot of people look at [a model] as a black box.They haven't got a clue what's in that black box."

Table 2 .
An indicative sample of responses to the question 'What do you perceive to be the main disadvantages of modelling?
RP1"A good model relies on good information, so if you don't have good information going in, you're not going to have a good answer coming out."RP3 " . . .a [model] output's only as good as the input".RP4 "It all depends on how good the information is going in." RP5 "The level of data that you put in gives an indication of the accuracy of the outputs . . .it's all about the data that goes into it."RP6 "My biggest concern with models is . . .how recent is the data that has gone in ... RS3 "Quality data is potentially very important [for model confidence], especially if outputs are spurious."3.3.Lack of Trust, Models Can Be Manipulated to Produce Any Outcome

Table 3 .
An indicative sample of responses regarding barriers to using ecological computer models in a decision making context.If people have a vested interest in wanting a decision and it is not the kind of information that the model's supporting, they're going to try and take the model less seriously."RP4"... you can get such big differences in the results depending on which group does it and which factors you use and all those sorts of things I think [a model] can be misused, and managers might be tempted, and contractors might be tempted to find the right answer, or the answer they think people want ..."