Next Article in Journal
Efficiency of Energy Taxes and the Validity of the Residential Electricity Environmental Kuznets Curve in the European Union
Next Article in Special Issue
Challenges and Opportunities of a Forthcoming Strategic Assessment of the Implications of International Climate Change Mitigation Commitments for Individual Undertakings in Canada
Previous Article in Journal
Environmental Impacts of Experimental Production of Lactic Acid for Bioplastics from Ulva spp.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Strategies for Dealing with Uncertainties in Strategic Environmental Assessment: An Analytical Framework Illustrated with Case Studies from The Netherlands

1
Arcadis Netherlands, Beaulieustraat 22, 6814 DV Arnhem, The Netherlands
2
Environmental Governance Group, Copernicus Institute of Sustainable Development, Utrecht University, Princetonlaan 8a, 3584 CB Utrecht, The Netherlands
3
Forest and Nature Conservation Policy Group, Wageningen University and Research, Droevendaalsesteeg 3, 6708 PB Wageningen, The Netherlands
*
Author to whom correspondence should be addressed.
Sustainability 2018, 10(7), 2463; https://doi.org/10.3390/su10072463
Submission received: 18 June 2018 / Revised: 2 July 2018 / Accepted: 3 July 2018 / Published: 13 July 2018
(This article belongs to the Special Issue Fostering Sustainability through Strategic Environmental Assessment)

Abstract

:
Strategic environmental assessment (SEA) is a widely applied policy tool that aims to aid decision-makers in making informed, higher-quality decisions that minimize negative environmental impacts. However, different types of uncertainties complicate the ex ante assessment of environmental impacts. Literature suggests uncertainties are often not well addressed, resulting in inaccurate and even unreliable SEAs. At the same time, SEA literature offers limited guidance in how to systematically identify and deal with uncertainties. Therefore, in this paper, we present an analytical framework for characterizing and classifying different forms of uncertainty in SEA, and for identifying strategies for dealing with these uncertainties. The framework is based on literature on uncertainties in other subdomains of the environmental sciences. The framework is applied to five case studies of SEAs for spatial planning in The Netherlands in order to illustrate and critically reflect on our framework, and to bridge the gap between theory and practice. Based on these case studies we concluded the following: (1) The framework is useful for identifying uncertainties in SEA in a systematic way; (2) There is a discrepancy between how uncertainties are dealt with in theory and in practice; (3) In practice, uncertainties seem to be dealt with in a rather implicit way. The framework may help dealing with uncertainties more systematically and more proactively; (4) The most successful way of coping with uncertainties seems to be the application of multiple strategies.

1. Introduction

Environmental assessment (EA) is a widely applied policy tool for decision-making for a range of decisions with potential detrimental consequences for the environment. Since its introduction in the United States in 1969, it evolved into a process that does not only inform decision-making for environmental impacts of projects, plans, and policies programs, but that also is used in a more proactive way for development and design [1,2]. Yet, it is increasingly acknowledged that uncertainties pose a serious limitation to the value that EA can have as a tool to identify and mitigate adverse environmental impacts.
Uncertainties specifically emerge in strategic environmental assessment (SEA). SEA is a systematic decision support process that aims to identify, predict, evaluate, and mitigate environmental effects of proposed policies and plans before decisions are made [3,4]. SEAs often apply to strategic decisions that provide the context of more concrete projects subject to environmental impact assessments (EIAs) [5]. Whereas EIAs are often used to make a decision about project approval and environmental licensing, SEAs are also used by decision-makers, stakeholders, and environmental experts to develop, review, and discuss plan and policy options in the light of their environmental impacts [1,2].
In theory, most environmental gains can be achieved at the strategic policy-making and planning phases (which are often subject to SEA), because of higher degrees of freedom compared to the more narrowly defined projects (subject to EIAs; think of locational choices, modes of transport, fuel mixes in national energy policies, etc.). At the same time, the accuracy of ex ante assessments in SEAs are complicated because of the typically large scopes, high levels of abstraction, and long time horizons in decisions subject to SEA [5]. As a consequence, environmental impacts that manifest themselves after the implementation of decisions subject to SEA may differ substantially from the expected impacts from an SEA, with major consequences for the environment and for public health. Mitigating these unforeseen environmental impacts can delay the implementation of policies and plans, and result in higher costs than planned for. In addition, SEAs and EAs in general may lose their credibility among decision-makers and the public [6]. Thus, “if impact assessment is to be a sound decision-aiding tool, it must give consideration to uncertainties, identify means to manage those uncertainties and provide decision makers a better understanding of the consequences of their decisions” (Leung et al. [7] p. 121).
Various scholars observed that uncertainties are not well addressed in SEAs. SEAs are found to be inaccurate or even unreliable, even though the assessments are often presented as very certain [7,8,9,10]. SEA reports often do not acknowledge or communicate any uncertainties that occurred during the assessment process. Furthermore, even if uncertainties are disclosed, often no guidance is provided about how to deal with the uncertainties at issue, reducing the usefulness of SEAs for decision-makers [11,12].
In EA literature, the conceptualization of uncertainties in the context of SEA remains to be elaborated in much detail, in contrast to the more general environmental sciences literature. What types of uncertainty are typical for SEA and the decisions it aims to support? What strategies can be employed in order to deal with uncertainties? Despite recognition of the importance of uncertainties in SEA in the literature [4,7,8,13], these questions received little attention thus far in SEA literature [2]. Moreover, little empirical research is conducted on how uncertainties are experienced and dealt with in SEA practice. This, however, is necessary if scientific research should contribute to practical guidance for dealing with uncertainties [4].
This paper, therefore, provides an analytical framework that identifies different types of uncertainties in SEAs, and strategies for dealing with them, similar to what was developed for other environmental domains ([14], based on [15]). We applied the resulting framework to a number of SEA cases in The Netherlands in order to illustrate the framework, as well as for a first test, and to get a first insight into how uncertainties in SEA are dealt with in practice. The Netherlands is interesting because we expect that, in Dutch SEA practice, there is relatively much attention for recognizing and dealing with uncertainties. One reason is the relatively long history of SEA in this country; a second is that knowledge accumulation regarding SEA is formally institutionalized in the form of The Netherlands Commission for Environmental Assessment, which is charged with the quality review of EAs [2,16]. The Netherlands also has a long tradition of explicit reflection on uncertainty in policy-oriented assessments and decision-making [11,17,18,19,20,21,22,23,24].
The remainder of this paper unfolds as follows. In Section 2, we define our key concepts and develop our analytical framework, based on a literature review. In Section 3, we outline the case-study approach we employed for illustrating our analytical framework and present the results of our case studies. Finally, we wrap up our main conclusions and recommendations for SEA practice in Section 4.

2. Analytical Framework

2.1. Uncertainties in SEA Discussed in EA Literature

Uncertainty implies the information that is provided by SEAs may not be sufficient for the decision at issue. In other words, no accurate impact prediction of environmental effects can be made, which means that effects can turn out differently.
In SEAs, different types of uncertainties may emerge [21]. Uncertainty is not only about a deficit of available knowledge, it is also a lack of equivocalness (multiple interpretation), and an intrinsic characteristic of complex systems subject to SEAs. Additional sources of uncertainty in SEA are a lack of specificity of the policy or plan at issue, incomplete models, and a lack of insight into the effectiveness of mitigation measures, the project, as well as interpretation and biases [7,13,25,26].
According to Larsen et al. [27], SEA practice does not sufficiently acknowledge uncertainty. As a consequence, politicians and the public might interpret outcomes as more certain than they actually are. Various reasons were suggested for “uncertainty avoidance”, such as a belief that decision-makers and stakeholders may not understand the complex uncertainty information, that they do not want that level of detail, that it might increase distrust of the information, that it could be used strategically (either downplayed or emphasized for political reasons), that it might result in protracted discussions or simply lead to more research and delays in decision-making (“paralysis by analysis”), and that it could lead to unwanted conflict (which could harm decision-makers) [11,21,27,28,29,30].
Pavlyuk [26] and Lees et al. [13], therefore, stated that there is a need for disclosure and consideration of uncertainty in EAs, as well as standardized best practices to ensure that uncertainty is properly addressed. In this context, Cardenas & Halman [8] suggested that such guidance is best based on an adaptive management approach that aids dealing with uncertainties as they manifest themselves during the various stages of the decision-making process, whereas Lees et al. [13] and Pavlyuk [26] proposed legal obligations regarding the disclosure and consideration of uncertainty.
Concluding, the literature on EA reflects that uncertainty can be understood in different ways, and can occur in different elements of the environmental assessment process. However, it remains unclear what types of uncertainties occur that need management during the process, and what type of management is suitable in what situation.

2.2. Uncertainties Discussed in Other Bodies of Environmental Literature

In order to further understand uncertainties, a broader perspective is taken by including other strands of literature. Further research was done in ecological studies, studies in environmental modelling and policy, environmental decision-making, crisis management, and risk management. The selection of literature reflects the complexity in environmental systems, resiliency approaches and attitudes and behavior of practitioners (when decisions have to be taken under great pressure, there is no time for collection of more data or for consulting multiple actors [31].
In that literature, a variety of uncertainties are discussed, and different definitions of “uncertainty” are used. An example of a broad definition is “uncertainty is any deviation from the unachievable ideal of completely deterministic knowledge of the relevant system” [32]. Other definitions are “not having the required knowledge to precisely describe an event and its characteristics” [8]; “a lack of confidence about knowledge relating to a specific question” [33], or similarly “a person is uncertain if s/he lacks confidence about the specific outcomes of an event. Reasons for this… might include a judgment of the information as incomplete, blurred, inaccurate, unreliable, inconclusive or potentially false” [34,35]; and “incomplete knowledge about a subject” [36]. Brugnach et al. [15] added a nuance to these definitions: any deviation from complete knowledge also means that there is not a unique understanding of the system; there could be “multiple understandings”. Uncertainty then includes a lack of agreement amongst stakeholders, which is important since stakeholder involvement often plays a large role in complex policies and plans. Uncertainty definitions vary in their approach, particularly on whether uncertainty is defined from the viewpoint of the analyst or from that of the decision-maker [30,37]. The Walker et al. [32] framework is an example of an analyst-focused typology. It was developed as an attempt to synthesize and harmonize the earlier work on uncertainty typology in various literatures related to policy-oriented modeling and assessment. In turn, it spawned numerous adaptations to new domains and fixes of perceived weaknesses [12,38]. Kwakkel et al. [38] attempted an update. The division in nature, level, and location of uncertainty, however, remains relatively stable [12]. Another set of typologies places primary focus on uncertainties related to the societal context of environmental assessments; these are decision-maker-focused typologies [37]. They observe that uncertainty for a decision-maker is not only scientific (technical, physical, biological, etc.), but also societal (cultural, institutional, etc.) in nature [37]. For example, De Marchi [39] assessed uncertainties in environmental disaster management, and included a broad range of contextual and societal types of uncertainties, such as legal, moral, societal, and institutional. Scientific uncertainty is just one aspect. Van Asselt [40] and Meijer et al. [41] similarly expanded on scientific uncertainty with broader aspects related to values, technology, markets, politics, and society.
We argue that, for a tool such as SEA, which is in basis a decision tool, we should interpret uncertainty primarily from the decision-makers’ point of view (which of course does not exclude uncertainties that analysts face, and that have an impact on their assessments as they are presented to decision-makers). Taking this into account, in Table 1, we show some interpretations and categorizations of uncertainties that we derived from literature on environmental research. The table clearly shows there is not a commonly shared agreement on a generic typology.

2.3. Toward a Typology of Uncertainties Relevant for SEAs

To develop an appropriate uncertainty typology for SEA, its role and context should be taken into account. SEAs aim to explore different options for policies and plans, and to provide regulators and the public with information on the potential range and nature of environmental impacts and mitigation options. SEAs do not take place in a vacuum, but are directly connected with decision-makers with particular objectives, and stakeholders whose interests may be affected by the decision itself, its expected environmental impacts, or by mitigation options. Some uncertainties in SEA can be reduced through more research (scientific uncertainty), but some are inherent in the system studied and cannot be reduced (inherent uncertainty), for instance, those related to future debate on the values and interests that underpin the analysis or that are impacted by the proposed plans (social uncertainty), and uncertainties related to the legal implications of assessments (what knowledge needs to be supplied in case of a dispute between citizens, stakeholders, decision-makers, and the initiators of the plans (legal uncertainty). Consequently, four categories of uncertainty form the backbone of our framework:
  • Inherent uncertainties (“we cannot know (exactly)”);
  • Scientific uncertainty (“our information and understanding could be wrong or incomplete”);
  • Social uncertainty (“we do not agree on what information is or will be relevant”);
  • Legal uncertainty (“we do not know what information we should (legally) provide”).
Below, we elaborate on the four categories of uncertainties, followed by an identification of strategies for dealing with them.

2.3.1. Inherent Uncertainty—“We Cannot Know (Exactly)”

The first category of uncertainty is inextricably connected to the inherent unpredictability of the system under study. In many systems, the impact of a plan can vary because the physical size of the potential disturbance varies per location or over time, because the systems or populations that are affected have a diverse sensitivity to the disturbance, because other issues or trends that are at play modify the impact of the plan under study, or because the system exhibits some level of chaotic behavior. This is not simply solved by additional research or empirical efforts [15,32,33,36]. The uncertainty range in the impact is a real, physical thing. At best, it can be described as a classic probability distribution. Inherent uncertainty is the “aleatory” or “variability” type of uncertainty described in many of the typologies in Table 1. Note that this can relate to the environmental system, as well as the social, legal, or political systems. However, such aspects are already largely covered under the social and legal uncertainties in the later subsections.
Inherent uncertainties may manifest themselves in SEA in various ways, some of which are listed below.
  • Because of variability of the system, the appropriate system boundaries regarding time and spatial scales are unknown or unclear, or the vulnerability of the system(s), populations, or individuals impacted varies. It may be possible to give “likely” bounds, but the precise impacts in practice will vary, and outliers cannot be ruled out. Examples include variability in local weather conditions, in local activities, or in the way local plants and animals might respond to the effects of a plan in the environment. For SEAs, it means that the exact magnitude and full range of environmental impacts of an activity cannot be known. Our knowledge of the natural system determines how we represent these properties in assessments, and how we design tools to evaluate impact.
  • In understanding environmental processes, it is important to study the relationships between cause and effect. Cause-and-effect mechanisms can only be established if these relationships are well understood. In the case of very complex systems and issues, such as climate change, this is difficult to establish, and the system may exhibit “chaotic” behavior. As a consequence, assessing the impact of a future activity in SEAs can become very difficult, especially in complex systems and for long-term impacts.
  • Uncertainties also arise in the assessment of cumulative effects [25]. Noise pollution is a good example for cumulative effects. If an activity takes place on a larger scale, other existing sources of noise have to be taken into account to study the total impact of noise. Different sources of noise reinforce each other, called accumulation. Noise increase from the assessed activity might seem irrelevant, yet, in total, it could mean a significant increase in noise pollution in the area. It can be difficult to understand how natural phenomena reinforce themselves. Consequently, the full impact of an activity in an existing situation with multiple sources and burdens may not be clear, and it can be difficult to attribute reported problems to a specific activity.
Often, inherent uncertainties exist in combination with scientific (epistemic or “knowledge”) uncertainties that can be reduced through more research. For instance, the range of variability may not be known at first. The variability can then be better characterized and bounded (reducing knowledge uncertainty about the variability), but the physical source remains. For example, uncertainty around climate-change effects can be reduced by improving data analysis, models, and parameters. However, despite improvements to models, there will always be some uncertainty inherent to the natural and socio-economic systems involved which we cannot remove. The reducibility of uncertainty strongly relates to determining how we deal with uncertainty [33].

2.3.2. Scientific Uncertainty—“Our Information and Understanding Could Be Wrong or Incomplete”

Scientific uncertainty entails having limited or incorrect information about phenomena. This relates to “epistemic” or knowledge uncertainty as discussed in the typologies in Table 1. Reasons include technical issues such as faults in models or data, and problems in the translation of the practical problem into the scientific problem. This might, at least in theory, be reduced by performing additional research [36], for example, the design and selection of indicators and criteria for assessment.
Technical problems are associated with data and models. More often than not, impact predictions are made using models, rather than actual measurement, or model extrapolations based on a limited set of measurements. This is especially the case when predicting air quality, changes in water systems, or noise pollution in large-scale projects. Model outcomes are used to compare alternatives and determine a preferred alternative with the least significant effects. Uncertainties may emerge in the following ways:
  • Models are simplified abstractions of the real world, and are, therefore, never fully accurate [26]. Uncertainties can occur in the model structure, variables, and parameters [8]. Similarly, many assumptions are made in the modeling process, e.g., in designing a model or combining models in a model chain, where different researchers might make different choices [48,49]. That models make simplifications and assumptions is, in itself, not necessarily bad—it is a necessary aspect of generalizing and applying knowledge of environmental processes to evaluate new situations (i.e., not yet existing in exactly the set-up proposed). Rather, one should relate models to model and knowledge quality [28], and to the fitness of the model for the purpose for which it is used in the assessment [50,51]. Often, generic models are developed and used in SEAs to find consistency in the research methodology, and thus, overcome uncertainty due to limitations in models. Interactions and variables that are unique to the situation might be overlooked.
  • Models use the input of data. Uncertainty about data can occur due to limited access to information, measurement errors, type of data, and presentation of data [25,46,52]. Also, data might become invalid in the long term due to greater variability, depending on the time horizon that is selected. Limitations in data seriously influence the impact prediction that is the outcome of the model.
  • Data on baseline conditions is a specific issue. Baseline conditions include the developments, impacts, and environmental dynamics that would occur without the proposed activity. Baseline conditions are a critical starting point in SEAs, as they provide the benchmark against which assessments are predicted. Measurement errors occur in baseline data [52].
The translation of problems, as defined by policy-makers and planners for scientific problems, is a second source of scientific uncertainty:
  • Uncertainties can occur in the choices of data, methods, parameters, and statistics, in other words, the assessment framework. Science is looking for measures to represent phenomena. It applies to SEAs in the sense that indicators are selected to study environmental effects, which may not be the best representation of the real environment [26,52].
  • Furthermore, projects and activities may change, and impacts that are attributable to them change as well [8,25,26,52].
  • When determining change and impact, we need to determine past, present, and future activities for the development at issue [25]. To create an inventory of all activities, a large amount of effort and input is needed from different stakeholders. Future activities are especially difficult to include, since they occur over a longer time scale, influenced by many other factors.
The distinction between inherent and scientific uncertainties is not always clear, and both types of uncertainty can be related (e.g., our inability to predict how complex systems develop or behave may result in both types of uncertainty).

2.3.3. Social Uncertainty—“We Do Not Agree on What Information Is or Will Be Relevant”

Social uncertainties refer to doubts or ambiguity about information by actors involved in an SEA, or in the policy or plan at issue. It is caused by differences in human values and interpretations. The role of social uncertainties in environmental research was only recently recognized, and the challenge is to account for human input in the decision-making process [36]. This relates in particularly to the “ambiguity” type of uncertainty, as discussed in the typologies in Table 1, as well as, to some extent, to variability (e.g., variability of social values) and knowledge (limited information on the social perception of the activities proposed, or lack of accounting for social aspects in the scientific analyses) uncertainty. Some examples of social uncertainties in SEAs include the following:
  • Stakeholders, as well as decision-makers and researchers in SEAs have different values, interests, and perceptions of environmental components [8,46]. Examples are conflicts of interest regarding the objects to be studied, and different world views regarding what is important. It influences the framing of the problem, and therefore, the scope of the assessment. It also entails a subjective selection of criteria and indicators. The assessment of system boundaries and impacts are a result of negotiations between stakeholders.
  • The political climate influences whether an environmental problem is addressed, and which alternatives are considered and selected [43]. Political groups or lobbyists can have a large influence on the outcome of the decision-making process. They can also demand to study specific environmental aspects, such as health or sustainability. It depends on the societal context and the period. It could also mean that politicians pursue political goals, and overrule environmental issues.
  • Knowledge frames and capacities of stakeholders are strongly related to inherent and scientific uncertainty. It entails our understanding of the environmental processes at hand, but it also entails an understanding of what information is delivered in SEAs. This depends on the capacities and skills of responsible persons such as policy makers and project managers [39]. Similarly, the frames of the analysts and competent authorities play a role in shaping the scientific analysis in the (S)EAs; issue-framing plays a key role in setting the research questions and boundaries, strongly impacting what is analyzed and how, and consequently, the results of the analysis [22,28].
  • Social uncertainty can exist in the project design for the SEA process, e.g., organizational factors, procedures, resources, and coordination among stakeholders [26,39,46,52].
The implications of social uncertainty could be legal uncertainty (see below), which suggests our distinction in forms of uncertainty is mainly an analytical one.

2.3.4. Legal Uncertainty—“We Do Not Know What Information We Should (Legally) Provide”

Legal uncertainty has to do with the decision-making context. It relates to ambiguity in the uncertainty typologies discussed in Table 1, as well as, to some extent, to variability (e.g., in legal rulings and perceptions), and knowledge (e.g., lack of clear criteria or legal precedents) uncertainty. Decisions that are made in an SEA need to be justified, and decision-making approaches depend on goals, performance measures, and assessment criteria [46]. For example, new legislation on specific environmental aspects could pose uncertainty about how to include this in the SEA process. Legal uncertainty in that respect relates to what one “could or should have known” before implementing the project to due diligence, as elaborated below.
  • The decision-making context poses uncertainty as to what information the SEA needs to deliver. The task of supplying information is imposed on the initiator of the policy or plan [52]. Often, legal guidelines exist to address the type and amount of information that needs to be delivered in SEAs to make a decision. However, uncertainty increases when the decision-making context changes due to new (environmental) legislation or revisions of existing legislation.
  • The institutional context influences rights and responsibilities, and shapes the degree of power and influence [52]. This also relates to how responsibilities and definitions, for instance, the definition of the “precautionary principle”, are embedded in national or European Union (EU) law or international agreements. Such differences can lead to different levels of proof that are required before allowing a plan, or to demanding precautionary risk-mitigation actions, and who should bear the burden of proof [53,54].
  • Furthermore, De Marchi [39] describes legal uncertainty as the future contingencies or personal liability for actions or inactions. The people involved in an SEA process, including the initiator, consultants, and decision-makers, are primarily concerned with making their assessments and decisions appear defensible and politically palatable [7]. Providing information about significant impacts in a worst-case scenario, or uncertainties in the assessment can have consequences for the public image, social trust, legitimacy, and political acceptability. The public can use this kind of information to appeal to a proposal, or at least policy-makers feel that this is the case.
Figure 1 summarizes the four categories of uncertainties relevant for SEAs.

2.4. Strategies for Dealing with Uncertainties in SEAs

Theoretically, the type of uncertainty determines the choice for how to deal with uncertainty. If uncertainty is acknowledged as inherent, initiators and competent authorities might accept that it is irreducible, and might, therefore, aim at a strategy to manage the uncertainty as it manifests itself during the implementation of the policy or plan, rather than try solving it during the SEA phase [2]. If uncertainty classifies as scientific, additional research might be performed to find more confirmation for results of the assessment.
In the environmental literature, several strategies for dealing with uncertainties are discussed [8,15,35,37,38,53,55]. On several occasions, authors explicitly linked strategies to specific types of uncertainty, and in other cases, such links are less explicit. We discuss strategies relevant for the SEA context below. Firstly, we discuss what strategies ideally aim at; in other words, when are the four categories of uncertainties discussed above successfully addressed?

2.4.1. Ideally or Typically Dealing with Uncertainties

Dealing with uncertainties in SEA has to do with both reducing uncertainty and managing uncertainty. A measure of change to determine the success of specific strategies for dealing with uncertainties was provided by Allen & Garmestani [56]: “What changed as a result of action in relation to the project goals and objectives?”. Connecting this to the goal of SEA, which is to inform decision-makers on the consequences of plans and programs, successful management of uncertainties would provide decision-makers with a better understanding of the consequences of their decisions, or with rich decision options [7].
Because each of the four categories of uncertainty imposes a different need for information, management, and progress (which is hypothesized in this paper), different definitions of success were identified, and are summarized in Table 2
Below, we discuss strategies for dealing with uncertainties in SEAs. Four strategies are identified that can be implemented in various ways: knowledge generation, stakeholder involvement, adaptive management, and employing the precautionary principle.

2.4.2. Knowledge Generation

Knowledge generation can generally be defined as any approach or strategy that yields more knowledge on a specific subject. It is most applicable to types of uncertainty that can actually be reduced by obtaining more knowledge. Uncertainties that are addressed by knowledge generation include the full range of uncertainty about future activities, change and impact predictions, measurement errors, limitations in models, and data gaps.
Two specific types of knowledge generation generally applied in SEAs are discussed here. Firstly, expert judgment (“expert elicitation”) is used to identify options and impacts when the full range is not known, or when direct measurements or modeling are not feasible [15,45,57]. Inviting multiple experts to the discussion leads to benchmarking or standardization, and can help with combining different disciplines or lines of evidence in the analysis. Expert elicitations can provide quantitative or qualitative information, depending on the situation, such as the amount of evidence available. Secondly, additional research can be performed to compensate for absence of data, such as desk studies, fieldwork, sensitivity analyses, or simulation models [8,15].
Not all research necessarily completes the knowledge base. It may uncover more uncertainties and complexities. It may also cost substantial time and money, potentially leading to “paralysis by analysis” [58]. The cost, time, and urgency of a problem should be taken into consideration when choosing to do more research [15]. This is why other types of strategies are often applied in SEAs.

2.4.3. Stakeholder Involvement

Stakeholder involvement can lead to reductions in ambiguity, increased support for the implementation of a proposed activity, and more robust decision-making. It is a rather generic term that can be understood in many ways. Its definition depends on the purpose and the uncertainty situation at hand, and differs by mode, degree, and timing [26]. Interested parties, affected communities, and also scientists and experts are part of the “stakeholder group”. Different levels of, and approaches to stakeholder participation can be taken in the assessment, depending on the goals, and different methods can be used [59].
Active involvement of stakeholders helps with dealing with uncertainties about the identification of future activities, change and impact, and assessment criteria, and when the values, interests, and perceptions of stakeholders are likely to be different [8]. This is because involving stakeholders can help display heterogeneity among stakeholders, a better understanding of their interests, and insight into power relationships. This serves to assess the full range of possible options and impacts, and to get to an agreement on the course of action, such as exploring risk-mitigation options and communication activities that stakeholders would value. Alternatively, it could, at least, provide an inventory of the disagreements and positions of stakeholders, and a clarity among them about this, which can be used and decided upon in the political decisions on the project. When there is room for interpretation of environmental indicators and impacts, stakeholder involvement can help provide more consistency and transparency by communicating about uncertainties [44].
Stakeholder involvement in EAs can take many forms [60]. According to Pavlyuk [26], it entails activities such as informing, consulting, reaching consensus, and negotiations. Refsgaard [35] describes stakeholder involvement as enabling stakeholders to communicate issues of concern, using (non-scientific) knowledge of stakeholders, and to actively involve stakeholders in the quality control of the SEA process. The application of stakeholder involvement obviously entails a degree of inclusion. The levels are identified as providing information, consulting, collaborating, and co-decision-making [8]. According to Isendahl et al. [61], stakeholder involvement also means to create trust among actors, and to communicate mutual expectations. Stakeholder involvement is best applied at the start of a project, when stakeholders can have a large impact on finding assessment criteria and alternatives, and can jointly consider the options.

2.4.4. Adaptive Management

Adaptive management (AM) is a management approach that can be implemented when knowledge is incomplete, and managers and policymakers need to act despite inherent uncertainty [56]. The goal is to reduce uncertainty, build knowledge, and improve management over time, in a step-by-step way. AM was first introduced by Holling in 1978 [62]. The concept is based on resilience literature, in which variability in a system is accounted for with different methods and techniques, aimed at gradually reducing uncertainty [62].
AM is an iterative process where the current condition is used to determine subsequent actions (MacDonald, 2000). In empirical literature, AM approaches are mostly used to deal with inherent or scientific uncertainties. Firstly, AM helps with uncertainty about the identification of future activities and the identification of change and impact. Planning activities are not always defined in detail, making impact prediction complicated [63]. It does so by generating multiple management actions when the effectiveness of management actions is unknown, by identifying multiple possibilities, and by managing the preferred alternative [8,63]. Secondly, AM helps when there is uncertainty about key criteria, local thresholds, and carrying capacity of an area (the assessment framework) [8,10]. Thirdly, AM deals with estimates of natural variability, cumulative effects, and the identification of cause-effect mechanisms [8].
Adaptive management is, like SEA, not a strictly defined, homogenous management process. Different flexible approaches with feedback of information exist. Noble [10] describes AM as an “exploration of different alternatives to meet a management objective, a prediction of the outcome of these alternatives, an implementation of one alternative, and monitoring to learn about the impacts”. The identification and selection of alternatives and monitoring are recurrent approaches. Allen and Garmestani [56] also speak of the identification of alternatives, followed by evaluation. Monitoring is a key aspect of adaptive management. It serves as a mechanism to check for assumptions about options, impacts, and assessment criteria. Furthermore, monitoring can document experiences and lessons learned from a project regarding inaccurate predictions, measurement errors, data gaps (faults in data and/or models), and interpretation [13,26]. The emphasis of monitoring is on “measured change” and documenting cause-and-effect relationships [64]. This means that monitoring is less effective when dealing with unquantifiable uncertainties, when no information is available (acknowledged ignorance), or when the processes studied take a long time to play out or might experience discontinuities. It also requires an explicit monitoring protocol and a mandate for actors to intervene if negative effects occur [44]. Along with monitoring, Brugnach et al. [15] suggested that scenario planning and experimental approaches are adaptive-management applications. The aim is to develop flexible solutions that can adapt to changing conditions and unexpected developments.
In SEAs, adaptive management can take the form of monitoring in combination with taking additional measures, in order to comply with environmental standards and to enhance the environmental performance of policies and plans. In this sense, it bears strong resemblance with the concept of “follow-up” in EA, which is about ex ante measurement of actual environmental performance, and correcting for unexpected environmental effects [64]. However, the above discussion suggests AM is more than that—a continuous activity during the development, implementation, and adjustment of policies and plans.
The strength of AM, at least in theory, is its capacity to adapt to unforeseen and changing events, decisions, and circumstances [10]. Yet, it requires several prerequisites, such as strong baseline data and agreement amongst stakeholders. A major drawback of AM is that detectable changes need to occur before changes in management can be made. This means that, when effects are irreversible, when risks are high, or when there is a long time span before effects occur, AM might not be suitable [65].

2.4.5. Employing the Precautionary Principle

The key concept of the precautionary principle is that it protects humans and the environment against uncertain risks by means of pre-damage control [35,58]. The principle entails that, “where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation” [66], or that, “when human activities may lead to morally unacceptable harm that is scientifically plausible but uncertain, actions shall be taken to avoid or diminish that harm” [67]. It means that harm from an activity should be prevented when there is plausible indication of harm, even when there is no strong evidence yet. The initiator will need to take mitigation measures or prove that it is not harmful. However, different definitions of the precautionary principle are in use in different policy documents, international declarations and agreements, and legal frameworks, each with different perspectives on the level of scientific evidence required, the level of impact that would trigger the use of the principle, and the level of preventative action that is required [53,54]. This could include a moratorium, but most current frameworks focus on a “proportionate response”, using a repertoire of possible options to increase knowledge, limit the probability of harm, and limit the damage or increase resilience.
The precautionary principle applies to inherent uncertainty and scientific uncertainty. The approach is used in cases where there is uncertainty about the interpretation of data or natural variability of the system [44]. Furthermore, it addresses uncertainties in models and model outcomes, data gaps, and cause-effect relationships [26].
Employing the precautionary principle can be done in different ways. Firstly, a worst-case scenario is often adopted [25,44]. Choosing a worst-case scenario means representing the conditions and impacts of a spatial plan in the most conservative way. It strengthens the robustness of decision-making, as exposure will always be within acceptable limits. Working with worst-case scenarios is a subject that is gaining importance in the literature on scenario analysis in SEAs and EIAs [68,69]. Secondly, standardized principles or thresholds are set to limit harmful effects [14]. These thresholds are set based on the worst-case scenario, and activities that exceed the thresholds are prohibited. Prohibition of activities is a third application of the precautionary principle. Activities are then only permitted if scientific certainty is provided that significant negative effects are excluded. Fourthly, the precautionary principle entails the design and implementation of mitigation measures to reduce effects after decision-making [26,44].
The precautionary principle is an important decision-making criterion when activities have to be assessed that are highly uncertain. A major disadvantage is that it can sometimes be difficult to balance competing goals (especially in a political setting where voters may demand swift and strong action), and might, therefore, lock developments. The application of the approach is dependent on values and perceptions of stakeholders as to what environmental issues are important and what uncertainties are urgent to resolve, what impacts are “morally unacceptable” or “serious”, and what level of action is enough [53,54].

2.4.6. Linking Uncertainties to Strategies

Figure 2 shows how the uncertainties we discussed above relate to the above strategies for dealing with them. Not all uncertainties are covered by a strategy because linkages are not always clear from the reviewed literature. The illustrative case studies, therefore, serve as a means to explore these missing linkages. Figure 2 also sets out indicators for “successful” dealing with strategies.

3. Case Studies to Illustrate and to Refine and Bridge the Gap between Theory and Practice

3.1. Case Selection and Data Collection

Our analytical framework was applied to five cases of SEAs for spatial plans for two reasons. Firstly, it was applied to illustrate and refine the framework (i.e., verify whether uncertainties or strategies were missing in our categorizations, and to identify missing links between uncertainties and strategies). Secondly, it was applied to get a first impression of how uncertainties are perceived and dealt with in practice, in order to bridge the gap between theory and practice.
Five spatial planning cases were selected based on the following criteria:
  • The planning initiatives have a relatively high level of abstraction, and therefore, contain uncertainties.
  • The Netherlands Commission for Environmental Assessment (NCEA) reviewed the SEA report (which was expected to explicitly reveal uncertainties and suggestions for dealing with them).
  • The SEAs ran in the past five years, ensuring respondents remembered the project.
Table 3 summarizes the selected case studies and their characteristics. A brief description is provided in Appendix A.
The first step in data collection involved desk research of open resources. The NCEA publishes all relevant documents for projects on their website. This includes the spatial-planning documents, the SEA report, and the NCEA advice. These documents contain a very rich database with specific knowledge on the projects, such as spatial-planning goals, research methods in the SEA process, and results of environmental studies. Reading these documents prior to the interviewing phase ensured the researcher had sufficient knowledge of the projects. It also served to get a first grip on the most relevant environmental themes and possible uncertainties. Yet, the SEA reports only publish the end result, and reveal little of the process that was necessary to come to those results. Therefore, the desk research served only to provide basic knowledge for the interviews, and to ensure relevant uncertainties were discussed.
The next step involved interviewing key actors in the SEA process. These key actors represented different roles to gain a comprehensive view of the process. It was expected that different parties have a different perspective on uncertainties based on their role, tasks, and interests in the SEA process. The group of respondents consisted of researchers, consultants, project managers, policy makers, and specialists from different organizations (the NCEA, Utrecht University, several government authorities, and consultancy firms; see Appendix B for more information about the respondents).
Data were collected in a flexible and iterative way. Intermediate analyses between interviews revealed adjustments to data collection that could directly be employed in following interviews. Furthermore, flexible methods allow taking advantage of emergent themes and unique case features.
The interviews served to delve deeper into uncertainties that were identified during the desk research, and revealed further uncertainties. Interviews were semi-structured, focusing on what uncertainties respondents experienced, in what way they tried to solve or reduce the uncertainty, and whether they thought their approach was successful or not. At the start of the interview, the researcher explained the definition of uncertainty in this research, and the types of uncertainties that were used in the framework. This served mainly for the respondents to get a grip on the subject, and to provide some guidance during the interview. It also served to enable comparison of cases. Respondents were then asked to openly list all uncertainties they experienced during the SEA process. Listing uncertainties was not influenced by the researcher. Respondents were then invited to rank uncertainties by importance, depending on the role in decision-making and degree of knowledge about the uncertainty. This served to further structure the interview. The perceived important uncertainties were further discussed regarding the strategies that were employed to solve them, the goal of the strategy, what resources were used, and to what extent the goal was achieved. Lastly, a focus in the interviews was on a reflection of the strategies. Respondents were asked if they would have done anything different or better in retrospect, and whether uncertainties were resolved or remained.

3.2. Uncertainties Perceived in Practice and Strategies Employed for Dealing with Them

The problem of uncertainty in SEAs was recognized by all respondents. They all experienced uncertainties during the SEA process, and acknowledged they were not sufficiently dealt with. Figure 3 shows the results of the comparative analysis for uncertainties as perceived by respondents. It shows that nearly all types of uncertainty were experienced, except for uncertainty in project design. It also shows that each type of uncertainty is experienced in every case, although legal uncertainties are in the minority. Figure 4 shows the strategies that respondents employed, or saw others employing, in order to deal with the uncertainties. It also shows how successful the strategies were in the perception of the respondents. They were asked whether they thought the strategies were sufficient in reducing or managing uncertainties, and if alternative strategies might have had the same effect. Though subjective and qualitative, this assessment provides a first impression of the usefulness of the strategies in practice.
Below, we briefly summarize the four categories of uncertainty as experienced in the five cases, and how they were dealt with.

3.3. Dealing with Inherent Uncertainty

Of the inherent uncertainties, variability was experienced by all respondents. Uncertainties that were experienced in this category were, for example, uncertainty about other policies, the future run of a river, economic developments, and earthquakes. They refer to variability in the natural system, as well as the socio-technological system. When choosing a strategy to deal with these uncertainties, influential factors identified were the scope of the assignment, European legislation, documentation prior to the SEA process, and dependency on national policies. In other words, respondents perceived variability to be outside the control of the spatial plan, and to belong to European and national levels of decision-making, rather than to regional spatial plans. This is partly why, in two cases, managing this uncertainty was perceived to be not very successful; the large degree of detail in the assessment led to false certainty, and created unrest amongst decision-makers after a scientist expressed critique of the calculations.
Our analytical framework suggests that uncertainty due to variability can be best dealt with using adaptive management and application of the precautionary principle, for example, by assessing different scenarios and applying a worst-case scenario. The results of the case studies confirm the use of these strategies in practice and their success: seven unique uncertainties were dealt with using adaptive management or the precautionary principle, of which six were identified as successful by respondents (Figure 4). More specifically, respondents found it successful if a possible bandwidth of effects was identified during the SEA process (minimal and maximal), and these values were used to steer developments in a monitoring program, combined with the implementation of mitigation measures if necessary (i.e., performance- or compliance-oriented monitoring). In other words, the combination of the two strategies ensured that developments could be managed within the given bandwidths of effects, and a control mechanism was in place for after decision-making. The uncertainty was not “solved”, but reduced and controlled, which respondents identified as successful.
In some cases, stakeholder involvement was employed to reduce inherent uncertainty, for example, by communicating about uncertainty, and through collaboration in international regimes on river safety. However, uncertainty remained and caused unrest amongst decision-makers. Stakeholder involvement was, thus, not successful for dealing with this category of uncertainty.

3.4. Dealing with Scientific Uncertainty

Of the scientific uncertainties, the identification of future activities, and change and impact were most often mentioned by respondents, and came up in multiple cases (Figure 4). Causes of these types of uncertainties, as mentioned by the respondents, were flexibility in the end goal, the long time horizon, and the inclusion of sustainability developments. When choosing a strategy to deal with scientific types of uncertainty, influential factors mentioned by respondents were the level of decision-making, the legal status of the plan, the advice of the NCEA, and resources, amongst others. Thus, it seems that strategies are focused on delivering just the amount of information that is necessary for decision-making about the plan. This is a shortcoming for the potential of SEA as a tool to design regulatory frameworks beyond what is legally required.
Our analytical framework suggests that scientific uncertainty is best dealt with using adaptive management, stakeholder involvement, and knowledge generation, for example, by working with different scenarios and alternatives, through a monitoring program, through consultation with stakeholders to define the change, by performing additional research, and through expert judgment. However, knowledge generation was not often applied, and the precautionary principle was only applied in some cases. The use of the precautionary principle by using worst-case scenarios appeared to be disputable, as respondents stated it was considered only feasible for more traditional plans, and not so much for new types of innovative planning. In The Netherlands, a shift in planning culture will be institutionalized in a new environmental and spatial planning act (“Omgevingswet”) in 2021. The legal reforms affect the content of spatial plans, and therefore, also the practice of EAs. The law enables more flexibility in spatial plans, and poses new demands, such as planning for a longer term of 20 years. It is expected that spatial planners will focus more on creating the right conditions for spatial development, rather than on determining specific goals. This means that, in their perspective, a worst-case scenario is not a realistic scenario. Uncertainties are, therefore, not reduced or resolved by assessing unrealistic scenarios. However, a strategy to determine environmental limits and values (the precautionary principle) in combination with a monitoring program was considered most successful. In the SEA, scenarios or so-called impact tests can be used to assess whether possible impacts fit within the available environmental space. This was not expected from theory. The combination of the precautionary principle (determining limits) and adaptive management (monitoring in combination with mitigation measures if needed) can be optionally further improved with stakeholder involvement or knowledge generation, for example, by identifying the thresholds with the use of quick scans (knowledge generation) to make sure these thresholds are valid, or by sharing information about these thresholds throughout the organization (stakeholder involvement, thus ensuring the maintenance of this information). This further improves the quality of and support for the assessment.
In view of our analytical framework, knowledge generation in practice seems a less successful strategy than we expected, whereas a specific form of the precautionary principle does seem successful, also contrary to our expectations. New insights can be gained from specific applications of a combination of adaptive management, the precautionary principle, and stakeholder involvement, rather than investing in additional research.

3.5. Dealing with Social Uncertainty

For social uncertainty, values, interests, and perspectives of stakeholders were mentioned most often by respondents (Figure 3). This can be anything from discussions about environmental values to the definition of acceptable norms. Choosing strategies to deal with these types of uncertainties depends mostly on project-management skills. Respondents agreed that it takes experience of the project team, as well as of the project manager to identify strategies to deal with the social types of uncertainty.
Based on the literature, we expected that social uncertainty would only be dealt with using stakeholder involvement (Figure 2). However, it turns out that, for our respondents, all strategies were used and were perceived to be successful for dealing with social uncertainties, especially the new combination of adaptive management, stakeholder involvement, and the precautionary principle. Dealing with social uncertainties requires negotiations with stakeholders in the first place. Stakeholders can have a very strong opinion on what needs to be included in the SEA report, or on what the spatial plan should look like. This is especially relevant in the Dutch planning context, where stakeholders, especially the public, can be highly influential during the process. However, only negotiation and discussion with stakeholders was often perceived to be insufficient in reducing social uncertainties. Respondents stated that actual results that are implemented in the spatial plan to ensure sufficient sense of security amongst stakeholders are required. This can be done by making tailor-made agreements about acceptable environmental limits (precautionary principle), by including these limits in a monitoring program and implementing extra mitigation measures if needed (adaptive management), and by including stakeholders in the whole planning process.
To ensure the success of this process, the most important factor is the early inclusion of stakeholders in the process, and their willingness to negotiate. The willingness to negotiate is induced by awareness of the level of interdependency, and urgency of values. When to invite stakeholders to the table is a difficult question, and it differs for each type of spatial plan. Inviting stakeholders too early in the process, when plans are still very vague, could mean that they do not recognize their interest because they do not feel the impact of urgency. Inviting stakeholders too late could induce resistance if they feel their concerns and interests are not recognized enough [60].

3.6. Dealing with Legal Uncertainty

Legal uncertainties were mostly experienced by respondents in the two pilot projects, Binckhorst The Hague and Almere Oosterwold, which makes sense due to their experimental characteristics. Both case studies were pilot projects in the context of the new environmental and spatial planning act. The pilot projects were conducted to explore possibilities and restrictions in new planning processes. These pilots also offered room for innovation in the role and content of SEAs in cases of more flexible plans, in which no blueprint of the plan was available. An implication, as indicated by the respondents, was that, in the pilot projects, it was uncertain what was required from the SEA, and how legal security could be offered to stakeholders. Interestingly, the respondents explicitly acknowledged they did not choose the more “traditional” SEA methods for dealing with uncertainty, because using alternate scenarios or worst-case situations was considered to give a poor representation of the real situation, as the more flexible plans had no exact end goal, and therefore, had an unlimited number of future outcomes for the planning area. In both pilot cases, and in the case of Eemsmond Delfzijl, respondents considered that a combination of strategies was needed: (1) identifying environmental thresholds to limit developments (precautionary principle); (2) consulting and cooperating with legal partners (stakeholder involvement); (3) quick scans to determine the baseline situation (knowledge generation); and (4) application of all the previous strategies in a monitoring program in combination with possible mitigation measures (adaptive management). These strategies were considered to contribute to a need for information, to a confirmation of results in the SEA, and to solid management of uncertainties during the implementation of the plans.

4. Conclusions and Discussion

A first conclusion of our analysis is that the most dominant types of uncertainty, as experienced by our respondents, are uncertainties about variability, identification of future activities, identification of change and impact, and values, interests, and perceptions. Of these four dominant types of uncertainties, variability, and change and impact seem most difficult to address, as relatively few successful strategies were reported. It should be noted that this conclusion was based on respondents’ perceptions, and on a limited set of cases.
Strategies that were applied in practice did not always seem to be in line with what we expected from our literature review. Moreover, successful dealing with uncertainties seems hardly achieved by applying a single strategy. An important second conclusion of our analysis is that a combination of strategies is needed to successfully reduce or deal with uncertainties. This implies that the linkages between specific uncertainties and strategies in Figure 2 is perhaps too detailed, and that it is better to link strategies with categories of uncertainties (i.e., intrinsic, scientific, etc.).
Thirdly, adaptive management, in the form of monitoring and the implementation of mitigation measures where needed, was often mentioned on the basis of appropriate strategies for dealing with uncertainties that could not be reduced during a SEA process. It implies that, after the SEA and formal decision-making about the policy or plan at issue, environmental effects are monitored, and measures are taken if environmental thresholds are exceeded. However, while this strategy is in line with what other scholars proposed earlier, research also showed that this form of “SEA follow-up” in practice can often be problematic [2,64].
A final conclusion is that actors working with SEAs do not seem to be very aware of different types of uncertainties, as derived from our literature review. Our framework may help in raising awareness of uncertainties and why they matter, and help practitioners to think more systematically and proactively about dealing with uncertainty. This would also require more practical guidance, tailored to specific SEA procedures and specific legislation.
Uncertainty is unavoidable in SEAs due to the generally high level of abstraction of policies and plans. Reducing uncertainties or dealing with uncertainties is important, because they potentially undermine the goal of SEAs, which is to provide information about environmental effects that can be expected, and to facilitate the development of mitigation measures for reducing these effects to acceptable levels. Due to uncertainties in early stages of policy-making and planning, the “real” environmental impacts, as they manifest themselves during the implementation of policies or plans, may differ substantially from the expected impacts from an SEA, with major consequences for the environment and for public health. Developing additional mitigation measures may also delay implementation or lead to higher costs. Finally, inaccurate or incomplete assessments in SEAs may threaten the credibility of the instrument among decision-makers and the public.
In this paper, we created an analytical framework linking different types of uncertainties to strategies for dealing with uncertainties. Although the uncertainties that we identified show some overlap and can be related, they nevertheless can be distinguished analytically, and allow for a systematic exploration of uncertainties and strategies for dealing with them. Applying the framework to five cases of spatial planning in The Netherlands allowed us to identify categories of uncertainties and strategies as they emerge in SEA practice. The framework was applied to a limited number of cases. Applying it to cases elsewhere can provide more detailed insight into types of uncertainties, as well as strategies of coping with them. This is not only interesting from a scientific perspective, but also facilitates the transfer of best practices for identifying, reducing, or dealing with uncertainties. It is critical, however, to complement stakeholders’ perceptions of the “success” of dealing with uncertainties with a systematic, objective scientific assessment of that success.

Author Contributions

M.B. conducted the research and wrote the first draft the paper. H.R. rewrote the paper. P.D., A.W. and K.v.d.W. provided feedback and text suggestions and helped sharpening the conclusions. A.W. contributed to the further embedding of the paper in the literature.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Outline of Case Studies and Interviewees

Appendix A.1. Waalweelde West

Waalweelde West is an area in the south of The Netherlands, including a part of the river Waal and flood plains with high ecological value. After the “near” floods of 1993 and 1995, authorities addressed flood protection. A program was designed to reinforce dikes and to lower water levels by 80 cm during a flood. A structure vision (SV) was established to serve as an overarching framework for all projects and future developments. This also included economic, urban, and ecological activities. The SEA assessed three different alternatives that could be applied to each of the subprojects/areas. Alternatives were weighted regarding the resulting water-level reduction, stakeholder interests, and cost-effectiveness. The NCEA found that the SEA report contained essential information to decide on the SV with the inclusion of environmental interest.

Appendix A.2. Binckhorst

Binckhorst is a neighborhood in The Hague of about 125 hectares, characterized by business in the creative sector and information technology (IT) sector. Large housing needs are predicted in demographic forecasts, which is why the Binckhorst was transformed into a sustainable mixed urban residential area, creating space for about 5000 houses. In 2014, the municipality of The Hague started the procedure for an “Omgevingsplan”. This was different from a traditional plan in the sense that it had a longer time horizon (20 years), it represented all aspects of the integral physical environment (e.g., mobility), and rules were flexible rather than definitive. The central pillars of the plan combined with the SEA were a flexible set of rules, a balanced approach (compensation for negative effects), crash tests (evaluation mechanism), and a monitoring program. The NCEA found that the monitoring program should have been more explicit.

Appendix A.3. Oosterwold

Situated in the province of Flevoland, Oosterwold is an agricultural area of 4300 hectares. To solve large housing needs, and a shortage of space in and around Amsterdam, a regional contract was signed in 2013 to stimulate growth in the region around Almere. Oosterwold offers room for 15,000 households, and is meant to organically develop toward a green mixed-living area with a low concentration of buildings. To facilitate the development of housing, a structure vision was established in 2013, followed by a zoning plan in 2016. The same SEA report was used to provide information about environmental effects on both spatial plans. The spatial plans were designed at a high abstraction level, and mostly contained ambitions and development principles for the housing areas. The regulatory framework was based on so-called “decision trees”. The NCEA identified essential shortcomings in the SEA report, as the proposed activities and their environmental impact remained unclear.

Appendix A.4. Greenport Venlo

Greenport Venlo is an area between the municipalities of Venlo, Horst aan de Maas, and Peel aan de Maas. It is located in the province of Limburg, close to the border with Germany. Situated in the area are businesses for agrologistics, horticulture, and (intensive) livestock farming. The area is to be developed sustainably from 14,000 to 27,000 jobs up to 2022, including the expansion of business parks, infrastructure, and a high-quality landscape. An SEA was performed to study the environmental effects and possible mitigation measures of the plan. The NCEA recommended the inclusion of environmental information on different levels of detail, because the SEA served as an overarching framework for future assessments of zoning plans. The SEA needed to include the probability that effects could occur (best-case and worst-case), and in what way this would be monitored, evaluated, and addressed.

Appendix A.5. Eemsmond Delfzijl

In the province of Groningen lies Eemsmond Delfzijl. The region is characterized by a harbor, chemical industries, and energy industries. It is situated next to ecologically valuable areas, like the Waddenzee and Eems Dollard estuary. To stimulate economic growth in the area, a structure vision was established in 2017, aiming to provide room for sustainable energy, to create an attractive business climate, to prevent environmental pollution, and to increase biodiversity. The SEA was based on different scenarios, and included information about the environmental limits of the region. According to the NCEA, it included all the necessary environmental information for decision-making.

Appendix B. Interviewees

Table A1. Interviewees for the case studies.
Table A1. Interviewees for the case studies.
No.AffiliationOrganizationDate of Interview
EXPERT INTERVIEWS
1.SecretaryNCEA1 November 2016
2.Senior researcherUtrecht University11 November 2016
3.Senior advisor SEAArcadis NL21 November 2016
CASE I: WAALWEELDE WEST
4.SEA project managerArcadis NL23 January 2017
5.Project manager & SEA coordinatorProvince of Gelderland26 January 2017
6. SecretaryNCEA27 January 2017
7.SEA specialistArcadis NL30 January 2017
CASE II: BINCKHORST
8.PolicymakerMunicipality of The Hague 29 November 2016
9.Project manager OmgevingsplanMunicipality of The Hague29 November 2016
10.Strategic consultantAntea Group8 December 2016
11. SecretaryNCEA22 December 2016
12. Legal advisor Municipality of The Hague23 December 2016
CASE III: OOSTERWOLD
13.SecretaryNCEA8 December 2016
14. SEA practitioner & SEA project managerSweco10 January 2017
15.SEA coordinatorMunicipality of Almere 25 January 2017
CASE IV: GREENPORT VENLO
16.Project manager SEAArcadis NL19 December 2016
17.Project manager SEADevelopment Company21 December 2016
18.SecretaryNCEA7 March 2017
CASE V: EEMSMOND DELFZIJL
19.Project manager SEAArcadis NL13 January 2017
20.Project manager SV Eemsmond DelfzijlProvince of Groningen20 January 2017
21. SEA coordinatorProvince of Groningen20 January 2017
22.SecretaryNCEA7 March 2017

References

  1. Tetlow, M.; Hanusch, M. Strategic environmental assessment: The state of the art. Impact Assess. Proj. Apprais. 2012, 30, 15–24. [Google Scholar] [CrossRef]
  2. Runhaar, H.A.C.; Arts, J. Getting Ea Research out of the Comfort Zone: Critical Reflections From The Netherlands. J. Environ. Assess. Policy Manag. 2015, 17, 1550011. [Google Scholar] [CrossRef]
  3. Fischer, T.B. Theory and Practice of Strategic Environmental Assessment. Towards a More Systemic Approach; Earthscan: London, UK, 2007. [Google Scholar]
  4. Morgan, R.K. Environmental impact assessment: The state of the art. Impact Assess. Proj. Apprais. 2012, 30, 5–14. [Google Scholar] [CrossRef]
  5. Liu, Y.; Chen, J.; He, W.; Tong, Q.; Li, W. Application of an Uncertainty Analysis Approach to Strategic Environmental Assessment for Urban Planning. Environ. Sci. Technol. 2010, 44, 3136–3141. [Google Scholar] [CrossRef] [PubMed]
  6. Tenney, A.; Kværner, J.; Gjerstad, K.I. Uncertainty in environmental impact assessment predictions: The need for better communication and more transparency. Impact Assess. Proj. Apprais. 2006, 24, 45–56. [Google Scholar] [CrossRef]
  7. Leung, W.; Noble, B.; Gunn, J.; Jaeger, J.A. A review of uncertainty research in impact assessment. Environ. Impact Assess. Rev. 2015, 50, 116–123. [Google Scholar] [CrossRef]
  8. Cardenas, I.C.; Halman, J.I. Coping with uncertainty in environmental impact assessments: Open criteria and techniques. Environ. Impact Assess. Rev. 2016, 60, 24–39. [Google Scholar] [CrossRef]
  9. Morrison-Saunders, A.; Arts, J. Assessing Impact: Handbook of EIA and SEA Follow-Up; Earthscan: London, UK, 2004. [Google Scholar]
  10. Noble, B.F. Strengthening EIA through adaptive management: A systems perspective. Environ. Impact Assess. Rev. 2000, 20, 97–111. [Google Scholar] [CrossRef]
  11. Wardekker, J.A.; van der Sluijs, J.P.; Janssen, P.H.; Kloprogge, P.; Petersen, A.C. Uncertainty communication in environmental assessments: Views from the Dutch science-policy interface. Environ. Sci. Policy 2008, 11, 627–641. [Google Scholar] [CrossRef]
  12. Zandvoort, M.; van der Vlist, M.J.; Klijn, F.; van den Brink, A. Navigating amid uncertainty in spatial planning. Plan. Theory 2018, 17, 96–116. [Google Scholar] [CrossRef]
  13. Lees, J.; Jaeger, J.A.G.; Gunn, J.A.E.; Noble, B.F. Analysis of uncertainty consideration in environmental assessment: An empirical study of Canadian EA practice. J. Environ. Plan. Manag. 2016, 568, 1–21. [Google Scholar] [CrossRef]
  14. Raadgever, G.T; Dieperink, C.; Driessen, P.P.J.; Smit, A.A.H.; van Rijswick, H.F.M.W. Uncertainty management strategies: Lessons from the regional implementation of the Water Framework Directive in The Netherlands. Environ. Sci. Policy 2011, 14, 64–75. [Google Scholar] [CrossRef]
  15. Brugnach, M.; Dewulf, A.; Pahl-Wostl, C.; Taillieu, T. Toward a relational concept of uncertainty: About knowing too little, knowing too differently, and accepting not to know. Ecol. Soc. 2008, 13, 30. [Google Scholar] [CrossRef]
  16. Arts, J.; Runhaar, H.A.C.; Fischer, T.B.; Jha-Thakur, U.; van Laerhoven, F.; Driessen, P.P.J.; Onyango, V. The effectiveness of EIA as an instrument for environmental governance: Reflecting on 25 years of EIA practice in The Netherlands and the UK. J. Environ. Assess. Policy Manag. 2012, 14, 1250025. [Google Scholar] [CrossRef]
  17. Van Asselt, M.; Petersen, A. Niet Bang Voor Onzekerheid; Advisory Council for Research on Spatial Planning, Nature and the Environment (RMNO): The Hague, The Netherlands, 2003. [Google Scholar]
  18. Slob, M. Zeker Weten. In Gesprek Met Politici, Bestuurders en Wetenschappers over Omgaan Met Onzekerheid; Rathenau Institute: The Hague, The Netherlands, 2006. [Google Scholar]
  19. Mathijssen, J.; Petersen, A.; Besseling, P.; Rahman, A.; Don, H. Dealing with Uncertainty in Policymaking; CPB, PBL, RAND Europe: The Hague, The Netherlands, 2008. [Google Scholar]
  20. Brenninkmeijer, A.; de Graaf, B.; Roeser, S.; Passchier, W. Omgaan Met Omgevingsrisico’s en Onzekerheden: Hoe Doen We Dat Samen? Ministry of Infrastructure and the Environment: The Hague, The Netherlands, 2013. [Google Scholar]
  21. Petersen, A.C.; Cath, A.; Hage, M.; Kunseler, E.; van der Sluijs, J.P. Post-Normal Science in Practice at The Netherlands Environmental Assessment Agency. Sci. Technol. Hum. Values 2011, 36, 362–388. [Google Scholar] [CrossRef] [PubMed]
  22. Petersen, A.C.; Janssen, P.H.; van der Sluijs, J.P.; Risbey, J.S.; Ravetz, J.R.; Wardekker, J.A.; Martinson Hughes, H. Guidance for Uncertainty Assessment and Communication; PBL Netherlands Environmental Assessment Agency: The Hague, The Netherlands, 2013. [Google Scholar]
  23. Wardekker, J.A.; Kloprogge, P.; Petersen, A.C.; Janssen, P.H.M.; van der Sluijs, J.P. Guide for Uncertainty Communication; PBL Netherlands Environmental Assessment Agency: The Hague, The Netherlands, 2013. [Google Scholar]
  24. Kunseler, E.M.; Tuinstra, W.; Vasileiadou, E.; Petersen, A.C. The reflective futures practitioner: Balancing salience, credibility and legitimacy in generating foresight knowledge with stakeholders. Futures 2015, 66, 1–12. [Google Scholar] [CrossRef]
  25. Phillips, P.D. Evaluating Approaches to Dealing with Uncertainties in Environmental Assessment. Master’s Thesis, Unversity of East Anglia, Norwhich, UK, 2005. [Google Scholar]
  26. Pavlyuk, O. An Analysis of Legislation and Guidance for Uncertainty Disclosure and Consideration in Canadian Environmental Impact Assessment. Ph.D. Thesis, University of Saskatchewan, Saskatoon, SK, Canada, 2016. [Google Scholar]
  27. Larsen, S.V.; Kørnøv, L.; Driscoll, P. Avoiding climate change uncertainties in Strategic Environmental Assessment. Environ. Impact Assess. Rev. 2013, 43, 144–150. [Google Scholar] [CrossRef] [Green Version]
  28. Van der Sluijs, J.P.; Petersen, A.C.; Janssen, P.H.; Risbey, J.S.; Ravetz, J.R. Exploring the quality of evidence for complex and contested policy decisions. Environ. Res. Lett. 2008, 3, 024008. [Google Scholar] [CrossRef] [Green Version]
  29. Wibeck, V. Communicating uncertainty: Models of communication and the role of science in assessing progress towards environmental objectives. J. Environ. Policy Plan. 2009, 11, 87–102. [Google Scholar] [CrossRef]
  30. Thissen, W.; Kwakkel, J.; Mens, M.; van der Sluijs, J.; Stemberger, S.; Wardekker, A.; Wildschut, D. Dealing with uncertainties in fresh water supply: Experiences in The Netherlands. Water Resour. Manag. 2017, 31, 703–725. [Google Scholar] [CrossRef]
  31. Woodruff, S.C. Planning for an unknowable future: Uncertainty in climate change adaptation planning. Clim. Chang. 2016, 139, 445–459. [Google Scholar] [CrossRef]
  32. Walker, W.E.; Harremoës, P.; Rotmans, J.; Sluijs, J.P.; van der Asselt, M.B.A.; van Janssen, P.; Krauss, M.P.K. A Conceptual Basis for Uncertainty Management. Integr. Assess. 2003, 4, 5–17. [Google Scholar] [CrossRef]
  33. Sigel, K.; Klauer, B.; Pahl-Wostl, C. Conceptualising uncertainty in environmental decision-making: The example of the EU water framework directive. Ecolog. Econ. 2010, 69, 502–510. [Google Scholar] [CrossRef] [Green Version]
  34. Klauer, B.; Brown, J.D. Conceptualising imperfect knowledge in public decision making: Ignorance, uncertainty, error and ‘risk situations’. Eng. Manag. 2004, 27, 124–128. [Google Scholar]
  35. Refsgaard, J.C.; van der Sluijs, J.P.; Højberg, A.L.; Vanrolleghem, P.A. Uncertainty in the environmental modelling process—A framework and guidance. Environ. Model. Softw. 2007, 22, 1543–1556. [Google Scholar] [CrossRef]
  36. Ascough, J.C.; Maier, H.R.; Ravalico, J.K.; Strudley, M.W. Future research challenges for incorporation of uncertainty in environmental and ecological decision-making. Ecolog. Model. 2008, 219, 383–399. [Google Scholar] [CrossRef]
  37. Wardekker, J.A. Climate Change Impact Assessment and Adaptation under Uncertainty. Ph.D. Thesis, Utrecht University, Utrecht, The Netherlands, 2011. [Google Scholar]
  38. Kwakkel, J.H.; Walker, W.E.; Marchau, V.A. Classifying and communicating uncertainties in model-based policy analysis. Int. J. Technol. Policy Manag. 2010, 10, 299–315. [Google Scholar] [CrossRef]
  39. De Marchi, B. Uncertainty in environmental emergencies: A diagnostic tool. J. Conting. Crisis Manag. 1995, 3, 103–112. [Google Scholar] [CrossRef]
  40. Van Asselt, M.B.A. Perspectives on Uncertainty and Risk: The PRIMA Approach to Decision Support; Kluwer Academic Publishers: Dordrecht, The Netherlands, 2000. [Google Scholar]
  41. Meijer, I.S.; Hekkert, M.P.; Faber, J.; Smits, R.E. Perceived uncertainties regarding socio-technological transformations: Towards a framework. Int. J. For. Innov. Policy 2006, 2, 214–240. [Google Scholar] [CrossRef]
  42. Pahl-Wostl, C.; Sendzimir, J.; Jeffrey, P.; Aerts, J.; Berkamp, G.; Cross, K. Managing change toward adaptive water management through social learning. Ecol. Society 2007, 12, 30. [Google Scholar] [CrossRef]
  43. Maier, H.R.; Ascough II, J.C.; Wattenbach, M.; Renschler, C.S.; Labiosa, W.B.; Ravalico, J.K. Environmental modelling, software and decision support. Develop. Integr. Environ. Assess. 2008, 3, 69–85. [Google Scholar]
  44. Broekmeyer, M.E.; Opdam, P.F.; Kistenkas, F.H. Het Bepalen van Significante Effecten: Omgaan Met Onzekerheden; Alterra: Wageningen, The Netherlands, 2008. [Google Scholar]
  45. Knol, A.B.; Petersen, A.C.; Van der Sluijs, J.P.; Lebret, E. Dealing with uncertainties in environmental burden of disease assessment. Environ. Health 2009, 8, 21. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  46. Maxim, L.; van der Sluijs, J.P. Quality in environmental science for policy: Assessing uncertainty as a component of policy analysis. Environ. Sci. Policy 2011, 14, 482–492. [Google Scholar] [CrossRef]
  47. Skinner, D.J.; Rocks, S.A.; Pollard, S.J.; Drew, G.H. Identifying uncertainty in environmental risk assessments: The development of a novel typology and its implications for risk characterization. Hum. Ecol. Risk Assess. 2014, 20, 607–640. [Google Scholar] [CrossRef] [Green Version]
  48. Kloprogge, P.; Van der Sluijs, J.P.; Petersen, A.C. A method for the analysis of assumptions in model-based environmental assessments. Environ. Model. Softw. 2011, 26, 280–301. [Google Scholar] [CrossRef]
  49. De Jong, A.; Wardekker, J.A.; Van der Sluijs, J.P. Assumptions in quantitative analyses of health risks of overhead power lines. Environ. Sci. Policy 2012, 16, 114–121. [Google Scholar] [CrossRef] [Green Version]
  50. Jakeman, A.J.; Letcher, R.A.; Norton, J.P. Ten iterative steps in development and evaluation of environmental models. Environ. Model. Softw. 2006, 21, 602–614. [Google Scholar] [CrossRef]
  51. Guillera-Arroita, G.; Lahoz-Monfort, J.J.; Elith, J.; Gordon, A.; Kujala, H.; Lentini, P.E.; McCarthy, M.A.; Tingley, R.; Wintle, B.A. Is my species distribution model fit for purpose? Matching data and models to applications. Glob. Ecol. Biogeogr. 2015, 24, 276–292. [Google Scholar] [CrossRef] [Green Version]
  52. Wood, G. Thresholds and criteria for evaluating and communicating impact significance in environmental statements: “See no evil, hear no evil, speak no evil”? Environ. Impact Assess. Rev. 2008, 28, 22–38. [Google Scholar] [CrossRef]
  53. Weiss, C. Scientific uncertainty and science-based precaution. Int. Environ. Agreem. 2003, 3, 137–166. [Google Scholar] [CrossRef]
  54. Weiss, C. Can there be science-based precaution? Environ. Res. Lett. 2006, 1, 014003. [Google Scholar] [CrossRef] [Green Version]
  55. Peterson, G.D.; Cumming, G.S.; Carpenter, S.R. Scenario planning: A tool for conservation in an uncertain world. Conserv. Biol. 2003, 17, 358–366. [Google Scholar] [CrossRef]
  56. Allen, C.R.; Garmestani, A.S. Adaptive Management of Social-Ecological Systems; Springer: Dordrecht, The Netherlands, 2015. [Google Scholar]
  57. Cooke, R.M. Experts in Uncertainty: Opinion and Subjective Probability in Science; Oxford University Press: New York, NY, USA, 1991. [Google Scholar]
  58. Harremoës, P.; Gee, D.; MacGarvin, M.; Stirling, A.; Keys, J.; Wynne, B.; Guedes Vaz, S. Late Lessons from Early Warnings: The Precautionary Principle 1896–2000; European Environmental Agency: Copenhagen, Denmark, 2001. [Google Scholar]
  59. Hage, M.; Leroy, P.; Petersen, A.C. Stakeholder participation in environmental knowledge production. Futures 2010, 42, 254–264. [Google Scholar] [CrossRef]
  60. Glucker, A.N.; Driessen, P.P.; Kolhoff, A.; Runhaar, H.A. Public participation in environmental impact assessment: Why, who, and how? Environ. Impact Assess. Rev. 2013, 43, 104–111. [Google Scholar] [CrossRef]
  61. Isendahl, J; Pahl-Wostl, C.; Dewulf, A. Using framing parameters to improve handling of uncertainties in water management practice. Environmental Policy and Governance 2010, 20, 107–122. [Google Scholar] [CrossRef]
  62. Holling, C.S. Adaptive Environmental Assessment and Management; The Pitman Press: Bath, UK, 1978. [Google Scholar]
  63. Canter, L.; Atkinson, S.F. Adaptive management with integrated decision making: an emerging tool for cumulative effects management. Impact Assess. Proj. Apprais. 2010, 28, 287–297. [Google Scholar] [CrossRef] [Green Version]
  64. Morrison-Saunders, A.; Arts, J. Learning from experience: Emerging trends in environmental impact assessment follow-up. Impact Assess. Proj. Apprais. 2005, 23, 170–174. [Google Scholar] [CrossRef]
  65. Williams, B.K.; Szaro, R.C.; Shapiro, C.D. Adaptive Management: The U.S. Department of the Interior Technical Guide, 2nd ed.; U.S. Department of the Interior: Washington, DC, USA, 2009. [Google Scholar]
  66. United Nations. The Rio Declaration on Environment and Development; United Nations: Rio de Janeiro, Brazil, 1992. [Google Scholar]
  67. United Nations Educational, Scientific and Cultural Organization. The precautionary principle. In World Commission on the Ethics of Scientific Knowledge and Technology; UNESCO: Paris, France, 2005. [Google Scholar]
  68. Bond, A.; Morrison-Saunders, A.; Gunn, J.A.; Pope, J.; Retief, F. Managing uncertainty, ambiguity and ignorance in Impact Assessment by embedding evolutionary resilience, participatory modelling and adaptive management. Environ. Manag. 2015, 151, 97–104. [Google Scholar] [CrossRef] [PubMed]
  69. Khosravi, F.; Jha-Thakur, U. Managing uncertainties through scenario analysis in strategic environmental assessment. J. Environ. Plan. Manag. 2018. [Google Scholar] [CrossRef]
Figure 1. A typology of uncertainties relevant for strategic environmental assessments (SEAs).
Figure 1. A typology of uncertainties relevant for strategic environmental assessments (SEAs).
Sustainability 10 02463 g001
Figure 2. Analytical framework linking strategies to uncertainties, and their measure of success. Note: categories of uncertainties are represented in colors, green = inherent, blue = scientific, red = social, and yellow = legal.
Figure 2. Analytical framework linking strategies to uncertainties, and their measure of success. Note: categories of uncertainties are represented in colors, green = inherent, blue = scientific, red = social, and yellow = legal.
Sustainability 10 02463 g002
Figure 3. Uncertainties encountered in the five cases. Note: the colors refer to types of uncertainty, inherent = green, scientific = blue, social = red, and legal = yellow.
Figure 3. Uncertainties encountered in the five cases. Note: the colors refer to types of uncertainty, inherent = green, scientific = blue, social = red, and legal = yellow.
Sustainability 10 02463 g003
Figure 4. Strategies employed in the five cases and their connection with perceived uncertainties. Note: this table shows unique examples of uncertainties coupled to strategies, their success rate, and the theoretical expectation for strategies. For example, 2/2 means the type of uncertainty was dealt with using adaptive management twice, which was found as successful twice; or 4/6 means the type of uncertainty was dealt with using stakeholder involvement six times, which was found as successful four times. The colors represent the theoretical expectation.
Figure 4. Strategies employed in the five cases and their connection with perceived uncertainties. Note: this table shows unique examples of uncertainties coupled to strategies, their success rate, and the theoretical expectation for strategies. For example, 2/2 means the type of uncertainty was dealt with using adaptive management twice, which was found as successful twice; or 4/6 means the type of uncertainty was dealt with using stakeholder involvement six times, which was found as successful four times. The colors represent the theoretical expectation.
Sustainability 10 02463 g004
Table 1. Categorizations and examples of decision-making-centered uncertainties in environmental research.
Table 1. Categorizations and examples of decision-making-centered uncertainties in environmental research.
ReferenceTypologies or Interpretations of Uncertainty
[39]Scientific, legal, moral, societal, institutional, proprietary, situational
[40]Uncertainty due to variability: natural randomness, value diversity, behavioral variability, societal randomness, technological surprise
Uncertainty due to lack of knowledge: unreliability, structural uncertainty
[32]Location of uncertainty: context (natural, technical, economic, social, political, representation), model (structure, technical aspects), inputs (driving forces, system data), parameters, model outcomes
Level of uncertainty: statistical, scenario, recognized ignorance, total ignorance
Nature of uncertainty: epistemic or variability
[41]Nature: knowledge uncertainty, variability uncertainty
Level: from low to high
Source: technology, resources, competitors, suppliers, consumers, politics
[42]Lack of knowledge due to limited data
Understanding of the system
Unpredictability of factors in the system (randomness)
Diversity of rules and mental models that determine stakeholder perceptions
[15]Unpredictability, incomplete knowledge or multiple knowledge frames about the natural system, technical system or social system
[43]Data uncertainty, model uncertainty, human uncertainty
[36]Knowledge uncertainty: process understanding, model/data uncertainty
Variability: natural, human, institutional, technological
Linguistic uncertainty: vagueness, ambiguity, underspecificity
[44]Data or methods/knowledge gaps, inherent to complexity/ecological systems, societal interpretation of effects and values
[45]Location: model structure, parameters, input data
Nature: epistemic, ontic (process variability, normative uncertainty)
Range: statistical (range + chance), scenario (range + “what if”)
Recognized ignorance
Methodological unreliability
Value diversity among analysts
[38]Location: system boundary, conceptual model, computer model (structure, parameters inside model, input parameters to model), input data, model implementation, processed output data
Level: shallow, medium, deep, recognized ignorance
Nature: ambiguity, epistemology, ontology
[46]Location in a model: content, process, context of knowledge
Sources: lack of knowledge, variability, expert subjectivity, communication
[47]Epistemic uncertainty: data, language, system
Aleatory uncertainty: variability, extrapolation
Combined Epistemic-Aleatory: model, decision
Table 2. Uncertainties and a measurement of successful outcomes of management approaches.
Table 2. Uncertainties and a measurement of successful outcomes of management approaches.
TypeWhat Is Uncertain?What Is a Successful Outcome?
Inherent
Unpredictability in the natural system
The full range of options and impactsA bandwidth of options or effects is identified; it is explicitly addressed in future programs and/or intervention measures are explicitly available
Scientific
Limited or false information
Outcomes of models are inaccurate or not representativeAdditional information is provided, or the issue is explicitly addressed in a follow-up program
Social
Doubts or ambiguity about information
Differences in values, interests, and perceptionsSupport for the plan is obtained
Legal
Decision-making context
The justification of decisionsAcknowledgement or justification through public appeal or authorities
Table 3. Illustrative cases and their characteristics.
Table 3. Illustrative cases and their characteristics.
Case, Publishing Year, and Consultancy FirmProject Characteristics
#1 Structure Vision Waalweelde West (2015)Initiator: Province of Gelderland
Goal: Flood protection program including regional economic, urban, and ecologic development
#2 Zoning plan Binckhorst Den Haag
(2015–ongoing)
Initiator: Municipality of The Hague
Goal: Transformation of the Binckhorst into a sustainable mixed urban living area (5000 houses)
#3 Structure Vision Almere Oosterwold (2013) & Zoning plan Oosterwold (2016)Initiator: Municipality of Almere
Goal: Sustainable development of Oosterwold into a low density living/working area (15,000 houses)
#4 Structure Vision Greenport Venlo (2012)Initiator: Development Company Greenport Venlo (partnership of public organizations and private shareholders)
Goal: Sustainable development of agribusiness (14,000 jobs)
#5 Structure Vision Eemsmond Delfzijl (2017)Initiator: Province of Groningen
Goal: Industrial development of Eemsmond region (wind farms, heliport, industrial areas)

Share and Cite

MDPI and ACS Style

Bodde, M.; Van der Wel, K.; Driessen, P.; Wardekker, A.; Runhaar, H. Strategies for Dealing with Uncertainties in Strategic Environmental Assessment: An Analytical Framework Illustrated with Case Studies from The Netherlands. Sustainability 2018, 10, 2463. https://doi.org/10.3390/su10072463

AMA Style

Bodde M, Van der Wel K, Driessen P, Wardekker A, Runhaar H. Strategies for Dealing with Uncertainties in Strategic Environmental Assessment: An Analytical Framework Illustrated with Case Studies from The Netherlands. Sustainability. 2018; 10(7):2463. https://doi.org/10.3390/su10072463

Chicago/Turabian Style

Bodde, Maartje, Karin Van der Wel, Peter Driessen, Arjan Wardekker, and Hens Runhaar. 2018. "Strategies for Dealing with Uncertainties in Strategic Environmental Assessment: An Analytical Framework Illustrated with Case Studies from The Netherlands" Sustainability 10, no. 7: 2463. https://doi.org/10.3390/su10072463

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop