1. Introduction
In a strange paradox, we claim to live in a knowledge society [
1,
2], yet we are largely ignorant. At least, there are many more things we are ignorant of than knowledgeable of. Despite ignorance being ever so prominent in our relationship to knowledge, it is viewed, more often than not, as a bane. While there has been some limited interest in ignorance in various branches of scholarship [
3,
4], the common perception of ignorance as a negative trait has left it rather unloved in debates around making knowledge public, including science communication and its various cousins. We suffer an “ignorance of ignorance”, or, as Ravetz eloquently put it, “ignorance-squared” [
5]. Yet ignorance is a complex and essential part of science, from being a driver of curiosity to setting limits on expertise [
6]. Ignorance performs a number of legitimate roles, and is performed in a range of legitimate ways within science. On the other hand, ignorance can also be misused and abused, intentionally or naively.
In this paper, I will make a case for the importance of the public understanding of ignorance to the public understanding of science. I will argue that understanding when ignorance is a legitimate part of the scientific process—and when ignorance is misused or abused in science—is central to understanding science in terms of traditional public understanding of science, but especially if we think in terms of critical science literacy.
Critical science literacy argues for more than simply an understanding of scientific facts and processes; it argues that an understanding of the tacit knowledge of science is a key component of what scientific understanding and literacy should aim for [
7].
To make my case, I begin by defining the limits of ignorance, making the case that ignorance, far from being the curse we ought to break-free from in our lives, is our stock-standard underlying condition (so better we learn to live happily with it!). Drawing on the existing scholarship, I then present a typology of ignorance, and pay special attention to its uses (and misuses) in science. In the third part of this paper, I make explicit what I take to be critical science literacy, locating it within the greater context of science communication and the public understanding of science. Lastly, I bring the former parts together to argue that fostering a greater public understanding of ignorance is a rarely acknowledged, yet essential, aspect of making science public, and is a challenge that those engaged in and committed to a better public understanding of science should take very seriously.
2. What Is Ignorance
If I were to call someone ignorant, they would likely be offended. Ignorance is usually considered a flaw in an individual. It is presented as the antithesis of knowledge, and knowledge is good. Being knowledgeable is a compliment. Knowledge is the light (as the World Bank put it [
8]) that will dispel the darkness of ignorance [
9]. Indeed, there seems to be little love for (and little research on) ignorance. As Proctor and Schiebinger, who coined the term “agnotology”, claim: “Ignorance hides in the shadows of philosophy and is frowned upon in sociology” (2008). Yet even they mostly present ignorance in a bad light, opening their volume about the topic with a goal: “to explore how ignorance is produced or maintained in diverse settings, through mechanisms such as deliberate or inadvertent neglect, secrecy and suppression, document destruction, unquestioned tradition, and myriad forms of inherent (or avoidable) culturopolitical selectivity”. Proctor and Schiebinger do not use the term agnotology as the study of ignorance in all its (gore and) glory; it is reserved for the study of culturally induced ignorance. When we study ignorance, it is usually the bad aspects we focus on.
Sure, not all ignorance is bad. Scholars have noted that ignorance can be strategic and helpful [
10]. More relevant for this paper, it has also been noted that ignorance is an essential part of science [
11]; it is what drives science and allows for discovery. There would be no search for an answer or solution to a scientific question if we were not initially lacking one—if we were not firstly ignorant.
Ignorance, at its most basic, is “not knowing”. In particular, the focus in this paper is on the “not knowing” of facts, claims, information, and so forth in two ways: either lacking any knowledge or holding a false knowledge. This view assumes a broad definition of ignorance, which will be discussed as we proceed. However, there are some clarifications to get out of the way first. Someone might be referred to as being ignorant because they “ignore” social customs or trespass some moral or social code. This disregarding of norms is not (necessarily) a result of ignorance as “not knowing”—it sometimes is intentional non-compliance. Indeed, as Margaret Atwood noted: “Ignoring isn′t the same as ignorance, you have to work at it” [
12]. The distinction between ignorance and ignoring is one area where knowledge and ignorance are asymmetrical; you can be ignorant without actively ignoring, but “[f]or knowledge to be, there must have been, at some stage, an act of knowing” [
13]. This paper will be on both ignorance (the state) and ignoring (the act), as the latter can also sometimes lead to ignorance, as will be discussed further below.
Thinking of ignorance in this way narrows the field, yet it takes nothing away from the enormity of our ignorance, both as individuals and collectively. Put simply, while each individual may have some limited, finite knowledge about a few topics, the volume of claims and propositions any individual is ignorant of is infinite. Put mathematically, at an individual level, our knowledge, relative to our ignorance, approaches zero. Collectively, we are not much better off. While there is much more collective knowledge (defined broadly in terms of complementary knowledge and embedded knowledge as the sum of the knowledge held, not just in the individuals, but in the collective artefacts, books, etc. [
14]), there is still significantly more ignorance than knowledge. Worse still, knowledge (and science is an exemplar case here) does not so much dispel ignorance as open new spaces of ignorance; ignorance “increases with every state of new knowledge” [
4]. The more we know, the more we know we do not know. Ignorance, it turns out, is our most fundamental underlying condition. As humans, we are mostly ignorant, with few, rare glimpses of knowledge, and there are many ways we can carve up our ignorance
3. Typology of Ignorance
Unsurprisingly, it was the great sociologist of science, Merton, who first brought our attention to the functions of ignorance in our structure of knowledge, and specifically in science. He carved up ignorance into specified and unrecognized ignorance. Merton used
specified ignorance to refer to “the express recognition of what is not yet known but needs to be known in order to lay the foundation for still more knowledge” [
15] and noted that “
specified ignorance is often a first step toward supplanting that ignorance with knowledge” [
16].
Unrecognized ignorance, by contrast, is that which is left off our radars such that “as yesterday′s uncommon knowledge becomes today′s common knowledge, so yesterday′s unrecognized ignorance becomes today′s specified ignorance” [
15]. For the sake of simplicity, I will refer to these as
recognized and
unrecognized ignorance, respectively.
We can further formalize the typology of ignorance by drawing our attention to what cannot be known. As Ungar explains, there is “ignorance that exists beyond the boundaries of knowledge, such as scientific ignorance about new or unknown phenomena, ranging from the aspects of the brain through to black holes” [
17]. Wilholt distinguishes between what he terms conscious ignorance (Merton’s
recognized ignorance) and
deep ignorance, where
deep ignorance is the subclass of
recognized ignorance that, while recognized, has no candidate answers [
18]. An example might be knowing what it is like inside a black hole; we can ask the question and recognize our ignorance, but we also know we cannot answer it and are bound to, at least for now, remain ignorant. Wilholt further distinguishes between
unrecognized ignorance (which he terms opaque ignorance) and
thoroughly opaque ignorance, where
thoroughly opaque ignorance is the subclass of
unrecognized ignorance that is not only left off our radar, but could not even make it onto our radar, as we do not have the conceptual capacity to formulate the question [
18]. An example here is Aristotle’s ignorance of what it is like inside a black hole. Not only is there no answer to that question, but Aristotle also lacked the conceptual capacity to even make sense of the question, given the state of astronomy in classical Greece.
Ignorance has also received substantial treatment in formal systems that try to contain or manage our ignorance, especially in scientific and technical systems. In such settings, ignorance is treated along formal lines, in terms of probability assignments, risk-analysis, and formal decision-theoretic frameworks. Risk, uncertainty, and probabilities are well-respected and established family relations of ignorance [
19]. These require some form of ignorance and can be seen as ways to manage our ignorance. We can think of risk and uncertainty as fundamentally about ignorance as to which state of affairs is true, and probabilities as a way of quantifying this ignorance. Note that while risks and uncertainty are related to ignorance, they are not equivalent to it. There is much to ignorance that lays beyond this. Indeed, the concept of ignorance is an integral part of decision theory and risk analysis, such as in “decisions under ignorance”, where it is interpreted as a unique and “extreme state of uncertainty” defined as “a singular state of knowledge characterized by knowing nothing or having no reliable information about the phenomenon of interest” [
20]. Though treating ignorance in this way continues the long-standing scientific narrative and ideal of managing or “replacing ignorance by knowledge, with little attention to the formation of a useful kind of ignorance” [
15]. This paper, to some extent, hopes to redress this by giving attention to this “useful kind of ignorance”.
Table 1 shows how the formal typology of the recognized/unrecognized typology of ignorance mentioned above relates to more recent social, agnotology typology, which is discussed below.
Since the mid-2000s, ignorance has taken a social turn, most notable since the publication of Proctor’s and Schiebinger’s
Agnotology [
21]. Rather than focusing on formal structures and divisions of ignorance, this recent turn focuses our attention on the socio-cultural aspects of ignorance. Much like the more formal work on ignorance can be thought of as the ignorance counterpoint to epistemology, the work stemming from agnotology can be thought of the ignorance counterpoint to social epistemology [
22]. Proctor and Schiebinger also offer a typology, but fitting the social move, one based on socio-cultural attributes. Agnotology distinguishes between the active construction of ignorance and passive construction of ignorance. Active constructions of ignorance are intentional forms of ignorance while passive constructions of ignorance are “the unintended by-product of choices made in the research process” [
3]. Active productions of ignorance come in a number of flavors, from strategic ignorance to obscurantism, and other anti-epistemic strategies (creating doubt where knowledge exists) to virtuous ignorance.
Strategic ignorance (though this might be better termed strategic
ignoring) is the intentional use of ignorance for strategic purposes. Much of the focus on strategic ignorance has been on less than admirable instances. Indeed, McGoey, in her landmark book on the topic, defines strategic ignorance as “actions which mobilize, manufacture or exploit unknowns in a wider environment to avoid liability for earlier actions” [
23]. This includes examples of bank executives ignoring their underlings’ actions so as to able to truthfully claim ignorance of them. Strategic ignorance is related to what DeNicola terms “nescience”, “what we or others have determined not to know” [
24]. In fact, DeNicola considers strategic ignorance as a form of “nescience”. Others forms include
rational ignorance (when knowing is not worth the effort) and
willful ignorance (when it is better not to know some piece of information, especially where knowing such a piece of information can be painful or paralyzing, for example, “when one is unable to summon the courage to jump a ravine and thereby get to safety, because one knows that there is a serious possibility that one might fail to reach the other side” [
25]). As these are closely related, I will stick with strategic ignorance as an overarching term.
Virtuous ignorance is a special case of active construction where knowledge is intentionally not pursued for moral reasons. A classic case we may think of is the near universal decision to ban human cloning. Banning human cloning means we remain ignorant of many of the things we could and would learn through the development and application of this technology. Note that some scholars have argued this intentional creating of ignorance is not virtuous [
26]. Still, broadly, the consensus view is that it is better for us to remain ignorant than to start playing around with human cloning. It is a virtuous ignorance.
Both strategic ignorance and virtuous ignorance require intentional ignorance—actively not-knowing something that could be known. As such, they are a form of
recognized ignorance but not
deep ignorance (or at least, they do not necessitate
deep ignorance). In both cases, we recognize we are ignorant and we could, in principle, gain knowledge about the issue, but we choose not to. Obscurantism, on the other hand, while also a form of active ignorance, is one that plays on the intersection between plain old
recognized ignorance and
deep ignorance. Obscurantism and what Carrier terms “anti-epistemic strategies” are the moves made by various epistemic actors that “damage or hurt the production of knowledge” [
27]. Classic examples include Wakefield’s false MMR vaccine study, and the well-rehearsed doubt-creating moves of both the tobacco lobby and the climate-change denial lobby. These actors have, through various means, created doubt and uncertainty, and have actively worked to maintain a level of ignorance, most classically by using the existing knowledge-creating structures (“we need more research still before we can be certain humans cause climate change”, they might say). The doubt-flavored variety of ignorance that emerges relies, in part, on balancing what we do not yet know and what we cannot yet know. In doing so, anti-epistemic strategies, while actively creating, fostering, and feeding ignorance, straddle the space between
recognized ignorance and
deep ignorance.
The passive construction of ignorance, on the other hand, is the residual ignorance of our epistemic efforts and sits squarely within the realm of
unrecognized ignorance. The manner in which knowledge is pursued always leaves some areas non-investigated. An exemplar case is the standard use of crash-test dummies that are sized to represent male drivers as opposed to female drivers [
28]. As a result, car safety engineers are significantly more ignorant of the effects of crashes on female drivers than on male drivers, and are therefore less able to optimize car safety for all genders equally. The very real-world result of this ignorance is that female drivers are significantly more likely to suffer injury as a result of accidents. Indeed, “the odds of a belt-restrained female driver sustaining an MAIS 3+ and MAIS 2+ injury were 47% (95% CI = 27%, 70%) and 71% (95% CI = 44%, 102%) higher, respectively, than those of a belt-restrained male driver” [
28].
The passive construction of ignorance, then, is ignorance of some fact that could be known, but the lack of knowledge has not been noticed. As such, the passive construction of ignorance is a form of unrecognized ignorance, but not of thoroughly opaque ignorance (we could, in principle, gain knowledge about the issue). Of course, the catch with providing an example, such as the effect of gender assumptions in crash-test dummies, as I have above, is that to provide an example requires recognizing the ignorance in the first place. Once this happens, the ignorance moves from unrecognized passive construction to recognized ignorance. One can only hope that as this has been recognized, steps will likely be taken to redress the ignorance.
What stands out from these distinctions within ignorance, and the discussions motivating the typologies, is that aside from virtuous ignorance (which is noted, but rarely discussed in more depth), ignorance is a bad object. It is something to be either managed or overcome. This is particularly true in science where it has been noted that “uncertainty is there to be banished, and ignorance is to be rolled back beyond the horizon” [
5].
4. Ignorance in Science
The negative aspects of ignorance have been well-rehearsed—from the negative effects of false beliefs, be it intentional misinformation or otherwise, to the harms of unrecognized ignorance, from naïve p-hacking to biases such as the case of crash-test dummies. I therefore will not discuss them further here but take them as acknowledged. What has received less, if any attention, are the many very important and positive roles and functions ignorance plays in science. Ignorance, I will suggest, is not only beneficial to science, but is in fact an essential component of science. I will focus on three aspects. Firstly, ignorance as a foundational basis for knowing. That is the easy one. Secondly, ignorance as central to the culture of science as is standard practice. Lastly (and more importantly), ignorance as a fundamental and powerful tool essential to create increasingly important, insightful, and complex knowledge. Science not only interacts with ignorance, it intentionally uses ignorance as a tool to improve its epistemic pursuits.
Ignorance is a pre-requisite in science. Science, no matter how we interpret the term, is at least to some extent, concerned with gaining new knowledge and improving our understanding [
29]. Of course, there would be no need to improve our understanding or to gain new knowledge if we were not, to begin with, ignorant. It takes not knowing something for that knowledge to be novel. Scientific discoveries can only be “discovered” if they are not already known—if we are currently ignorant of them. Some science will lead to new knowledge accidently—scientists might sort of stumble on something new. However, much of science is about intentionally responding to an acknowledged lack of knowledge, an acknowledged ignorance. Highlighting what is not known but important or relevant to know in a field is often the first step in science. Indeed, the recent moves to increasingly pre-register [
30,
31] trials and studies rely, amongst other things, on first clearly acknowledging and articulating our ignorance (and how we plan to overcome it) before we even start our investigations. Ignorance, then, is the first step to science. Firestein, in fact, goes as far as to argue that ignorance is “the most critical part of the whole operation” (of science) [
6], especially
recognized ignorance.
Ignorance also plays a critical role in the cultural practice of science. While science can be defined in terms of method or in terms of sets of fact [
32], science is also a set of cultural norms and practices. The culture of science includes everything from the way citations are used as a marker of expertise [
33] to the value put on professional development activities beyond lab-based skills [
34]. A fundamental cultural practice in science is peer review. Indeed, for a field fundamentally concerned with producing high-quality knowledge, “peer review is the principal mechanism for quality control in most scientific disciplines” [
35]. In fact, peer review not only acts to ensure the quality of the work, but is also perceived by most academics to improve the quality of their publication, where “double-blind peer review is considered the most effective form of peer review” [
35]. Blinding, fundamentally, is about intentionally creating ignorance. Creating ignorance in the reviewer about who authored the piece and ignorance in the author about the reviewer aims to reduce bias in the reviewer and minimize retributions or repayments of favors. A blinded peer review creates the kind of (presumably) beneficial ignorance commonly seen in settings that deal with justice and fairness, such as ignoring facts unrelated to a case in jury settings or Rawls’ veil of ignorance in political theory [
36]. In doing so, blinded peer review uses a form of active construction of ignorance to enable science to better achieve its epistemic aims.
Lastly, ignorance is one of the most powerful tools we have in our knowledge-making activities, and especially in science. Much of science is about focusing on the specific issue and topic under investigation. Indeed, to glean evermore in-depth, insightful knowledge, distracting noise needs to be removed. Consider the case of models, one of science’s most powerful tools. Models are always a simplified version of the real world (the only full and complete model of the real world is the real world itself), so modelers must decide which variable to include in a model, and which to leave out. When modelling the movements of a given bird species in a rural setting, the modeler has to decide how much detail to include (e.g., with regard to villages, are the church bells towers included? Is the ringing of church bells included?, etc.). Whichever choice is made, some aspects will not be included. Indeed, the power of models stems from the clarity that comes from focusing on only what matters to the question at hand (and extrapolating from there). Likewise, consider the case of control in experiments. Control in experiments—understood here as the efforts taken to minimize the effects of all variables except the variable under observation—are a fundamental aspect of science [
37]. Such form of control in experiments is, fundamentally, about ensuring aspects not under investigations do not interfere with the experiment. In effect, such control measures in experiments require we ignore the variability found in the real world. Both models and controls in experiments necessarily require ignoring—ignoring all the aspects that are considered tangential and unrelated to the object of study. This requires a form of
strategic ignorance. This
strategic ignorance also leads to a
passive construction of ignorance because ignoring aspects considered unrelated leads to ignorance of the effects and interactions with and between these left-out variables. While this can sometimes lead to shortcomings in scientific understanding (perhaps the church bells in our model does affect the movement of some birds), by and large, the use of ignorance in models and in controls is what allows science to provide increasingly precise, rigorous knowledge. (Strategic) ignorance, it turns out, is one of science’s most powerful and important tools. Yet ignorance is rarely the subject of science communication.