Next Article in Journal
Navigating the Complex Terrain of Photography and Temporality
Previous Article in Journal
Contemporary Natural Philosophy and Philosophies—Part 3
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Informed Ignorance as a Form of Epistemic Injustice

1
Zefat Center for Bioethics, Zefat Academic College, Zefat 1320611, Israel
2
Department of Education Science, Roma Tre University, 00185 Rome, Italy
*
Author to whom correspondence should be addressed.
Philosophies 2024, 9(3), 59; https://doi.org/10.3390/philosophies9030059
Submission received: 19 February 2024 / Revised: 25 March 2024 / Accepted: 29 March 2024 / Published: 29 April 2024

Abstract

:
Ignorance, or the lack of knowledge, appears to be steadily spreading, despite the increasing availability of information. The notion of informed ignorance herein proposed to describe the widespread position of being exposed to an abundance of information yet lacking relevant knowledge, which is tied to the exponential growth in misinformation driven by technological developments and social media. Linked to many of societies’ most looming catastrophes, from political polarization to the climate crisis, practices related to knowledge and information are deemed some of the most imminent and daunting modern threats, evidenced by the latest report of the World Economic Forum, which has named misinformation the most severe short-term global risk. This paper’s epistemic perspective links the properties of today’s information culture and the ways in which it interacts with individual capacities and limitations in current technological and socio-political contexts. Such a position is analyzed through the lens of epistemic principles as a contemporary epistemic phenotype that emerges from an environment of ill-adapted and excessive information inputs and leads to a distinctive type of social injustice that is primarily epistemic in nature. While equity and accessibility are widely discussed as important contributing factors to epistemic discrepancies, other overlooked but fundamental issues underlying epistemic injustices are considered, such as information manipulation, cognitive limitations, and epistemic degradation. To effectively face this elusive threat, we propose an inclusive viewpoint that harnesses knowledge from cognitive science, science and technology studies, and social epistemology to inform a unifying theory of its main impacts and driving forces. By adjusting a modern epistemic framework to the described phenomena, we intend to contextually outline its trajectory and possible means of containment based on a shared responsibility to maintain ethical epistemic standards. In a time of international unrest and mounting civil acts of violence, it is pertinent to emphasize the ethical principles of knowledge systems and authorities and suggest policy adaptations to maintain a social contract based on the shared values of truth and freedom.

1. Introduction

Knowledge—of ourselves and the world around us, of laws, customs, duties, and preferences—governs the behavior of individuals and of society. When assessing the current properties directing society’s macro behavioral trends, it is therefore essential to evaluate its common sources of information and knowledge. This also demands consideration of the intricate exchange between them, their methods of production and diffusion, and their forms of reception and maintenance.
The past few decades are commonly referred to as the information age, driven by developments in data transfer and storage capabilities and characterized by a data-based global market in which information has become the greatest available resource [1]. Luciano Floridi [2] has coined the terms infosphere and inforgs to encapsulate a metaphysical realm of information, data, knowledge, and communication with the former and the informational beings populating it with the latter. Leaving aside the specific component of the digital dimension of knowledge, manipulation of information thus entails dramatic and unprecedented effects on societies globally, the relevance and imminence of which is evidenced by the recent report from the World Economic Forum [3], which named misinformation the world’s leading short-term threat.
The public relies both on external mediators and internal capacities to navigate the constant influx of information simultaneously arriving from multiple sources and is limited in its ability to assess its quality and reliability. These conditions provide fertile ground for the proliferation of misinformation, much often easier to create, spread, and understand than evidence-based and independently verified data. Sources of misinformation vary from professional literature to social media and commercial sources—each with their respective regulation capacities, incentives, and roles within society. Individual knowledge is therefore the result of a muddle of information pieces with different levels of grounding and credibility.
Problematically, as information becomes more available and accessible, the acquisition of knowledge appears to be diminishing. This information–knowledge gap has significantly widened over time. Misinformation, disinformation, fake news, and other instances of partial, unverified, or simply false information are conceptualized here as a product of this growing gap between knowledge and information, information referring to facts or data that are organized and presented in any meaningful way, and knowledge to a deeper processing and understanding of information. Whereas information usually refers to raw data, does not undergo substantial scrutiny or reflection, and can be false or misleading, knowledge involves more elaborate assessment of information, of which truth is a core property.
Although the precise relationship between knowledge and truth is elaborate and not strictly defined, there is an unequivocal and fundamental bond between these concepts [4] that is missing from information, not committed to the notion of truth [5]. Importantly, in the absence of truth or verified knowledge, decision-making relies upon the most readily available sources of information, which are often not those most precise or relevant [6]. Similarly, decisions made under the condition of uncertainty or stress can often be suboptimal [7], which can be detrimental in cases involving matters of health and safety. The effects of behavior based on low-quality information manifested in the most direct way during the COVID-19 pandemic [8]. The impact of misinformation was so dramatic that it was described by some as a separate pandemic related to information with comparable levels of risk termed “infodemic” by the World Health Organization1.
Who is responsible for keeping individual knowledge in check? As evidence accumulates for the dire consequences of misinformation, national and international bodies attempt to keep up with the rapidly evolving technological landscape and articulate appropriate policies and regulation strategies [9]. Meanwhile, foreign and private bodies may take advantage of the regulatory vacuum to promote commercial and political interests. Amid the battling powers, individuals are living through these significant changes with little guidance, making it difficult to recognize opportunities and threats or understand the most effective ways to adapt and protect oneself. While the great information tide is lingering, threatening to overtake intellectual freedom and agency, some will inevitably grab onto familiar methods of protection and lean on previously held theories and beliefs.
This paper proposes an epistemic perspective to the harms of information malpractices and borrows the idea of epistemic injustice to demonstrate its effects and suggest possible strategies of engagement. Expanding a term coined by Miranda Fricker [10], we depict a type of epistemic injustice that transcends these social boundaries. To demonstrate this argument, the current knowledge–information gap is illustrated as giving rise to a new hybrid type of epistemic class—informed ignorance—a position that evolves in an information-rich environment under conditions that do not allow proper acquisition of knowledge.
Clearly, the individual levels and impact of informed ignorance are also shaped by socio-economic and political variables as sources of discrimination and marginalization affecting knowledge accessibility and literacy [11]. However, although it is relevant to speak about these, we suggest it is not sufficient and position the state of informed ignorance within its cognitive, social, and technological origins and as a core determinant in facing global challenges. In describing its multilayered sources, the epistemic harms are more clearly delineated and understood within their respective fields of origin.
As scientific and technological research and development are supported by public funds, public data, and volunteer participation, which benefit various stakeholders, it is imperative to consistently advocate for the safeguarding of rights to services and data that are optimal, secure, and accessible. This necessitates a fundamental commitment from stakeholders to provide adequately substantiated and easily accessible information concerning the effectiveness, prevalence, and safety of all publicly available and promoted services. While the misinformation crisis is benefiting few but threatening everyone, it is crucial to recognize and engage with all forms of epistemic injustice and oppression. Some international institutions have begun to examine and delineate the main global concerns related to knowledge and misinformation [9,12]. In line with these efforts, we attempt to contribute a philosophical and scientific perspective on epistemic injustices, in as much as they impact the global threat of misinformation.

2. The Rise of Informed Ignorance

What are the shared elements that drive informed ignorance and its troubling effects on social order? As the great financial, technological, and political powers steadily push forward, their incentives enhance already encumbering individual cognitive limitations such as anchoring, limited attention, and memory resources [13,14] to promote a climate of information overload that heeds the acquisition of knowledge. Other underlying causes of current information gaps and knowledge fallacies are deeply rooted in the modern practice of scientific research [15]. Exploring the impact of each of these components on shaping individual choices and behaviors is considered here as essential for addressing the risks associated with misguided decision-making based on unfounded information.

2.1. Power, Media, and Access to Knowledge

Undoubtedly, technological development has been one of the main forces shaping and driving the spread of informed ignorance, both by boosting the public availability of information and by catalyzing processes of knowledge democratization. Greater distribution of fiber optic networks, 5G technology, cloud computing, and Content Delivery Networks (CDNs), all contribute to a significantly increased available bandwidth in many regions around the world. While supporting better and faster access to information, such changes also provide continuously evolving ways in which data are collected, distributed, and processed. From this perspective, technology serves as a powerful tool for democratizing information, increasing the availability of information for everyone.
Simultaneously, these technologies allow widespread access to data production and distribution platforms, which may also be employed for the proliferation of misinformation, disinformation, and unverified information and for the selective exposure to information. These can be traced to the concurrent extensive use of recommendation algorithms that utilize personal use data to influence the type of information each user interacts with [16]. Such algorithms significantly bias individual knowledge formation by deliberately or not reinforcing existing beliefs, limiting doubt and discussion, misrepresenting opposing opinions, and simplifying complex issues. In line with these considerations, advancements in Artificial Intelligence (AI) and deep learning technologies introduce additional problems. Deep fake technologies, for instance, can create highly realistic (yet fabricated) audio and video content, blurring the lines between truth and falsehood.
One of the widely recognized culprits of the current information culture is the global adoption of social media as a prime source of communication and its reciprocal ties with technological development, globalization, and populism [17]. This foundation gives rise to some of the perpetuators of informed ignorance such as echo chambers, filter bubbles, algorithmic biases and influences, and data manipulation [18,19]. Social media algorithms, designed for user engagement, reinforce individuals’ existing beliefs and render them more susceptible to political radicalization [20]. The rapid dissemination of emotionally charged and sensational content further amplifies these effects, leading individuals toward more extreme political ideologies. Moreover, false or misleading information, often disguised as legitimate news, can distort the comprehension of politically relevant material and allow voters to ‘pick and choose’ views and evidence which align with their ideological inclinations, thus fostering the formation of digital communities that share and reinforce similar ideas, provide a sense of belonging and solidify radicalized political views [21].
Adding to the complexity and unreliability of information is the intricate and often problematic role played by popular media in the dissemination of information. Incentives driving media sources are often financial and commercial, notoriously lacking in transparency regarding conflicts of interest but also regarding information sources and level of expertise (i.e., in scientific matters). Without such disclosures, media sources further contribute to the spread of unverified information [22] and disinformation and contribute to a climate of distrust. Importantly, while media reports do not dictate public opinion, numerous studies have demonstrated their substantial influence on the framing of public discourse and the shaping of its attitudes [23]. Popular media, like social media, by promoting interests besides public awareness and engagement, can introduce bias or oversimplify complex issues. This is especially problematic when mediating complex information to the general public, as in issues concerning healthcare, political debates, and scientific findings.
Furthermore, the sheer speed at which information is disseminated is an important factor in facilitating the dissemination of large volumes of misinformation. As unverified information is much easier to create, and can easily be more sensational, it is more likely to “go viral” and spread before its accuracy is even questioned [24]. Social media facilitates this phenomenon by rewarding viral content, thus rewarding the creation and spread of more stimulating yet less verified content.
The shift from industrial to technological economic dominance has given rise to new players and powers. With the growing influence of data technologies on decision-making in business, academia, and healthcare, the ability to analyze, navigate, and interpret information has become key in determining individual and company influence and success [25]. Accordingly, technological companies are becoming increasingly influential in terms of market behavior, while also significantly affecting public discourse and perceptions, through media, communication, and political activity [26]. On the other hand, the democratization of information, which promotes and supports this technocracy, is maintained by social media, which can also be seen as empowering local and previously marginalized groups and voices [27].
However, some deem the discrepancies in internet access and digital services both locally and globally as a new form of impairing discrimination, labeled digital divide [28]. This term pertains to the uneven distribution of access to internet resources, which in turn has a profound impact on individuals’ opportunities and abilities to engage with information. Importantly, such digital disparity is not isolated but intersects with various social determinants related to technology access such as age, scientific literacy, and political inclination, which play a pivotal role [29]. More relevantly here, we need to understand how information can be—no matter how counterintuitively—disempowering and, indeed, oppressing.

2.2. Experts, Trust, and Research Integrity

Violations of scientific integrity can promote public skepticism and mistrust by disregarding scientific standards and supporting misinformation. These can be more or less intentional and include methodological and statistical choices, interpretation and publishing biases, hypothesis phishing, and lack of reproducibility [30]. Some of these practices are deeply rooted within accepted scientific research culture, thus necessitating abundant and mostly unrewarded efforts from researchers in order to completely avoid them. Unfortunately, the cumulative effect of even minor violations over time, combined with less prevalent major violations, results in the publication of unreliable information and an erosion of public trust [31], the effects of which are far-reaching and untraceable.
Misinformation may also result from common ways of scientific reporting, which involve selective publishing and may favor certain authors or topics based on not-strictly-professional reasoning and publication biases, which may drive researchers to slightly manipulate data to fit a publication or omit less favorable or supportive data. An example from the field of neuroethics relates to the neuroenhancement debate which, despite having produced worthwhile philosophical speculations over its scope and validity [32], has been shown to conflate or omit certain methods and their contexts of use [33] in order to promote a prepacked view. Furthermore, for more than a decade, reports have been pointing to discrepancies in published information regarding the prevalence, efficacy, and risks of neuroenhancement use [34,35].
Rational ignorance is another way experts can inadvertently manipulate individual choice [36] by producing information that is not sufficiently accessible because it is either too complicated, too long, or too unengaging. As human beings have limited attention resources that need to be allocated efficiently, it is a harder task to keep one’s attention than to lose it. Hence, stakeholders are required to allocate resources to actively engage users and supply them with the relevant information to make informed decisions. However, this is not in the interest of most stakeholders, who in some cases may even actively or passively benefit from end-user ignorance.

2.3. Cognitive Factors: Limited Resources Meet Excessive Information

An important and seldom addressed contributor to informed ignorance is the function of the human brain or, more specifically, the way in which the human brain processes information, learns, and remembers: the properties of the cognitive system. Human cognition is highly complex and notably involves around 100 billion neurons and an estimated storage capacity of 1.25 × 1012 bytes, leading to very high processing abilities. Nonetheless, the brain is made of biological matter and is therefore limited in its capacities [37], including storage size and the speed and accuracy of processing. This idea is captured in the term bounded rationality [38], which relates such cognitive limits to the ensuing need to rely on less demanding cognitive processes such as biases and heuristics.
Cognitive biases are adaptive functions of data processing but are suboptimal for accurate knowledge production and decision-making. One example is confirmation bias, the tendency to prefer information that confirms previously existing beliefs and reject or avoid contradictory views, traditionally thought to help in avoiding cognitive dissonance [39]. Another is anchoring, the tendency of individuals to develop over-confidence in an initial viewpoint (e.g., due to a lack of critical skills), which prevents them from updating their beliefs, even in the face of information contradicting their currently held view [40]. Attention is another cognitive resource that is crucial for information processing. Like others, it is known to be limited in capacity and particularly vulnerable to disruptions and manipulation. In the presence of excessive information, cognitive resources are exhausted, and prioritizing relevant information becomes especially challenging, also due to limited attention capacities [41], leading again to biased information and misunderstanding.
Such biases can be useful for quickly choosing appropriate behaviors in unfavorable conditions but are not suited for guiding complex learning and analysis of information. Notwithstanding, the current information culture, in which decision-makers often meet conditions of information overload in stressful social and political environments, requires such suboptimal cognitive strategies to be used, exposing individuals to harm and permitting more misinformation to circulate. Reliable information is difficult enough to locate and recognize; however, this becomes more cognitively demanding in the presence of uncertainty, which is a common feature of many knowledge types. This requires users to make an effort to gain relevant knowledge, which is sometimes deliberately made inaccessible, and their ability to do so depends on their level of literacy, technological skills, available time, and so on.
In healthcare, this is easier to observe, as seen in the case of the COVID-19 pandemic, an invaluable example of the power and place of knowledge, information sources, and misinformation. The pandemic created an unprecedented sense of urgency in obtaining and diffusing knowledge under extreme uncertainty, and the effects on public trust and cooperation were immediate [31]. Group differences in accessibility to knowledge, but also education and cognitive skills, and their effects on perceptions and choices were evident, with dire consequences to social and political stability.

3. An Epistemic Perspective

Laypeople often lack tools to assess the quality of information they acquire; however, the abundance of it may nonetheless create a false sense of being highly informed, thus not in need of exploring further sources, as well as strengthen deviations of the ego such as the Dunning–Kruger effect [42]. This is enhanced by the limited time and cognitive resources available to the average person and forms individuals who are highly informed but are nonetheless fundamentally ignorant of the truth and even of their ignorant state. Informed ignorance is therefore differentiated from other related states such as rational ignorance [43], an adaptive cognitive–epistemic status in which the cost of acquiring specific knowledge is not worth the potential benefit to be gained from that knowledge, or information avoidance [44], which involves voluntary avoidance of emotionally or cognitively challenging information. Informed ignorance is proposed to be less acknowledged but potentially more widespread and harmful.

3.1. Epistemic Degradation

Current manifestations of the relationship between information and knowledge in the digital sphere contribute to a degradation of epistemic virtues [45] such as humility and integrity, which implies a failure to acknowledge both one’s responsibility for others’ epistemic position and the limitations of one’s own epistemic resources. The roots of these failings can be traced to several factors, including the non-linear relationship between the intellectual and cognitive efforts exerted and the knowledge acquired due to mediating effects of irrelevant and false information. Another source of this degradation is the loss of reverence towards knowledge, the recognition and theoretical implications of knowledge, and the understanding of its practical significance, which can be exemplified in the term “post-truth” [46]. These support an environment of uncertainty and confusion regarding the means and benefits of verifying information and perpetuates the reciprocal effects between epistemic degradation and informed ignorance.
By contributing to an erosion of accepted knowledge structures and promoting confusion and fear, current information culture also exerts a profound influence on political choices, also referred to as an “epistemic crisis” [47]. These types of practices further alienate people from seeking true knowledge, deepening their distrust in authorities and experts, and degrading the production of knowledge by depriving it of one of its basic necessary conditions: freedom. Moreover, interpreting and assessing the quality and reliability of information requires skills not widely available and, in complex cases, available only to several experts in the relevant field. Given that most of the population lacks the necessary proficiency, feelings of frustration may manifest, leading to potential rebellious acts against the knowledge production system [48]. These can also manifest as skepticism towards expert authority, pursuit of alternative information sources, and a propensity to question established norms of rationale and justification.

3.2. Epistemic Injustice

Individuals within the current information culture are suggested to suffer from injustices that are fundamentally epistemic. Beyond the epistemic injustices previously described, which concern the marginalization of certain social groups from participation in knowledge creation [49], another injustice can be recognized that affects society more widely, one that is based on the access to knowledge. While clearly an overlap exists between these groups, as marginalized groups also have less access to knowledge, there are suggested to be other lines of division more relevant to this debate. In this case, the injustice extends to most users of social media and other digital information sources. Like a contaminated source of water, users turn to digital sources to acquire knowledge to inform decisions regarding health, politics, finances, etc. and inevitably consume low-quality products, which may offer them more harm than benefit. This is epistemic pollution.
Borrowing the term coined by Fricker [10] to denote inequalities in the legitimacy of groups and individuals to act as knowers, epistemic injustice can be seen on a wider scale as discrimination of groups not only in terms of their participation in knowledge production but also in terms of opportunities to achieve epistemic virtues through access to appropriate tools and resources [50], which can manifest as scientific literacy and technological aptness. Notably, it is interesting to consider Dan Kahan’s explanation of a bad belief (a belief based on faulty data) as the result of cultural cognition [51]. Given that, for Kahan, we perceive the world and make inferences about it in line with the society and culture we live in, we should pay enormous attention to the cultural world we inhabit. If we are to tackle epistemic injustice globally, we need to make sure that everyone, through different cultural paths perhaps, will perceive this to be an existential problem.
In this context, the phenomenon of informed ignorance intricately intertwines with the manifestation of epistemic injustice in its broader sense. This concept, taken here to denote the unequal distribution of knowledge and access to information, is emphasized by instances of selective ignorance [50]. Individuals or institutions, through conscious or unconscious mechanisms, may perpetuate epistemic injustice by marginalizing certain perspectives, thereby excluding some forms of information from common discourse. The outcome is a distorted and decontextualized understanding of information, often marked by misrepresentation, that undermines the epistemic authority of individuals and can reinforce unjust hierarchies of knowledge. Moreover, unequal access to information deepens existing epistemic injustices, as certain groups are systematically deprived of educational resources, tools for knowledge assessment, and platforms to express and practice their knowledge [10]. Epistemic injustice similarly manifests in favoring certain uses of language, certain methodological tools, and certain categorization and assessment tools, such that alternative forms of knowledge may be dismissed or devalued [52].

3.3. Impacts of Informed Ignorance and Epistemic Injustice: The Case of Healthcare

Individual decision-making based on partial or biased information can lead to discriminatory practices in healthcare, criminal justice systems, education, and even marketing and can include suboptimal treatment, denial of rights, and exploitation. Unattended informed ignorance can cause deliberate and undeliberate violations of human rights, arising from epistemic injustices and affected by technological skills and discrepancies between and within socio-economic groups. Here, the case of healthcare-related information will be utilized to exemplify such current information practices.
A discrepancy between available evidence-based information and ensuing public choices and behaviors has long been recognized in healthcare and studied under the term infodemiology [53], an umbrella term that is concerned with how information is disseminated and produced as well as its impact on public behavior, predominantly in health-related issues. The relevance of the term surged during the COVID-19 pandemic, reaching the level of an “infodemic”, defined by Gunther Eysenbach [54] as “an excessive amount of unfiltered information concerning a problem such that the solution is made more difficult” and by the World Health Organization as “too much information including false or misleading information in digital and physical environments during a disease outbreak,”2 which can cause confusion, harmful behavior, and distrust in experts.
Ambiguous or unclear information about safety in healthcare not only creates confusion but also elevates the risks of potential harm. This ambiguity may manifest in a variety of ways, including the over-emphasis of risks associated with certain medical treatments. Such a skewed representation can contribute to a climate of fear or hesitancy, leading to the unintended consequence of under-treatment. Individuals, influenced by an exaggerated perception of risks, may opt to forgo necessary medical interventions, impacting their overall well-being. Unverified media reports may also foster unrealistic expectations of treatment leading to disappointment and distrust while deterring people from seeking less advertised but more effective solutions [55].
Additionally, in an era of pervasive self-diagnosis, marked by increasing reliance on online resources, and the abundance of conflicting and partial information, patients are more likely to choose an inadequate treatment [56], especially when information is supported by alleged professional opinions. This has been exemplified in cases of public health concerns such as the opioid epidemic in the United States, in which financial incentives, privatized healthcare, and unregulated advertising enhanced a substance use epidemic, downplaying risks, and leading to a dramatic health crisis [57].
Another problematic consequence of misinformation in healthcare is its detrimental effect on the ability to make informed decisions. The notion of informed consent to medical procedures arose from the need to avoid different forms of exploitation and infringement on personal human rights at the intersection of professional hierarchy, medical care, and personal freedom and agency. Decisions regarding compliance in medical procedures are usually determined by the individual based on expert opinion and the available information presented in the informed consent form. Both sources of information have problematic aspects [58], which are further amplified by current infodemiological features of health information, such as patients’ lack of skills for assessing the available information. These result in the previously described harmful behaviors and perspectives that are associated with confusion, information fatigue, and mistrust in experts. Importantly, while some individual properties are recognized as compromising a patient’s ability to provide consent, and demand special attention, compromised access to knowledge is not recognized as such [59], thus potentially exposing vulnerable participants to exploitation3.

4. Fighting Back

Facing informed ignorance is an urgent task that relies on a collaboration between various fields of knowledge. In recent years, and especially after the COVID-19 pandemic, an important debate is taking place regarding short- and long-term steps required to take in order to contain, thwart, and manage threats associated with information practices. From technological solutions, through awareness campaigns, policy changes, and education initiatives, many approaches have been suggested but few have been tested, such that little evidence regarding their respectable advantages is available [61,62,63]. We wish to add to this pool of resources the call to consider associated epistemic injustices as a core guiding principle.

4.1. Articulating Epistemic Ethics

The unattainability of a solid ground that always favors true knowledge over other kinds of beliefs is fundamental to epistemic thought. But can epistemic theory be utilized to support the war on misinformation? Intuitively, the millennia of epistemic traditionmust have laid the intellectual groundwork for systematically categorizing true and false beliefs. However, this is not so clear in practice. Some philosophical perspectives giving value to knowledge as true belief have been discussed. For instance, Siegmann and Grayot [64] point to the similarities shared by knowledge leading to true and false beliefs and suggest they are differentiated by the associated understanding involved in acquiring true beliefs.
Importantly, any attempt at articulating an ethics of epistemology must integrate these and similar notions from epistemic theory that address its interaction with normative thought, namely, virtue epistemology. This can be poised within the conditions set forth by the “value turn” in epistemology [65], traces of which can be found in several of its associated sub-fields. In social epistemology, the role of social processes in the acquisition of knowledge is emphasized and underlies a call for collaborative efforts and diversity of perspectives, while epistemic virtue theory underscores the importance of cultivating intellectual virtues like open-mindedness and humility, which can enhance critical thinking and reduce ignorance. Epistemic notions such as fallibilism and reliabilism also encourage epistemic humility and skepticism by recognizing the limits of human knowledge. Bioethics and medical ethics emphasize the ethical imperative of clear and appropriate communication of relevant information for assuring safe, equitable, and informed decision-making in healthcare. Finally, pragmatic epistemology considers the practical consequences of knowledge, motivating individuals to seek accurate information.
Notably, promoting and voicing the understanding that information is not sufficient for knowledge is fundamental and should be considered within a greater commitment to promote skills for engaging in responsible epistemology. The awareness of the epistemic challenges afforded by current information culture has been referred to as an “epistemic crisis” of post-truth [66,67], with suggestions to provide the public with appropriate epistemic tools.

4.2. Practical Steps

Though the practical aspects of facing informed ignorance will most likely lead to gradual and unperceivable changes through long-term effects, they must be articulated regularly and persistently until integrated into appropriate frameworks. The unprecedented challenges encountered during the COVID-19 pandemic to public health systems, global cooperation, clinical research standards, and information dissemination are vital to inform any future strategies. The global infodemic that developed alongside the health crisis afforded a unique opportunity to witness an accelerated version of the looming implications of the bubbling misinformation culture. It is an ethical imperative to consider the indispensable lessons learned from these tragic circumstances in order to promote tangible solutions and avoid future harm, while overlooking this responsibility would amount to the type of epistemic injustice herein discussed.
While the practicalities of the suggested approaches for countering misinformation are not within the scope of this paper and have been extensively elaborated elsewhere, a few major areas of concern will be recounted in line with this paper’s appeal for their consideration within an epistemic framework, namely, media literacy, scientific research standards, well-informed regulation, technological solutions, and stakeholder responsibility.
To provide end-users relevant skills to adapt to the changing information climate, it is necessary to directly engage with public media literacy, while taking into account individual differences and bridging the gap between scientific expertise and public understanding. Promoting literacy can allow individuals to recognize credible sources and be skeptical of unverified information. The necessary conditions for acquiring knowledge, namely, an appropriate psycho-social environment, when extended to the general adult population in the information era must also consider the implications of both external (regulatory, technological, and political) and internal (cognitive training, awareness, and critical thinking) influences. While this is a well-known property in theories of education [68], it is mostly omitted in the discussions of public discourse.
One important consideration is the incorporation of practices for incentivizing integrity in science and media communication. This involves creating a framework that rewards ethical behavior such as transparency and inclusion. For the scientific community, different strategies have been suggested to promote integrity, from refreshing and reestablishing transparent peer-review processes to improving statistical education and engaging with reproducibility through the scientific reward system. Widely discussed forms of improving scientific standards must be implemented, including open science practices, data sharing, pre-registration of studies, and others, as part of the scientific institutions’ ethical commitment to society.
Addressing informed ignorance through regulation should consider how ethical and intellectual virtues can guide the creation, dissemination, and consumption of information. Regulative actions regarding scientific practices should directly prioritize open and transparent scientific processes in order to encourage a commitment to providing well-founded and evidence-based results while avoiding redundant procedures, whereas regulation regarding data and information management might focus on promoting values of clarity, meaningful organization, and accessible resources. An underlying epistemic virtue shared by regulative practices affecting knowledge production should be a clear commitment and adherence to precision and evidence.
Importantly, social media platforms, as the main arena for public discourse, should prioritize independently verified information and develop creative ways to promote and reward evidence-based information. Some such processes have already been implemented, such as “community notes” in the social media platform “X” (formerly “Twitter”), which is based on community engagement in information confirmation4 through existing methods of reward, including visibility and popularity. Other systematic procedures such as fact checking and flagging are increasingly utilized but are far from ideal as they are still not immune to exploitation by perpetrators of misinformation.
Finally, stakeholders should be held responsible for promoting and maintaining ethical standards related to information practices, particularly in the context of emerging technologies, media, policymaking, and the actions of scientists, technological companies, and experts. Avoiding the potential dangers leading to informed ignorance is an urgent public safety concern requiring a collective effort from stakeholders across various domains. Influential parties should be continuously scrutinized by external regulation bodies and professional media to detect ways in which they may profit from informed ignorance and may accordingly be short-sighted when determining related governance strategies. As highlighted by Zuboff [48], identifying the influential players and mechanisms in the new information industry is crucial when outlining the strategies and goals of current societal efforts to protect its values and well-being. Stakeholders, including tech developers, must uphold the commitment to responsible deployment of technology by actively addressing biases, ensuring accountability, promoting fairness in algorithmic decision-making processes, and ultimately by prioritizing ethical considerations over maximizing profits.

5. Conclusions: It Is in Our Hands

Epistemic injustice seems to align with other forms of social and political discrimination; however, beyond this marginalization of certain groups in their ability to impact discourse and knowledge, the idea of epistemic injustice is suggested to extend these boundaries in a meaningful way. The line drawn by power and money, “have and have-nots”, is relevant to understanding some of the forces driving epistemic injustice, but it is not sufficient, and as such may not lead to effective solutions. Perhaps an alternative notion similarly framing the few who hold the means of information production may be more useful in this context. Nonetheless, the processes of knowledge democratization are impacting and shaping the global information market with significant implications.
As the public pushes back against traditional sources of information and expertise, it appears as an act of defiance aimed at taking back control of knowledge production. Accordingly, the new means of information production are free, accessible, equal, and reject accepted structures or rules. However, as in similar historical feats, it is unclear who benefits from these dramatic power shifts. Thus, while still not enjoying the real fruits of labor and common goods associated with the new data-based market dynamics, the public gains only a transient sense of power and freedom. Informed ignorance may be a side effect of the modern counterculture against the intellectual elite, achieving to lower the worth of their work by injecting doubt, mistrust, and skepticism but in the process also losing crucial meaning, nuance, and knowledge.
Informed ignorance, as a human-driven property, calls for actions that are equally within our control. While practical steps are undeniably imperative, they must be underpinned by robust theoretical convictions rooted in clear epistemic virtues. These should unequivocally recognize the social components that drive modern information culture, embrace joint values of truth and justice, which respect social boundaries, and appreciate that our collective understanding is what shapes the trajectory of societal knowledge. By aligning our intentions with these qualities, we can lay the foundation for guidelines and practices that not only undermine the driving forces of informed ignorance but also contribute to the flourishing of a well-informed and enlightened society.

Author Contributions

Conceptualization, N.C. and M.D.G.; methodology, N.C.; investigation, N.C.; resources, N.C. and M.D.G.; writing—original draft preparation, N.C.; writing—review and editing, N.C. and M.D.G.; supervision, N.C. and M.D.G.; funding acquisition, M.D.G. All authors have read and agreed to the published version of the manuscript.

Funding

This article has been partially funded by the PRIN project “Distrust in Science Reframed: Understanding and Countering Anti-scientific Behavior” (CUP F53D23010710001).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study.

Conflicts of Interest

The authors declare no conflict of interest.

Notes

1
2
3
As part of a wider resizing of individual autonomy in western societies, informed consent has been challenged in many ways in recent years. Within the European Union alone, there have been projects. See for instance https://i-consentproject.eu/ (accessed on 25 March 2024) and publications [60] attempting to tackle some of its criticisms in relation to vulnerable populations. Yet, we suggest that the vulnerability discussed here is of a new type deserving attention.
4

References

  1. Sadowski, J. When data is capital: Datafication, accumulation, and extraction. Big Data Soc. 2019, 6, 2053951718820549. [Google Scholar] [CrossRef]
  2. Floridi, L. A look into the future impact of ICT on our lives. Inf. Soc. 2007, 23, 59–64. [Google Scholar] [CrossRef]
  3. World Economic Forum. Global Risks Report 2024; World Economic Forum: Geneva, Switzerland, 2024. [Google Scholar]
  4. Davidson, D.; LePore, E. A coherence theory of truth and knowledge. In Epistemology: An Anthology; Blackwell: Cambridge, UK, 1986; pp. 124–133. [Google Scholar]
  5. Fetzer, J.H. Information: Does it Have To Be True? Minds Mach. 2004, 14, 223–229. [Google Scholar] [CrossRef]
  6. Wyart, V.; Koechlin, E. Choice variability and suboptimality in uncertain environments. Curr. Opin. Behav. Sci. 2016, 11, 109–115. [Google Scholar] [CrossRef]
  7. Phillips-Wren, G.; Adya, M. Decision making under stress: The role of information overload, time pressure, complexity, and uncertainty. J. Decis. Syst. 2020, 29 (Suppl. S1), 213–225. [Google Scholar] [CrossRef]
  8. Zielinski, C. Infodemics and infodemiology: A short history, a long future. Rev. Panam. Salud Publica/Pan Am. J. Public Health 2021, 45, e40. [Google Scholar] [CrossRef] [PubMed]
  9. Secretary-General of the United Nations. Countering Disinformation for the Promotion and Protection of Human Rights and Fundamental Freedoms: Programme Budget Implications of Draft Resolution A/C. 3/76/L. 7/Rev. 1: Statement/Submitted by the Secretary-General in Accordance with Rule 153 of the Rules of Procedure of the General Assembly. 2021. Available online: https://digitallibrary.un.org/record/3948741?v=pdf (accessed on 23 March 2024).
  10. Fricker, M. Epistemic Injustice: Power and the Ethics of Knowing. In Epistemic Injustice: Power and the Ethics of Knowing; Oxford Academic: Oxford, UK, 2007. [Google Scholar] [CrossRef]
  11. Rowlands, G.; Shaw, A.; Jaswal, S.; Smith, S.; Harpham, T. Health literacy and the social determinants of health: A qualitative model from adult learners. Health Promot. Int. 2017, 32, dav093. [Google Scholar] [CrossRef]
  12. OECD. Misinformation and Disinformation: An International Effort Using Behavioural Science to Tackle the Spread of Misinformation; OECD Publishing: Washington, DC, USA, 2022; Volume 21. [Google Scholar]
  13. Hills, T.T. The Dark Side of Information Proliferation. Perspect. Psychol. Sci. 2018, 14, 323–330. [Google Scholar] [CrossRef] [PubMed]
  14. Qiu, X.; Oliveira, D.F.M.; Sahami Shirazi, A.; Flammini, A.; Menczer, F. Limited individual attention and online virality of low-quality information. Nat. Hum. Behav. 2017, 1, 132. [Google Scholar] [CrossRef]
  15. Cohen, N. (Re-) Redefining Neuroethics to Meet the Challenges of the Future. AJOB Neurosci. 2023, 14, 421–424. [Google Scholar] [CrossRef]
  16. Fernández, M.; Bellogín, A.; Cantador, I. Analysing the effect of recommendation algorithms on the amplification of misinformation. arXiv 2021, arXiv:2103.14748. [Google Scholar]
  17. Schroeder, R. Social Theory after the Internet: Media, Technology and Globalization; UCL Press: London, UK, 2018. [Google Scholar]
  18. Rhodes, S.C. Filter Bubbles, Echo Chambers, and Fake News: How Social Media Conditions Individuals to Be Less Critical of Political Misinformation. Political Commun. 2022, 39, 1910887. [Google Scholar] [CrossRef]
  19. Zimmer, F.; Scheibe, K.; Stock, M.; Stock, W.G. Fake news in social media: Bad algorithms or biased users? J. Inf. Sci. Theory Pract. 2019, 7, 40–53. [Google Scholar] [CrossRef]
  20. Kubin, E.; von Sikorski, C. The role of (social) media in political polarization: A systematic review. Ann. Int. Commun. Assoc. 2021, 45, 1976070. [Google Scholar] [CrossRef]
  21. Sunstein, C.R. #Republic: Divided Democracy in the Age of Social Media. In Divided Democracy in the Age of Social Media; Princeton University Press: Princeton, NJ, USA, 2018. [Google Scholar] [CrossRef]
  22. Southwell, B.G.; Thorson, E.A. The prevalence, consequence, and remedy of misinformation in mass media systems. J. Commun. 2015, 65, 589–595. [Google Scholar] [CrossRef]
  23. Protess, D.; McCombs, M.E. Agenda Setting: Readings on Media, Public Opinion, and Policymaking; Routledge: New York, NY, USA, 2016. [Google Scholar]
  24. Vosoughi, S.; Roy, D.; Aral, S. The spread of true and false news online. Science 2018, 359, 1146–1151. [Google Scholar] [CrossRef] [PubMed]
  25. West, S.M. Data Capitalism: Redefining the Logics of Surveillance and Privacy. Bus. Soc. 2019, 58, 20–41. [Google Scholar] [CrossRef]
  26. Sadowski, J. Too Smart. In Too Smart; The MIT Press: Cambridge, MA, USA, 2020. [Google Scholar] [CrossRef]
  27. Ortiz, J.; Young, A.; Myers, M.D.; Carbaugh, D.; Bedeley, R.T.; Chughtai, H.; Davidson, E.; George, J.; Gogan, J.; Gordon, S.; et al. Giving voice to the voiceless: The use of digital technologies by marginalized groups. Commun. Assoc. Inf. Syst. 2019, 45, 20–38. [Google Scholar] [CrossRef]
  28. Guillén, M.F.; Suárez, S.L. Explaining the global digital divide: Economic, political and sociological drivers of cross-national internet use. Soc. Forces 2005, 84, 681–708. [Google Scholar] [CrossRef]
  29. Howell, E.L.; Brossard, D. (Mis)informed about what? What it means to be a science-literate citizen in a digital world. Proc. Natl. Acad. Sci. USA 2021, 118, e1912436117. [Google Scholar] [CrossRef] [PubMed]
  30. West, J.D.; Bergstrom, C.T. Misinformation in and about science. Proc. Natl. Acad. Sci. USA 2021, 118, e1912444117. [Google Scholar] [CrossRef] [PubMed]
  31. Kreps, S.E.; Kriner, D.L. Model uncertainty, political contestation, and public trust in science: Evidence from the COVID-19 pandemic. Sci. Adv. 2020, 6, abd4563. [Google Scholar] [CrossRef] [PubMed]
  32. Chatterjee, A. The ethics of neuroenhancement. In Handbook of Clinical Neurology; Elsevier: Amsterdam, The Netherlands, 2013; Volume 118. [Google Scholar] [CrossRef]
  33. Zohny, H. The Myth of Cognitive Enhancement Drugs. Neuroethics 2015, 8, 257–269. [Google Scholar] [CrossRef]
  34. Lucke, J.C.; Bell, S.; Partridge, B.; Hall, W.D. Deflating the neuroenhancement bubble. AJOB Neurosci. 2011, 2, 611122. [Google Scholar] [CrossRef]
  35. Schleim, S.; Quednow, B.B. How realistic are the scientific assumptions of the neuroenhancement debate? Assessing the pharmacological optimism and neuroenhancement prevalence hypotheses. Front. Pharmacol. 2018, 9, 3. [Google Scholar] [CrossRef]
  36. Brocas, I.; Carrillo, J.D. Influence through ignorance. Rand. J. Econ. 2007, 38, 931–947. [Google Scholar] [CrossRef]
  37. Marois, R.; Ivanoff, J. Capacity limits of information processing in the brain. Trends Cogn. Sci. 2005, 9, 296–305. [Google Scholar] [CrossRef] [PubMed]
  38. Jones, B.D. Bounded rationality. Annu. Rev. Political Sci. 1999, 2, 297–321. [Google Scholar] [CrossRef]
  39. Festinger, L. A theory of cognitive dissonance. In A theory of Cognitive Dissonance; Stanford University Press: Redwood City, CA, USA, 1957. [Google Scholar]
  40. Furnham, A.; Boo, H.C. A literature review of the anchoring effect. J. Socio-Econ. 2011, 40, 35–42. [Google Scholar] [CrossRef]
  41. Speier, C.; Valacich, J.S.; Vessey, I. The influence of task interruption on individual decision making: An information overload perspective. Decis. Sci. 1999, 30, 337–360. [Google Scholar] [CrossRef]
  42. Dunning, D. The dunning-kruger effect. On being ignorant of one’s own ignorance. Adv. Exp. Soc. Psychol. 2011, 44, 247–296. [Google Scholar] [CrossRef]
  43. Downs, A. An Economic Theory of Political Action in a Democracy. J. Political Econ. 1957, 65, 257897. [Google Scholar] [CrossRef]
  44. Golman, R.; Hagmann, D.; Loewenstein, G. Information avoidance. J. Econ. Lit. 2017, 55, 96–135. [Google Scholar] [CrossRef]
  45. Snow, N.E.; Vaccarezza, M.S. Virtues, Democracy, and Online Media: Ethical and Epistemic Issues; Routledge: London, UK, 2021. [Google Scholar] [CrossRef]
  46. d’Ancona, M. Post-Truth: The New War on Truth and How to Fight Back; Random House: New York, NY, USA, 2017. [Google Scholar]
  47. Kloubert, T.; Hoggan, C. Reconsidering Rationality: A Response to Today’s Epistemic Crisis. In Proceedings of the Adult Education in Global Times: An International Research Conference; University of British Columbia: Metro Vancouver, BC, Canada, 2021; pp. 318–319. [Google Scholar]
  48. Zuboff, S. Big other: Surveillance capitalism and the prospects of an information civilization. J. Inf. Technol. 2015, 30, 75–89. [Google Scholar] [CrossRef]
  49. Bhakuni, H.; Abimbola, S. Epistemic injustice in academic global health. Lancet Glob. Health 2021, 9, e1465–e1470. [Google Scholar] [CrossRef] [PubMed]
  50. Nikolaidis, A.C. A Third Conception of Epistemic Injustice. Stud. Philos. Educ. 2021, 40, 381–398. [Google Scholar] [CrossRef]
  51. Kahan, D.M. Cultural cognition as a conception of the cultural theory of risk. In Handbook of Risk Theory: Epistemology, Decision Theory, Ethics, and Social Implications of Risk; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar] [CrossRef]
  52. Carel, H.; Kidd, I.J. Epistemic injustice in healthcare: A philosophial analysis. Med. Heal. Care Philos. 2014, 17, 529–540. [Google Scholar] [CrossRef] [PubMed]
  53. Eysenbach, G. Infodemiology: The epidemiology of (mis)information. Am. J. Med. 2002, 113, 763–765. [Google Scholar] [CrossRef] [PubMed]
  54. Eysenbach, G. Infodemiology and infoveillance: Framework for an emerging set of public health informatics methods to analyze search, communication and publication behavior on the internet. J. Med. Internet Res. 2009, 11, 1157. [Google Scholar] [CrossRef] [PubMed]
  55. Dubljević, V.; McCall, I.C.; Illes, J. Neuroenhancement at Work: Addressing the Ethical, Legal, and Social Implications; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar] [CrossRef]
  56. Antal, A.; Luber, B.; Brem, A.K.; Bikson, M.; Brunoni, A.R.; Cohen Kadosh, R.; Dubljević, V.; Fecteau, S.; Ferreri, F.; Flöel, A.; et al. Non-invasive brain stimulation and neuroenhancement. Clin. Neurophysiol. Pr. 2022, 7, 146–165. [Google Scholar] [CrossRef] [PubMed]
  57. Volkow, N.D.; McLellan, A.T. Opioid Abuse in Chronic Pain—Misconceptions and Mitigation Strategies. N. Engl. J. Med. 2016, 374. [Google Scholar] [CrossRef] [PubMed]
  58. Bazzano, L.A.; Durant, J.; Brantley, P.R. A modern history of informed consent and the role of key information. Ochsner J. 2021, 21, 81–85. [Google Scholar] [CrossRef] [PubMed]
  59. Barstow, C.; Shahan, B.; Roberts, M. Evaluating medical decision-making capacity in practice. Am. Fam. Physician 2018, 98, 40–46. [Google Scholar] [PubMed]
  60. Tham, J.; Gómez, A.G.; Garasic, M.D. Cross-Cultural and Religious Critiques of Informed Consent. In Cross-Cultural and Religious Critiques of Informed Consent; Routledge: London, UK, 2021. [Google Scholar] [CrossRef]
  61. De Paor, S.; Heravi, B. Information literacy and fake news: How the field of librarianship can help combat the epidemic of fake news. J. Acad. Librariansh. 2020, 46, 102218. [Google Scholar] [CrossRef]
  62. Khan, M.L.; Idris, I.K. Recognise misinformation and verify before sharing: A reasoned action and information literacy perspective. Behav. Inf. Technol. 2019, 38, 1194–1212. [Google Scholar] [CrossRef]
  63. Lewandowsky, S.; Van Der Linden, S. Countering misinformation and fake news through inoculation and prebunking. Eur. Rev. Soc. Psychol. 2021, 32, 348–384. [Google Scholar] [CrossRef]
  64. Siegmann, J.; Grayot, J. The Vices and Virtues of Instrumentalized Knowledge. Philosophies 2023, 8, 84. [Google Scholar] [CrossRef]
  65. Pritchard, D. Recent work on epistemic value. Am. Philos. Q. 2007, 44, 85–110. [Google Scholar]
  66. Friedman, J. Post-Truth and the Epistemological Crisis. Crit. Rev. 2023, 35, 1–21. [Google Scholar] [CrossRef]
  67. Hoggan-Kloubert, T.; Hoggan, C. Post-Truth as an Epistemic Crisis: The Need for Rationality, Autonomy, and Pluralism. Adult Educ. Q. 2023, 73, 3–20. [Google Scholar] [CrossRef]
  68. Mason, S.A. Turning Data Into Knowledge: Lessons from Six Milwaukee Public Schools. American Education Research Association, April. 2002. Available online: https://wcer.wisc.edu/docs/working-papers/Working_Paper_No_2002_3.pdf (accessed on 23 March 2024).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cohen, N.; Garasic, M.D. Informed Ignorance as a Form of Epistemic Injustice. Philosophies 2024, 9, 59. https://doi.org/10.3390/philosophies9030059

AMA Style

Cohen N, Garasic MD. Informed Ignorance as a Form of Epistemic Injustice. Philosophies. 2024; 9(3):59. https://doi.org/10.3390/philosophies9030059

Chicago/Turabian Style

Cohen, Noa, and Mirko Daniel Garasic. 2024. "Informed Ignorance as a Form of Epistemic Injustice" Philosophies 9, no. 3: 59. https://doi.org/10.3390/philosophies9030059

APA Style

Cohen, N., & Garasic, M. D. (2024). Informed Ignorance as a Form of Epistemic Injustice. Philosophies, 9(3), 59. https://doi.org/10.3390/philosophies9030059

Article Metrics

Back to TopTop