Next Article in Journal / Special Issue
Jung’s Legacy in Depth Psychology
Previous Article in Journal
Wigner Functions
Previous Article in Special Issue
Homeownership and Working-Class Suburbs in Barcelona
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Entry

Scientific Misinformation

by
Alessandro Siani
School of the Environment and Life Sciences, University of Portsmouth, Portsmouth PO1 2DY, UK
Encyclopedia 2025, 5(3), 119; https://doi.org/10.3390/encyclopedia5030119
Submission received: 2 July 2025 / Revised: 31 July 2025 / Accepted: 7 August 2025 / Published: 11 August 2025
(This article belongs to the Collection Encyclopedia of Social Sciences)

Definition

Scientific misinformation refers to false, misleading, or inaccurate information that contradicts, ignores, or misrepresents established scientific evidence or consensus. It can stem from accidental misunderstanding or ignorance of facts, as well as from their deliberate distortion (generally referred to as “disinformation”). Unlike rigorous scientific evidence, misinformation typically lacks a credible evidential basis and does not rely on the scientific method. Often spread through mass media, social networking platforms, or informal communication, scientific misinformation can undermine public trust in science, influence health and policy decisions, and contribute to confusion or harmful behaviours at both personal and societal levels.

1. Introduction and Key Glossary

Scientific misinformation has been defined as “publicly available information that is misleading or deceptive relative to the best available scientific evidence and that runs contrary to statements by actors or institutions who adhere to scientific principles” [1]. It is important to note that the word “misinformation” refers to factually incorrect information shared without knowledge of its falseness or any intent to deceive. In this sense, it is distinct from both “disinformation” (deliberately false information shared with intent to deceive) and “malinformation” (factually correct information shared with intent to deceive or harm, e.g., information taken out of context or distributed with hateful/malicious intent) [2]. While misinformation, disinformation, and malinformation have distinct definitions in the taxonomy of information disorder, the categorisation of harmful information is complicated by a degree of overlap between the three concepts [3]. For example, an individual might be unaware of the falseness or maliciousness of some information and spread it while believing it to be true and without meaning any harm—or even in a misguided attempt to be helpful to others. Depending on the context, misinformation and disinformation can take different forms and denominations, including “fake news” (deliberately misleading information maliciously presented as legitimate news items), “pseudoscience” (beliefs or practices that claim to be scientific but lack the empirical support and methodological rigour that define legitimate science), and “propaganda” (communication designed to promote a particular viewpoint or agenda by controlling or manipulating the flow of information) [4].

2. History and Notable Cases

While scientific misinformation is usually regarded as a modern-day phenomenon, it is worth noting that its history predates the emergence of the internet and mass media. Although misconceptions about natural phenomena have existed since early human history, it would be disingenuous and anachronistic to frame pre-scientific erroneous beliefs as scientific misinformation. Therefore, the examples presented in this section refer to periods subsequent to the development of the modern scientific method, which, in retrospective characterisation, is widely agreed to have arisen around the 17th century from empiricism (e.g., Bacon, Locke, Galileo) and mathematical rationalism (e.g., Newton, Descartes, Leibniz).

2.1. Beringer’s “Lying Stones”

An early example of what we would now classify as scientific misinformation is the case of Professor Johann Beringer, a German physician who obtained his doctorate in medicine from the University of Würzburg in 1694 [5]. In 1726, Beringer published a study reporting the discovery of bizarre animal fossils carved with divine inscriptions and symbols, which were later defined as “lying stones”. Beringer’s lying stones were later revealed to have been crafted by three of his assistants (aged 14, 17, and 18 at the time) as a hoax, allegedly orchestrated by fellow faculty members who resented Beringer’s success and perceived arrogance.

2.2. Phrenology

Another notable historical example of scientific misinformation is phrenology, the pseudoscientific belief that the shape of an individual’s cranium could reveal or predict their personality traits and propensity for violence and crime [6]. Developed in the late 18th century by German physician Franz Joseph Gall, phrenology gained influence in the 19th century, when it was widely accepted as a valid scientific theory despite the fact that its methodological rigour (or lack thereof) had already been called into question by the second half of the century. The early 20th century saw a revived interest in phrenology, which was embraced by segments of the medical, academic, and political establishment and even contributed, in combination with concepts borrowed from social Darwinism, to the theoretical justification of colonial, imperialist, racist, and classist ideologies [7].

2.3. The Piltdown Man

Nationalist undertones and cultural prejudice also characterised the discovery of the Piltdown Man, another notable case of scientific misinformation that emerged in the early 20th century [8]. The Piltdown Man was a fossil (consisting of a combination of a human skull and an orangutan jaw) discovered in East Sussex (England) in 1912, purported to be the “earliest Englishman” and the missing link between apes and humans. As argued by Stephen Jay Gould, it is conceivable that nationalism and cultural bias contributed to the acceptance of Piltdown Man, as it constituted “proof” that the earliest humans arose in Europe [9]. Although the discovery of Piltdown Man was met with scepticism by part of the scientific community since its early days, it took over 40 years for the forgery to be fully debunked, making it one of the most notorious and long-lasting scientific hoaxes in history.

2.4. The Sokal Hoax

The Sokal hoax is a cogent example of how misinformation can deliberately be used to expose flaws and vulnerabilities within the scientific community [10]. In 1996, American physicist Alan Sokal submitted a deliberately nonsensical article to the social and cultural studies journal Social Text that mimicked postmodern discourse while intentionally misusing scientific terminology. The article, titled “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity”, was accepted for publication without peer review [11]. Sokal later revealed the hoax in an article published in Lingua Franca magazine, arguing that ideological bias and a lack of academic rigour had allowed the piece to pass editorial scrutiny. The hoax prompted a wider debate about academic standards, editorial accountability, and ideological bias in scholarly publishing.

2.5. The MMR Scandal

The measles, mumps and rubella (MMR) vaccine scandal is perhaps the most egregious case of scientific misinformation, with far-reaching consequences that continue to exert their detrimental impact to this day [12]. In 1998, English physician Andrew Wakefield published an article in The Lancet suggesting a link between the MMR vaccine and the development of gastrointestinal abnormalities and autism spectrum disorder in children [13]. Subsequent investigations revealed serious ethical violations, undisclosed conflicts of interest, and manipulated data on Wakefield’s part. The paper was fully retracted in 2010, and Wakefield was struck off the UK medical register over multiple charges of serious professional misconduct. Despite Wakefield’s claims having been widely debunked by multiple clinical and epidemiological studies, the MMR scandal contributed to an increase in vaccine hesitancy over the following decades, with disastrous public health consequences still felt to this day [14].

2.6. The COVID-19 Infodemic

Vaccine scepticism was also central to the rampant diffusion of scientific misinformation during the COVID-19 pandemic. An “infodemic” of conspiracy theories and unsubstantiated claims about the virus’s origins, transmission routes, prevention strategies, and treatments circulated widely through social media, blogs, and even mainstream news channels [15]. Misleading or factually incorrect information about mask efficacy, vaccine safety, and unproven treatments such as hydroxychloroquine and ivermectin influenced public attitudes and behaviours, leading to lower confidence in (and compliance with) public health guidelines and preventive measures [16].

2.7. Climate Change Scepticism and Denial

Misinformation surrounding anthropogenic climate change is a present and pressing concern, hindering the global efforts required to reduce fossil fuel consumption and carbon emissions [17]. Despite near-universal consensus within the scientific community regarding the anthropogenic origins of climate change and global warming, misinformation on the topic is still rife. Climate misinformation includes stances such as denialism (the climate is not changing due to human activities, or not changing at all), doomerism (climate change is inevitable and reducing carbon emissions will not stop it), and conspiracy narratives (climate change is a hoax designed to suppress personal freedoms or manipulate the economy). Abundant evidence indicates that such claims are fuelled by decades-long disinformation campaigns orchestrated by actors with a vested interest in maintaining the status quo with regard to energy policies, e.g., fossil fuel lobbyists and conservative think tanks [18].
The historical examples discussed in this section illustrate that misinformation is not a modern construct but rather a recurring feature of the human relationship with science and nature. Throughout history, misinformation has often gained traction when it aligns with existing beliefs, cultural narratives, or power structures; similar patterns and determinants continue to drive the onset and diffusion of scientific misinformation today, as discussed in the next section.

3. Psychological, Social and Cultural Drivers

Individuals’ belief in scientific misinformation and its diffusion among a population are underpinned by a combination of psychological, social, and cultural drivers, the understanding of which is crucial for developing mitigation strategies and effective scientific communication approaches. This section provides a concise overview of the most salient factors that have been shown to influence susceptibility to scientific misinformation in individuals and across populations.

3.1. Cognitive Biases

At the individual level, cognitive biases play a key role in determining susceptibility to scientific misinformation [19]. In particular, confirmation bias (the tendency to seek and prefer information that aligns with one’s pre-existing beliefs) can lead individuals to cherry-pick sources that reinforce their viewpoints while ignoring or dismissing any evidence that challenges their beliefs, as exemplified in the previous section by the cases of Piltdown man and the Sokal hoax. In the digital era, confirmation bias is exacerbated by online “echo chambers”, whereby people tend to seek and engage with like-minded individuals in ideologically homogeneous social networks. Exposure to echo chambers and repeated confirmation of erroneous views contribute to the consolidation of misinformation within individuals and subsequent spread among the population through a phenomenon known as the illusory truth effect [20]. Moreover, cognitive biases (e.g., illusion of explanatory depth and Dunning–Kruger effect) related to overconfidence in one’s own knowledge and understanding of scientific matters have been proven to lead individuals to discount expert opinions and embrace fallacious views and conspiracy theories [21].

3.2. Emotions

The spread of scientific misinformation is particularly pronounced when dealing with ideologically or emotionally charged topics such as climate change, nuclear energy, genetic modification, or vaccine safety. A recent study revealed that heightened emotional involvement negatively impacts individuals’ capacity to evaluate the veracity of information, and that fake news that elicits an emotional response is more likely to be believed compared to information relying on analytical and logical reasoning [22]. This effect is particularly pronounced in cases involving high-arousal negative emotions (e.g., anger, fear, and disgust), which have been shown to be frequently embedded in online rumours, resulting in increased virality and longer-lasting diffusion [23]. For example, the MMR scandal and the COVID-19 infodemic gained momentum because of strong emotional responses to fear (e.g., concerns about children’s safety), uncertainty (e.g., about the rapid development of mRNA vaccines), and outrage (e.g., perceived violation of bodily autonomy by vaccine mandates).

3.3. Personality Traits

Individuals’ susceptibility to misinformation and their tendency to share it have also been shown to be influenced by intrinsic personality traits. A recent review revealed that “more extroverted and less conscientious and agreeable people tend to be more susceptible to believing in and sharing misinformation” [24]. Moreover, the sharing of misinformation is positively associated with all three of the Dark Triad personality traits, namely, narcissism, psychopathy, and Machiavellianism. A large-scale multinational study revealed a correlation between paranoid ideation and conspiracy mentality, both of which were associated, although to different extents, with interpersonal mistrust [25].

3.4. Social Cues and Engagement Metrics

Recent research has highlighted that the spread of scientific misinformation in digital forums is facilitated by social cues and engagement metrics [26]. The study revealed that “misinformation with negative sentiment had significantly higher interactivity than neutral and positive content”. On social media platforms, indicators such as the number of likes, shares, retweets, and comments are often perceived as proxies for credibility and popularity, regardless of the veracity of the information being shared. A recent meta-analysis of 41 studies revealed that users tend to interpret high-engagement posts as more trustworthy, regardless of the reliability of their content, a phenomenon known as the “bandwagon effect” [27].

3.5. Cultural and Religious Values

Cultural and religious values influence how people interpret and engage with scientific information, modulating their susceptibility to misinformation. Individuals tend to align their perception of risk and their opinions on controversial issues with pre-existing cultural values and beliefs, a concept known as “cultural cognition”. For example, a survey of US adults revealed that “Individuals possessing strongly held cultural worldviews not only choose news outlets where they expect to find culturally congruent arguments about climate change, but they also selectively process the arguments they encounter” [28]. Religious identity further shapes engagement with scientific information, both as a source of moral evaluation and a social anchor. A scoping review including 14 studies indicated that religious beliefs are a significant factor contributing to vaccine hesitancy [29]. Likewise, a meta-analysis of 87 reports revealed that “people with strong religious views are more likely to believe in conspiracy theories than less religious or non-religious individuals” [30].

3.6. Educational Levels and Political Orientation

Educational levels (particularly scientific and digital literacy) and political orientation play a critical role in determining how people perceive and interpret scientific evidence, influencing their susceptibility to scientific misinformation [31]. A study conducted during the COVID-19 pandemic revealed that “Respondents with higher levels of science education and motivation relied less on misinformation, even if they did not necessarily intend to follow the health recommendations”, indicating that the decision-making process with respect to scientific and healthcare matters is modulated by more factors than just the simple understanding (or lack thereof) of facts [32]. A systematic meta-analysis of 31 studies involving 11,561 US-based participants highlighted that “political identity had a strong, credible, and negative effect on discrimination ability, with Republican participants achieving lower overall accuracy compared to Democrat participants” when it came to discerning truth from false news [33]. Likewise, another study revealed that “right-wing adherents were less trustful of scientists and believed in COVID-19-related misinformation more than left-wing adherents, and these two factors accounted for their higher vaccine hesitancy and reduced willingness to receive an anti-COVID-19 vaccination” [34].

3.7. Mistrust of Political and Scientific Institutions

Mistrust of political institutions and the scientific establishment contributes to the onset and diffusion of scientific misinformation. A survey of 71,922 respondents across 68 countries highlighted that, although scientists and the scientific methods are widely trusted in most countries, “distrusting minorities may affect considerations of scientific evidence in policymaking, as well as decisions by individuals that can affect society at large” [35]. Instances of governmental miscommunication, mismanagement, and corruption in the handling of the COVID-19 pandemic contributed to a reduction in public trust and adherence to preventive measures in the UK [36]. Mistrust in the political, academic, and scientific establishments is often stoked by populist parties for political gain and traction. A large-scale survey conducted in four countries led by populist leaders during the pandemic (Brazil, Poland, Serbia, USA) revealed that “populist attitudes are the most significant predictor of distrust in political institutions in all four countries”, and that in Brazil and the USA, “populist voters were more likely to distrust expert institutions” [37]. In recent years, populist leaders and think tanks have deliberately weaponised misinformation as a strategy to manipulate public opinion and consolidate power, as exemplified by the misinformation campaigns surrounding the 2016 Brexit referendum in the UK and the rise of right-wing populism in several countries, including Brazil, the USA, and, more recently, Argentina [35]. By presenting themselves as the sole voice of the “people” against a purportedly corrupt elite (that typically includes scientists, mainstream media, and intellectuals), populist figures frequently undermine trust in experts and established institutions. This erosion of academic and institutional trust, combined with the pervasive emotive and polarising rhetoric, provides fertile ground for the rejection of rational political and scientific discourse and the affirmation of anti-intellectual and conspiratorial mindsets.
Elucidating the psychological, social, and cultural drivers of scientific misinformation is far from being merely an academic exercise. As discussed in the next section, the development and deployment of strategies aimed at tackling misinformation requires a deep understanding of the factors that facilitate its pervasion at both the individual and societal level.

4. Prevention and Mitigation Strategies

A recent World Economic Forum report drawing on the opinions of over 900 global risks experts, policymakers, and industry leaders identified “misinformation and disinformation” as one of the five most pressing global threats in 2025 [38]. Likewise, an editorial in The Lancet highlighted that “Disinformation has become a deliberate instrument to attack and discredit scientists and health professionals for political gains. The effects are destructive and damaging to public health” and recommended that “governments and science communicators must strive to ensure that public health messaging is relevant to the individual; to not only provide accurate information but also foster an environment of trust and understanding, and to acknowledge areas of uncertainty and unknowns” [39]. This section aims to briefly synthesise current and potential strategies to limit the production and dissemination of scientific misinformation, with emphasis on both preventive and mitigative approaches.

4.1. Prevention: Education, Prebunking, and Inoculation

Public education is a cornerstone of any effort to improve scientific and digital literacy among the population and increase their resistance to misinformation and disinformation. Scholars and educators have raised concerns that current science curricula often overlook the development of independent critical evaluation skills, which means that “science education across the globe, despite its avowed commitment to the importance of scientific literacy, is failing to prepare students to engage successfully with too much of the science in their everyday lives” [40].
However, other studies have highlighted that primary and secondary school students show a good ability to distinguish between scientific facts and misinformation and an awareness of the reliability of news sources, particularly when supported by appropriate educational interventions [41,42]. Remarkably, there is evidence indicating that secondary students can be more competent at identifying the falsity of scientific misconceptions (e.g., “vaccines cause autism”, “the Earth is flat”, “the Moon landing was staged”) compared to adults living in the same country [43]. In the context of combating misinformation, this observation reinforces the importance of providing adults with lifelong learning opportunities both within and outside the boundaries of traditional higher education [44]. “Prebunking”, a strategy based on inoculation theory, has emerged as a particularly promising intervention, involving the exposure to weakened forms of misinformation alongside counterarguments to enhance resistance to more persuasive deceptive narratives [45]. A review of studies employing prebunking and inoculation strategies indicated that this approach can successfully “immunise” individuals against misinformation and fake news in “real world” situations [46]. While prebunking and inoculation strategies have shown promising potential in controlled studies, their wider implementation and scalability remain a matter of debate, particularly with respect to maintaining immunity across diverse populations and digital ecosystems. Specifically, the observation that repeated exposure to prebunking cues might be necessary to maintain the protective effects over time raises concerns over the practical applicability of such an approach. This is particularly the case for informal interventions or short-term scenarios (e.g., school or university courses) in which the target cohort might be difficult to reach for follow-up reinforcement after leaving the institution.

4.2. Mitigation: Fact-Checking, Debunking, and Algorithmic Interventions

While education and prevention are essential for medium- and long-term efforts to stem the onset and diffusion of misinformation, it is crucial to put mitigation strategies in place to tackle the issue whenever misinformation is already rooted within individuals and populations. Fact-checking strategies based on the REACT framework (Repetition, Empathy, Alternative explanations, Credible sources, Timeliness) are considered the gold standard approach to rectify misconceptions and debunk myths [47]. Based on a meta-analysis of 20 studies, Chan and colleagues issued three key recommendations to optimise debunking efforts, namely, “reduce arguments that support misinformation, engage audiences in scrutiny and counterarguing of misinformation, and introduce new information as part of the debunking message” [48]. A large study involving 1000 participants per country across 16 European countries showed that in all countries, “people were less likely to believe the misinformation claim after being exposed to a fact-check, with this effect being quite strong” [49]. Corrective efforts are significantly more effective when delivered by messengers who hold trust and credibility among the target audience, whether due to social group identity, faith affiliation, or professional standing. A study involving 2805 participants in India and Pakistan revealed that debunking messages delivered via WhatsApp are re-shared at higher rates when the sender has close personal ties or similar political views to the receiver [50]. When developing and deploying mitigation strategies based on fact-checking and debunking, it is essential to be aware that they can be less effective when misinformation aligns with a person’s core identity or deeply entrenched beliefs. While concerns about a potential “backfire effect” (whereby fact-checking might end up reinforcing false beliefs) have not been supported by substantial empirical evidence, it remains a theoretically plausible risk under specific conditions, such as when corrections directly challenge group-defining narratives or are perceived as threatening to one’s ingrained worldview [47,49]. Therefore, corrective efforts should remain sensitive to individual differences in information processing, identity, and personal beliefs.
As digital platforms bear a significant share of responsibility for allowing (and in some cases facilitating) the spread of misinformation, their cooperation and accountability should be incentivised and, when necessary, enforced through policy. Research indicates that design-level interventions such as “traffic light” reliability indicators [51] or additional prompts before sharing [52] can be successfully implemented to mitigate the uptake and spreading of misinformation. Likewise, recommendation algorithms (which are typically designed to prioritise user engagement over content veracity) could potentially be re-tuned to ensure that recommended content is based on reliable sources [53].

4.3. Artificial Intelligence: A Double-Edged Sword

The rapid development of generative artificial intelligence (AI) and large language models (LLMs) is causing a seismic shift in the current information landscape [54]. Such tools enable unprecedented scale and sophistication in the creation of new (and often malicious or misleading) content, enabling virtually anyone with internet access to rapidly generate text, images, and footage that are often indistinguishable from human-generated ones. Worryingly, research indicates that “large language models currently available can already produce text that is indistinguishable from organic text”, and “false synthetic tweets are recognized as false worse than false organic tweets” [55]. However, the same study also revealed that content generated by AI is easier to understand than that generated by humans, and that AI-generated correct information is recognised as true more quickly and frequently than its organic counterpart. This observation suggests that AI could potentially become an invaluable tool if appropriately leveraged to combat misinformation and communicate reliable information. A systematic review of 76 studies indicated that “AI systems can effectively detect and mitigate false information on a large scale” [56]. The authors also recommended that efforts to leverage AI to combat misinformation should always include human oversight to ensure “transparency, privacy protection, bias mitigation, and accountability throughout the development and deployment phases”. These recommendations are critically important, as AI systems are reportedly susceptible to biases stemming from their training data and obscure operating parameters. Concerningly, the operation of many AI systems has been described as a “black box”, whereby the choice of certain outputs based on the training data often eludes even the developers of the model itself. While a wider discussion of the ethical implications of AI use in combating misinformation falls outside the scope of the present article, it is important to acknowledge that such applications raise complex ethical concerns, including risks to freedom of expression, algorithmic bias, epistemic manipulation, and the potential for overreach in content moderation [57,58]. Without human oversight and deliberate safeguards, AI interventions risk reinforcing existing biases, suppressing legitimate viewpoints, and ultimately amplifying misinformation.

5. Conclusions

This entry sought to synthesise historical, psychological, social and technological perspectives to provide an integrated and concise overview of how scientific misinformation emerges and spreads, with the aim of informing interventions in education and policymaking. While scientific misinformation is not a new phenomenon, its diffusion and impact are a defining trait of the current “post-truth” era. The recurring patterns observed across historical cases of scientific misinformation, whether in the form of hoaxes, pseudoscience, or fabricated findings, highlight how misinformation proliferates particularly when it aligns with existing social vulnerabilities, cognitive biases, power structures, and economic interests. By distorting public understanding of scientific facts and eroding trust in science and scientists, misinformation undermines informed decision-making at both the individual and societal levels. Unless urgent action is taken to limit its diffusion, scientific misinformation will continue to drastically hinder our collective ability to address pressing global challenges, such as climate change, pandemics, antibiotic resistance, and the uncertainties surrounding the development of artificial intelligence. Tackling scientific misinformation requires systematic and coherent global efforts based around three priority areas. The first requires embedding critical thinking, digital literacy, and scientific reasoning across all levels of formal and informal education, involving not just students, but also educators, educational leaders, and school leavers. Secondly, policy interventions are required to enforce accountability, transparency, and ethical integrity from digital media platforms and AI developers. Thirdly, it is essential that all efforts in the previous two areas continue to be underpinned and informed by interdisciplinary research elucidating the factors that influence vulnerability to misinformation among individuals and populations. While enormous progress has been made over the last decades toward understanding the determinants of misinformation and developing prevention and mitigation strategies, substantial challenges remain with regard to their integration into policy and governance. Addressing these challenges will require that the combined efforts of educators, scientists, policymakers, and digital media platforms transcend national boundaries and acknowledge that scientific misinformation poses a global and existential threat to our well-being and survival as a species.

Funding

This research received no external funding.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Southwell, B.G.; Brennen, J.S.B.; Paquin, R.; Boudewyns, V.; Zeng, J. Defining and Measuring Scientific Misinformation. Ann. Am. Acad. Political Soc. Sci. 2022, 700, 98–111. [Google Scholar] [CrossRef]
  2. Lim, W.M. Fact or fake? The search for truth in an infodemic of disinformation, misinformation, and malinformation with deepfake and fake news. J. Strateg. Mark. 2023, 1–37. [Google Scholar] [CrossRef]
  3. Gradoń, K.T.; Hołyst, J.A.; Moy, W.R.; Sienkiewicz, J.; Suchecki, K. Countering misinformation: A multidisciplinary approach. Big Data Soc. 2021, 8. [Google Scholar] [CrossRef]
  4. Altay, S.; Berriche, M.; Heuer, H.; Farkas, J.; Rathje, S. A survey of expert views on misinformation: Definitions, determinants, solutions, and future of the field. Harv. Kennedy Sch. Misinformation Rev. 2023, 4, 1–34. [Google Scholar] [CrossRef]
  5. Jahn, M.E. Dr. Beringer and the Würzburg” Lügensteine”. J. Soc. Bibliogr. Nat. Hist. 1963, 4, 138–146. [Google Scholar] [CrossRef]
  6. Greenblatt, S.H. Phrenology in the science and culture of the 19th century. Neurosurgery 1995, 37, 790–805. [Google Scholar] [CrossRef]
  7. Goulden, M. Bringing Bones to Life: How Science Made Piltdown Man Human. Sci. Cult. 2007, 16, 333–357. [Google Scholar] [CrossRef]
  8. Mitchell, P.W.; Michael, J.S. Bias, brains, and skulls: Tracing the legacy of scientific racism in the nineteenth-century works of Samuel George Morton and Friedrich Tiedemann. In Embodied Difference: Divergent Bodies in Public Discourse; Lexington Books: Lanham, MD, USA, 2019; pp. 77–98. [Google Scholar]
  9. Gould, S.J. The Panda’s Thumb: More Reflections in Natural History; WW Norton Company: New York, NY, USA, 1980. [Google Scholar]
  10. Hodge, B. The Sokal ‘Hoax’: Some implications for science and postmodernism. Contin. J. Media Cult. Stud. 1999, 13, 255–269. [Google Scholar] [CrossRef]
  11. Sokal, A.D. Transgressing the boundaries: Toward a transformative hermeneutics of quantum gravity. Soc. Text 1996, 46/47, 217–252. [Google Scholar] [CrossRef]
  12. Godlee, F.; Smith, J.; Marcovitch, H. Wakefield’s article linking MMR vaccine and autism was fraudulent. Bmj 2011, 342. [Google Scholar] [CrossRef]
  13. Wakefield, A.; Murch, S.; Anthony, A.; Linnell, J.; Casson, D.; Malik, M.; Berelowitz, M.; Dhillon, A.; Thomson, M.; Harvey, P.; et al. RETRACTED: Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. Lancet 1998, 351, 637–641. [Google Scholar] [CrossRef]
  14. Siani, A. Vaccine hesitancy and refusal: History, causes, mitigation strategies. In Integrated Science of Global Epidemics; Springer International Publishing: Cham, Switzerland, 2023; pp. 503–517. [Google Scholar] [CrossRef]
  15. The COVID-19 infodemic. Lancet Infect. Dis. 2020, 20, 875. [CrossRef]
  16. Siani, A.; Green, I. Scientific misinformation and mistrust of COVID-19 preventive measures among the UK population: A pilot study. Vaccines 2023, 11, 301. [Google Scholar] [CrossRef]
  17. Treen, K.M.D.I.; Williams, H.T.; O’Neill, S.J. Online misinformation about climate change. Wiley Interdiscip. Rev. Clim. Change 2020, 11, e665. [Google Scholar] [CrossRef]
  18. Lewandowsky, S. Climate change disinformation and how to combat it. Annu. Rev. Public Health 2021, 42, 1–21. [Google Scholar] [CrossRef] [PubMed]
  19. Marie, A.; Altay, S.; Strickland, B. The cognitive foundations of misinformation on science: What we know and what scientists can do about it. EMBO Rep. 2020, 21, e50205. [Google Scholar] [CrossRef] [PubMed]
  20. Vellani, V.; Zheng, S.; Ercelik, D.; Sharot, T. The illusory truth effect leads to the spread of misinformation. Cognition 2023, 236, 105421. [Google Scholar] [CrossRef]
  21. Vranic, A.; Hromatko, I.; Tonković, M. “I did my own research”: Overconfidence,(dis) trust in science, and endorsement of conspiracy theories. Front. Psychol. 2022, 13, 931865. [Google Scholar] [CrossRef] [PubMed]
  22. Martel, C.; Pennycook, G.; Rand, D.G. Reliance on emotion promotes belief in fake news. Cogn. Res. 2020, 5, 47. [Google Scholar] [CrossRef]
  23. Pröllochs, N.; Bär, D.; Feuerriegel, S. Emotions in online rumor diffusion. EPJ Data Sci. 2021, 10, 51. [Google Scholar] [CrossRef]
  24. Calvillo, D.P.; León, A.; Rutchick, A.M. Personality and misinformation. Curr. Opin. Psychol. 2024, 55, 101752. [Google Scholar] [CrossRef] [PubMed]
  25. Martinez, A.P.; Shevlin, M.; Valiente, C.; Hyland, P.; Bentall, R.P. Paranoid beliefs and conspiracy mentality are associated with different forms of mistrust: A three-nation study. Front. Psychol. 2022, 13, 1023366. [Google Scholar] [CrossRef] [PubMed]
  26. Chen, R.; Chen, G.; Zhang, L.; Xie, R.; Chen, R. An analysis of the factors influencing engagement metrics within the dissemination of health science misinformation. Front. Public Health 2025, 13, 1571210. [Google Scholar] [CrossRef] [PubMed]
  27. Wang, S.; Chu, T.H.; Huang, G. Do Bandwagon Cues Affect Credibility Perceptions? A Meta-Analysis of the Experimental Evidence. Commun. Res. 2023, 50, 720–744. [Google Scholar] [CrossRef]
  28. Newman, T.P.; Nisbet, E.C.; Nisbet, M.C. Climate change, cultural cognition, and media effects: Worldviews drive news selectivity, biased processing, and polarized attitudes. Public Underst. Sci. 2018, 27, 985–1002. [Google Scholar] [CrossRef]
  29. Tiwana, M.H.; Smith, J. Faith and vaccination: A scoping review of the relationships between religious beliefs and vaccine hesitancy. BMC Public Health 2024, 24, 1806. [Google Scholar] [CrossRef]
  30. Stasielowicz, L. Who believes in conspiracy theories? A meta-analysis on personality correlates. J. Res. Personal. 2022, 98, 104229. [Google Scholar] [CrossRef]
  31. Siani, A.; Carter, I.; Moulton, F. Political views and science literacy as indicators of vaccine confidence and COVID-19 concern. J. Prev. Med. Hyg. 2022, 63, E257–E269. [Google Scholar] [CrossRef]
  32. Rozenblum, Y.; Dalyot, K.; Baram-Tsabari, A. People who have more science education rely less on misinformation—Even if they do not necessarily follow the health recommendations. J. Res. Sci. Teach. 2025, 62, 825–868. [Google Scholar] [CrossRef]
  33. Sultan, M.; Tump, A.N.; Ehmann, N.; Lorenz-Spreen, P.; Hertwig, R.; Gollwitzer, A.; Kurvers, R.H. Susceptibility to online misinformation: A systematic meta-analysis of demographic and psychological factors. Proc. Natl. Acad. Sci. USA 2024, 121, e2409329121. [Google Scholar] [CrossRef]
  34. Santirocchi, A.; Spataro, P.; Alessi, F.; Rossi-Arnaud, C.; Cestari, V. Trust in science and belief in misinformation mediate the effects of political orientation on vaccine hesitancy and intention to be vaccinated. Acta Psychol. 2023, 237, 103945. [Google Scholar] [CrossRef] [PubMed]
  35. Cologna, V.; Mede, N.G.; Berger, S.; Besley, J.; Brick, C.; Joubert, M.; Maibach, E.W.; Mihelj, S.; Oreskes, N.; Schäfer, M.S.; et al. Trust in scientists and their role in society across 68 countries. Nat. Hum. Behav. 2025, 9, 713–730. [Google Scholar] [CrossRef]
  36. Siani, A. Mixed messages, broken trust, avoidable deaths: A critical appraisal of the UK government’s response to the COVID-19 pandemic. Open Health 2024, 5, 20230016. [Google Scholar] [CrossRef]
  37. Štětka, V.; Brandao, F.; Mihelj, S.; Tóth, F.; Hallin, D.; Rothberg, D.; Ferracioli, P.; Klimkiewicz, B. Have people ‘had enough of experts’? The impact of populism and pandemic misinformation on institutional trust in comparative perspective. Information. Commun. Soc. 2024, 28, 1039–1060. [Google Scholar] [CrossRef]
  38. World Economic Forum. Global Risks Report 2025: Conflict, Environment and Disinformation Top Threats. Available online: https://www.weforum.org/press/2025/01/global-risks-report-2025-conflict-environment-and-disinformation-top-threats/ (accessed on 29 June 2025).
  39. The Lancet Health in the age of disinformation. Lancet 2025, 405, 173. [CrossRef]
  40. Osborne, J.; Pimentel, D. Science education in an age of misinformation. Sci. Educ. 2023, 107, 553–571. [Google Scholar] [CrossRef]
  41. Siani, A.; Gennari, A. Scientific misinformation, disinformation and perception of news sources among Italian secondary school students. Int. J. Sci. Educ. 2025, 1–15. [Google Scholar] [CrossRef]
  42. Allaire-Duquette, G.; Hasni, A.; Drouin, J.N.; Groleau, A.; Mahhou, A.; Legault, A.; Khayat, A.; Carignan, M.-E.; Ayotte-Beaudet, J.-P. Primary school pupils’ ability to detect fake science news following a news media literacy intervention: Exploration of their success rate, evaluation strategies, self-efficacy beliefs, and views of science news. J. Digit. Educ. Technol. 2025, 5, ep2509. [Google Scholar] [CrossRef]
  43. Siani, A.; Joseph, M.; Dacin, C. Susceptibility to scientific misinformation and perception of news source reliability in secondary school students. Discov. Educ. 2024, 3, 93. [Google Scholar] [CrossRef]
  44. Wright, R.R. The way forward: Adult educators combating mis/disinformation via formal and informal education. New Dir. Adult Contin. Educ. 2023, 2023, 119–130. [Google Scholar] [CrossRef]
  45. Roozenbeek, J.; van der Linden, S.; Nygren, T. Prebunking interventions based on “inoculation” theory can reduce susceptibility to misinformation across cultures. Harv. Kennedy Sch. Misinformation Rev. 2020, 1, 1–24. [Google Scholar] [CrossRef]
  46. Lewandowsky, S.; van der Linden, S. Countering Misinformation and Fake News Through Inoculation and Prebunking. Eur. Rev. Soc. Psychol. 2021, 32, 348–384. [Google Scholar] [CrossRef]
  47. Vraga, E.K.; Ecker, U.K.; Žeželj, I.; Lazić, A.; Azlan, A.A. To debunk or not to debunk? Correcting (mis) information. In Managing Infodemics in the 21st Century: Addressing New Public Health Challenges in the Information Ecosystem; Springer: Cham, Switzerland, 2023; pp. 85–98. Available online: http://dx.crossref.org/10.1007/978-3-031-27789-4_7 (accessed on 29 June 2025).
  48. Chan, M.S.; Jones, C.R.; Hall Jamieson, K.; Albarracín, D. Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation. Psychol. Sci. 2017, 28, 1531–1546. [Google Scholar] [CrossRef]
  49. Van Erkel, P.F.; Van Aelst, P.; de Vreese, C.H.; Hopmann, D.N.; Matthes, J.; Stanyer, J.; Corbu, N. When are fact-checks effective? An experimental study on the inclusion of the misinformation source and the source of fact-checks in 16 European Countries. Mass Commun. Soc. 2024, 27, 851–876. [Google Scholar] [CrossRef]
  50. Pasquetto, I.V.; Jahani, E.; Atreja, S.; Baum, M. Social debunking of misinformation on WhatsApp: The case for strong and in-group ties. Proc. ACM Hum.-Comput. Interact. 2022, 6, 1–35. [Google Scholar] [CrossRef]
  51. Dobber, T.; Kruikemeier, S.; Votta, F.; Helberger, N.; Goodman, E.P. The effect of traffic light veracity labels on perceptions of political advertising source and message credibility on social media. J. Inf. Technol. Politics 2023, 22, 82–97. [Google Scholar] [CrossRef]
  52. Pennycook, G.; Rand, D.G. Accuracy prompts are a replicable and generalizable approach for reducing the spread of misinformation. Nat. Commun. 2022, 13, 2333. [Google Scholar] [CrossRef] [PubMed]
  53. Pathak, R.; Spezzano, F. An Empirical Analysis of Intervention Strategies’ Effectiveness for Countering Misinformation Amplification by Recommendation Algorithms. In Advances in Information Retrieval; Goharian, N., Tonellotto, N., He, Y., Lipani, A., McDonald, G., Macdonald, C., Ounis, I., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2024; Volume 14611. [Google Scholar] [CrossRef]
  54. Tan, Y.H.; Chua, H.N.; Low, Y.C.; Jasser, M.B. Current Landscape of Generative AI: Models, Applications, Regulations and Challenges. In Proceedings of the 2024 IEEE 14th International Conference on Control System, Computing and Engineering (ICCSCE), Penang, Malaysia, 23–24 August 2024; pp. 168–173. [Google Scholar] [CrossRef]
  55. Spitale, G.; Biller-Andorno, N.; Germani, F. AI model GPT-3 (dis) informs us better than humans. Sci. Adv. 2023, 9, eadh1850. [Google Scholar] [CrossRef]
  56. Saeidnia, H.R.; Hosseini, E.; Lund, B.; Tehrani, M.A.; Zaker, S.; Molaei, S. Artificial intelligence in the battle against disinformation and misinformation: A systematic review of challenges and approaches. Knowl. Inf. Syst. 2025, 67, 3139–3158. [Google Scholar] [CrossRef]
  57. Bontridder, N.; Poullet, Y. The role of artificial intelligence in disinformation. Data Policy 2021, 3, e32. [Google Scholar] [CrossRef]
  58. Coeckelbergh, M. AI and Epistemic Agency: How AI Influences Belief Revision and Its Normative Implications. Soc. Epistemol. 2025, 1–13. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Siani, A. Scientific Misinformation. Encyclopedia 2025, 5, 119. https://doi.org/10.3390/encyclopedia5030119

AMA Style

Siani A. Scientific Misinformation. Encyclopedia. 2025; 5(3):119. https://doi.org/10.3390/encyclopedia5030119

Chicago/Turabian Style

Siani, Alessandro. 2025. "Scientific Misinformation" Encyclopedia 5, no. 3: 119. https://doi.org/10.3390/encyclopedia5030119

APA Style

Siani, A. (2025). Scientific Misinformation. Encyclopedia, 5(3), 119. https://doi.org/10.3390/encyclopedia5030119

Article Metrics

Back to TopTop