Definition
Misinformation has emerged as a significant threat to both society and public health, with social media acting as a major conduit for its dissemination. This contributes to harmful health outcomes and undermines trust in authoritative institutions. In addition, the dismantling of scientific authority seems to be a symptom of the post-truth era, where “alternative facts” are presented in the public debate as indisputable evidence of the inherent limitations of scientific infallibility. The prevalence of misinformation on social media platforms stems from multiple, interconnected factors, including individual-level influences such as cognitive biases, as well as systemic aspects of social media’s information architecture. Unlike scientific institutions that adhere to the principles of evidence-based knowledge, social media platforms operate under an attention-driven model that favors virality over factuality. Addressing these challenges effectively requires coordinated, multi-level, and multidisciplinary interventions targeting users, content creators, technology companies, health authorities, and governments to restore public trust and safeguard the credibility of medical expertise.
1. Introduction
Over the years, the use of social media for information-seeking purposes has significantly increased. Many individuals now turn to social media platforms to obtain health-related information. Social media facilitates diverse interactions, enabling communication between patients and their peers [] as well as fostering interaction between patients and healthcare professionals []. In their literature review research, Househ, Borycki and Kushniruk [] documented a rise in patients’ social media use across various health-related topics, noting that different activities on these platforms may produce varying levels of patient engagement and empowerment. As a result, patients increasingly expect their healthcare providers to demonstrate not only outstanding clinical skills but also proficiency in digital communication [] (p. 313).
A significant advantage of social media in health communication lies in its capacity to improve accessibility and disseminate health information to diverse demographic groups, regardless of age, education, race, ethnicity, or geographic location-unlike traditional methods []. In effect, medical practitioners often use social media to enhance communication with patients, promote behavioral changes, and ultimately achieve better health outcomes, though these benefits must be weighed against risks and ethical concerns []. During the COVID-19 pandemic, for instance, users shared health articles distributed by medical professionals on social media to raise awareness of preventive measures against the virus [] (p. 2).
Research on the effects of social media use on patient-provider relationships indicates that these platforms can shift traditional power dynamics, encouraging patients to take a more active role in discussions and decision-making, a process known as patient empowerment []. Van Dijck and Alinejad describe a fundamental shift from the traditional “institutional model” of science communication, characterized by linear information flows between scientists, policymakers, journalists, and the public, toward a networked model []. In this new framework, social media acts as a “centrifugal force”, enabling multidirectional exchanges among all actors and reshaping the landscape of public debate. “The concept of apomediation, introduced by Dr. Gunther Eysenbach, a leading scholar in eHealth and health policy, captures a core aspect of the Medicine 2.0 paradigm: a shift toward openness and active user participation. In this model, individuals who seek medical information online increasingly bypass traditional gatekeepers-such as primary care physicians-and instead access credible information with the help of “apomediaries” (people, tools) that guide users to relevant and credible health information. Importantly, the presence of apomediaries is not mandatory; users can still access the information independently, but these intermediaries help ensure that the information is accurate and trustworthy [] (p. 6). However, the advantages of this participatory model of information exchange can be offset by the risks and challenges inherent to social media platforms. The algorithmic amplification of misinformation and the formation of echo chambers that propagate views opposing scientific authority can undermine trust in medical expertise, potentially leading to serious public health consequences. Moreover, the absence of traditional journalistic gatekeeping allows populist actors to connect directly with the public, making their messages even more difficult to challenge or dismantle.
Responding to these challenges requires a multidisciplinary approach that promotes critical thinking in the consumption of medical information while also preserving a healthy and trustworthy relationship between medical experts and the public.
The following section addresses these issues through a thematic literature review, organizing research around key concepts and debates related to the topic. Relevant studies were identified through database searches using keywords such as trust in science, medical populism, science-related populism, health mis/disinformation, and health communication. This review is structured around three major thematic areas, trust and values in science, populist and anti-expert discourse, and misinformation (including triggers and strategies to combat it in digital health contexts), to synthesize insights on the interaction between public trust, expertise, and contemporary health communication in digital spaces.
2. Social Media and Misinformation on Health Issues
Previous research has demonstrated that misinformation is highly prevalent across health-related public deliberation, covering topics such as vaccination, infectious diseases, nutrition, cancer, drugs and tobacco use [,]. A systematic review exploring the quality of health information on social media platforms (SMHI) found that cancer and diabetes to be the most concerning topics regarding the state of SMHI [].
Drawing from the recent past, the COVID-19 pandemic underscored the dual risks posed by both the virus and the spread of misinformation, with social media serving as a major conduit for pandemic-related false information []. Under this scope, in February 2020, WHO Director-General Tedros Adhanom Ghebreyesus cautioned that the COVID-19 outbreak was accompanied by an “infodemic”. This term denotes an excessive flow of information, both factual and misleading, that proliferates during health crises, spreading through digital and offline networks and hindering individuals’ ability to discern trustworthy sources and accurate guidance [] (p. 2). The infodemic can have detrimental effects on health and well-being, as well as contribute to the polarization of public discourse. This, in turn, places additional strain on healthcare systems by compromising the reach and effectiveness of health intervention programs [] (p. 1). The harmful effects of misinformation are especially evident when it is presented as a conspiracy theory [] (p. 355).
In a study examining the dynamics of misinformation dissemination on social media, particularly during health crises such as the COVID-19 pandemic, it was found that posts containing conspiracy narratives received significantly higher user engagement []. Furthermore, health influencers sometimes promote conspiracy content, including anti-vaccine claims, to build communities around alternative products []. A U.S. study revealed that even some physicians engaged in spreading COVID-19 misinformation on social media, emphasizing the public health risks associated with such misinformation amplified by credentialed professionals [].
However, misinformation on vaccination has been persisting on social media long before COVID-19, contributing to reduced vaccination rates and outbreaks of previous controlled diseases, such as the measles, even in parts of the Western world []. During the 2014 Ebola outbreak, in a relevant study on X (formerly Twitter), it was found that among Ebola-related tweets that were retweeted, misinformation was more widely shared than accurate content, indicating higher levels of user engagement [].
Scholars distinguish between misinformation and disinformation as two central yet distinct concepts within public discourse. Misinformation refers to false or misleading information that is shared without any deliberate intent to deceive, often originating from misunderstandings, rumors, or inadequate fact-checking []. On the other hand, disinformation involves intentionally false or manipulated content created to mislead or inflict harm [,] (p. 2). Disinformation frequently appears as “adversarial narratives, stories that may include some truthful elements but are deliberately manipulated to create division and provoke conflict among various groups within society [] (p. 2). A particularly salient example of adversarial disinformation can be found in the amplification of COVID-19 denialist narratives, which strategically leveraged digital media to challenge scientific authority and public health guidance []. More precisely, the author of this study identifies four major “forces” that enabled denialist views to gain traction; (a) motivated amplifiers, i.e., actors with ideological or economic incentives to support denialist scientists, (b) conventional media outlets giving outsized coverage to denialist perspectives in the name of “balanced” reporting, (c) promoters of controversy, so-called “attention-economy” actors who benefit from sensationalist or oppositional stances to the scientific consensus and, (d) social media platforms and digital information environments, which foster information silos, allowing denialist voices to dominate and spread unchallenged.
Populist actors have found in the digital communication ecosystem, particularly social networking platforms, a fertile ground for spreading their messages. These platforms provide a powerful means for populist messages to spread rapidly and engage broad audiences []. By bypassing the traditional gatekeeping functions of legacy media and exploiting widespread distrust in institutions and experts, they leverage the opportunities for direct communication provided by social media platforms to build their “electorate”, followers and friends who act as active transmitters of their messages. In this environment, where information circulates freely and verification is often absent, the truthfulness of content becomes secondary to its emotional and ideological resonance []. Consequently, post-truth politics display an elective affinity with digital populism, as both thrive on affective appeal, identity confirmation, and the erosion of epistemic authority.
Medical populism has attracted considerable scholarly debate, especially during the COVID-19 pandemic. Lasco and Curato defined medical populism as a political style of communication through which leaders seek to legitimize their handling of health crises by appealing to “the people” and challenging expert or institutional authority [].
According to Lasco, one of the key characteristics of medical populism is the invocation of knowledge claims to dramatize or oversimplify public health crises. As the author contends, “while some of these claims go against established scientific facts and verge on fake news, they may just as likely involve invocations of ‘science’ and ‘public health” [] (p. 1419). Building on this framework, Tubera and Dacela argue that medical populism should also be understood as a violation of individuals’ epistemic rights, that is, their right to access credible and reliable medical information []. They suggest that when populist actors misuse their epistemic authority and spread misinformation, they distort scientific discourse and reduce complex public health issues to matters of political contestation. Such politicization, they conclude, undermines public trust in science and produces harmful consequences for collective health outcomes.
Based on the theoretical foundations of the participatory turn and alternative epistemologies, Mede and Schäfer develop their concept of science-related populism [].This framework underscores the public’s growing demand for greater participation in domains traditionally reserved for experts while simultaneously challenging the authoritative knowledge claims made by the scientific and medical communities. Within this perspective, populist actors contest the legitimacy of scientists in two main areas: decision-making sovereignty, the authority to determine scientific priorities and make decisions on research matters and truth-speaking sovereignty, the authority to define what constitutes valid and reliable knowledge.
Overall, misinformation represents just a surface-level issue. As Lewandowsky, Ecker, and Cook argue in today’s so-called post-truth era, more profound societal trends, such as decreasing social cohesion, rising economic inequality, increasing political polarization, diminishing trust in science, and a fractured media environment, create conditions in which factual, evidence-based discourse struggles to prevail []. Failures and broken promises in the political sphere erode public trust, metastasize and result in an “epistemological contamination”, whereby people come to doubt not only political actors but also other sources of guidance and expertise that, under different circumstances, would help navigate complex societal problems []. Brubaker similarly contends that fake news is only a surface-level manifestation of a deeper issue, which according to him is traced in the systemic erosion of public knowledge. According to the author:
“The institutions that generate, refine, assess, popularize, and disseminate knowledge—science, universities, and the mainstream and elite media—have suffered a massive loss in public trust and legitimacy. The digital ecosystem that incubates and circulates what purports to be knowledge is increasingly disconnected from these institutions. A mood of “epistemological populism” breeds a pervasive suspicion of expertise” []. Under these conditions, Giordani et al. refer to a ‘post-press’ and ‘post-fact’ era, emphasizing the de-legitimization of traditional media as the primary institution for fact-checking, increasingly challenged by competing narratives on social media, constructed under the logic of distortion of information [] (pp. 2865–2866). New populist actors, claiming to speak for scientific truth, have become adept at utilizing scientific rhetoric to construct their own parallel framework of authority [] (pp. 141–142). From their part, Palmer and Gorman argue that for pseudo-experts to gain influence, it is essential to undermine the credibility of established scientists, which explains why anti-science campaigns often target prominent institutions, researchers, and organizations [] (p. 4).
Following Lewandowsky, Ecker, and Cook’s assertion, “an obvious hallmark of a post-truth world is that it empowers people to choose their own reality, where facts and objective evidence are trumped by existing beliefs and prejudices” [] (p. 361). This highlights how individuals can select their preferred version of reality. According to Kleeberg, the current era is marked by a significant “subjectivation of truth”, where the perception of truth relies more on seeming authentic than on factual correctness []. Consequently, disproven claims continue to circulate if their advocates effectively demonstrate qualities of genuineness within particular “truth-scenes”.
Triggers of Misinformation Cascades on Social Media
So, what drives susceptibility to health misinformation on social media? Is there a “magic recipe” that determines which type of stories can catch users’ attention despite their credibility? According to cognitive psychology, people tend to believe inaccurate or incorrect information if it is in line with their previously held beliefs and opinions, a phenomenon termed “confirmation bias” []. Closely related are the concepts of selective exposure and selective retention. Selective exposure is the deliberate tendency of people to consume messages from sources that confirm their preexisting viewpoints, even when an ample set of viewpoints is readily accessible []. In parallel, selective retention argues that people are less likely to recall information that contradicts their beliefs [].
As Schacter suggests [], bias is a fundamental memory distortion. Our existing beliefs and knowledge do not just shape how we interpret the world; they can also strongly influence how we form and recall memories.
After all, as Friedman argues, to cope with the abundance of information surrounding us, we must compare incoming “data” to the interpretations already formed by our web of beliefs at a given time, ignoring, for the most part, any data that does not map onto the contexts of our extant web [] (p. 8).
A relative study shows that even users with high health literacy can be influenced by confirmation bias, particularly when their prior beliefs are strong [], an alarming finding which highlights the dynamics of the discussed mechanism and the challenges in promoting critical evaluation of health information online.
Moreover, social media platforms have been widely recognized for their contribution to the creation of echo chambers, where users engage primarily with like-minded individuals who share similar beliefs and viewpoints [,]. These echo chambers can reinforce false narratives, amplify misinformation [] and in certain contexts intensify skepticism toward medical authorities []. Samet makes a crucial differentiation between echo chambers and filter bubbles, two phenomena that display dynamics of online clustering and insulation from dissenting perspectives []. This distinction rests on the locus of agency underpinning each phenomenon: echo chambers are predominantly user-driven, representing the purposeful selection of ideologically compatible individuals and groups, whereas filter bubbles are algorithmically driven, being created through the personalization of content to users’ digital activity. This algorithmic gatekeeping acts as an invisible editorial process that orders the inflow of information on online platforms yet mostly remains imperceptible to the end user [] (pp. 207–209). This distinction is significant for the formulation of effective countermeasures. Under this prism, algorithmically induced filter bubbles can, at least in principle, be mitigated through modifications to algorithmic design. Echo chambers, on the other hand, are harder to address; they call for broader strategies that take into account the social and psychological factors behind people’s tendency to ignore or discredit viewpoints that challenge their own.
Online users not only select information that confirms their viewpoints, but they also join polarized groups with “ideological homophily”, collectively reinforcing the shared narratives []. Challenging content is often disregarded, and when users engage with it, it often reinforces existing beliefs and deepens opinion polarization []. Overall, the literature on COVID-19 demonstrates that social media-induced polarization (SMIP) is not confined to political or environmental issues such as climate change but also extends to the domain of health [,]. Other approaches draw attention not only to the polarization of public health issues but also to the weaponization of health communication []. Broniatowski et al. investigated the weaponization of health communication, showing how vaccine-related discourse on social media was deliberately manipulated by automated bots and troll accounts, particularly those tied to Russian influence operations [].
Another study also highlights the role of novelty in spreading misinformation, since it was found that false news appears more novel than true news, something that implies that people are inclined to share unfamiliar information []. One very interesting finding of the same study [] is that contrary to widespread assumptions about bots’ role in the dissemination of misinformation, especially in polarizing health topics [] it is human users who play the central role in propagating false news. It is worth noting that the fact that misinformation triggers better engagement [] might pave the way for its strategic incorporation into fact-checking practices []. More precisely, it was found that fact-checking posts which reiterated the misinformation being debunked alongside the corrective information were significantly more likely to elicit user comments and stimulate discussion than either fact-checking posts that omitted the false claim or posts originating from misinformation spreaders themselves [].
Other studies have shed light on the major components underpinning the production and diffusion of health-related misinformation: the agent, the message, and the interpreter []. The findings of the study shows that most people spreading misinformation are not tied to official institutions, thereby underscoring the prominence of the so-called “expert patient” and the broader presence of anti-science sentiments within online health discourse. Misinformation narratives tend to be personal, negative, and emotionally charged, creating fear and mistrust toward institutions. Such conditions heighten susceptibility to misinformation and render corrective efforts especially challenging, as their success depends on distinct cognitive and psychological traits of individuals. Studies identified health-related anxiety, preexisting misconceptions, repeated exposure, and demographic characteristics (younger age, female gender, lower education, and income) as key contributors to individuals’ susceptibility to false health claims [].
Overall, misinformation continues to spread on social media due to a combination of cognitive biases, individual characteristics increasing vulnerability in specific groups to false claims, and platform designs favoring sensational content to boost engagement, which aligns with their attention-based economic model [].
3. Eroding Trust: The Impact of Misinformation on Scientific Authority
Previous research has underscored the importance of trust between the public and health authorities for public adherence to health measures during crises such as the COVID-19 pandemic [,]. According to Armstrong et al. trust can be conceptualized as “the belief that an entity will act in one’s interest in the future” [] (p. 828). Trust in the public health system fosters expectations of social protection and mutual respect between citizens and the state [] (p. 5). In addition, based on the findings of a relevant study higher trust in scientists and increased numeracy skills—denoting stronger critical thinking abilities—are correlated with lower susceptibility to health misinformation []. However, misinformation and politicization of health issues, which subsequently leads to polarization of the public debate contributes to a decline in public trust in health and science [] with in turn has detrimental outcomes for the public health [,]. This decline in trust deepens when healthcare professionals are perceived as driven by personal or financial interests [].
Even though distrust and mistrust are being used interchangeably to refer to the lack of trust in the health system, there is a subtle but important distinction between the two []. More precisely, distrust denotes an individual act of doubt based on former or present disappointing experiences, while mistrust refers to a generalized suspicion rooted in shared historical experiences of social inequality and marginalization, conditions fostering the ground for conspiracy theories to flourish [] (pp. 1–2). In fact, in their theoretical overview of the factors that increase vulnerability to conspiracy theories, authors identify membership in marginalized social groups, among other determinants (perceived threats from social change, feelings of uncertainty and powerlessness, limited socio-political control, lower social status, reduced analytical thinking, lower levels of education and income) []. Overall, in their concept analysis of medical mistrust, Shukla et al. identify four core determinants; interpersonal mistrust, reflected in suspicion or skepticism toward healthcare providers’ intentions; institutional mistrust, representing a lack of confidence in healthcare systems; fear of exploitation and harm; and mistrust regarding the quality and safety of medical care [].
Furthermore, considerable debate surrounds the ways in which, during times of crisis, scientific expertise can be utilized by political actors to provide legitimization to the policy making process, particularly in policy areas that involve significant technical complexity [,]. The deployment of scientific expertise is intended to build fiduciary trust with the public, signaling that government actions are guided by the citizens’ best interests [] (p. 322). In addition, studies provide evidence that public health communication during crises is perceived as a trusted source if the presenter is a health care official, the information is presented without political bias, and it is shared promptly []. In addition, as Metzen highlights, trust in expertise has a political advantage; “it enables voluntary compliance” [] (p. 58). At the same time, this tactic can obscure political accountability during crises, since politicians can render scientists responsible for the employment of frustrating measures while maintaining their appeal with their electorate.
Under this prism, it is argued that political opposition “instrumentalize” counter expertise or “alternative facts” in order to “undermine the knowledge claims of an expertocracy forced by ruling politics” [] (p. 136). In effect, the mechanism under which “alternative facts” operate lays in a process of an enduring negation, where factual facts are presented as mere opinions [] (p. 138).
In the Greek context, levels of trust in health institutions fluctuated throughout the pandemic, reflecting differences in public perception regarding the transparency of authorities’ responses to the health crisis, as well as ongoing debates about policy decisions and the efficacy of vaccines []. In the beginning the government gained public confidence through clear communication and expert-led strategies. However, this trust waned over time, especially during later lockdowns, which sparked public skepticism and disagreement [] (p. 20). Similarly, in a study of the Netherlands’ institutional approach during the first four months of the pandemic, two key phases were identified: an initial phase of institutional “emergency response” driven by expert guidance, followed by a “smart exit strategy” phase that became more contentious as non-expert opinions and opposing narratives gained visibility online [].
From their part, Russell and Patterson suggest that the “following the science” doctrine can also at times contribute to the erosion of trust []. By invoking Naomi Oreskes’ work on scientific consensus and its misapplication during the COVID-19 pandemic, they show how efforts to reinforce scientific authority may backfire by oversimplifying the nuances of scientific discourse and ignoring the inherent uncertainties and complexities of scientific research. Similarly, Oxman et al. argue that while persuading the public to follow recommended measures during health crises can be justified, it is crucial to disclose significant uncertainties to prevent undesirable outcomes [] (p. 7).
People’s unwillingness to comply with health policy measures proposed by an academic elite does not primarily stem from scientific counterarguments about the efficiency of those measures, as would occur in a scientific debate among equals. Rather, it reflects low levels of trust, suggesting that when people evaluate scientific decisions, they also consider broader moral and social values. From this perspective, Popa argues that the value-laden aspects of science can be legitimate when they are trust-conducive-that is, when they align with public values []. Drawing on the principles of distributive and procedural justice, she contends that incorporating these values into scientific decision-making not only enhances the legitimacy of public health policies but also fosters greater public trust in medical expertise, thereby improving the effectiveness of public health interventions. Individuals who are concerned with the true intentions behind the use of scientific expertise in crisis management often seek out alternative sources, which frequently disseminate misinformation and challenge the credibility of established experts. For instance, parents who are hesitant about vaccines are more likely to seek information online rather than rely on guidance from public health authorities []. Research shows that much of the COVID-19 misinformation focused on misleading or false claims about public health and medical authorities, thereby complicating the process of effectively debunking such falsehoods []. Social media often amplifies alternative viewpoints that challenge established scientific messages, which can weaken trust in official policies and institutions [] (p. 323).
Friedman suggests that to address the underlying epistemological issues causing polarization in this post-truth era, we should avoid attributing ill intentions or irrationality to those who disagree with us politically or epistemically []. By acknowledging the boundaries of what we know and how interpretive frameworks influence understanding, Friedman highlights that agreement on epistemic authority is still influenced by cognitive limits and cannot be considered entirely transparent or without issues. From this viewpoint, Friedman claims that disagreements ought to be seen as problems of knowledge rather than motivation. Metzen’s idea of vigilant trust may offer a way forward for rebuilding confidence in scientific expertise. This strategy promotes combining trust in expertise with both democratic oversight and public examination. It seeks to make sure that scientific methods are still responsible and in line with the values and requirements of society [].
4. Strategies for Combating Health Misinformation
Social media’s pervasive role in disseminating health information offers both opportunities and significant challenges for public health communication. As discussed in the previous sections, while these platforms can effectively facilitate widespread health education they are also primary conduits for false or misleading content. In today’s post-truth era, health researchers and practitioners, collectively ground their work in evidence, must contend with post-truth claims that question their evidence-based approaches [] (p. 599). Consequently, a substantial body of research focuses on strategies and holistic solutions to counter the threat of health misinformation.
1. Communication strategies: To effectively address this issue, some studies advocate for a move beyond traditional top-down communication methods. Public health authorities may benefit from new approaches, such as partnering with trusted intermediaries and relatable non-experts, like social media influencers, who can effectively resonate with diverse audiences [] (pp. 5–6). Tailored health communication materials are also effective for specific groups, like caregivers of children or the elderly. Palmer and Gorman involve creating “trust bridges” between health institutions and communities that harbor mistrust toward public authorities []. The success of this approach hinges on two key factors: the credibility of the messenger and the development of customized health communication materials that are specifically tailored to the unique informational needs and concerns of different social groups. This shift from one-size-fits-all messaging to targeted, participatory communication is critical for building public trust and mitigating the spread of misinformation.
At an individual level, doctors, public health professionals, and scientific organizations need to be more active in digital spaces to counter misinformation []. Scripted conversation guides that address common misinformation encountered by a clinician’s patient population could be a useful tool for countering prevalent medical myths within specific medical specialties [] (p. 8). Strategies that emphasize the role of medical experts often highlight the importance of clear, empathetic, and accessible communication [], especially in a media environment where sensational stories often overshadow factual information. Understanding patients’ knowledge, beliefs, and values enables professionals to address resistance with evidence-based information [] (p. 4). Additionally, Loeb et al. propose an “information prescription” approach, where medical practitioners guide patients to trustworthy websites for condition-specific information, much like prescribing medicine. This helps patients access reliable information in a structured and practical way [] (p. 461). Τargeted interventions for vulnerable groups, such as older adults and adolescents, enhance media literacy and limit misinformation spread. Fiordelli et al., for example, carried out a feasibility study aimed at strengthening adolescents’ critical health and scientific literacy, combining training in argumentation skills with an understanding of scientific processes []. The study highlights the potential of participatory, learner-centered approaches in schools to improve students’ critical evaluation of health information and resistance to mis- and dis-information.
Moreover, continuous monitoring is essential for the effectiveness of tailored and context specific health communication strategies and should be conducted both ex ante—to understand why people may be hesitant to follow expert guidance—and ex post—to evaluate the outcomes of medical experts’ interventions [] (p. 19).
2. Data science strategies: A prominent string of research focuses on data science approaches like machine learning techniques, NLP, and context-aware algorithms to detect and flag health misinformation [,]. Identification and flagging of questionable content is a key part of content moderation, with tools like warning labels employed to limit the spread of false health information []. However, many authors warn that while automated moderation can effectively reduce the visibility of misleading or harmful content, the lack of transparency and accountability in these systems can unintentionally suppress diverse viewpoints, raising important concerns about freedom of expression and democratic participation online [].
3. Education strategies: In an era where misinformation spreads rapidly across digital platforms, the importance of media and science literacy has become increasingly pronounced as a form of resistance to fake news, empowering individuals to critically evaluate information, identify credible sources, and make informed, evidence-based judgments. More precisely, media literacy has been described as a multifaceted concept encompasses a broad set of skills, including recognizing bias, persuasion, and manipulation across various media forms [] (p. 3). It enables individuals to critically assess the credibility and intent behind messages, make informed decisions about the content they consume or share, and understand the potential impact of misleading information [] (p. 3). It is worth nothing that recent studies have underscored the protective role of media literacy in mitigating the effects of disinformation in digital setting, serving as a critical defense mechanism against the escalating sophistication of digital disinformation, such as deepfakes [].
Scientific literacy involves understanding scientific knowledge, the methods through which it is generated, and its societal applications, providing people with the skills to make informed, science-based decisions and critically evaluate scientific information encountered in the media [] (p. 3). Under this prism, health and digital literacy campaigns have been emerged as essential tools for equipping the public to critically evaluate online health information, as lower health literacy has been associated with increased trust and use of information found on social media [].
Refutation interventions, which involve debunking misinformation and providing the correct information, have been shown to be effective in reducing the spread of scientific misconceptions, as well []. However, since research provide evidence that post hoc correction of false claims may have a backfire effect [], increasing attention has shifted toward inoculation theory, or “prebunking” [,]. This strategy works by exposing people in advance to a weakened version of misleading information, along with tools to challenge it, so they are better prepared to recognize and resist it when they encounter it later. Gamification has also proven to be an effective method for educating learners about misinformation and disinformation, using interactive game elements to strengthen media literacy skills [] (p. 6). Examples of gamified activities include online or in-person scavenger hunts, narrative-based scenarios, puzzles, self-assessment quizzes, and simulations.
4. Multidisciplinary approaches: Other research underscores the complexity of current information ecosystems and propose a holistic strategy to counterbalance the spread of health misinformation by proposing a multi-level public health response that includes proactive as well as reactive refutation strategies, like content monitoring, digital literacy initiatives, community-based initiatives for better health communication and systemic policy changes, including international legal frameworks, to address the root causes of misinformation [,].
In effect, the fight against health misinformation on social media requires a multi-layered strategy that combines technological solutions, educational interventions, proactive communication by health professionals, and supportive policies. No single method is likely to succeed in isolation []. As the information landscape continues to evolve, collaborative efforts across disciplines and sectors, with the engagement of all key stakeholders—medical practitioners, health researchers, social media giants, health organizations, patients and the public—will be essential to protect public health and restore trust in scientific and medical institutions.
5. Conclusions
In recent years, science communication has undergone significant transformations, largely driven or accelerated by digitalization. Socially, digital media have lowered barriers for both communicators and audiences, increasing opportunities for participation while opening the public debate to a plurality of voices.
Social media platforms are nowadays powerful tools for health-related communication, offering opportunities for peer support, community building, and access to information from medical professionals. However, as this review has demonstrated, they also serve as primary conduits for misinformation, including conspiracy theories, pseudoscientific claims, and unverified narratives that erode public trust in health authorities. The persistence of misinformation is driven by a complex interplay of cognitive biases, psychological factors, and platform algorithms that amplify its reach and impact. This dynamic highlights the challenges of health communication in the post-truth era, where deeply held beliefs often resist scientific evidence and can lead to harmful behavioral responses. Therefore, effective health communication must overcome these barriers and create a context that fosters behavioral change, leveraging dialogic and participatory approaches that are human/patient-centered and promote equality among all the discussants. This realization calls for a paradigm shift in health communication to promote public health, emphasizing the principles of participatory communication and the creation of health messages that are responsive to audience needs [] (p. 435).
Furthermore, we must recognize the complexities inherent in the relationship between politics and health. While politics is shaped by negotiations driven by specific interests and ideologies, medical science derives its legitimacy from verifiable evidence. We should therefore be cautious when political decisions invoke medical authority to justify their legitimacy. After all, in the post-truth era, science must defend its principles and guard against the depoliticization of politics carried out in its name [] (pp. 142–143).
Since social media platforms are the primary loci of public health misinformation, internal regulation of platform providers must be not only transparent but also informed by the expertise of public health agencies, thereby enhancing regulatory responsibility []. Ultimately, addressing health misinformation demands collective action and coordinated efforts from the scientific community, policymakers, and platform providers. By implementing multi-layered strategies that not only restore public trust in health institutions [] (pp. 11–12), but also strengthen societal resilience, we can effectively “immunize” the public against future infodemics.
Overall, this literature review provides a comprehensive overview of the evolving landscape of science communication in the digital era, using health communication as a case study. While it lacks the depth and rigor of a systematic meta-analysis, it offers readers a clear understanding of the key theoretical concepts shaping current discussions. In particular, it highlights not only the structural dynamics that facilitate the proliferation of disinformation in digital environments but also the deeper social factors underpinning widespread distrust of scientific authorities and institutions.
Author Contributions
Conceptualization, S.P. and I.G.; methodology, S.P. and I.G.; writing—original draft preparation, I.G.; writing—review and editing, S.P.; supervision, S.P. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
No new data were created or analyzed in this study. Data sharing is not applicable to this article.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Frost, J.H.; Massagli, M.P. Social Uses of Personal Health Information Within PatientsLikeMe, an Online Patient Community: What Can Happen When Patients Have Access to One Another’s Data. J. Med. Internet Res. 2008, 10, e15. [Google Scholar] [CrossRef]
- Wu, T.; He, Z.; Zhang, D. Impact of Communicating with Doctors via Social Media on Consumers’ E-Health Literacy and Healthy Behaviors in China. Inquiry 2020, 57. [Google Scholar] [CrossRef]
- Househ, M.; Borycki, E.; Kushniruk, A. Empowering Patients through Social Media: The Benefits and Challenges. Health Inform. J. 2014, 20, 50–58. [Google Scholar] [CrossRef]
- Romano, R.; Baum, N. How Pediatric Surgeons Use Social Media to Attract New Patients. Eur. J. Pediatr. Surg. 2014, 24, 313–316. [Google Scholar] [CrossRef] [PubMed]
- Moorhead, S.A.; Hazlett, D.E.; Harrison, L.; Carroll, J.K.; Irwin, A.; Hoving, C. A New Dimension of Health Care: Systematic Review of the Uses, Benefits, and Limitations of Social Media for Health Communication. J. Med. Internet Res. 2013, 15, e85. [Google Scholar] [CrossRef] [PubMed]
- George, D.R.; Rovniak, L.S.; Kraschnewski, J.L. Dangers and Opportunities for Social Media in Medicine. Clin. Obstet. Gynecol. 2013, 56, 453–462. [Google Scholar] [CrossRef] [PubMed]
- Zhao, H.; Fu, S.; Chen, X. Promoting Users’ Intention to Share Online Health Articles on Social Media: The Role of Confirmation Bias. Inf. Process. Manag. 2020, 57, 102354. [Google Scholar] [CrossRef]
- Smailhodzic, E.; Hooijsma, W.; Boonstra, A.; Langley, D.J. Social Media Use in Healthcare: A Systematic Review of Effects on Patients and on Their Relationship with Healthcare Professionals. BMC Health Serv. Res. 2016, 16, 442. [Google Scholar] [CrossRef]
- van Dijck, J.; Alinejad, D. Social Media and Trust in Scientific Expertise: Debating the Covid-19 Pandemic in The Netherlands. Soc. Media Soc. 2020, 6. [Google Scholar] [CrossRef]
- Eysenbach, G. Medicine 2.0: Social Networking, Collaboration, Participation, Apomediation, and Openness. J. Med. Internet Res. 2008, 10, e22. [Google Scholar] [CrossRef]
- Wang, Y.; McKee, M.; Torbica, A.; Stuckler, D. Systematic Literature Review on the Spread of Health-Related Misinformation on Social Media. Soc. Sci. Med. 2019, 240, 112552. [Google Scholar] [CrossRef]
- Suarez-Lledo, V.; Alvarez-Galvez, J. Prevalence of Health Misinformation on Social Media: Systematic Review. J. Med. Internet Res. 2021, 23, e17187. [Google Scholar] [CrossRef]
- Afful-Dadzie, E.; Afful-Dadzie, A.; Egala, S.B. Social media in health communication: A literature review of information quality. Health Inf. Manag. 2023, 52, 3–17. [Google Scholar] [CrossRef]
- Bridgman, A.; Merkley, E.; Zhilin, O.; Loewen, P.J.; Owen, T.; Ruths, D. Infodemic Pathways: Evaluating the Role That Traditional and Social Media Play in Cross-National Information Transfer. Front. Polit. Sci. 2021, 3, 648646. [Google Scholar] [CrossRef]
- Tangcharoensathien, V.; Calleja, N.; Nguyen, T.; Purnat, T.; D’Agostino, M.; Garcia-Saiso, S.; Landry, M.; Rashidian, A.; Hamilton, C.; AbdAllah, A.; et al. Framework for Managing the COVID-19 Infodemic: Methods and Results of an Online, Crowdsourced WHO Technical Consultation. J. Med. Internet Res. 2020, 22, e19659. [Google Scholar] [CrossRef] [PubMed]
- García-Saisó, S.; Marti, M.; Brooks, I.; Curioso, W.; González, D.; Malek, V.; Medina, F.M.; Radix, C.; Otzoy, D.; Zacarías, S.; et al. The COVID-19 Infodemic. Rev. Panam. Salud Publica 2021, 45, e56. [Google Scholar] [CrossRef] [PubMed]
- Lewandowsky, S.; Ecker, U.K.H.; Cook, J. Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. J. Appl. Res. Mem. Cogn. 2017, 6, 353–369. [Google Scholar] [CrossRef]
- Chuai, Y.; Zhao, J.; Lenzini, G. From News Sharers to Post Viewers: How Topic Diversity and Conspiracy Theories Shape Engagement With Misinformation During a Health Crisis. arXiv 2024, arXiv:2401.08832. [Google Scholar] [CrossRef]
- Moran, R.; Swan, A.; Agajanian, T. Vaccine Misinformation for Profit: Conspiratorial Wellness Influencers and the Monetization of Alternative Health. Int. J. Commun. 2024, 18, 23. Available online: https://ijoc.org/index.php/ijoc/article/view/21128/4494 (accessed on 15 August 2025).
- Sule, S.; DaCosta, M.C.; DeCou, E.; Gilson, C.; Wallace, K.; Goff, S.L. Communication of COVID-19 Misinformation on Social Media by Physicians in the US. JAMA Netw. Open 2023, 6, e2328928. [Google Scholar] [CrossRef]
- Hussain, A.; Ali, S.; Ahmed, M.; Hussain, S. The Anti-Vaccination Movement: A Regression in Modern Medicine. Cureus 2018, 10, e2919. [Google Scholar] [CrossRef]
- Oyeyemi, S.O.; Gabarron, E.; Wynn, R. Ebola, Twitter, and Misinformation: A Dangerous Combination? BMJ 2014, 349, g6178. [Google Scholar] [CrossRef]
- Wardle, C.; Derakhshan, H. Information Disorder: Toward an Interdisciplinary Framework for Research and Policy-Making; Council of Europe: Strasbourg, France, 2017; Available online: http://tverezo.info/wp-content/uploads/2017/11/PREMS-162317-GBR-2018-Report-desinformation-A4-BAT.pdf (accessed on 15 August 2025).
- Jones, K. Protecting Political Discourse from Online Manipulation: The International Human Rights Law Framework. Available online: https://www.ohchr.org/sites/default/files/documents/issues/religion/cfi-ga76/submissions/2022-12-19/submission-freedom-thought-ga76-others-katejones-2_0.pdf (accessed on 20 August 2025).
- Morris, R.D. How Denialist Amplification Spread COVID Misinformation and Undermined the Credibility of Public Health Science. J. Public Health Policy 2024, 45, 114–125. [Google Scholar] [CrossRef] [PubMed]
- Engesser, S.; Fawzi, N.; Larsson, A. Populist Online Communication: Introduction to the Special Issue. Inf. Commun. Soc. 2017, 20, 1279–1292. [Google Scholar] [CrossRef]
- Prior, H. Digital Populism and Disinformation in “Post-Truth” Times. Commun. Soc. 2021, 34, 49–64. [Google Scholar] [CrossRef]
- Lasco, G.; Curato, N. Medical Populism. Soc. Sci. Med. 2019, 221, 1–8. [Google Scholar] [CrossRef] [PubMed]
- Lasco, G. Medical Populism and the COVID-19 Pandemic. Glob. Public Health 2020, 15, 1417–1429. [Google Scholar] [CrossRef]
- Tubera, K.C.B.; Dacela, M.A. Medical Populism and Epistemic Rights. Asia-Pac. Soc. Sci. Rev. 2025, 25, 11. [Google Scholar] [CrossRef]
- Mede, N.G.; Schäfer, M.S. Science-Related Populism: Conceptualizing Populist Demands toward Science. Public Underst. Sci. 2020, 29, 473–491. [Google Scholar] [CrossRef]
- Enroth, H. Crisis of Authority: The Truth of Post-Truth. Int. J. Polit. Cult. Soc. 2023, 36, 179–195. [Google Scholar] [CrossRef]
- Brubaker, R. Forget Fake News. Social Media Is Making Democracy Less Democratic. Zócalo Public Square, 29 November 2017. Available online: https://www.zocalopublicsquare.org/2017/11/29/forget-fake-news-social-media-making-democracy-less-democratic/ideas/essay/ (accessed on 6 September 2025).
- Giordani, R.C.F.; Donasolo, J.P.G.; Ames, V.D.B.; Giordani, R.L. The Science between the Infodemic and Other Post-Truth Narratives: Challenges during the Pandemic. Cien. Saude Colet. 2021, 26, 2863–2872. [Google Scholar] [CrossRef] [PubMed]
- Russell, J.H.; Patterson, D. Post-Truth and the Rhetoric of “Following the Science”. Rhetor. Soc. Sci. 2023, 13, 133–150. [Google Scholar] [CrossRef]
- Palmer, S.E.; Gorman, S.E. Misinformation, Trust, and Health: The Case for Information Environment as a Major Independent Social Determinant of Health. Soc. Sci. Med. 2025, 381, 115302. [Google Scholar] [CrossRef]
- Kleeberg, B. Post Post-Truth: Epistemologies of Disintegration and the Praxeology of Truth. Stan Rzeczy 2019, 17, 25–52. [Google Scholar] [CrossRef]
- Nickerson, R.S. Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Rev. Gen. Psychol. 1998, 2, 175–220. [Google Scholar] [CrossRef]
- Sears, D.O.; Freedman, J.L. Selective Exposure to Information: A Critical Review. Public Opin. Q. 1967, 31, 194–213. [Google Scholar] [CrossRef]
- Surlin, S.H.; Gordon, T.F. Selective Exposure and Retention of Political Advertising: A Regional Comparison. J. Advert. 1976, 5, 32–44. [Google Scholar] [CrossRef]
- Schacter, D.L. The Seven Sins of Memory: An Update. Memory 2022, 30, 37–42. [Google Scholar] [CrossRef]
- Friedman, J. Post-Truth and the Epistemological Crisis. Crit. Rev. 2023, 35, 1–21. [Google Scholar] [CrossRef]
- Suzuki, M.; Yamamoto, Y. Characterizing the Influence of Confirmation Bias on Web Search Behavior. Front. Psychol. 2021, 12, 771948. [Google Scholar] [CrossRef] [PubMed]
- Jiang, J.; Ren, X.; Ferrara, E. Social Media Polarization and Echo Chambers in the Context of COVID-19: Case Study. JMIRx Med. 2021, 2, e29570. [Google Scholar] [CrossRef]
- Schmidt, A.L.; Zollo, F.; Scala, A.; Betsch, C.; Quattrociocchi, W. Polarization of the Vaccination Debate on Facebook. Vaccine 2018, 36, 3606–3612. [Google Scholar] [CrossRef]
- Acemoglu, D.; Ozdaglar, A. Misinformation: Strategic Sharing, Homophily, and Endogenous Echo Chambers. VoxEU CEPR Columns, 30 June 2021. Available online: https://cepr.org/voxeu/columns/misinformation-strategic-sharing-homophily-and-endogenous-echo-chambers (accessed on 19 August 2025).
- Milhazes-Cunha, J.; Oliveira, L. Doctors for the Truth: Echo Chambers of Disinformation, Hate Speech, and Authority Bias on Social Media. Societies 2023, 13, 226. [Google Scholar] [CrossRef]
- Samet, U. Digital Echo Chambers in Emergencies: How Social Media Amplifies Polarization During Health and Disaster Events. Glob. J. Eng. Technol. Adv. 2024, 20, 198–211. [Google Scholar] [CrossRef]
- Tufekci, Z. Algorithmic Harms Beyond Facebook and Google: Emergent Challenges of Computational Agency. Colo. Technol. Law J. 2015, 13, 203–218. Available online: https://scholar.law.colorado.edu/ctlj/vol13/iss2/4 (accessed on 15 August 2025).
- Boutyline, A.; Willer, R. The Social Structure of Political Echo Chambers: Variation in Ideological Homophily in Online Networks. Political Psychol. 2017, 38, 551–569. [Google Scholar] [CrossRef]
- Modgil, S.; Singh, R.K.; Gupta, S.; Dennehy, D. A Confirmation Bias View on Social Media Induced Polarisation During Covid-19. Inf. Syst. Front. 2024, 26, 417–441. [Google Scholar] [CrossRef]
- The Lancet. Health in the Age of Disinformation. Lancet 2025, 405, 173. [Google Scholar] [CrossRef]
- Broniatowski, D.A.; Jamison, A.M.; Qi, S.; AlKulaib, L.; Chen, T.; Benton, A.; Quinn, S.C.; Dredze, M. Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate. Am. J. Public Health 2018, 108, 1378–1384. [Google Scholar] [CrossRef] [PubMed]
- Vosoughi, S.; Roy, D.; Aral, S. The Spread of True and False News Online. Science 2018, 359, 1146–1151. [Google Scholar] [CrossRef] [PubMed]
- Suarez-Lledo, V.; Ortega-Martin, E.; Carretero-Bravo, J.; Ramos-Fiol, B.; Alvarez-Galvez, J. Unraveling the Use of Disinformation Hashtags by Social Bots During the COVID-19 Pandemic: Social Networks Analysis. JMIR Infodemiol. 2025, 5, e50021. [Google Scholar] [CrossRef]
- Loeb, S.; Sengupta, S.; Butaney, M.; Macaluso, J.N., Jr.; Czarniecki, S.W.; Robbins, R.; Braithwaite, R.S.; Gao, L.; Byrne, N.; Walter, D.; et al. Dissemination of Misinformative and Biased Information about Prostate Cancer on YouTube. Eur. Urol. 2019, 75, 564–567. [Google Scholar] [CrossRef] [PubMed]
- Yang, A.; Shin, J.; Zhou, A.; Huang-Isherwood, K.M.; Lee, E.; Dong, C.; Kim, H.M.; Zhang, Y.; Sun, J.; Li, Y.; et al. The Battleground of COVID-19 Vaccine Misinformation on Facebook: Fact Checkers vs. Misinformation Spreaders. Harv. Kennedy Sch. Misinform. Rev. 2021. [Google Scholar] [CrossRef]
- Pan, W.; Liu, D.; Fang, J. An Examination of Factors Contributing to the Acceptance of Online Health Misinformation. Front. Psychol. 2021, 12, 630268. [Google Scholar] [CrossRef]
- Rodrigues, F.; Newell, R.; Rathnaiah Babu, G.; Chatterjee, T.; Sandhu, N.K.; Gupta, L. The Social Media Infodemic of Health-Related Misinformation and Technical Solutions. Health Policy Technol. 2024, 13, 100846. [Google Scholar] [CrossRef]
- Skirbekk, H.; Magelssen, M.; Conradsen, S. Trust in Healthcare before and during the COVID-19 Pandemic. BMC Public Health 2023, 23, 863. [Google Scholar] [CrossRef]
- Souvatzi, E.; Katsikidou, M.; Arvaniti, A.; Plakias, S.; Tsiakiri, A.; Samakouri, M. Trust in Healthcare, Medical Mistrust, and Health Outcomes in Times of Health Crisis: A Narrative Review. Societies 2024, 14, 269. [Google Scholar] [CrossRef]
- Armstrong, K.; McMurphy, S.; Dean, L.T.; Micco, E.; Putt, M.; Halbert, C.H.; Schwartz, J.S.; Sankar, P.; Pyeritz, R.E.; Bernhardt, B.; et al. Differences in the Patterns of Health Care System Distrust between Blacks and Whites. J. Gen. Intern. Med. 2008, 23, 827–833. [Google Scholar] [CrossRef]
- Roozenbeek, J.; Schneider, C.R.; Dryhurst, S.; Kerr, J.; Freeman, A.L.J.; Recchia, G.; van der Bles, A.M.; van der Linden, S. Susceptibility to Misinformation about COVID-19 around the World. R. Soc. Open Sci. 2020, 7, 201199. [Google Scholar] [CrossRef]
- Del Ponte, A.; Gerber, A.S.; Patashnik, E.M. Polarization, the Pandemic, and Public Trust in Health System Actors. J. Health Polit. Policy Law 2024, 49, 375–401. [Google Scholar] [CrossRef]
- Jolley, D.; Douglas, K.M. The Effects of Anti-Vaccine Conspiracy Theories on Vaccination Intentions. PLoS ONE 2014, 9, e89177. [Google Scholar] [CrossRef] [PubMed]
- Charura, D.; Hill, A.P.; Etherson, M.E. COVID-19 Vaccine Hesitancy, Medical Mistrust, and Mattering in Ethnically Diverse Communities. J. Racial Ethn. Health Disparities 2023, 10, 1518–1525. [Google Scholar] [CrossRef]
- Shukla, M.; Schilt-Solberg, M.; Gibson-Scipio, W. Medical Mistrust: A Concept Analysis. Nurs. Rep. 2025, 15, 103. [Google Scholar] [CrossRef] [PubMed]
- Freeman, D.; Waite, F.; Rosebrock, L.; Petit, A.; Causier, C.; East, A.; Jenner, L.; Teale, A.L.; Carr, L.; Mulhall, S.; et al. Coronavirus Conspiracy Beliefs, Mistrust, and Compliance with Government Guidelines in England. Psychol. Med. 2022, 52, 251–263. [Google Scholar] [CrossRef]
- Angelou, A.; Ladi, S.; Panagiotatou, D.; Tsagkroni, V. Paths to Trust: Explaining Citizens’ Trust in Experts and Evidence-Informed Policymaking during the COVID-19 Pandemic. Public Adm. 2024, 102, 1008–1025. [Google Scholar] [CrossRef]
- Dowling, M.-E.; Legrand, T. “I Do Not Consent”: Political Legitimacy, Misinformation, and the Compliance Challenge in Australia’s COVID-19 Policy Response. Policy Soc. 2023, 42, 319–333. [Google Scholar] [CrossRef]
- MacKay, M.; Colangeli, T.; Thaivalappil, A.; Del Bianco, A.; McWhirter, J.; Papadopoulos, A. A Review and Analysis of the Literature on Public Health Emergency Communication Practices. J. Community Health 2022, 47, 150–162. [Google Scholar] [CrossRef]
- Metzen, H. Vigilant Trust in Scientific Expertise. Eur. J. Philos. Sci. 2024, 14, 58. [Google Scholar] [CrossRef]
- Benetka, G.; Schor-Tschudnowskaja, A. Post-Truth and Scientific Authority. Cult. Psych. 2023, 4, 133–144. [Google Scholar] [CrossRef]
- Papathanasopoulos, S.; Armenakis, A.; Karadimitriou, A. Greeks and the Coronavirus: Trust in Institutions and Major Habits in the Way of Getting Informed. In The Communicative Construction of a Pandemic: SARS-CoV-2, the Media & Society; Pleios, G., Skamnakis, G., Theocharidis, A., Eds.; Papazisi Publications: Athens, Greece, 2021. (In Greek) [Google Scholar]
- Giannouli, I.; Archontaki, I.; Karadimitriou, A.; Papathanassopoulos, S. Medical Science in Peril? Analyzing the Anti-Vaccine Rhetoric on Greek Facebook in the COVID-19 Era. Health New Media Res. 2024, 8, 19–26. [Google Scholar] [CrossRef]
- Oxman, A.D.; Fretheim, A.; Lewin, S.; Flottorp, S.; Glenton, C.; Helleve, A.; Vestrheim, D.F.; Iversen, B.G.; Rosenbaum, S.E. Health communication in and out of public health emergencies: To persuade or to inform? Health Res. Policy Syst. 2022, 20, 28. [Google Scholar] [CrossRef]
- Popa, E. Values in Public Health: An Argument from Trust. Synthese 2024, 203, 200. [Google Scholar] [CrossRef]
- Jones, A.M.; Omer, S.B.; Bednarczyk, R.A.; Halsey, N.A.; Moulton, L.H.; Salmon, D.A. Parents’ Source of Vaccine Information and Impact on Vaccine Attitudes, Beliefs, and Nonmedical Exemptions. Adv. Prev. Med. 2012, 2012, 932741. [Google Scholar] [CrossRef] [PubMed]
- Brennen, J.S.; Simon, F.M.; Howard, P.N.; Nielsen, R.K. Types, Sources, and Claims of COVID-19 Misinformation. Reuters Institute for the Study of Journalism. 7 April 2020. Available online: https://reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation (accessed on 5 September 2025).
- Sparks, M. Promoting Health in a Post-Truth World. Health Promot. Int. 2017, 32, 599–602. [Google Scholar] [CrossRef]
- Leung, C.C.Y.; Cheng, H.Y.; Lam, T.H. Concerns over the Spread of Misinformation and Fake News on Social Media—Challenges amid the Coronavirus Pandemic. Med. Sci. Forum 2021, 4, 39. [Google Scholar] [CrossRef]
- Bautista, J.R.; Zhang, Y.; Gwizdka, J. Healthcare Professionals’ Acts of Correcting Health Misinformation on Social Media. Int. J. Med. Inform. 2021, 148, 104375. [Google Scholar] [CrossRef]
- Southwell, B.G.; Wood, J.L.; Navar, A.M. Roles for Health Care Professionals in Addressing Patient-Held Misinformation Beyond Fact Correction. Am. J. Public Health 2020, 110, S288–S289. [Google Scholar] [CrossRef]
- Polyzou, M.; Kiefer, D.; Baraliakos, X.; Sewerin, P. Addressing the Spread of Health-Related Misinformation on Social Networks: An Opinion Article. Front. Med. 2023, 10, 1167033. [Google Scholar] [CrossRef]
- Loeb, S.; Langford, A.T.; Bragg, M.A.; Sherman, R.; Chan, J.M. Cancer Misinformation on Social Media. CA Cancer J. Clin. 2024, 74, 453–464. [Google Scholar] [CrossRef] [PubMed]
- Fiordelli, M.; Diviani, N.; Farina, R.; Pellicini, P.; Ghirimoldi, A.; Rubinelli, S. Strengthening Adolescents’ Critical Health Literacy and Scientific Literacy to Tackle Mis- and Dis-Information: A Feasibility Study in Switzerland. Front. Public Health 2023, 11, 1183838. [Google Scholar] [CrossRef]
- European Observatory on Health Systems and Policies; Ali, K.; Pastore Celentano, L. Addressing Vaccine Hesitancy in the “Post-Truth” Era. Eurohealth 2017, 23, 16–20. Available online: https://iris.who.int/handle/10665/332615 (accessed on 15 August 2025).
- Di Sotto, S.; Viviani, M. Health Misinformation Detection in the Social Web: An Overview and a Data Science Approach. Int. J. Environ. Res. Public Health 2022, 19, 2173. [Google Scholar] [CrossRef] [PubMed]
- Padalko, H.; Chomko, V.; Yakovlev, S.; Chumachenko, D. A Novel Comprehensive Framework for Detecting and Understanding Health-Related Misinformation. Information 2025, 16, 175. [Google Scholar] [CrossRef]
- Martel, C.; Rand, D.G. Misinformation Warning Labels Are Widely Effective: A Review of Warning Effects and Their Moderating Features. Curr. Opin. Psychol. 2023, 54, 101710. [Google Scholar] [CrossRef]
- Gomez, J.F.; Machado, C.V.; Paes, L.M.; Calmon, F.P. Algorithmic arbitrariness in content moderation. arXiv 2024, arXiv:2402.16979. [Google Scholar] [CrossRef]
- Droog, E.; Vermeulen, I.; van Huijstee, D.; Harutyunyan, D.; Tejedor, S.; Pulido, C. Combatting the Misinformation Crisis: A Systematic Review of the Literature on Characteristics and Effectiveness of Media Literacy Interventions. Commun. Res. 2025. [Google Scholar] [CrossRef]
- Hwang, Y.; Ryu, J.Y.; Jeong, S.H. Effects of Disinformation Using Deepfake: The Protective Effect of Media Literacy Education. Cyberpsychol. Behav. Soc. Netw. 2021, 24, 188–193. [Google Scholar] [CrossRef]
- Rosenthal, S. Media Literacy, Scientific Literacy, and Science Videos on the Internet. Front. Commun. 2020, 5, 581585. [Google Scholar] [CrossRef]
- Chen, X.; Hay, J.L.; Waters, E.A.; Kiviniemi, M.T.; Biddle, C.; Schofield, E.; Li, Y.; Kaphingst, K.; Orom, H. Health Literacy and Use and Trust in Health Information. J. Health Commun. 2018, 23, 724–734. [Google Scholar] [CrossRef] [PubMed]
- Danielson, R.W.; Jacobson, N.G.; Patall, E.A.; Sinatra, G.M.; Adesope, O.O.; Kennedy, A.A.U.; Sunday, O.J. The Effectiveness of Refutation Text in Confronting Scientific Misconceptions: A Meta-Analysis. Educ. Psychol. 2024, 60, 23–47. [Google Scholar] [CrossRef]
- Nyhan, B.; Reifler, J.; Richey, S.; Freed, G.L. Effective Messages in Vaccine Promotion: A Randomized Trial. Pediatrics 2014, 133, e835–e842. [Google Scholar] [CrossRef]
- Jolley, D.; Douglas, K.M. Prevention Is Better than Cure: Addressing Anti-Vaccine Conspiracy Theories. J. Appl. Soc. Psychol. 2017, 47, 459–469. [Google Scholar] [CrossRef]
- van der Linden, S.; Roozenbeek, J.; Compton, J. Inoculating Against Fake News About COVID-19. Front. Psychol. 2020, 11, 566790. [Google Scholar] [CrossRef] [PubMed]
- Early, J.O.; Robillard, A.G.; Rooks, R.N.; Smith Romocki, L. Pedagogy and Propaganda in the Post-Truth Era: Examining Effective Approaches to Teaching about Mis/Disinformation. Pedagog. Health Promot. 2024, 10, 152–165. [Google Scholar] [CrossRef]
- Ishizumi, A.; Kolis, J.; Abad, N.; Prybylski, D.; Brookmeyer, K.A.; Voegeli, C.; Wardle, C.; Chiou, H. Beyond Misinformation: Developing a Public Health Prevention Framework for Managing Information Ecosystems. Lancet Public Health 2024, 9, e397–e406. [Google Scholar] [CrossRef]
- Denniss, R.; Lindberg, R. Social Media and the Spread of Misinformation: Infectious and a Threat to Public Health. Health Promot. Int. 2025, 40, daaf023. [Google Scholar] [CrossRef]
- Bin Naeem, S.; Kamel Boulos, M.N. COVID-19 Misinformation Online and Health Literacy: A Brief Overview. Int. J. Environ. Res. Public Health 2021, 18, 8091. [Google Scholar] [CrossRef] [PubMed]
- Fachrurrazi; Sofian, G.Y.; Marajo; Syaifudin, M. Book Review: Transformational Health Communication: A New Perspective on Healthcare and Prevention by Olaf Werder (Ed.). Eur. J. Commun. 2025, 40, 433–437. [Google Scholar] [CrossRef]
- Polyák, G.; Nagy, K. Regulating Health Communication in the Post-Truth Era. Intersections 2021, 7, 120–138. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).