Next Article in Journal
Detection of Distributed Denial of Service (DDoS) Attacks in IOT Based Monitoring System of Banking Sector Using Machine Learning Models
Next Article in Special Issue
Infotainment May Increase Engagement with Science but It Can Decrease Perceptions of Seriousness
Previous Article in Journal
Sustainability (Is Not) in the Boardroom: Evidence and Implications of Attentional Voids
Previous Article in Special Issue
Public Understanding of Ignorance as Critical Science Literacy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Commentary

We Need to Do Better: Five Notable Failings in Science Communication

ThinkOutsideThe, Canberra 2615, Australia
Sustainability 2022, 14(14), 8393; https://doi.org/10.3390/su14148393
Submission received: 17 June 2022 / Revised: 1 July 2022 / Accepted: 6 July 2022 / Published: 8 July 2022

Abstract

:
Despite significant growth in interest and investment in science communication, the field has demonstrated some high-profile failures in recent years—exemplified by anti-vaccine and anti-climate change beliefs—supported by interest groups that are often highly effective at promoting anti-science messages. This paper looks at five key areas where science communication research and practice need to do better, and offers some solutions, in order to achieve the impact that science communicators strive for.

  • “It’s a jungle out there,
  • So you best beware,
  • There are creatures lurking, that just don’t care,
  • They’ll gobble you up in one big bite,
  • And that’s the last you’ll see of day or night,
  • So if you value your life and your friends as well,
  • Don’t let the creatures get you, because it will be pure hell.”
  • —Everett R. Lake
The 2021 Glasgow Climate Change Conference was clearly a great success, with countries around the world heeding messages about the perils of continued temperature rises and committing to serious targets to prevent global catastrophe.
Sorry—I will start that again.
The 2021 Glasgow Climate Change Conference was clearly a failure, with countries around the world failing to fully heed messages about the perils of continued temperature rises, and they only committed to non-binding agreements that are unlikely to be followed up with actual action, leading to global catastrophe.
Which one is the truth?
Perhaps, like Schrodinger’s Climate Change Conference, they are both true? Certainly, both statements have their competing supporters who will repeat them in the years ahead as we inch towards meeting the first significant measures in 2030. It is now fairly well-established that what I consider the truth to be might not be what somebody else considers the truth to be.
Nevertheless, I am going to give you two of my own “truths” that I think anybody working in science communication needs to consider. The first is that, no matter what we believe the vital and truthful messages around pressing issues are, there is a strong likelihood that somebody more skilled at spin, framing, and emotive language is promoting an alternative message.
The second truth is that, over the past decade or so, interest groups have become better and better at this, and we—science communicators—have been largely left behind by them. We are good at celebrating our successes—and fair enough—but we are not particularly good at examining our failures. And there are many failures in need of examining.
After all, scientists have been talking about the dangers of climate change since at least 1957 [1]. That is 65 years over which key messages on climate change have been fought against, ignored, outmaneuvered, and countered, bringing us to this maybe-victory–maybe-defeat at Glasgow. It is really not a lot to be proud of, and if it takes that long to effectively capture the attention of the public on the next key environmental message, we are going to be in huge trouble.
Of course, you can argue immediacy and growing impacts change things—which is why climate change has now become mainstream, and why COVID-19 never left mainstream. However, I do not think we can really call either a huge success for science communication.
For instance, the initial COVID-19 response in the USA was described as “an unfortunate example of how muddled science communication can confuse non-scientific audiences, contribute to distrust of scientific evidence, and foster doubt about the rationale for health belief and behavior change (e.g., mask-wearing) as new evidence emerges” [2].
At least 6.5 million deaths should galvanize us as to why we should have performed better.
The COVID-19 pandemic has illustrated very powerfully why building a better evidence base for practitioners to communicate about science and its public impacts is both more urgent and more complicated than ever [3].
Major failures in science communication during the pandemic included:
  • The inability to effectively refute fringe medicine beliefs;
  • The inability to overcome anti-vaccine beliefs;
  • The inability to temper the media’s penchant for alarmist stories or “miracle” cures.
Yes, I admit these are often wicked problems, and we are working in a rapidly moving and complex environment, but in response to that, I would simply add to the list of failures:
  • The inability to master the rapidly moving and complex environment in which we work.
Therefore, let us look at that operating environment and see if we can target some of the areas where we really need to be performing better, understanding that the changing information-and-misinformation landscape requires an increasingly sophisticated grasp of how data—both accurate and not—are produced, disseminated, and consumed [4].
Some of the things that we still do not have a sufficient handle on and that have often ramped up to new levels of urgency, include:
  • When and why people do not think like us;
  • A better understanding of isolated and hostile publics;
  • How different belief systems affect how a message is received or not;
  • How to address misinformation and disinformation and effectively counter it;
  • How new media flows across influencers and networks and the impacts it has (and does not have).
To add to our woes, most of these are interlinked and cannot effectively be addressed without addressing the others.

1. When and Why People Do Not Think like Us

One of the major areas of success for science communications has been the way we have effectively reached people who think something like us: the well-educated, professional, informed elite. For those with a strong, or even moderate, interest in science, they have never had it so good. We have become so much better at making TV shows that science fans love, created online publications, YouTube channels, and Facebook sites that receive thousands of views and likes.
We have followed the fragmenting of mainstream media and have learned how to create niche media channels that reach niche audiences really well.
However, to a large extent, we are preaching to the converted.
The alternate side of fragmenting mainstream media—and fragmenting audiences—is that there are many peoples that are never reached by science communicators. We have not been particularly good at keeping in touch with those who are not much interested in science or are hostile to it.
Let us be clear that, not only do these groups exist, but what was once considered fringe-ranting amongst them now constitutes environmental and public health risks, particularly when they sit in seats of power and influence. We are not only talking about anti-vaccine and anti-climate change believers here, but a large spectrum of scientific issues around which people are undecided and uncertain and are looking for information to help them make decisions.
We have not met these peoples’ information needs well.
According to Andy Hira, a professor of political science at Simon Fraser University in Canada, the people we most need to reach most are not scientifically literate and often process information in highly irrational, often emotive, and tribal ways [1].
He said, “Such audiences are less likely to consider whether the sources of their information are scientifically authoritative; they want clear answers. Attempts to blindly apply the scientific method to human behaviour consistently fail. The straightforward messaging of climate change and vaccination scientists ignores such realities” [1].
For example, Dr. Pamela Rutledge, Director of the Media Psychology Research Center at Fielding Graduate University in California, argued that the anti-mask story is emotionally stronger than the pro-mask story (“I have a right as an American to not wear a mask, and it is all a conspiracy to control me” vs. “I am doing this to protect others and to do my part to promote health in my community”) [5].
Much of the science communication around COVID-19 came down to facts vs. emotions. We too often fail to grasp that people who think the most like us (who read essays and reports in scientific journals) accept facts, and people who do not think as much like us (and I have many of them in my wider family) are more likely swayed by emotions. When it comes to a contest between facts and figures, versus stories and emotions, facts and figures rarely win [6].
If we really understood how people who do not think like us think, we could be creating and tailoring messages that work primarily at an emotive level and are both more appealing and more convincing than the current, fact-based messages about sustainability and health.

2. Better Understanding of Isolated and Hostile Publics

Looking at COVID-19 communications again, it becomes apparent that there are many publics with deep suspicions about science and others who are actively hostile to it. There are many reasons for this, but researchers have flagged the effects that deeply rooted social inequities can have on isolating audiences [7]. Dietram Scheufele from the Department of Life Sciences Communication at the University of Wisconsin–Madison said, “This includes having access to information, being able to evaluate its quality, and having the motivation or capacity to extract relevant meaning for one’s life. Many existing efforts in science communication favor more affluent or already information-privileged audiences, while failing hard-to-reach or, more aptly, hardly reached audiences” [8].
As an example, in Australia, many First Nations communities and individuals remain suspicious of government efforts to increase COVID-19 vaccination rates. First Nations writers, such as Nayuka Gorrie, have openly stated that these suspicions are deep-seated.
“Colonialism and white supremacy have rendered Black and Indigenous bodies disposable and lacking in agency… Our bodies, in death, were and still are not seen as ours—we were free or cheap labour, sexual property, a specimen for experimentation and/or control by the medical industry” [9].
Citing examples such as Henrietta Lacks and experiments conducted on the bodies of Australian Indigenous people by universities in the 1920s and 1930s, she said that, throughout the so-called “western world”, Black and Indigenous bodies have borne the brunt of medical advancement [9].
In addition, researchers from Johns Hopkins University looked specifically at people who have a distrust of science and found that it strongly correlated with a failure to adopt preventative health measures, such as physical distancing and listening to the advice of health experts [10]. Added to this, a lack of trust in expert institutions was also found to correlate with an increased belief in COVID-19 myths and conspiracy theories [11].
We need to not just better understand the perspectives, histories, and values of isolated and untrusting audiences, but to find better ways to communicate with them in terms they acknowledge, accept, and trust. As we improve this knowledge, we need to also address the issue that there is much more research conducted on understanding audiences than there is on science communicators themselves and the diversity of skills, backgrounds, workplace objectives, and restrictions they operate within.

3. How Different Belief Systems Effect How a Message Is Received or Not

This should be a clear win for science communicators. There is more than enough evidence around to show how belief systems influence the types of messages people are willing to accept or reject, but we still do not succeed often enough.
We should know that, if we frame messages that align with peoples’ belief systems, they are more likely to be effective. However, when celebrities endorse questionable COVID-19 treatments such as hydroxychloroquine, we see support for that take off and find ourselves trying to combat it mainly with factual-based messages [12].
Emotive stories told by people with high trust (for some people) but little scientific expertise, such as celebrities, can be very influential. However, science communicators are reluctant to respond with similar types of messaging for fear it diminishes the science in some way.
Perversely, there is a massive body of research and scholarship around the psychological, political, societal, cultural, economic, moral, and institutional factors that influence effective science communication [13], but unfortunately, much of it is tied up in complex journal articles that fail the basic principles of effective communication themselves and are often unknown to practitioners. Often, those studies that are known to practitioners are single studies that have not been replicated well, such as the “backfire effect” [14,15].
To address this, practitioners and theorists need to communicate much more effectively with each other if we are to be able to reach the public more effectively.

4. How to Address Misinformation and Disinformation and Effectively Counter It

We have studied misinformation and put on our best game-faces to stand up to it, but we have yet to find effective ways to counter it. Michael Cacciatore of the Grady College of Journalism and Mass Communication at the University of Georgia said that virtually no work has been successful at completely eliminating the effects of misinformation (though some studies have shown promise for reducing it) [4].
This point is a serious one, as it has become the jungle of the introductory verse. Many scientists who stand up to counter misinformation have been attacked, trolled, and threatened.
A recent survey of scientists published in the journal Nature found that 15 percent of 321 respondents received death threats after speaking publicly about COVID-19. In addition, 22 percent had received threats of physical or sexual violence, and six of the respondents had actually been physically attacked [16].
As one example, Professor Julie Leask of the University of Sydney’s School of Nursing and Midwifery recently received an email that read:
“Shove that shit vaccine propaganda up your arse, Julie … Your hypochondria, OCD, and insecurities are pathetic … What a fawning, obsequious useless prat you are” [17].
Misinformation is also being used to distort messages being given by scientific experts. For instance, Kristine Macartney, the Director of the National Centre for Immunisation Research and Surveillance (NCIRS) in Australia, was falsely reported to have made a statement in court that “fully vaccinated people were 13 times more likely to catch and spread COVID-19 and that the vaccines were dangerous to pregnant women and those planning pregnancy” [18].
The report was widely spread on social media despite efforts to correct it.
Let us be clear here: those with a vested interest in creating and spreading misinformation do it very, very well and use vast social research to its best effect in spreading fear and doubt—as so well-described by Naomi Oreskes and Erik Conway in their 2010 book, Merchants of Doubt [19].
Sander van der Linden, a professor of social psychology in society at the University of Cambridge, said, “Money and politics are the two main incentives that people have for spreading misinformation” [20], and well-funded political ideologues have a lot of resources to throw into creating misinformation in effective ways.
When the public cannot distinguish the expertise of information sources, countering misinformation can also seem to them like just some esoteric in-fighting, which has been played upon by anti-climate change interest groups for a long time. When the public sees different scientific perspectives being put forward like this (regardless of their validity), they gravitate toward the individuals or messages that they feel the most affinity with [2].
The key word there is “feel”. Again, emotions vs. facts. The nature of the jungle.
Better science education is often held up as a critical part of the fight against emotionally appealing misinformation, and Professors Steve Sloman and Philip Fernbach, authors of the book The Knowledge Illusion: Why We Never Think Alone, said, “As a rule, strong feelings about issues do not emerge from deep understanding” [21].
However, it is actually very hard to educate an uninterested adult audience on complex topics—especially while they are looking for simple answers and turning to new media channels that make them feel comforted when they find such answers there.
Elizabeth Kolbert, Pulitzer-Prize-winning science journalist with The New Yorker, said one of the problems is that, as people invent new tools for new ways of living, they simultaneously create new realms of ignorance [22].
“If everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies, incomplete understanding is empowering” [22].
Consequently, what can be done?
Well, there is some hope. However, it does mean adopting some of the strategies of those who sow misinformation and disinformation. For example, Michael Dahlstrom of the Greenlee School of Journalism and Communication at Iowa State University said that narrative stories may be viewed warily from within the scientific community as a distortion of science, but the technique can actually counter misinformation by linking scientific truths to human experience [23].
And after all, better narrative stories are surely our bread and butter.

5. How New Media Flows across Influencers and Networks and the Impacts It Has (and Does Not Have)

Let us practice what we preach and start by looking at new media through the frame of a narrative story.
In Iran, in the early days of the COVID-19 pandemic, many people came to hospitals exhibiting symptoms of chest pain, nausea, and hyperventilation. Some showed signs of delayed organ and brain damage. Others slipped into comas. Many died.
However, they did not die of COVID-19—they died from its supposed cures.
As the COVID-19 infection and death toll rose in Iran—one of the first countries to experience widespread infection—people were understandably looking for cures. However, many found them on social media being shared by family and friends—people they trusted.
The posts showed that the virus could be eliminated by drinking methanol- or alcohol-based cleaning products, but they did not mention this often resulted in fatality.
A study published in The American Journal of Tropical Medicine and Hygiene estimated that about 5800 people were admitted to hospitals across several countries as a result of acting on false information about COVID-19 cures spread on social media. At least 800 of these people died in just the first three months of 2020 [24].
There are so many other stories we could tell: Donald Trump telling supporters to head to the Capital Building; Russian and Chinese tactics to influence elections; Facebook being weaponized by the Myanmar military to demonize the Rohingyas, leading to hundreds of deaths [25]; or just the impact of social media influencers on climate change hesitation, anti-GM crops, and widespread alternative health practices.
Globally, we have embraced new media, being so happy with its benefits that we are too willing to overlook its negative impacts, although this is probably changing with increased revelations of how new media conglomerates know they are causing negative impacts and promoting misinformation, but are doing very little to counter it.
Even the very nature of Facebook, Instagram, Twitter, etc., favors short and emotive comments over deep analysis and prolonged discourse—just emotive posts that target the feelings, not the knowledge. The very algorithms that run new media are designed to work against us.
This is the real jungle—a barely regulated new frontier where beasts lurk and misinformation wars happen.
It is not the medium that is the problem—no more than the printed book was when the power brokers of the Middle Ages looked at the printed word with suspicion. It is the way it is used and misused.
Many of those who misuse it know exactly what they are doing. Research has shown that, by hitting the Subscribe button, liking, or sharing, posts and tweets can actually confirm our biases. Nicole Yeatman, the director of strategic communications and partnerships at the Institute for Humane Studies, said, “by committing ourselves publicly to our present opinions, we may be hardening ourselves to future information that would otherwise change our minds” [26].
She also said that, with its currency and algorithms of shares, retweets, likes, and ratios, the social media economy picks losers and winners. “It rewards atta-boys and pile-ons; memes and humiliations; short, brutal interactions where a winner ‘crushes’ a loser” [26].
However, if you really want to change somebody’s mind, you need to have a lot of patience, induce them to reflect on their life and their attitudes, and listen to other points of view. Nicole Yeatman also said, “In other words, the kind of dialogue that actually has a chance of changing people’s minds—of breaching their epistemic arrogance and forcing them to confront the limits of their own knowledge—is precisely the kind of nonjudgmental, long-form, authentic discourse that is not rewarded in the current social media economy” [26].
According to Michael Patrick Lynch, professor of philosophy at the University of Connecticut and author of The Internet of Us, new media echo chambers have made us overconfident in our knowledge and abilities, believing we know more than we actually do [27].
A harsh truth here is that this applies to us—science communicators and science fans—as much as it applies to anti-climate change or anti-vaccine proponents.
We can see everybody’s biases except our own.
In a scathing attack on the negative impacts of Facebook, and an analysis of how it works on us, political commentator John Birmingham said: “As creatures that live in social clusters, natural selection has sculpted our neurology over thousands of generations to reward certain types of social stimuli and status cueing over others. We can’t help it. Facebook’s engagement metrics are the digital measure of a species which has learned that an individual’s chance of survival is improved when the clan or the tribe is minded to assure that survival” [28].
Consequently, what do we do? Despite collecting metrics and data on the creature that does not care, we still do not know how to best manage it and reap the benefits while containing the risks. We do not know nearly enough about how new media flows across networks of people, the role influencers play, and what its impacts are and are not.
Are we over-panicking? Are we not panicking enough? Unfortunately, those with the best data keep it to themselves as commercial in confidence and use it to sell their own ideas, ideologies, and products to us all.

6. What Does It All Mean?

Of course, we are doing what we can—but the point is there are commercial and political interest groups who are doing so much more. Corporations and governments spend millions to better understand how to influence political, social, and economic outcomes—sometimes by just tinkering around the edges and sowing social discord, and sometimes by overtly promoting misinformation and disinformation. We need to obtain better grasp on it than just watching it happen around us.
Having said all that, I should balance it a bit by acknowledging that there are some really good things being achieved—albeit often in small-scale and in isolation, such as community-level engagements on citizen science, sustainability, and community health. However, for science communication to be effective, we have to find these examples and expand our toolboxes with them. We need to not only include social science research in practical ways, but we need to better understand the tactics and tools we are too often struggling against—and know how to counter them and use them to our own outcomes.
We need to not only challenge misrepresentation and misinformation, but we need to communicate our own work in ways that are useful and practical for others to learn from and adopt. In addition, we need to admit our shortcomings and failures and gather our efforts to improve on them.
It is not an overstatement to say that lives hang in the balance.

Funding

This research received no external funding.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Hira, A. The Tragic Failure of Science Communication- from climate change to COVID. 1 September 2021. Available online: https://sciencepolicy.ca/posts/the-tragic-failure-of-science-communication-from-climate-change-to-covid (accessed on 7 February 2022).
  2. Goldstein, C.M.; Murray, E.; Beard, J.; Wang, M.L. Science Communication in the Age of Misinformation. Ann. Behav. Med. 2020, 54, 985–990. [Google Scholar] [CrossRef] [PubMed]
  3. Scheufele, D.A.; Hoffman, A.J.; Neeley, L.; Reid, C.M. Misinformation about science in the public sphere. Proc. Natl. Acad. Sci. USA 2021, 118, e2104068118. [Google Scholar] [CrossRef] [PubMed]
  4. Cacciatore, M.A. Misinformation and public opinion of science and health: Approaches, findings, and future directions. Proc. Natl. Acad. Sci. USA 2021, 118, e1912437117. [Google Scholar] [CrossRef] [PubMed]
  5. Rutledge, P.B. How Stories Spread Conflict: The Face Mask Culture Wars. Psychology Today. 19 June 2020. Available online: https://www.psychologytoday.com/us/blog/positively-media/202006/how-stories-spread-conflict-the-face-mask-culture-wars (accessed on 7 February 2022).
  6. Cormick, C. The Science of Communicating Science; CSIRO Publishing: Clayton, VIC, Australia, 2019. [Google Scholar]
  7. Howell, E.L.; Brossard, D. (Mis)informed about what? What it means to be a science-literate citizen in a digital world. Proc. Natl. Acad. Sci. USA 2021, 118, e1912436117. [Google Scholar] [CrossRef] [PubMed]
  8. Scheufele, D.A. Beyond the choir? The need to understand multiple publics for science. Environ. Commun. 2018, 12, 1123–1126. [Google Scholar] [CrossRef]
  9. Gorrie, N. Why vaccination presents an ethical dilemma for us, but remains the best way to keep our families safe. IndigenousX. 12 October 2021. Available online: https://indigenousx.com.au/why-vaccination-presents-an-ethical-dilemma-for-us-but-remains-the-best-way-to-keep-our-families-safe/ (accessed on 7 February 2022).
  10. Barry, C.; Han, H.; McGinty, B. Trust in Science and COVID-19. Johns Hopkins Bloomberg School of Public Health Expert Insights. 17 June 2020. Available online: jhsph.edu/covid-19/articles/trust-in-science-and-covid-19.html (accessed on 7 February 2022).
  11. Pickles, K.; Cvejic, E.; Nickel, B.; Copp, T.; Bonner, C.; Leask, J.; Ayre, J.; Batcup, C.; Cornell, S.; Dakin, T. COVID-19: Beliefs in misinformation in the Australian community. medRxiv 2020. [Google Scholar] [CrossRef]
  12. Caulfield, T.; Bubela, T.; Timmelman, J.; Ravitsky, V. Let’s do better: Public representations of COVID-19 science. FACETS 2021, 6, 403–423. [Google Scholar] [CrossRef]
  13. National Academies of Sciences, Engineering, and Medicine; Division of Behavioral and Social Sciences and Education; Committee on the Science of Science Communication: A Research Agenda. Communicating Science Effectively: A Research Agenda; Using Science to Improve Science Communication; National Academies Press (US): Washington, DC, USA, 2017. [Google Scholar]
  14. Wood, T.; Porter, E. The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behav. 2016, 41, 135–163. [Google Scholar] [CrossRef]
  15. Nyhan, B. Why the backfire effect does not explain the durability of political misperceptions. Proc. Natl. Acad. Sci. USA 2021, 118, e1912440117. [Google Scholar] [CrossRef] [PubMed]
  16. Nogrady, B. ‘I Hope You Die’: How the COVID Pandemic Unleashed Attacks on Scientists. Nature. 13 October 2021. Available online: https://www.nature.com/articles/d41586-021-02741-x (accessed on 7 February 2022).
  17. Salleh, A. Scientists Talking about COVID-19 Are Copping Widespread Abuse and Death Threats, Survey Finds. ABC News. 14 October 2021. Available online: https://www.abc.net.au/news/science/2021-10-14/covid-scientists-receiving-death-threats-abuse/100533564 (accessed on 7 February 2022).
  18. Anti-Vaxxers Fabricated Quotes Attributed to Infectious Diseases Expert Kristine Macartney and Spread Them Online. RMIT ABC Fact Check. 15 October 2021. Available online: https://www.abc.net.au/news/2021-10-15/coronacheck-nsw-supreme-court-transcript-fake/100539064 (accessed on 7 February 2022).
  19. Oreskes, N.; Conway, E. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming; Bloomsbury Press: London, UK, 2010. [Google Scholar]
  20. Roozenbeek, J.; Schneider, C.R.; Dryhurst, S.; Kerr, J.; Freeman, A.; Recchia, G.; van der Bles, A.M.; van der Linden, S. Susceptibility to misinformation about COVID-19 around the world. R. Soc. Open Sci. 2020, 7, 7201199. [Google Scholar] [CrossRef] [PubMed]
  21. Sloman, S.; Fernback, P. The Knowledge Illusion: Why We Never Think Alone; Riverhead Books: New York, NY, USA, 2017. [Google Scholar]
  22. Kolbert, E. Why Facts don’t change our minds. The New Yorker, 27 February 2017. [Google Scholar]
  23. Dahlstrom, M. The narrative truth about scientific misinformation. Proc. Natl. Acad. Sci. USA 2021, 118, e1914085117. [Google Scholar] [CrossRef] [PubMed]
  24. Islam, M.S.; Sarkar, T.; Khan, S.H.; Mostofa Kamal, A.; Hasan, S.M.M.; Kabir, A.; Yeasmin, D.; Islam, M.A.; Amin Chowdhury, K.I.; Anwar, K.S.; et al. COVID-19–Related Infodemic and Its Impact on Public Health: A Global Social Media Analysis. Am. J. Trop. Med. Hyg. 2020, 103, 1621–1629. [Google Scholar] [CrossRef] [PubMed]
  25. Stevenson, A. Facebook Admits It Was Used to Incite Violence in Myanmar. New York Times, 6 November 2018. [Google Scholar]
  26. Yeatman, N. Is Social Media Killing Intellectual Humility? BigThink. 9 March 2020. Available online: https://bigthink.com/neuropsych/intellectual-humility (accessed on 7 February 2022).
  27. Lynch, M.P. The Internet of Us: Knowing More and Understanding Less in the Age of Big Data; Liveright: New York, NY, USA, 2016. [Google Scholar]
  28. Birmingham, J. Kill All You See. Alien Sideboob. 15 October 2021. Available online: https://aliensideboob.substack.com/p/kill-all-you-see (accessed on 7 February 2022).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cormick, C. We Need to Do Better: Five Notable Failings in Science Communication. Sustainability 2022, 14, 8393. https://doi.org/10.3390/su14148393

AMA Style

Cormick C. We Need to Do Better: Five Notable Failings in Science Communication. Sustainability. 2022; 14(14):8393. https://doi.org/10.3390/su14148393

Chicago/Turabian Style

Cormick, Craig. 2022. "We Need to Do Better: Five Notable Failings in Science Communication" Sustainability 14, no. 14: 8393. https://doi.org/10.3390/su14148393

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop