Next Article in Journal
Methodology for Identification of the Key Levee Parameters for Limit-State Analyses Based on Sequential Bifurcation
Next Article in Special Issue
Interactive Approach for Innovation: The Experience of the Italian EIP AGRI Operational Groups
Previous Article in Journal
Sustainable Design of Self-Consolidating Green Concrete with Partial Replacements for Cement through Neural-Network and Fuzzy Technique
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Systematic Review of Misinformation in Social and Online Media for the Development of an Analytical Framework for Agri-Food Sector

by
Ataharul Chowdhury
1,*,
Khondokar H. Kabir
1,2,
Abdul-Rahim Abdulai
3 and
Md Firoze Alam
1
1
School of Environmental Design and Rural Development, University of Guelph, Guelph, ON N1G 2W1, Canada
2
Department of Agricultural Extension Education, Bangladesh Agricultural University, Mymensingh 2202, Bangladesh
3
Department of Geography, Environment and Geomatics, University of Guelph, Guelph, ON N1G 2W1, Canada
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(6), 4753; https://doi.org/10.3390/su15064753
Submission received: 25 January 2023 / Revised: 28 February 2023 / Accepted: 3 March 2023 / Published: 7 March 2023
(This article belongs to the Special Issue Agricultural Knowledge and Innovation Systems)

Abstract

:
The ubiquity of social and online media networks, the credulousness of online communities, coupled with limited accountability pose a risk of mis-, dis-, mal-, information (mis-dis-mal-information)—the intentional or unintentional spread of false, misleading and right information related to agri-food topics. However, agri-food mis-dis-malinformation in social media and online digital agricultural communities of practice (CoPs) remains underexplored. There is also a limited theoretical and conceptual foundation for understanding mis-dis-malinformation topics in the agri-food sectors. The study aims to review mis-dis-malinformation literature and offer a framework to help understand agri-food mis-dis-malinformation in social media and online CoPs. This paper performs a systematic review following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). The review shows that many disciplines, particularly communication, social media studies, computer science, health studies, political science and journalism, are increasingly engaging with mis-dis-malinformation research. This systematic research generates a framework based on six thematic categories for holistically understanding and assessing agri-food mis-dis-malinformation in social and online media communities. The framework includes mis-dis-malinformation characterization, source identification, diffusion mechanisms, stakeholder impacts, detection tactics, and mis-dis-malinformation curtailment and countermeasures. The paper contributes to advancing the emerging literature on ‘controversial topics’, ‘misinformation’, and ‘information integrity’ of the virtual agri-food advisory services. This is the first attempt to systematically analyze and incorporate experience from diverse fields of mis-dis-malinformation research that will inform future scholarly works in facilitating conversations and advisory efforts in the agri-food sector.

1. Introduction

Agricultural extension and advisory services are credited to ensuring food security and improvement of the livelihoods of millions by mobilizing various agri-food information and knowledge [1]. Traditionally, extension and advisory service providers use face-to-face and group-based approaches to provide farm advisory services. Due to limited human resources and operation funding [2,3], extension and advisory organizations struggle to make necessary agricultural information usable and accessible to farmers through the traditional information and knowledge sharing process [4,5]. National governments and agricultural stakeholders in both high- and low-income countries are revitalizing their extension systems by integrating information communications technology (ICT) -based social and digital media [6,7,8,9,10,11].
Increasingly, digital and online media is leveraged by the general public [12,13] and specific Communities of Practice (CoPs)—agricultural advisory actors connected via a medium to enhance their shared practice(s), including extension and advisory practices [14]. These novel social information and communication mechanisms engender new dynamics and processes of engagement, social learning, and relationship building [15,16]. Facebook, Twitter, YouTube, blogs, and Wikipedia, have transformed how people, communities, and organizations share and consume information in diverse settings and with different purposes [7,17,18,19]. Since the onset of the COVID-19 pandemic, there has been a rise in the use of online and digital media in the delivery of extension services [19]. Many studies touted the advantages that social and digital media would bring to agri-food advisory CoPs through enhancing mediated communities, expanded social networks, and increased variety and volume of information and knowledge sharing among members, such as farmers and extension agents [9,20,21,22]. For example, previous studies indicate that social and digital media can support agricultural knowledge and innovation systems by reducing physical distance between actors and fostering virtual networks of knowledge and advice [9,23,24]. Other studies explored the role of social and digital media in marketing and extension activities [25], showcasing new innovations and technologies to farmers [23], providing better and more inclusively accessible information [26], and promoting a campaign of advisory companies focusing on novel agricultural practices such as urban farming [7].
Although several authors highlight the ‘bright side’ of social and digital media, a growing number of incidents show social media’s ‘dark side’ [17]. The ubiquity of social and online media networks, the credulousness of online communities, and the potential weaponization of information, coupled with limited accountability, may also pose a risk of health [27,28] and agri-food mis-, dis-, mal-, information (mis-dis-mal-information) [29,30,31]. For example, there have been an increasing number of reports and research attention into concerns such as fake news, misinformation, disinformation, and malinformation amid COVID-19 [17,32,33]. Mis-dis-mal-information has perpetuated trust and beliefs that led to vaccine avoidance, mask refusal, and use of medications with insignificant scientific data, ultimately contributing to increased morbidity [34]. Wardle and Derakhshan [32] stated that these terms are getting politicized, therefore, they redefined mis-dis-mal-information as an “information disorder”. Information disorder includes three elements: agent, message, and interpreter. These helps comprehend the actors and their motivations. It has three phases: creation, production, and distribution. These helps explain its lifecycle. Following Wardle and Derakhshan [32] “misinformation” is defined as any information that is spread is false or untrue, but the intention behind doing so is not to harm anyone. “Disinformation” is defined as false information that is spread with the intent to cause harm. “Malinformation” is defined as true information that is spread through social and online media with the intention to harm people. The concepts allow for tracking, appreciation, and research into the dynamics of ill-information in virtual and online CoPs.
Historically, there is ample evidence of the existence of agri-food mis-dis-mal-information. The Villejuif Leaflet is a historical example of the agri-food industry in Europe in the 1970s and 1980s. It spread fear among Europeans through leaflets/flyers about toxic and cancer-causing substances in various foods and drinks, especially for children [35]. Another important historical example includes Trofim Lysenko’s belief that it is the environment not the genetics that shapes plants and animals. Policy based on this belief ultimately contributed to the starvation of millions of Soviet citizens in the early 1930s [36].
The widespread availability of the Internet and social media platforms has made mis-dis-mal-information a global problem. Using the Internet, everyone can produce, access, and disseminate different content and narratives [37]. As a result, we have recently observed a number of cases in the agri-food context at the macro level. Some recent high-profile examples include the ‘Buttergate controversies’ in the Canadian dairy industry when a food blogger tweeted that butter took longer to melt in the microwave when added to recipes [38]. The blogger theorized that Canadian dairy farmers used palm fat as an animal feed additive that would change butter’s consistency and melting point. Social media became fertile ground for people who polarized and advocated conflicting explanations or solutions based on their alternative values and advised farmers to stop adding palm oil to animal feed. For months, we witnessed decreased sales of Canadian butter that became an international news story. Likewise, we saw how policy decisions on ‘organic farming’ in Sri Lanka and ‘Reduction of Nitrogen emission by 2030’ in the Netherlands sparked farmer protests. One distinguishing feature of these policy-relevant events is the proliferation of accurate and incorrect information, further amplified by social and digital media. Different actors are involved in the mis-dis-mal-information campaign to discredit science-based evidence on organic farming and nitrogen emission from livestock farming. We have number of examples on how media frame the scientific findings and spread misinformation. The media misrepresented a study on milk consumption and mortality by Michaëlsson et al. [39] by framing drinking three glasses of milk per day as increasing one’s risk of dying prematurely [40]. In Bangladesh, the media recently spread misinformation about a study on tracing metal contamination in top soils and brinjal fruits conducted by Bushra et al. [41], claiming that eating brinjal can cause cancer [42]. Due to media framing, social media users became more divided over the issue resulting in misinformation spreading further directly impacting farmers.
These incidences are typically reported at the macro level, where extension agents, farmers, and other agri-food actors regularly face mis-dis-mal-information at the personal level. Only recently have the agricultural extension and advisory scholars started recognizing this challenge [7,30,31]. Extension and advisory agents are increasingly facing situations that warrant facilitation of conversations with farmers and other stakeholders on controversial topics, including genetically modified organisms (GMOs), the use of various farm inputs such as pesticides, chemical fertilizers, herbicides, growth hormones, soil management, climate change, food processing and safety, the use of antibiotics, benefits of new technologies, artificial intelligence, farm subsidies, market power, local foods, feed for livestock, use of natural resources and animal welfare [43,44,45,46,47]. Each controversy can be approached in a variety of ways, creating fertile ground in social media for various ill-intentioned actors to produce and disseminate mis-dis-mal-information. Sharing correct or incorrect information in conversation on a topic is only defined as mis-dis-mal-information based on intention behind sharing it [32]. Mis-dis-mal-information can prevent timely and effective adoption of appropriate behaviors and science-based recommendations [45], affect advisor and farmer knowledge sharing and learning, reduce trust, and increase polarization [47].
Recently, two editorials [7,9] have highlighted the need for new research on mis-dis-mal-information in the agri-food nexus to understand better the dynamics of mis-dis-mal-information within interpersonal relationships, institutional, organizational, and financial contexts. In the agri-food context, mis-dis-mal-information may be linked to scientific deism, skepticism, conspiracy, the complexity of information, and multi-actor involvement in agricultural knowledge and information systems (AKIS). Understanding of production, dissemination, and adoption of agricultural mis-dis-mal-information are key research gaps in this context, as are the questions of why it is important to study.
This paper explores how to leverage conceptual understandings developed in other areas to understand and engage with mis-dis-mal-information in agriculture CoPs on social and online media. It aims to review mis-dis-mal-information literature and offer a framework to help understand agri-food mis-dis-mal-information in social media and online CoPs. These research questions will be examined:
  • What is the current state of the literature on mis-dis-mal-information, including study types, frameworks, actors, focuses, tools, and main conclusions?
  • How is mis-dis-mal-information characterized and defined in the literature?
  • How can existing studies on mis-dis-mal-information in social and online media be used to draw conclusions and provide recommendations for agri-food advisory communities of practice?
We review different frameworks proposed in media studies, communication, political science, social and online media studies, and health. The paper presents a conceptual approach that researchers and practitioners can leverage to understand how agriculture CoPs deal with mis-dis-mal-information on social and online media. The proposed approach emphasizes the different forms of misinformation, how they spread, the actors involved, their impacts in knowledge communities, and processes of dealing with them at the individual and collective scales. This research and the outcome framework are critical to scholars and practitioners as the field grapples with unearthing novel approaches to understanding and maintaining ethical practices in online media.

2. Review: Social Media and Online Mis-Dis-Mal-information

Social and online media, referring to the Web 2.0-based applications that allow user-generated content creation and exchange, has become a primary information source for those people accessing the Internet [48,49,50]. It has transformed how people produce, search, and consume information and news, with many now relying on it as a news source [51]. In 2016, about 62% of adults obtained their news from social and online media in USA, for example 66% of Facebook users and 59% of Twitter users [52]. This percentage may vary across countries.
Gupta and Kumaraguru [53] simplified the mechanism of social and online media networks as composed of users and their digital artifacts. Users can post, follow, forward, like, link, rate, review, reply, comment, and send content, including text, documents, photographs, videos, and other shared data structures. Through user engagement and interactions, simple to complex networks are created, varying in size, purpose, operations, and engagement level. Social network analysis can analyze these connections and reveal how users are interconnected and engaged with each other and the broader platform [54,55]. Information flows within social and online media networks, mimicking everyday interactions, and generating content that is passed along to friends and people close to them for further distribution. This feature engenders novel complexities, including the potential for misinformation flow.
Scholars are increasingly researching the relationship between social media and mis-dis-mal-information. Diagnostic frameworks are needed to detect and prevent the spread of fake news [32]. From a psychological perspective, Kumar and Geethakumari [50] highlight that online social networks are a haven for mis-dis-mal-information and suggest devising mechanisms to prevent their diffusion. Mintz et al. [56] describes online media as a “web of deceit”, referring to the potency of mis-dis-mal-information flows on digital platforms. Multiple studies from both academics and non-academics have highlighted the challenge posed by social and online media regarding mis-dis-mal-information, e.g., [53,57,58,59].
Traditional media provides gatekeeping functions through media outlets, social and online media democratizes information, allowing people to be both producers and consumers of content with limited control [50,52]. Koohang and Weiss [59] argue that the Internet’s freedom leads to poor information quality control that endangers information flow within social and online media networks and provides opportunities for risky activities. The Internet’s global audience and two-way engagement add to the challenge of managing information flow, compressing society, and bringing different communications together with little to no control [60,61].
Online mis-dis-mal-information is spread by individual users and collective groups such as organizations and institutions. According to Karlova and Fisher [57], people, governments, and businesses share mis-dis-mal-information for various unknown reasons. Additionally, Chen et al. [48] suggest that the personality characteristics of those sharing information is embedded in people’s social behaviors as individuals are inclined to share mis-dis-mal-information to stay connected to their networks. Online sharing is often associated with bots (computer-mediated robots) that spread information into networks [62,63].
Motives for spreading information can range from curiosity to validation, status seeking, and a desire to help. It is generally agreed that deception is crucial to misinformation. Studies have shown that deception plays a central role in efforts to spread misinformation in online communities [56,58,64]. The goal of spreading misinformation’s is usually to deceive people into accepting an idea or promoting a particular interest [65]. The motive to deceive is intricately linked to the diffusion of misinformation. At the individual level, the definition of misinformation refers to cases where people spread information without knowing its validity rather than with malicious intent [66]. People often share misinformation for noninformational reasons, such as soliciting opinions on content, expressing personal views, and creating interactions within a social network. While gender showed no significant differences, individuals’ personalities influenced the behavior of sharing misinformation [48].
In a study on the diffusion of political misinformation on Twitter, Shin et al. [27] found that false information tends to recur more frequently than factual information and is diffused through external sources as the rumor’s links evolve and change. The study identified 17 rumors and traced their movement on social and online media platforms. Ireton and Posetti [67] showed information disorder follow the creation, production, distribution, and reproduction model in the same manner as any form of information. This creation is moved into a production stage, mostly mimicking genuine news formats or channels to gain authenticity for acceptance. The content then finds its way through online media, including social media, and is distributed by actors of different forms and motives, eventually finding their way into networks of people who rely on social and online media for news and information. Redistribution is critical to the success of misinformation allowing it to spread further.
In their model for mis-dis-mal-information diffusion, Karlova and Fisher [57] explain how information flows in online networks, diffused by people, governments, and businesses with mostly unknown intentions. The process starts with creating deceptive information, but identifying human intention is difficult. Information brokers help fill gaps in networks, allowing information to flow between groups. Uncertainty in verifying information and identifying human intention, coupled with the power of brokers, creates an environment for mis-dis-mal-information. To detect mis-dis-mal-information, cues to credibility are critical [48,56,68]. Recipients in online networks use these cues to assess information before sharing, causing credible information to spread and incredible information to be shared for other purposes.

3. Methods

We used a multi-phase, systematic review approach for our research. The multi-phase, systematic review approach that we used involved several steps. First, we selected relevant articles based on our inclusion criteria. Second, we identified the study design and type of research in each article. Third, we conducted a quality appraisal of each article to assess the reliability and validity of the research findings. Finally, we synthesized the results from the selected articles to draw conclusions and provide recommendations for future research. Our main goal was to propose a framework for research and understanding agricultural mis-dis-mal-information on social and online media. A search of the mis-dis-mal-information literature revealed minimal research on agricultural misinformation. This limited the works we could use for our research. We adopted a three-stage review to use literature from other fields and pulled together conceptual advancements in the field of mis-dis-mal-information research (see Figure 1).
Phase one of our searches used combinations of keywords, such as (“social media” OR “online media”) AND (“misinformation” OR “disinformation” OR “malinformation”), (“social media” OR “online media”) AND (“misinformation”), (“social media OR online media”) AND (“disinformation”), (“social media” OR “online media”) AND (“malinformation”) to search on the Web of Science, which was chosen because the authors’ university has a subscription to Web of Science. We conducted two different searches on the database. The first occurred in early 2020 and the second in late August 2020. The diverse search terms returned 598 items during the initial search. Only 247 of the search results were considered relevant to the topic and were downloaded for further review. The second search, which was conducted to incorporate the spike in research on mis-dis-malinformation that accompanied the COVID-19 pandemic, yielded 173 more papers, of which 64 were downloaded based on their title. The downloaded papers were screened on two criteria for inclusion and exclusion into the further review: “Does the paper address mis-dis-malinformation?” and “Does the paper specifically address online (social) mis-dis-mal-information?” Exclusion criteria for our systematic literature evaluation included articles that did not address online mis-dis-mal-information as well as those not written in English. Only 93 articles met the inclusion criteria and were included in the later stages of the review.
A second search was executed to capture the sources we missed in the targeted search. Hence, the second stage took a broader approach by using Google Search and Google Scholar. These were chosen due to their ability to capture literature from different databases. These databases allowed us to incorporate grey literature and conference proceedings. According to our prior reading these sources showed substantial mis-dis-mal-information studies. Using key terms such as (“online” AND “misinformation”), (“social media” AND “misinformation”), (“online fake news” AND “dealing with misinformation”) returned 170 papers (including peer reviewed, reports, and conference proceedings). A second search in August 2020 yielded 133 additional papers, bringing the total to 303. Of these only 209 were deemed relevant after a quick sorting to remove duplicates. Next, articles were screened based on the two criteria noted earlier after reading abstracts and introductions, leading to the inclusion of 110 articles and an exclusion of 99 articles.
The third phase of the review involved harmonization and further screening. The two sets of reviews were combined for a detailed screen. A total of 203 papers were included at this stage. The screening involved reading the abstract, introduction, and conclusion for papers that specifically conceptualized online mis-dis-mal-information elements. The criteria included, “Does the paper have a conceptual framework?” and “Does the paper conceptualize any aspect of online mis-dis-mal-information?” Sixty-one papers met the inclusion criteria for detailed reading and analysis. We utilized a systematic approach to select articles for our study based on inclusion and exclusion criteria and evaluated each article for relevance and reliability. While publication in a reputable journal does not guarantee quality, we carefully analyzed each article based on our criteria.
Our preliminary analysis showed limited literature on agricultural mis-dis-mal-information online. Since our goal was to conceptualize agricultural mis-dis-mal-information, a fourth search focused on that subject. The Web of Science search in July 2020 with new search terms produced 23 more papers. Only eight articles directly addressed the topic, although they did not involve conceptual advancements. We included these papers in the last phase of our review. We analyzed them separately from the general literature described in phases one to three of the review.
The data analysis took traditional content and thematic-based qualitative approaches [69,70,71]. All of the papers were uploaded to NVivo 12 for further analyses. First, we read through various aspects of the papers to code elements of mis-dis-mal-information discussed or conceptualized. Next, themes and sub-themes were identified through reading the papers. The coding process was iterative as members of the research team discussed and examined the key themes. Various themes were combined, and others were eliminated. The derived themes were used to organize this paper’s tables and subsequent discussions.

4. Results

In this section of the paper, we discuss the temporary growth of social and online media mis-dis-mal-information research; the diverse disciplinary engagements in the subject area; the major themes and key concepts in the literature; highlight selected conceptual frameworks; discuss agri-food mis-dis-mal-information on social and online media, and we finish with a proposal of a conceptual framework for understanding and the researching of (agri-food) mis-dis-mal-information on social and online media.

4.1. Temporal Growth of Mis-Dis-Mal-information Research

We conducted a simple time frame analysis of the papers included in this review to understand how mis-dis-mal-information research has grown over the years. The papers’ temporal patterns, from 2000 until August 2020, show some peculiar characteristics of how a subject area evolves in the research community. Even though the papers included were selected based on the criteria of conceptualization of misinformation, our research revealed some patterns relevant to the general literature (see Figure 2).
The literature remained consistently limited from the year 2000 until 2016, with generally fewer than three papers appearing each year prior. The number of studies and general literature on mis-dis-mal-information increased in 2016 and remained high until this review in August 2020. The astronomical growth of papers corresponds with the general growth in mis-dis-mal-information literature in late 2016 and afterwards, with more than double the amount of papers from 2016 to 2017. This growth is attributable to the attention given to mis-dis-mal-information during the 2016 American Presidential Election, which advanced the term into public and academic discourses. The half-year literature review in 2020 reflects another uptick, primarily influenced by the plethora of information accompanying the COVID-19 pandemic [72,73,74,75,76]. With the COVID-19 pandemic increasing interest and attention to mis-dis-mal-information (socially), it is not surprising that academic attention has also spiked. The trend is expected to continue.
The trend indicates is that mis-dis-mal-information research is responsive to social change. The peaking of the number of papers in 2016 and 2020 speaks to how the research community is driven by social demands to understand these issues within social needs, driven by election mis-dis-mal-information as well as health mis-dis-mal-information. This finding emphasizes a strong relationship between the research community and society’s participation in online social life.

4.2. Main Discipline of Mis-Dis-Mal-information Research

We carried out a disciplined keyword attribution to the papers included in the review to show the subject areas contributing to mis-dis-mal-information conceptualization. Our goal and approach focused on using the main topic area and publication journal to conduct this exercise. Our goal was to provide a rough sketch of the research communities intersecting with mis-dis-mal-information studies.
Mis-dis-mal-information research and conceptualization permeate diverse disciplines [67,77]. Many disciplines are finding interest in the subject. Among the fields contributing to this literature are communication studies, social media studies, political science, journalism, health, media studies, and psychology (see Figure 3).
Figure 3 shows how mis-dis-mal-information has attracted interest in academic research. Communication studies are central to the conceptualization of mis-dis-mal-information in many ways [61,78,79,80]. This centrality may be attributed to the connection between mis-dis-mal-information as a phenomenon and communication. Both areas focus on communicating to an audience. Additionally, relevant in the literature is the emerging field of social media studies that has focused a great deal of attention on misinformation’s growing challenge. Since social and online media is increasingly associated with making mis-dis-mal-information prevalent, extensive research is dedicated to understanding the dynamics of information flows in this new media and how it aids in spreading false information. Other areas of interest that have been studied in relation to new media is the psychology of social media users [50,81,82] and political science engagements [61,83]. Much of the psychology-related studies have sought to understand mis-dis-mal-information concerning human behavior, including what motivates people to share such information [48,84] and the personalities of people that influence the tendency to share [66]. The political science papers revealed that engagements with social and online media are largely driven by the political aims and behaviors that drive the diffusion of mis-dis-mal-information, e.g., the implications of the 2016 American Election for fake news on social and online media [62,63,85]. To clarify the disciplinary focus of our study, we primarily draw from the fields of health, information and communication, political science, and social media studies. This interdisciplinary approach allows us to consider a wide range of perspectives and insights on the topic of mis-dis-mal-information in social media and agri-food advisory communities.

4.3. Major Themes in Conceptual Literature

The literature on social and online media mis-dis-mal-information reveals some broad themes and specific elements conceptualized under each theme (see Table 1). These themes are characterized by specific issues of interest and questions which they make evident and allow researchers to probe.

4.3.1. Theme 1: Characterization and Definition of Mis-Dis-Mal-Information

An important area of scholarly interest in the mis-dis-mal-information literature focuses on the definition and characterization of the concept [59,67,80,92,96]. This theme focuses on identifying and clearly defining the terms related to misinformation or all forms of information disorders, e.g., [96]. Some of the concepts of interest include fake news, misinformation, disinformation, malinformation, remorse, information disorder, and, more recently, Infodemic. Defining and understanding these concepts contributes to the conceptualization of online mis-dis-mal-information as well as finding and characterizing it. It helps researchers delineate questions to study sources, diffusion mechanisms, detection, countering strategies, and general dynamics in everyday social and online media networks.

4.3.2. Theme 2: Sources of Mis-Dis-Mal-Information

Researchers employed different disciplinary lenses to identify the primary sources of mis-dis-mal-information on social and online media, as well as the root causes or triggers that ignite it [27,63,97,118]. The literature places mis-dis-mal-information sources as originating from individuals in their everyday social and online media activities to organized entities such as political groups or governments of nations [58,117,119]. The sources of online mis-dis-mal-information are critical to tracking their diffusion mechanisms and dynamics, and are crucial to how practitioners and researchers respond to the phenomenon.

4.3.3. Theme 3: Diffusion and Dynamics of Mis-Dis-Mal-Information

Another vital theme of conception is how mis-dis-mal-information is diffused. Researchers have mainly focused on temporary patterns of spread in this area, the speed of spread, and human behaviors relevant to the diffusion of mis-dis-mal-information [27,57,60]. Studies ask how mis-dis-mal-information spreads on social and online media and how the spread may differ from other forms of information. It is important to note that the dissemination of online mis-dis-malinformation is central to the literature, with early works focusing on the dynamics of such content and how it moves within groups [27,57].

4.3.4. Theme 4: Mis-Dis-Mal-information Behaviors

This theme focuses on how mis-dis-mal-information behaves, mainly compared to other information types and human behaviors that drive its diffusion on social and online media [48]. Current research examines human behaviors that influence mis-dis-mal-information behaviors. Perspectives from information and communication studies focus on mis-dis-mal-information’s peculiar characteristics as a type of information. Existing literature also emphasized role of inherent characteristics (e.g., personality) and derived features (e.g., education) in the spread of mis-dis-mal-information [48,66,84]. Conceptions around behaviors could help shed light on the phenomenon’s human and informational dynamics.

4.3.5. Theme 5: Detection of Mis-Dis-Mal-Information

Another important theme found in the literature is the detection of mis-dis-mal-information. Detection refers to strategies adopted by actors to show what constitutes mis-dis-mal-information on social and online media. Detection can happen at various scales, including individual strategies, organizational strategies, and social and online media network strategies [28,50,58].

4.3.6. Theme 6: Strategies for Countering, Correcting, and Dealing with Mis-Dis-Mal-Information

An essential part of the conceptualization of social media and online mis-dis-mal-information is how to counter and correct it [81,88,106,114,115,116,120,121]. This theme is linked to detection of mis-dis-mal-information, through using the same strategies and building on detection to counter. The focus is on strategy from individuals, governments, organizations, and social and online media platforms to deal with the phenomenon’s growing situation. The reason for the massive interest in this critical element of mis-dis-mal-information in the research and practice circles emanates from the potential negative impacts of the phenomenon on different actors [28,49], including the potential to erode trust in information.
The themes identified in this section have served as anchors which many studies have examined, either theoretically or practically [59]. To provide the conceptual foundations for agri-food engagements of social and online media mis-dis-mal-information these elements will anchor parts of our later discussions and proposed framework.

4.4. Zooming in on Agri-Food (Mis-Dis-Mal) Information and Social Media CoPs

Developments in Internet-based tools have informed studies that examine social media use within the agriculture and food sectors [19,54]. The increasing access to the Internet and digital infrastructure in rural areas has permitted novel knowledge and information flows in agriculture [122]. The corresponding emerging field includes limited work on social media and online agri-food advisory CoPs [123,124] and networks of practice [18,124]. Generally, Web 2.0 and social and online media platforms provide farmers and agri-food stakeholders an avenue for forming novel communities that enhance their practices, with the potential to aid innovation in agricultural communities [23].
According to study [54], agriculture continues to receive limited attention despite increasing online communities’ interests. Their work explored how agricultural information flows through networks on Twitter. They found that communities on Twitter are built around expertise on subject matters. Communities are very centralized with specialized information while decentralized when more generic discussions on agriculture are of concern. Chowdhury and Hambly [23] examined the dynamic of social and online media for innovation in Ontario. Even though their findings suggest uptake of social media is low, the engagement of diverse stakeholders in online communities provides a step to understanding the increasing use of Web 2.0 for agricultural learning, knowledge creation, and innovation. Studies point to the growing need to understand the dynamics of social media use in agricultural CoP.
While social and online media provide agri-food stakeholders with diverse and flexible mediums for knowledge transfer and innovation, they are also avenues for disseminating mis-dis-mal-information [27,48]. Agri-food systems in the last couple of decades have been fraught with increasing controversies around climate change, animal welfare, environmental stewardship, technology in food production, and challenges to the digitization of food chains [55,125,126,127,128]. These issues provide avenues for polarizing opinions, which can spill over into online communities where information sharing is less regulated. Limited attention to mis-dis-mal-information related to agriculture, despite extensive health research [16,129], politics [27,88], and media and communication [48,66,130] leaves critical research unfilled.
According to Stroud [131], gaps in agricultural information created by the privatization of agricultural knowledge systems emphasize the need for the sector to embrace actively and engage digitally. For Howard et al. [132], disconnection between rural and urban, and the farm community and external actors, increasingly create division between agri-food and others, making agricultural information inaccessible or distorted at best. Discussions on mis-dis-mal-information in agriculture have highlighted potential areas susceptible to this problem, including food safety, sustainable agriculture, genetically modified foods, climate change, and animal welfare.
Ji et al. [55] used Chinese social media (Sina Weibo) to examine the mis-dis-mal-information about GMOs by assessing the detection of rumor-mongering on the platform. People’s scientific rumoring behaviors are not influenced by their friendship network on social media, as people are more concerned with learning from experts. The authors concluded that the lack of scientific knowledge on GMOs drives negative social media attitudes towards the subject, as people become concerned about food safety issues. Willingness to learn from scientific experts requires knowledge bearers to help counter mis-dis-mal-information on social media.
Howard et al. [132] sought to understand how media usage impacts students’ perceptions of dairy at the University of Tennessee. Results showed students perceived social and online media platforms to be relatively trustworthy. Students agreed that the beef industry supplied safe products to consumers, but they were concerned with food safety and relatively concerned about access to information about the beef supply. Studies social and online media could affect people’s views of agriculture, and the industry communicators must increase their presence to influence perceptions and diffuse any misperceptions. Other authors echo the idea of leveraging experts, in this case, farmers, to provide adequate information about dairy. Mogha et al. [133], in the case of India, notes that a simple tweet, Facebook post, or WhatsApp chat could change a person’s perception of a dairy farm. Social and online media help create a healthy, positive online presence for the dairy industry to counter negative, uninformed attacks and maintain public confidence.
Mis-dis-mal-information exists about the impact of modern farming, such as “modern farming practices have killed off four out of five worms” or the pesticide industry’s mis-dis-mal-information campaign on pesticide use and soil health. Mis-dis-mal-information about earthworms in soil management, for example, spread on social media, claimed that “farmers worldwide have been turning their fields into subterranean deserts.” Stroud [131] discusses using experts to correct mis-dis-mal-information by emphasizing social media discussions around sustainable agriculture. The research aimed to design a co-learning and practice framework that effectively allows farmers and other stakeholders to advance social media knowledge. Accurate information was created to flow through agricultural CoPs through Twitter accounts and hashtags. Given the flow of mis-dis-mal-information, Stroud [131] noted that “Simply providing the information is insufficient, mis-dis-mal-information is tackled by creating a network that fosters accurate information exchange”. Chowdhury and Firoze [134] argued that online agricultural mis-dis-mal-information is influenced by economic and political interests of diverse groups, and, therefore, it is important to consider the political economy of mis-dis-mal-information for researching and understanding mis-dis-mal-information disseminated through online platforms.
The limited research suggests a discerning gap in the flow of mis-dis-mal-information related to food and agricultural sectors, particularly in social and online media agricultural communities—a subject barely mentioned by any study. These gaps call for increased research. These gaps may also show that researchers need to develop conceptual and theoretical approaches that allow for significant engagement between the two areas and infer from frameworks used in other disciplines.

4.5. Frameworks for Researching and Understanding Online Mis-Dis-Mal-information

This section reviews early attempts to guide the research and understanding of social and online media mis-dis-mal-information or elements of the phenomenon (see Table 2). While this list may not be exhaustive, it provides a starting point and a foundation for building and advancing further conceptual and theoretical rubrics for mis-dis-mal-information research.
The theoretical frameworks mentioned in Table 2 offer valuable insights and perspectives on how to approach and address the issue of mis-dis-mal-information in various contexts. Each framework emanates from a different scholarly field and emphasizes specific components: they highlight elements to inform engagement and refinement of mis-dis-mal-information studies and its application within sectors. Egelhofer and Lecheler’s [86] framework can help researchers identify and characterize misinformation in online media. Cook et al.’s [64] framework focuses on the concepts of prebunking and inoculation, which can be employed individually or collectively to deal with misinformation. Similarly, Ireton et al.’s [67] framework highlights the elements of information disorder, which are crucial in any information flow, and identifies seven forms of misinformation. Ji et al.’s [55] framework emphasizes the determinants of individual scientific rumor-mongering in online communities. Rubin’s [58] framework provides practitioners with strategies to combat misinformation, while Fard and Lingeswaran’s [113] scaler approach and Piccolo et al.’s [135] human values technical solutions approach provides a granular view of the phenomenon. Tangcharoensathien et al.’s [91] framework emphasizes the need to manage Infodemic, while Treen et al.’s [107] framework highlights the importance of understanding social media users’ engagement and their susceptibility to spreading misinformation. Finally, Wardle and Derakhshan’s [32] framework focuses on understanding the types, phases, and elements of information disorder to combat it effectively.
The theoretical frameworks considered specific issue from information disorder, such as some of the frameworks considered issues related to the creation of information disorder; some considered production and dissemination, and some highlighted combating related issues. For example, the framework proposed by Cook et al. [64] on prebunking and inoculation can be used to design interventions that can prevent the spread of misinformation in the agri-food community. The framework proposed by Wardle and Derakhshan [32] on mis-dis-mal-information and the phases of creation, production, and distribution can be used to identify and address the different types of misinformation that may arise in the agri-food context.
Though largely disaggregated, the component-specific descriptions of mis-dis-mal-information themes offer first steps to holistic comprehension of social and online media mis-dis-mal-information. Comparability remains low because each focus on a different aspect of mis-dis-mal-information. This finding shows nascent characteristics of mis-dis-mal-information studies and an apparent lack of consensus on established approaches to research. Such a finding underscores need for further conceptual development in mis-dis-malinformation research, especially in under-represented fields such as agriculture.
Overall, key themes within existing frameworks (Table 2) and supporting literature focus on characterization of mis-dis-mal-information, diffusion of mis-dis-mal-information, identification, and detection of mis-dis-mal-information (challenges in doing so), dealing with mis-dis-mal-information at individual and collective levels, and impact of mis-dis-mal-information on human society. Integrating all these themes, we have proposed a framework (see Figure 4) that provides a granular view of the phenomenon, considering the unique characteristics and challenges of the agri-food domain, such as the influence of cultural, social, and economic factors on the spread of misinformation.

5. A Framework for Researching and Understanding Social Media and Online Agricultural Mis-Dis-Mal-Information

The agri-food industry is a major consumer of scientific technologies and information. It constantly deals with numerous uncertainties such as insect-pest infestation, climate change, and market failure. Agri-food value chain actors, including farmers, needed new technology and timely information to deal with the complexity for which they historically relied on in-person advice from extension and advisory systems (offline service). New practices or technology frequently introduce unexpected risks and spark controversy among agri-food system actors. The digital revolution in the food industry is causing an exponential rise in the amount of information in the digital ecosystem [136], allowing people to find answers to almost any question. The digital revolution fundamentally altered how information is created, communicated, and delivered. As more farmers and agri-food players turn to the Internet and social media for information, how diverse debatable issues (climate change, GMOs, organic agriculture, animal health, etc.) create, spread, and consumed through the digital ecosystem remains unclear. The proposed framework, see Figure 4, will respond to this question and will help address issues related to credibility, transparency and ownership of data and information, and building trust among users, which are some of the major challenges in advancing digitalization of agriculture and advisory services [9,11,137,138].
A holistic framework must include the conceptualization of mis-dis-mal-information’s various dimensions and elements. Our comprehensive framework ensures the validity of its content and its applicability to guide future interdisciplinary research in an agri-food context because it is based on scholarly literature (see Figure 4). We proposed that the conceptualization of online mis-dis-mal-information must cover the dimensions/characterization of mis-dis-mal-information, sources and strategies of mis-dis-mal-information, diffusion of mis-dis-mal-information; impacts of mis-dis-mal-information; detection impact of mis-dis-mal-information, and the correction of the mis-dis-mal-information. These elements provide the foundations for probing the dynamics of social and online media mis-dis-mal-information on topic areas, including agriculture, where unique questions could be interrogated (see Figure 4). Future agricultural extension and education researchers can use the framework and use guiding questions to identify and investigate areas for case-specific research on contentious agri-food topics (see Table 3), considering the dynamics between platforms, actors, tools, and mis-dis-mal-information.
This holistic framework includes platforms, actors, and messages surrounded by contentious issues susceptible to mis-dis-mal-information. Through this framework, we hope to learn more about the people behind this content creation, what drives them, what kinds of content they produce, how audiences react, who re-shares this content and why, the extent to which fabricated posts have an impact, and the tools and strategies available to agri-food actors to counteract mis-dis-mal-information. Farmers and extension agents are often the targets of mis-dis-mal-information, thus they need to be well-versed in emerging thematic areas that can help mitigate its consequences. Furthermore, extension agents’ ability to detect and use various tools to counter mis-dis-mal-information may influence them to think and act differently than they would if their capacity was limited [31].
We recognize the limitations we faced during the development of the framework. We have not been able to identify mis-dis-mal-information research in agri-food setting during the development of this framework, except a few studies recognized it as a challenge [7,30,31]. Therefore, the framework is informed by studies from other disciplines. However, we are confident that the framework will guide future interdisciplinary agri-food mis-dis-mal-information research and can be enriched by applying and validating in various agri-food settings.
Social media and online mis-dis-mal-information would be a complex phenomenon of interlinking elements that interact and evolve. To understand the phenomenon in the context of agri-food, issues that lend themselves to polarizations and contestations must be identified (Figure 4). GMOs, organic/conventional, animal welfare, and climate change may provide such variants. These issues are critical to agricultural mis-dis-mal-information breeding when interacting with novel dynamics of information flow and interactions negotiated through social media.
When complex agricultural issues flow through social media mostly through CoPs or broadly on general media, different actors (with interest or not) diffuse (mis-dis-mal) information about them. The process of diffusing such information is a mixture of platform and user characteristics. Understanding platform characteristics requires inquiries on the mechanisms of information flow, including, e.g., tweeting. The prevailing mechanism allows users to create and disseminate (agri-food) information, which could be mis-dis-mal-information in peculiar ways. For user characteristics, behaviors and motives of actors, such as farmers, government entities, general public, extension agents, and organized groups on social media (or networks within them) is critical [135]. Users may employ different tools to allow for their activities, partly influenced by the platform characteristics and information diffusion mechanisms.
Any meaningful study of the phenomenon must consider impacts on various stakeholders and society. This information lends itself to mis-dis-malinformation and could affect who is affected and to what extent. It is crucial to interrogate how different social and online media mis-dis-mal-information affects diverse stakeholders, including farmers and the general public (or consumers). To effectively interrogate impacts of mis-dis-mal-information in the industry or on a subject area, careful examination of people’s experiences and perceptions is relevant.
Considering the impacts of mis-dis-mal-information in agri-food advisory CoPs, strategies for detection and countering/quelling is essential. Research on these subjects will allow for careful understanding of what is conducted and can broadly reduce impacts of false information in agricultural networks and social media. Strategies for identifying and generally solving the menace of social and online media mis-dis-mal-information could well be adapted to issues and the context of topics, as well as the specific mechanisms of information diffusion on specific platforms.
The proposed framework would not only be critical to how researchers and practitioners (e.g., extension professionals) evaluate the mis-dis-mal-information phenomenon on a broad range of topics, it could well provide pivots for mis-dis-mal-information research on specific topics. Table 3 expands the framework for practitioners and researchers by engaging it with some questions relevant to agri-food mis-dis-mal-information context.
This approach’s value is the flexibility provided by the framework, which allows for adaptability and contextualization without undermining holistic understanding and research of mis-dis-mal-information. Table 3 provides context by expanding on our proposed framework. Our framework would allow a researcher to holistically interrogate the phenomenon by examining the topic’s context, sources of such information on specific social media and online platforms, nature of such information flow, impacts on agricultural community, and potential strategies to curtail problems. A researcher could effectively engage and expand on a specific element of mis-dis-mal-information by asking questions relevant to that or interrogating cross-cutting relationships, or holistically adapting it to a wholesome examination of the mis-dis-mal-information.

6. Conclusions and Areas for Further Engagement for Research and Development

This paper attempts to achieve multiple goals to propose a framework for researching and understanding (agricultural) mis-dis-mal-information on social media and online platforms. A systematic review of the literature to conceptualize mis-dis-mal-information or elements was conducted. The review revealed that mis-dis-mal-information and its research have grown over the last decade, with increased attention driven by significant events in the last five to seven years [60,67,85] and the 2019–2022 global COVID-19 pandemic [75,139]. These events shaped both social attention and conceptual developments. Emerging and growing engagements from disciplines beyond novel social media studies have contributed quantitatively and qualitatively to the field.
The increasing polarization of issues in the field lends itself to novel social and online media mis-dis-mal-information flow [55]. Mis-dis-mal-information research would benefit by integrating dynamics of agri-food mis-dis-mal-information on social and online media, especially within the growing Infodemic. We discussed elements in selected conceptual frameworks for understanding mis-dis-mal-information.
The framework encapsulates different elements, including characterizing mis-dis-mal-information, identifying sources, diffusion mechanisms, impacts on stakeholders, detection strategies, and ways to curtail and counter mis-dis-mal-information. Interrogating questions around these elements in the framework would be critical for holistic understanding of social and online media mis-dis-mal-information phenomenon and contributing to conceptual developments in the field. Our framework provides a holistic outlook on social and online media mis-dis-mal-information in the literature and would be crucial for future research and practice.
The review shows that many disciplines are increasingly engaging with mis-dis-mal-information research. Each discipline is fixated on the topic on their terms, without a unified field of mis-dis-mal-information research. While disciplinary engagement is critical to the field’s development, mis-dis-mal-information research could benefit from a comprehensive research area with conceptualizations and methods specific to the concept. Analytic tools from various disciplines are rarely combined, though each field applies them. There is little theoretical coherence in understanding mis-dis-mal-information as a phenomenon or various aspects of the subject. Mis-dis-mal-information research has not evolved into a multi- and inter-disciplinary agenda.
The review shows a lack of a holistic framework to guide research and practice in mis-dis-mal-information. It could benefit from holistic conceptualizations that cater to the concept’s multi-elements and allow for directed understanding and research on various dynamics of mis-dis-mal-information.
Literature is still limited with engagement in particular subject areas, which is crucial to understanding the full scale of social and online media mis-dis-mal-information. The review of various mis-dis-mal-information frameworks indicates that controversial topics enhance user engagement online. Pfeiffer et al. [47] argue that we need to develop the intellectual merit of the next generation of agri-food professionals to engage with controversial topics when agri-food CoPs are highly polarized on scientific issues and these professionals will face an emerging problem related to information integrity and mis-dis-mal-information. The proposed framework has pedagogical, scholarly, and practice merits. While the work of [47] provides a solid example of how to enhance the competence of the new generation of agri-food professionals in dealing with controversial topics and ill information, we think that our proposed framework will add value in developing future curriculum on this topic. It will aid future agri-food research and mis-dis-mal-information studies. Moreover, advisory agents and practitioners could also draw on the proposed framework to facilitate discussion around contested topics [31] and deal with mis-dis-mal-information when communicating with their clients. Finally, given the nuances of mis-dis-mal-information, policymakers should encourage researchers, farmers, and other agri-food actors to improve their critical digital literacy.

7. Limitations

Firstly, the review study only considered articles published before 2021. It did not incorporate the literature on misinformation research between 2021 and 2022. Additionally, we did not include some relevant keywords, such as “fake news”, “food”, and “agriculture”. We did not consider the SCOPUS database due to limited access. While we tried to include as many relevant articles as possible, we acknowledge that some relevant publications may have been excluded due to these limitations.
We understand that there may have been an increase in the number of publications on misinformation since we started developing this article. Thus We will have missed up-to-date findings. Therefore, the findings of this review should be interpreted with caution and in the context of the specific research question and boundary of interest. Future research should include the most recent literature, expand the scope of the review to include relevant keywords, seek access to other alternative databases, and prioritize regular updates to ensure reviews remain current and comprehensive.

Author Contributions

Conceptualization, A.C. and A.-R.A.; methodology, A.-R.A. and K.H.K.; writing—original draft preparation, A.-R.A., K.H.K. and A.C.; writing—A.C., K.H.K. and M.F.A.; visualization, K.H.K.; supervision, A.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Insight Grant of Social Science and Humanities Research Council of Canada, grant number 435-2019-0377.

Institutional Review Board Statement

The research received research ethics board approval (22-06-001) of the University of Guelph.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data supporting reported results can be provided upon request to the interested individuals/researchers.

Acknowledgments

Social Science and Humanities Research Council of Canada is acknowledged for the Insight Grant which helped to accomplish this research.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Editorials Nature. To End Hunger, Science must Change its Focus. Nature 2020, 586, 336. [Google Scholar]
  2. Kabir, K.H.; Hassan, F.; Mukta, M.Z.N.; Roy, D.; Darr, D.; Leggette, H.; Ullah, S.M.A. Application of the Technology Acceptance Model to Assess the Use and Preferences of ICTs among Field-level Extension Officers in Bangladesh. Digit. Geogr. Soc. 2022, 3, 100027. [Google Scholar] [CrossRef]
  3. Blum, M.L.; Cofini, F.; Sulaiman, R.V. Agricultural Extension in Transition Worldwide: Policies and Strategies for Reform; FAO: Rome, Italy, 2020. [Google Scholar] [CrossRef]
  4. Tata, J.S.; McNamara, P.E. Impact of ICT on Agricultural Extension Services Delivery: Evidence from the Catholic Relief Services SMART Skills and Farmbook Project in Kenya. J. Agric. Educ. Ext. 2018, 24, 89–110. [Google Scholar] [CrossRef]
  5. Mwombe, S.O.L.; Mugivane, F.I.; Adolwa, I.S.; Nderitu, J.H. Evaluation of Information and Communication Technology Utilization by Small Holder Banana Farmers in Gatanga District, Kenya. J. Agric. Educ. Ext. 2014, 20, 247–261. [Google Scholar] [CrossRef]
  6. Raj, S. e-Agriculture Prototype for Knowledge Facilitation among Tribal Farmers of North-East India: Innovations, Impact and Lessons. J. Agric. Educ. Ext. 2013, 19, 113–131. [Google Scholar] [CrossRef]
  7. Klerkx, L. Digital and Virtual Spaces as Sites of Extension and Advisory Services Research: Social Media, Gaming, and Digitally Integrated and Augmented Advice. J. Agric. Educ. Ext. 2021, 27, 277–286. [Google Scholar] [CrossRef]
  8. FAO; ITU. Status of Digital Agriculture in 47 Sub-Saharan African Countries; FAO: Rome, Italy; ITU: Geneva, Switzerland, 2022. [Google Scholar]
  9. Fielke, S.; Taylor, B.; Jakku, E. Digitalisation of Agricultural Knowledge and Ad-vice Networks: A State-of-the-art Review. Agric. Syst. 2020, 180, 102763. [Google Scholar] [CrossRef]
  10. Eastwood, C.; Ayre, M.; Nettle, R.; Dela, R.B. Making Sense in the Cloud: Farm Advisory Services in a Smart Farming Future. NJAS Wagening. J. Life Sci. 2019, 90–91, 100298. [Google Scholar] [CrossRef]
  11. Rijswijk, K.; Klerkx, L.; Turner, J.A. Digitalisation in the New Zealand Agricultural Knowledge and Innovation System: Initial Understandings and Emerging Organisational Responses to Digital Agriculture. NJAS Wagening. J. Life Sci. 2019, 90–91, 100313. [Google Scholar] [CrossRef]
  12. Gruzd, A.; Jacobson, J.; Wellman, B.; Mai, P. Understanding Communities in an Age of Social Media: The Good, the Bad, and the Complicated. Inf. Commun. Soc. 2016, 19, 1187–1193. [Google Scholar] [CrossRef]
  13. Wenger, E.; McDermott, R.; Snyder, W. Cultivating Communities of Practice: A Guide to Managing Knowledge; Harvard Business School Press: Cambridge, MA, USA, 2002. [Google Scholar]
  14. Gow, G.; Chowdhury, A.; Ramjattan, J.; Ganpat, W. Fostering Effective Use of ICT in Agricultural Extension: Participant Responses to an Inaugural Technology Stewardship Training Program in Trinidad. J. Agric. Educ. Ext. 2020, 26, 335–350. [Google Scholar] [CrossRef]
  15. Cummings, S.; Heeks, R.; Huysman, M. Knowledge and Learning in Online Networks in Development: A Social-capital Perspective. Dev. Pract. 2006, 16, 570–586. [Google Scholar] [CrossRef]
  16. Stevens, T.; Aarts, N.; Termeer, C.; Dewulf, A. Social Media as a New Playing Field for the Governance of Agro-food Sustainability. Curr. Opin. Environ. Sustain. 2016, 18, 99–106. [Google Scholar] [CrossRef]
  17. Baccarella, C.V.; Wagner, T.F.; Kietzmann, J.H.; McCarthy, I.P. Social Media? It’s Serious! Understanding the Dark Side of Social Media. Eur. Manag. J. 2018, 36, 431–438. [Google Scholar] [CrossRef]
  18. Aguilar-Gallegos, N.; Klerkx, L.; Romero-García, L.E.; Martínez-González, E.G.; Aguilar-Ávila, J. Social Network Analysis of Spreading and Exchanging Information on Twitter: The Case of an Agricultural Research and Education Centre in Mexico. J. Agric. Educ. Ext. 2021, 28, 115–136. [Google Scholar] [CrossRef]
  19. Chivers, C.-A.; Bliss, K.; de Boon, A.; Lishman, L.; Schillings, J.; Smith, R.; Rose, D.C. Videos and Podcasts for Delivering Agricultural Extension: Achieving Credibility, Relevance, Legitimacy and Accessibility. J. Agric. Educ. Ext. 2021, 1–25. [Google Scholar] [CrossRef]
  20. Birke, F.M.; Lemma, M.; Knierim, A. Perceptions towards Information Communication Technologies and Their Use in Agricultural Extension: Case Study from South Wollo, Ethiopia. J. Agric. Educ. Ext. 2019, 25, 47–62. [Google Scholar] [CrossRef]
  21. Kamruzzaman, M.; Chowdhury, A.; van Paassen, A.; Ganpat, W. Extension Agents’ Use and Acceptance of Social Media: The Case of the Department of Agricultural Extension in Bangladesh. J. Int. Agric. Ext. Educ. 2018, 25, 132–149. [Google Scholar] [CrossRef]
  22. Munthali, N.; van Paassen, A.; Leeuwis, C.; Lie, R.; van Lammeren, R.; Aguilar-Gallegos, N.; Oppong-Mensah, B. Social Media Platforms, Open Communication and Problem Solving in the Back-office of Ghanaian Extension: A Substantive, Structural and Relational Analysis. Agric. Syst. 2021, 190, 103123. [Google Scholar] [CrossRef]
  23. Chowdhury, A.; Hambly, O.H. Social Media for Enhancing Innovation in Agri-food and Rural Development: Current Dynamics in Ontario, Canada. J. Rural. Community Dev. 2013, 8, 97–119. Available online: https://journals.brandonu.ca/ (accessed on 22 May 2021).
  24. Materia, V.C.; Giarè, F.; Klerkx, L. Increasing Knowledge Flows Between the Agricultural Research and Advisory System in Italy: Combining Virtual and Non-Virtual Interaction in Communities of Practice. J. Agric. Educ. Ext. 2015, 21, 203–218. [Google Scholar] [CrossRef]
  25. Kaushik, P.; Chowdhury, A.; Odame, H.H.; Paassen, A.V. Social Media for Enhancing Stakeholders’ Innovation Networks in Ontario, Canada. J. Agric. Food Inf. 2018, 19, 331–353. [Google Scholar] [CrossRef]
  26. Agyekumhene, C.; de Vries, J.R.; van Paassen, A.; Schut, M.; MacNaghten, P. Making Smallholder Value Chain Partnerships Inclusive: Exploring Digital Farm Monitoring through Farmer Friendly Smartphone Platforms. Sustainability 2020, 12, 4580. [Google Scholar] [CrossRef]
  27. Shin, J.; Jian, L.; Driscoll, K.; Bar, F. The Diffusion of Misinformation on Social Media: Temporal Pattern, Message, and Source. Comput. Hum. Behav. 2018, 83, 278–287. [Google Scholar] [CrossRef]
  28. Zhang, C.; Gupta, A.; Kauten, C.; Deokar, A.V.; Qin, X. Detecting Fake News for Reducing Misinformation Risks Using Analytics Approaches. Eur. J. Oper. Res. 2019, 279, 1036–1052. [Google Scholar] [CrossRef]
  29. Cato, S.; McWhirt, A.; Herrera, L. Combating Horticultural Misinformation through Integrated Online Campaigns Using Social Media, Graphics Interchange Format, and Blogs. HortTechnology 2022, 32, 342–347. [Google Scholar] [CrossRef]
  30. Gibson, J.; Greig, J.; Rampold, S.; Nelson, H.; Stripling, C. Can You Cite that? De-scribing Tennessee Consumers’ Use of GMO Information Channels and Sources. Adv. Agric. Dev. 2022, 3, 1–16. [Google Scholar] [CrossRef]
  31. Leal, A.; Rumble, J.N.; Lamm, A.J.; Gay, K.D. Discussing Extension Agents’ Role in Moderating Contentious Issue Conversations. J. Hum. Sci. Ext. 2020, 8, 1. Available online: https://www.jhseonline.com/issue/view/111 (accessed on 15 November 2022). [CrossRef]
  32. Wardle, C.; Derakhshan, H. Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. In Council of Europe Report; Council of Europe: Strasbourg, France, 2017; Volume 27. [Google Scholar]
  33. European Commission. Final Report of the High-Level Expert Group on Fake News and Online Disinformation. Available online: https://digital-strategy.ec.europa.eu/en/library/final-report-high-level-expert-group-fake-news-and-online-disinformation (accessed on 25 November 2022).
  34. Ferreira, C.M.M.; Sosa, J.P.; Lawrence, J.A.; Sestacovschi, C.; Tidd-Johnson, A.; Rasool, M.H.U.; Gadamidi, V.K.; Ozair, S.; Pandav, K.; Cuevas-Lou, C.; et al. The Impact of Misinformation on the COVID-19 Pandemic. AIMS Public Health 2022, 9, 262–277. [Google Scholar] [CrossRef]
  35. Kapferer, J.N. A Mass Poisoning Rumor in Europe. Public Opin. Q. 1989, 53, 467–481. [Google Scholar] [CrossRef]
  36. Kolchinsky, E.I.; Kutschera, U.; Hossfeld, U.; Levit, G.S. Russia’s New Lysenkoism. Curr. Biol. 2017, 27, R1037–R1059. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Lazer, D.M.J.; Baum, M.A.; Benkler, Y.; Berinsky, A.J.; Greenhill, K.M.; Menczer, F.; Metzger, M.J.; Nyhan, B.; Pennycook, G.; Rothschild, D.; et al. The Science of Fake News. Science 2018, 359, 1094–1096. [Google Scholar] [CrossRef] [PubMed]
  38. Music, J.; Charlebois, S.; Marangoni, A.G.; Ghazani, S.M.; Burgess, J.; Proulx, A.; Somogyi, S.; Patelli, Y. Data Deficits and Transparency: What Led to Canada’s ‘Buttergate’. Trends Food Sci. Technol. 2022, 123, 334–342. [Google Scholar] [CrossRef]
  39. Michaëlsson, K.; Wolk, A.; Langenskiöld, S.; Basu, S.; Lemming, E.W.; Melhus, H.; Byberg, L. Milk Intake and Risk of Mortality and Fractures in Women and Men: Cohort Studies. BMJ 2014, 349, g6015. [Google Scholar] [CrossRef] [Green Version]
  40. Knapton, S. Three Glasses of Milk a Day Can Lead to Early Death, Warn Scientists. The Telegraph. 28 October 2014. Available online: https://www.telegraph.co.uk/news/health/news/11193329/Three-glasses-of-milk-a-day-can-lead-to-early-death-warn-scientists.html (accessed on 29 November 2022).
  41. Bushra, A.; Zakir, H.M.; Sharmin, S.; Quadir, Q.F.; Rashid, M.H.; Rahman, M.S.; Mallick, S. Human Health Implications of Trace Metal Contamination in Top-soils and Brinjal Fruits Harvested from a Famous Brinjal-producing Area in Bangladesh. Sci. Rep. 2022, 12, 14278. [Google Scholar] [CrossRef]
  42. Islam, M.J. Harmful Substances Found in Brinjal May Increase Cancer Risk: Study. The Business Insider. 27 September 2022. Available online: https://www.tbsnews.net/bangladesh/harmful-substances-found-brinjal-may-increase-cancer-risk-study-504038 (accessed on 29 November 2022).
  43. Lynas, M.; Adams, J.; Conrow, J. Misinformation in the Media: Global Coverage of GMOs 2019-2021. GM Crops Food 2022, 1–10. [Google Scholar] [CrossRef] [PubMed]
  44. Norwood, F.B.; Oltenacu, P.A.; Calvo-Lorenzo, M.S.; Lancaster, S. Agricultural and Food Controversies: What Everyone Needs to Know; Oxford University Press: Oxford, UK, 2015. [Google Scholar]
  45. Somerville, P. Misinformation in Agriculture Contributing to Tech Block. The Weekly Times. 21 February 2019. Available online: https://www.weeklytimesnow.com.au/agribusiness/misinformation-in-agriculturecontributing-to-tech-block/news-story/d9d3066537c06d6c2a31eafc6a2936c4 (accessed on 29 November 2022).
  46. Goerlich, D.; Walker, M.A. Determining Extension’s Role in Controversial Issues: Content, Process, Neither, or Both? J. Ext. 2015, 53, n3. [Google Scholar]
  47. Pfeiffer, L.J.; Knobloch, N.A.; Tucker, M.A.; Hovey, M. Issues-360TM: An Analysis of Transformational Learning in a Controversial Issues Engagement Initiative. J. Agric. Educ. Ext. 2021, 28, 439–458. [Google Scholar] [CrossRef]
  48. Chen, K.-H.; Hsieh, K.-J.; Chang, F.-H.; Chen, N.-C. The Customer Citizenship Behaviors of Food Blog Users. Sustainability 2015, 7, 12502–12520. [Google Scholar] [CrossRef] [Green Version]
  49. Al-Rawi, A. Gatekeeping Fake News Discourses on Mainstream Media Versus Social Media. Soc. Sci. Comput. Rev. 2019, 37, 687–704. [Google Scholar] [CrossRef]
  50. Kumar, K.P.K.; Geethakumari, G. Detecting Misinformation in Online Social Networks using Cognitive Psychology. Hum. Cent. Comput. Inf. Sci. 2014, 4, 14. [Google Scholar] [CrossRef] [Green Version]
  51. Weiss, R. Nip Misinformation in the Bud. Science 2017, 358, 427. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  52. Azzimonti, M.; Fernandes, M. Social Media Networks, Fake News, and Polarization; Working Paper 24462; National Bureau of Economic Research (NBER): Cambridge, MA, USA, 2018; Available online: http://www.nber.org/papers/w24462 (accessed on 29 November 2022).
  53. Gupta, A.; Kumaraguru, P. Misinformation in Social Networks: Analyzing Twitter during Crisis Events. In Encyclopedia of Social Network Analysis and Mining; Alhajj, R., Rokne, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar] [CrossRef] [Green Version]
  54. Bastos, M.; Piccardi, C.; Levy, M.; McRoberts, N.; Lubell, M. Core-periphery or Decentralized? Topological Shifts of Specialized Information on Twitter. Soc. Netw. 2018, 52, 282–293. [Google Scholar] [CrossRef] [Green Version]
  55. Ji, J.; Chao, N.; Ding, J. Rumormongering of Genetically Modified (GM) Food on Chinese Social Network. Telemat. Inform. 2019, 37, 1–12. [Google Scholar] [CrossRef]
  56. Mintz, A.P.; Benham, A.; Edwards, E.; Fractenberg, B.; Gordon-Murnane, L.; Hetherington, C.; Liptak, D.A.; Smith, M.; Thompson, C. Web of Deceit: Misinformation and Manipulation in the Age of Social Media; CyberAge Books: Medford, NJ, USA, 2012. [Google Scholar]
  57. Karlova, N.; Fisher, K.E. A Social Diffusion Model of Misinformation and Disinformation for Understanding Human Information Behaviour. Inf. Res. 2013, 18. Available online: http://InformationR.net/ir/18-1/paper573.html (accessed on 12 June 2022).
  58. Rubin, V.L. Disinformation and Misinformation Triangle: A Conceptual Model for “Fake News” Epidemic, Causal Factors and Interventions. J. Doc. 2019, 75, 1013–1034. [Google Scholar] [CrossRef]
  59. Koohang, A.; Weiss, E. Misinformation: Toward Creating a Prevention Frame-work. Inf. Sci. 2003, 109–115. Available online: https://proceedings.informingscience.org/IS2003Proceedings/docs/025Kooha.pdf (accessed on 12 June 2022).
  60. Allcott, H.; Gentzkow, M.; Yu, C. Trends in the Diffusion of Misinformation on Social Media. Res. Politics 2019, 6, 2053168019848554. [Google Scholar] [CrossRef] [Green Version]
  61. Bode, L.; Vraga, E.K. In Related News, that Was Wrong: The Correction of Mis-information through Related Stories Functionality in Social Media. J. Commun. 2015, 65, 619–638. [Google Scholar] [CrossRef]
  62. Ferrara, E. Bots, Elections, and Social Media: A. Brief Overview. In Disinformation, Misinformation, and Fake News in Social Media: Emerging Research Challenges and Opportunities; Shu, K., Wang, S., Lee, D., Liu, H., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 95–114. [Google Scholar] [CrossRef]
  63. Ferrara, E.; Varol, O.; Davis, C.; Menczer, F.; Flammini, A. The Rise of Social Bots. Commun. ACM 2016, 59, 96–104. [Google Scholar] [CrossRef] [Green Version]
  64. Cook, J.; Lewandowsky, S.; Ecker, U.K.H. Neutralizing Misinformation through Inoculation: Exposing Misleading Argumentation Techniques Reduces their Influence. PLoS ONE 2017, 12, e0175799. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  65. Smith, G.D.; Ng, F.; Ho, C.L.W. COVID-19: Emerging Compassion, Courage and Resilience in the Face of Misinformation and Adversity. J. Clin. Nurs. 2020, 29, 1425–1428. [Google Scholar] [CrossRef] [Green Version]
  66. Chen, X.; Sin, S.C.J. ‘Misinformation? What of it?’ Motivations and Individual Differences in Misinformation Sharing on Social Media. Proc. Am. Soc. Info. Sci. Technol. 2013, 50, 1–4. [Google Scholar] [CrossRef]
  67. Ireton, C.; Posetti, J.; UNESCO. Journalism, ‘Fake News’ and Disinformation: Handbook for Journalism Education and Training. 2018. Available online: http://unesdoc.unesco.org/images/0026/002655/265552E.pdf (accessed on 27 August 2022).
  68. Lu, X.; Vijaykumar, V.; Jin, Y.; Rogerson, D. Think Before You Share: Beliefs and Emotions that Shaped COVID-19 (Mis)information Vetting and Sharing Intentions among WhatsApp Users in the United Kingdom. Telemat. Inform. 2022, 67, 101750. [Google Scholar] [CrossRef]
  69. Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 4th ed.; Sage: Newbury Park, CA, USA, 2017. [Google Scholar]
  70. Flick, U. (Ed.) The SAGE Handbook of Qualitative Data Collection; Sage: London, UK, 2018. [Google Scholar]
  71. Rafi, M.S. Dialogic Content Analysis of Misinformation about COVID-19 on Social Media in Pakistan. Linguist. Lit. Rev. LLR 2020, 6, 1–11. [Google Scholar] [CrossRef]
  72. Ahinkorah, B.O.; Ameyaw, E.K.; Hagan, J.E.J.; Seidu, A.A.; Schack, T. Rising Above Misinformation or Fake News in Africa: Another Strategy to Control COVID-19 Spread. Front. Commun. 2020, 5, 45. [Google Scholar] [CrossRef]
  73. Ries, M. The COVID-19 Infodemic: Mechanism, Impact, and Counter-Measures—A Review of Reviews. Sustainability 2022, 14, 2605. [Google Scholar] [CrossRef]
  74. Anzar, D.W.; Baig, D.Q.A.; Afaq, D.A.; Taheer, D.T.B.; Amar, D.S. Impact of Infodemics on Generalized Anxiety Disorder, Sleep Quality and Depressive Symptoms among Pakistani Social Media Users during Epidemics of COVID-19. Merit Res. J. Med. Med. Sci. 2020, 8, 1–5. [Google Scholar] [CrossRef]
  75. Baines, D.; Elliott, R.J.R. Defining Misinformation, Disinformation and Malinformation: An Urgent Need for Clarity during the COVID-19 Infodemic. Discuss. Pap. 2020, 20, 1–23. [Google Scholar]
  76. Kouzy, R.; Jaoude, J.A.; Kraitem, A.; Alam, M.B.E.; Karam, B.; Adib, E.; Zarka, J.; Tra-boulsi, C.; Akl, E.; Baddour, K. Coronavirus Goes Viral: Quantifying the COVID-19 Misinformation Epidemic on Twitter. Cureus 2020, 12, e7255. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  77. Anderson, J.; Rainie, L. The Future of Truth and Misinformation Online; PEW Research Center: Washington, DC, USA, 2017; p. 224. [Google Scholar]
  78. Acerbi, A. Cognitive Attraction and Online Misinformation. Palgrave Commun. 2019, 5, 15. [Google Scholar] [CrossRef] [Green Version]
  79. Bonnet, J.L.; Rosenbaum, J.E. “Fake News,” Misinformation, and Political Bias: Teaching News Literacy in the 21st Century. Commun. Teach. 2020, 34, 103–108. [Google Scholar] [CrossRef]
  80. Bode, L.; Vraga, E.K. See Something, Say Something: Correction of Global Health Misinformation on Social Media. Health Commun. 2018, 33, 1131–1140. [Google Scholar] [CrossRef]
  81. Chan, M.S.; Jones, C.R.; Hall, J.K.; Albarracín, D. Debunking: A Meta-analysis of the Psychological Efficacy of Messages Countering Misinformation. Psychol. Sci. 2017, 28, 1531–1546. [Google Scholar] [CrossRef] [PubMed]
  82. Khan, M.L.; Idris, I.K. Recognise Misinformation and Verify before Sharing: A Reasoned Action and Information Literacy Perspective. Behav. Inf. Technol. 2019, 38, 1194–1212. [Google Scholar] [CrossRef]
  83. Margolin, D.B.; Hannak, A.; Weber, I. Political Fact-checking on Twitter: When do Corrections have an Effect? Political Commun. 2018, 35, 196–219. [Google Scholar] [CrossRef]
  84. Laato, S.; Islam, A.K.M.N.; Islam, M.N.; Whelan, E. Why Do People Share Misinformation during the COVID-19 Pandemic? Eur. J. Inf. Syst. 2020, 29, 288–305. [Google Scholar] [CrossRef]
  85. Jones-Jang, S.M.; Kim, D.H.; Kenski, K. Perceptions of Mis- or Disinformation Exposure Predict Political Cynicism: Evidence from a Two-wave Survey during the 2018 US Midterm Elections. New Media Soc. 2020, 23, 3105–3125. [Google Scholar] [CrossRef]
  86. Egelhofer, J.L.; Lecheler, S. Fake News as a Two-dimensional Phenomenon: A Framework and Research Agenda. Ann. Int. Commun. Assoc. 2019, 43, 97–116. [Google Scholar] [CrossRef] [Green Version]
  87. Shu, K.; Wang, S.; Lee, D.; Liu, H. (Eds.) Disinformation, Misinformation, and Fake News in Social Media: Emerging Research Challenges and Opportunities; Springer: Cham, Switzerland, 2020. [Google Scholar] [CrossRef]
  88. Stray, J. Institutional Counter-disinformation Strategies in a Networked Democracy. In Proceedings of the WWW ’19: Companion Proceedings of the 2019 World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019. [Google Scholar]
  89. Cinelli, M.; Quattrociocchi, W.; Galeazzi, A.; Valensise, C.M.; Brugnoli, E.; Schmidt, A.L.; Zola, P.; Zollo, F.; Scala, A. The COVID-19 Social Media Infodemic. Sci. Rep. 2020, 10, 16598. [Google Scholar] [CrossRef]
  90. Radu, R. Fighting the ‘Infodemic’: Legal Responses to COVID-19 Disinformation. Soc. Media + Soc. 2020, 6, 2056305120948190. [Google Scholar] [CrossRef] [PubMed]
  91. Tangcharoensathien, V.; Calleja, N.; Nguyen, T.; Purnat, T.; D’Agostino, M.; Garcia-Saiso, S.; Landry, M.; Rashidian, A.; Hamilton, C.; AbdAllah, A.; et al. Framework for Managing the COVID-19 Infodemic: Methods and Results of an Online, Crowdsourced WHO Technical Consultation. J. Med. Internet Res. 2020, 22, e19659. [Google Scholar] [CrossRef]
  92. Shao, C.; Ciampaglia, G.L.; Varol, O.; Yang, K.; Flammini, A.; Menczer, F. The Spread of Low-credibility Content by Social Bots. Nat. Commun. 2018, 9, 4787. [Google Scholar] [CrossRef] [Green Version]
  93. Golovchenko, Y.; Hartmann, M.; Adler-Nissen, R. State, Media and Civil Society in the Information Warfare over Ukraine: Citizen Curators of Digital Disinformation. Int. Aff. 2018, 94, 975–994. [Google Scholar] [CrossRef] [Green Version]
  94. Yang, J.; Li, S.; Wang, Z.; Dong, H.; Wang, J.; Tang, S. Using Deep Learning to Detect Defects in Manufacturing: A Comprehensive Survey and Current Challenges. Materials 2020, 13, 5755. [Google Scholar] [CrossRef] [PubMed]
  95. Vosoughi, S.; Roy, D.; Aral, S. The Spread of True and False News Online. Science 2018, 359, 1146–1151. [Google Scholar] [CrossRef] [PubMed]
  96. Jack, C. Lexicon of Lies: Terms for Problematic Information. Data & Society. 2018, Volume 22. Available online: https://datasociety.net/pubs/oh/DataAndSociety_LexiconofLies.pdf (accessed on 22 July 2022).
  97. Bradshaw, S.; Howard, P.N. Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation; Oxford Internet Institute: Oxford, UK, 2017; pp. 1–37. [Google Scholar]
  98. Guidi, B. An Overview of Blockchain Online Social Media from the Technical Point of View. Appl. Sci. 2021, 11, 9880. [Google Scholar] [CrossRef]
  99. Zerback, T.; Toepfl, F.; Knoepfle, M. The Disconcerting Potential of Online Disinformation: Persuasive Effects of Astroturfing Comments and Three Strategies for Inoculation Against Them. New Media Soc. 2020, 23, 1080–1098. [Google Scholar] [CrossRef]
  100. Marwick, A.; Lewis, R. Media Manipulation and Disinformation Online. Data and Society. 2017. Available online: https://datasociety.net/output/media-manipulation-and-disinfo-online/ (accessed on 14 May 2022).
  101. Ong, J.C.; Corbanes, J.V.A. Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines. Communication 2018, 74, 1–83. [Google Scholar] [CrossRef]
  102. Bessi, A.; Zollo, F.; Del Vicario, M.; Scala, A.; Caldarelli, G.; Quattrociocchi, W. Trend of Narratives in the Age of Misinformation. PLoS ONE 2015, 10, e0134641. [Google Scholar] [CrossRef]
  103. Acemoglu, D.; Ozdaglar, A.; ParandehGheibi, A. Spread of (Mis)information in Social Networks. Games Econ. Behav. 2010, 70, 194–227. [Google Scholar] [CrossRef]
  104. Valenzuela, S.; Halpern, D.; Katz, J.E.; Miranda, J.P. The Paradox of Participation Versus Misinformation: Social Media, Political Engagement, and the Spread of Misinformation. Digit. J. 2019, 7, 802–823. [Google Scholar] [CrossRef]
  105. Pennycook, G.; McPhetres, J.; Zhang, Y.; Lu, J.G.; Rand, D.G. Fighting COVID-19 Misinformation on Social Media: Experimental Evidence for a Scalable Accuracy-Nudge Intervention. Psychol. Sci. 2020, 31, 770–780. [Google Scholar] [CrossRef]
  106. Lewandowsky, S.; Ecker, U.K.H.; Seifert, C.M.; Schwarz, N.; Cook, J. Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychol. Sci. Public Interest 2012, 13, 106–131. [Google Scholar] [CrossRef]
  107. Treen, K.M.; Williams, H.T.P.; O’Neill, S.J. Online Misinformation about Climate Change. WIREs Clim. Change 2020, 11, e665. [Google Scholar] [CrossRef]
  108. Kumar, S.; Shah, N. False Information on Web and Social Media: A Survey. arXiv 2018, arXiv:1804.08559. [Google Scholar]
  109. Garrett, R.K.; Poulsen, S. Flagging Facebook Falsehoods: Self-identified Humor Warnings Outperform Fact Checker and Peer Warnings. J. Comput. Mediat. Commun. 2019, 24, 240–258. [Google Scholar] [CrossRef] [Green Version]
  110. Trethewey, S.P. Strategies to Combat Medical Misinformation on Social Media. Post Grad. Med. J. 2020, 96, 4–6. [Google Scholar] [CrossRef] [Green Version]
  111. Bühler, J.; Murawski, M.; Darvish, M.; Bick, M. Developing a Model to Measure Fake News Detection Literacy of Social Media Users. In Disinformation, Misinformation, and Fake News in Social Media: Emerging Research Challenges and Opportunities; Shu, K., Wang, S., Lee, D., Liu, H., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 213–227. [Google Scholar] [CrossRef]
  112. Wagner, M.C.; Boczkowski, P.J. The Reception of Fake News: The Interpretations and Practices that Shape the Consumption of Perceived Misinformation. Digit. J. 2019, 7, 870–885. [Google Scholar] [CrossRef] [Green Version]
  113. Fard, A.E.; Lingeswaran, S. Misinformation Battle Revisited: Counter Strategies from Clinics to Artificial Intelligence. Companion Proc. Web Conf. 2020, 510–519. [Google Scholar]
  114. Paynter, J.; Luskin-Saxby, S.; Keen, D.; Fordyce, K.; Frost, G.; Imms, C.; Miller, S.; Trembath, D.; Tucker, M.; Ecker, U. Evaluation of a Template for Countering Misinformation-Real-world Autism Treatment Myth Debunking. PLoS ONE 2019, 14, e0210746. [Google Scholar] [CrossRef] [Green Version]
  115. Vraga, E.K.; Kim, S.C.; Cook, J. Testing Logic-based and Humor-based Corrections for Science, Health, and Political Misinformation on Social Media. J. Broadcast. Electron. Media 2019, 63, 393–414. [Google Scholar] [CrossRef]
  116. Pourghomi, P.; Safieddine, F.; Masri, W.; Dordevic, M. How to Stop Spread of Misinformation on Social Media: Facebook Plans vs. Right-click Authenticate Approach. In Proceedings of the International Conference on Engineering & MIS (ICEMIS), Monastir, Tunisia, 8–10 May 2017; pp. 1–8. [Google Scholar] [CrossRef]
  117. Sommariva, S.; Vamos, C.; Mantzarlis, A.; Đào, L.U.L.; Tyson, D.M. Spreading the (Fake) News: Exploring Health Messages on Social Media and the Implications for Health Professionals using a Case Study. Am. J. Health Educ. 2018, 49, 246–255. [Google Scholar] [CrossRef]
  118. Brennen, J.S.; Simon, F.M.; Howard, P.N.; Nielsen, R.K. Types, Sources, and Claims of COVID-19 Misinformation 2020. p. 13. Available online: https://reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation (accessed on 11 March 2022).
  119. Wang, X.; Song, Y. Viral Misinformation and Echo Chambers: The Diffusion of Rumors about Genetically Modified Organisms on Social Media. Internet Res. 2020, 30, 1547–1564. [Google Scholar] [CrossRef]
  120. Danielson, L.; Marcus, B.; Boyle, L. Special Feature: Countering Vaccine Misinformation. Am. J. Nurs. 2019, 119, 50–55. [Google Scholar] [CrossRef] [PubMed]
  121. Bran, R.; Tiru, L.; Grosseck, G.; Holotescu, C.; Malita, L. Learning from Each Other—A Bibliometric Review of Research on Information Disorders. Sustainability 2021, 13, 10094. [Google Scholar] [CrossRef]
  122. Mills, J.; Reed, M.; Skaalsveen, K.; Ingram, J.; Bruyn, L.L. The Use of Twitter for Knowledge Exchange on Sustainable Soil Management. Soil Use Manag. 2019, 35, 195–203. [Google Scholar] [CrossRef] [Green Version]
  123. Leveau, L.; Soulignac, V. Knowledge Management for Sustainable Agro-systems: Can Analysis Tools Help Us to Understand and Support Agricultural Communities of Practice? Case of the French Lentil Production. Int. J. Food Syst. Dyn. 2018, 9, 197–206. [Google Scholar] [CrossRef]
  124. Riley, M.; Robertson, B. #Farming365-Exploring Farmers’ Social Media Use and the (re)Presentation of Farming Lives. J. Rural. Stud. 2021, 87, 99–111. [Google Scholar] [CrossRef]
  125. Oreszczyn, S.; Lane, A.; Carr, S. The Role of Networks of Practice and Webs of Influencers on Farmers’ Engagement with and Learning about Agricultural Innovations. J. Rural. Stud. 2010, 26, 404–417. [Google Scholar] [CrossRef]
  126. Eenennaam, A.V. The History and Impact of Misinformation in the Agricultural Sciences. CAS Initiative on Conspiracy, Misinformation, and the Infodemic. 2022. Available online: https://mediaspace.illinois.edu/media/t/1_k0b1s1mh (accessed on 11 March 2022).
  127. Zerbe, N. Feeding the Famine? American Food Aid and the GMO Debate in Southern Africa. Food Policy 2004, 29, 593–608. [Google Scholar] [CrossRef]
  128. Blancke, S.; Frank, V.B.; Geert, D.J.; Johan, B.; Marc, V.M. Fatal Attraction: The Intuitive Appeal of GMO Opposition. Trends Plant Sci. 2015, 20, 414–418. [Google Scholar] [CrossRef] [Green Version]
  129. Chou, W.S.; Oh, A.; Klein, W.M.P. Addressing Health-related Misinformation on Social Media. JAMA 2018, 320, 2417–2418. [Google Scholar] [CrossRef] [PubMed]
  130. Southwell, B.G.; Thorson, E.A. The Prevalence, Consequence, and Remedy of Misinformation in Mass Media Systems. J. Commun. 2015, 65, 589–595. [Google Scholar] [CrossRef]
  131. Stroud, J.L. Tackling Misinformation in Agriculture. bioRxiv 2019. [Google Scholar] [CrossRef]
  132. Howard, M.; Stephens, C.; Stripling, C.; Brawner, S.; Loveday, D. The Effect of Social Media on University Students’ Perceptions of the Beef Industry. J. Agric. Educ. 2017, 58, 316–330. [Google Scholar] [CrossRef]
  133. Mogha, K.V.; Shah, N.P.; Prajapati, J.B.; Chaudhari, A.R. Biofilm—A Threat to Dairy Industry. Indian J. Dairy Sci. 2014, 67, 459–466. [Google Scholar]
  134. Chowdhury, A.; Firoze, A. Combatting Online Agriculture Misinformation (OAM): A Perspective from Political Economy of Misinformation. In Proceedings of the 2022 Conference of the Association for International Agricultural and Extension Education, Thessaloniki, Greece, 4–7 April 2022. [Google Scholar]
  135. Piccolo, L.S.G.; Puska, A.; Pereira, R.; Farrell, T. Pathway to a Human-values Based Approach to Tackle Misinformation Online. In Human-Computer Interaction. Human Values and Quality of Life; Kurosu, M., Ed.; Springer: Cham, Switzerland, 2020; pp. 510–522. [Google Scholar] [CrossRef]
  136. Rust, N.A.; Stankovics, P.; Jarvis, R.M.; Morris-Trainor, Z.; de Vries, J.R.; Ingram, J.; Mills, J.; Glikman, J.A.; Parkinson, J.; Toth, Z.; et al. Have Farmers Had Enough of Experts? Envir. Manag. 2022, 69, 31–44. [Google Scholar] [CrossRef]
  137. Ingram, J.; Maye, D.; Bailye, C.; Barnes, A.; Bear, C.; Bell, M.; Cutress, D.; Davies, L.; de Boon, A.; Dinnie, L.; et al. What are the Priority Research Questions for Digital Agriculture? Land Use Policy 2022, 114, 105962. [Google Scholar] [CrossRef]
  138. Ding, J.; Jia, X.; Zhang, W.; Klerkx, L. The Effects of Combined Digital and Human Advisory Services on Reducing Nitrogen Fertilizer Use: Lessons from China’s National Research Programs on Low Carbon Agriculture. I. J. Agri. Sus. 2022, 20, 1136–1149. [Google Scholar] [CrossRef]
  139. Cuan-Baltazar, J.Y.; Muñoz-Perez, M.J.; Robledo-Vega, C.; Pérez-Zepeda, M.F.; Soto-Vega, E. Misinformation of COVID-19 on the Internet: Infodemiology study. JMIR Public Health Surveill. 2020, 6, e18444. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. The literature review process.
Figure 1. The literature review process.
Sustainability 15 04753 g001
Figure 2. The growth of mis-dis-malinformation research papers since 2000.
Figure 2. The growth of mis-dis-malinformation research papers since 2000.
Sustainability 15 04753 g002
Figure 3. Disciplinary contours of mis-dis-mal-information research.
Figure 3. Disciplinary contours of mis-dis-mal-information research.
Sustainability 15 04753 g003
Figure 4. A holistic framework for research and understanding agricultural mis-dis-mal-information.
Figure 4. A holistic framework for research and understanding agricultural mis-dis-mal-information.
Sustainability 15 04753 g004
Table 1. Major themes around mis-dis-mal-information research.
Table 1. Major themes around mis-dis-mal-information research.
Major ThemeSome Focus QuestionsDistinctive Concepts for Each Theme
Theme 1:
Characteriza-tion and definitions of misinfo
How do we define misinfo?
What are the different concepts that help to explain misinfo?
How does a piece of information on social media qualify as misinfo?
Key terms in the field:
Fake news: Information (info) with low felicity, in a journalistic or new format, with intention to deceive [86]. It is simple news items that are intentionally and verifiably false [86,87], created in a journalistic form with the intention to misinform and/or employed by actors to disintegrate news sources or content [86].
Rumor: The spread of unverified info that may later turn out to be true or false [55,86].
Misinfo: Incorrect or misleading info disseminated unintentionally [50,59,64,86] without the intention to cause harm [32]. Misinformation can be high-quality info that spreads because of its efficiency [78].
Disinfo: Incorrect or misleading info that is disseminated deliberately [57,86,88] with intention to cause harm [32].
Malinfo: Info based on reality but used to inflict harm on a person, organization, or country [67].
Info disorder: Broader concept used to describe any unusual occurrence in info, involving misinfo, disinfo, malinfo [32,67].
Infodemics: Flow of an abundance of true and false info during a pandemic [89,90,91].
Theme 2: Sources of
misinfo
What are the sources of misinfo on social media?
Who drives the spread of misinfo?
What elements provide the engine for the creation and diffusion of misinfo online?
Misinfo emanates from behaviors of diverse actors. From a core controlled by individuals, organizations, bots, governments, etc. [92] and spreads within a network. Critical sources of misinfo identified include:
The individual as a source of misinfo online: Individuals, citizens, and the general public [93] and resharing info [94,95]. Individuals may be volunteers and paid citizens [96,97]. False news spreads more than the truth because humans, not bots, are more likely to spread it [95].
Bots and spread of misinfo: Partly through unverified social media accounts [76], disseminates misinfo and amplifies false info [95], mostly through automated processes [94,98]. Social bots’ false impression and particular opinion has widespread public support through ‘astroturfing’ [99].
Organizations as a source of misinfo: Political entities and organizations such as partisan websites amplify misinfo [27], sometimes using cyber troops and private contractors [96,97]. Internet trolls, gamegaters, hate groups and ideologues, the manosphere, conspiracy theorists, hyper-partisan news outlets, and politicians circulate misinfo through blogs [100,101,102].
Government entities spread misinfo: Government-sponsored accounts, web pages or applications, and fake accounts spread misinfo for state interests [97].
Theme 3:
Diffusion and
dynamics of
misinfo
How does misinfo spread on social media?
How does the spread of misinfo differ from true info on social media?
Identifiable dynamics of misinfo flow:
Misinfo is temporary: False rumors (misinfo) tend return multiple times after initial publication, while true rumors (facts) do not. Rumor resurgence continues, often accompanying textual changes, until tension around the target dissolves [27,60].
Speed of diffusion varies: Falsehood diffuses significantly farther, faster, deeper, and more broadly than truth [102]. Individuals and misinfo elements diffuse through agents. These signals can partially influence agents directly by not altering fake news or indirectly by following friends who are themselves influenced by bots which can generate misinfo and polarization in the long run [52,97]. “Forceful” agents influence beliefs of (some of) the other individuals they meet but do not change their own opinions [103]. Use accounts, either real, fake, or automated, to interact with users on social media or create substantive content. Valence, a term used to define attractiveness (goodness) or averseness (badness) of a message, event, or thing [97].
Theme 4:
Misinfo behaviors
How do individual behaviors affect the spread of misinfo on social media?
What elements of misinfo allow it to thrive on social media?
Notable misinfo behaviors:
Individual behaviors: People’s sharing of misinfo on social media is influenced by individual characteristics [48,65,66,78], including personalities, political interests, social motives, and capacities. Individual personalities can facilitate misinfo. For example, extroverts are more prone to share misinfo for socializing purposes [48,66]; self-expression and socializing motivations are crucial to misinfo sharing [48], and gender could influence misinfo sharing behaviors [48]. People’s political behaviors and interests affect misinfo behaviors. For example, politically enthusiastic people tend to share misinfo [104]; people sometimes deliberately share false content because it furthers their political agenda [65,105], and pre-existing socio-political and cultural beliefs and bias entrench misinfo [103,106,107]. Misinfo and motivations. Top motivations for misinfo sharing are obtaining others’ opinions on that info, expressing their own opinions, and interacting with others [66]; people share info based on its ability to spark conversations [48]; people are more likely to share info by virtue of novelty [95], and misinfo is not always intentional [105]. Individual capacity and capabilities affect misinfo. For example, sharing info on social media without verification is predicted by Internet experience, Internet skills of info seeking, sharing, verification, attitude towards info verification, and belief in the reliability of info [82]. Individual cognitive process [78] involved in the decision to spread info involves consistency of the message, the coherency, the credibility of the source, and general acceptability of the message [50]. Other socio-economic conditions affect misinfo sharing. For example, “self-efficacy to detect misinfo on social media is predicted by income and level of education, Internet skills of info seeking and verification, and attitude towards info verification” [82].
Info behaviors: Misinfo exhibits some characteristics, either familiar or very different from the expected info. Misinfo starts from a core and spreads in a network. Core is controlled by individuals, organizations, bots, or partnerships [92]. Misinfo mutates faster over time [27].
Theme 5:
Detection of
misinfo
How can varied stakeholders detect social media misinfo?
What mechanism could be leveraged for the detection of misinfo on social media?
Varied strategies for the detection of misinfo:
Info literacy and detecting misinfo: Misinfo and disinfo are closely linked to info literacy, especially how they are diffused and shared, and how people use both cues for credibility and deception to make judgments [57,65].
Automated systems can detect misinfo: Automated detection systems or computerized forms of detecting misinfo [108] using algorithms [80,108]. For example, Hoaxy, an open platform that enables large-scale, systematic studies of how misinfo and fact-checking spread [92], and the Fake News Detection (FEND) system [28].
Organizational strategies allow for identifying misinfo: Organizations such as academics or news outlets can provide gate-keeping [49] to detect misinfo. Fact-checking entities contribute to detection [83,109].
Theme 6:
Strategies for countering,
correcting, and dealing with
misinfo
What strategies can be used to correct and deal with social media misinfo?
How do varied organizations contribute to countering the misinfo menace on social media?
Countering and correcting misinfo includes individuals, organizations, governments, and social media outlets.
Individuals can play roles in correcting misinfo: Culture of fact-checking by people [110]. People detect misinfo using cues for deception, and info literacy is helpful [82,111]. Individuals who follow and are followed by people can minimize misinfo through gate-keeping info [83]. Individuals are relevant to social corrections, for example, as they effectively limit misperceptions, and correction occurs for high and low conspiracy belief individuals [80].
Reception of misinfo is crucial to prevention: Individual reception can be influenced by traditional fact-based media, accompanied by rejection of opinionated outlets; personal experience and knowledge; repetition of info across outlets; consumption of cross-ideological sources; fact-checking; trust in specific personal contacts [112].
Organizations (academia, media, independent fact-checkers, etc.) [113] could minimize misinfo: Stray categorized tactics used by organizations: refutation, exposure of inauthenticity, alternative narratives [61], algorithmic filter manipulation [80], speech laws, censorship [88]. Strategies include careful dissemination, expert fact-checking, social media campaigns, and greater public engagement by organizations [110] as well as prebunking of people against misinfo, debunking [81,114] messages by organizations, warning of threats reduce misinfo [64,115] and its persistence. Organizations can use an info architect solely responsible for the info and dissemination [59].
States and governments can counter misinfo. Governments can further tackle misinformation through regulations and censorship [113], algorithmic filter manipulation, speech laws, and censorships [88]. For example, the EU East StratCom Task Force, a contemporary government counter-propaganda agency, and China’s info regime are networked info control [88].
Social media networks: Facebook, Twitter, and other networks have made numerous changes to their operations to combat disinfo [88,116], such as automated correction through algorithmic filter manipulation and censorship, facilitated by bots made possible by networks [80,117].
Table 2. Descriptions of selected conceptual and theoretical frameworks.
Table 2. Descriptions of selected conceptual and theoretical frameworks.
CitationKey ComponentsComment on Framework
Egelhofer and Lecheler [86]Focus: Types/forms of fake news; identification of fake news; characterization of info.Egelhofer and Lecheler [86] offer frameworks for research into fake news allowing researchers to characterize fake news, including in online mediums. The framework can help in the characterization of misinfo in online media. The focus on contents and attribution of info disorder to false content is a notable limitation. This is because accurate content can also be employed to misinform if used incorrectly.
Cook et al. [64]Focus: Dealing with misinfo.Cook et al. [64] employ two established concepts and apply them to the misinfo literature. The focus is on dealing with misinfo. The strength of their approach is their ability to apply and situate concepts within misinfo. Provides a limited view on dealing with misinfo since recent experiences show it requires more than social strategies to deal with the phenomenon.
Ireton et al. [67]Focus: Elements in misinfo diffusion; characterization of misinfo.Ireton et al. [67] provide an overview of misinfo in their handwork for a journalist. Frameworks provided in the handbook are essential to how we situate misinfo as a phenomenon and how it relates to journalists’ work. However, the focus on one discipline limits their work, applying a broader interest in misinfo.
Ji et al. [55]Focus: Rumor mongering.Framework allows us to appreciate misinfo elements which can be considered through different scales of action.
Rubin [58]Focus: Misinfo and disinfo interventions.Rubin [58] offers a method for dealing with misinfo. Levers of the ‘triangle’ offer practitioners what to focus on as strategies to combat the phenomenon. Framework is useful for researchers to start identifying components and interrogating misinfo.
Fard and Lingeswaran [113]Focus: Strategies for countering misinfo.Scaler approach to understanding the countering strategies for social media misinfo provides granular scale view of phenomenon. Crucial to understand misinfo elements, but focusing on one component limits its ability to describe misinfo effectively.
Piccolo et al. [135]Focus: Tackling online misinfo.The human value approach to misinfo countering allows for social view of the phenomena and would be critical to influencing human behaviors that drive misinfo. Approach is limited, as misinfo is increasingly shown to be diffused by non-human actors.
Tangcharoensathien et al. [91]Focus: Managing Infodemic.Framework is an excellent first step in plausible strategies for managing the contemporary infodemic. The generic focus on all forms (online and offline) may undermine the ability to incorporate the unique challenges of a social media misinfo.
Treen et al. [107]Focus: Understanding the diffusion of climate change misinformation on social media. Framework is one of first attempts to comprehend interconnected features of online social networks and underlying human and platform factors that may increase social media users’ susceptibility to consume, accept, and propagate misinfo. Framework needs further examination.
Wardle and Derakhshan [32]Focus: Discussing and researching information disorder. Framework reveals we must comprehend ritualistic function of communication.
Table 3. Thematic areas and questions for researching and understanding (agri-food) mis-dis-mal-information on social and online media.
Table 3. Thematic areas and questions for researching and understanding (agri-food) mis-dis-mal-information on social and online media.
Major ThemePotential Questions
Characterization and definitions of mis-dis-mal-information
  • How does agricultural mis-dis-mal-information breed online?
  • How does agricultural mis-dis-malinformation manifest in online communities?
Sources of mis-dis-mal-information
  • What are the root causes of or what ignites the misinformation?
  • What issues in agriculture offer avenues for social media mis-dis-mal-information?
  • Which actors are likely to create agricultural mis-dis-mal-information on social media?
  • What channels could be used to diffuse agricultural mis-dis-mal-information?
Diffusion and dynamics of mis-dis-mal-information
  • How does mis-dis-mal-information circulate in agricultural communities and networks online?
  • What motivates the creation and circulation of agricultural mis-dis-mal-information on social media?
  • How does agricultural mis-dis-mal-information differ from other forms of mis-dis-mal-information?
  • What social media environments breed effective transmission of agricultural mis-dis-mal-information?
Detection of mis-dis-mal-information
  • How can different actors identify agricultural mis-dis-mal-information online?
  • What tools could different stakeholders leverage to ensure the early detection of agricultural mis-dis-mal-information?
Impacts of mis-dis-mal-information
  • How could agricultural mis-dis-mal-information affect the industry?
  • How could agricultural mis-dis-mal-information affect different actors within the industry?
  • What are the trust implications of agricultural mis-dis-mal-information?
Strategies for countering, correcting, and dealing with mis-dis-mal-information
  • How can agricultural mis-dis-mal-information on social media be curtailed?
  • What role could different stakeholders play in countering and correcting social media agricultural mis-dis-mal-information?
  • What tools and strategies could different stakeholders leverage to deal with social media agricultural mis-dis-mal-information?
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chowdhury, A.; Kabir, K.H.; Abdulai, A.-R.; Alam, M.F. Systematic Review of Misinformation in Social and Online Media for the Development of an Analytical Framework for Agri-Food Sector. Sustainability 2023, 15, 4753. https://doi.org/10.3390/su15064753

AMA Style

Chowdhury A, Kabir KH, Abdulai A-R, Alam MF. Systematic Review of Misinformation in Social and Online Media for the Development of an Analytical Framework for Agri-Food Sector. Sustainability. 2023; 15(6):4753. https://doi.org/10.3390/su15064753

Chicago/Turabian Style

Chowdhury, Ataharul, Khondokar H. Kabir, Abdul-Rahim Abdulai, and Md Firoze Alam. 2023. "Systematic Review of Misinformation in Social and Online Media for the Development of an Analytical Framework for Agri-Food Sector" Sustainability 15, no. 6: 4753. https://doi.org/10.3390/su15064753

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop