Next Article in Journal
Multilevel Analysis of Civic Engagement and Effectiveness of Energy Transition Policy in Seoul: The Seoul Eco-Mileage Program
Next Article in Special Issue
Impact of COVID-19 Pandemic in Public Mental Health: An Extensive Narrative Review
Previous Article in Journal
Enhancing the Sustainability of Social Housing Policies through the Social Impact Approach: Innovative Perspectives form a “Paris Affordable Housing Challenge” Project in France
Previous Article in Special Issue
A Bibliometric Analysis of COVID-19 across Science and Social Science Research Landscape
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Crisis of Public Health and Infodemic: Analyzing Belief Structure of Fake News about COVID-19 Pandemic

1
Department of Public Administration, Ajou University, Suwon 16499, Korea
2
Department of Local Government Administration, Gangneung-Wonju National University, Gangneung-si 25457, Korea
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(23), 9904; https://doi.org/10.3390/su12239904
Submission received: 27 October 2020 / Revised: 18 November 2020 / Accepted: 25 November 2020 / Published: 26 November 2020
(This article belongs to the Collection Public Health and Social Science on COVID-19)

Abstract

:
False information about COVID-19 is being produced and disseminated on a large scale, impeding efforts to rapidly impose quarantines. Thus, in addition to the COVID-19 pandemic itself, an infodemic related with it is leading to social crises. This study therefore investigates who believes the misinformation that is being produced in the context of COVID-19. We choose two main factors—risk perception factor, so called psychometric paradigm, and communication factor—as independent variables that can affect belief in misinformation related to COVID-19. The results show that, among psychometric variables, perceived risk and stigma positively impact belief in fake news, whereas perceived benefit and trust have negative effects. Among communication factors, source credibility and the quantity of information reduce belief in fake news, whereas the credibility of information sources increases these beliefs. Stigma has the greatest explanatory power among the variables, followed by health status, heuristic information processing, trust, and subjective social class.

1. Introduction

Because of the COVID-19 pandemic, people worldwide have faced extreme fear at the individual level and instability at the social level. Social crises are not driven only by outbreaks of simple social events, e.g., pandemic disease, but can also be further accelerated by the fear accompanying these events. Fear paralyzes people’s reason and hinders rational judgment and, as a result, can cause people to believe excessive information, misinformation, and false information rather than objective facts. Such incorrect information serves to further spread fear within a society.
In the event of a disaster, people try to gather as much information as possible [1]. This vast amount of data often includes both true and false information. The COVID-19 outbreak and the subsequent response have been accompanied by a massive infodemic, that is, an overabundance of both accurate and inaccurate information that makes it hard for people to find trustworthy sources and reliable guidance when needed [2]. The infodemic can be thought of as another infectious disease. Under infodemic conditions, the information related to a specific topic increases exponentially in a very short time. According to the PAHO (Pan American Health Organization) [2], 361,000,000 videos were uploaded to YouTube under the COVID-19 classification in the last 30 days, and 19,200 articles have been published on Google Scholar since the start of the pandemic.
In Korea, fake news has spread since the outbreak of COVID-19. An example of fake news is that the mortality rate of COVID-19 is ten times that of SARS (Severe Acute Respiratory Syndrome). In response to spread of this fake news, the Korean government announced on March 3rd that the World Health Organization (WHO) stated that the global mortality rate of COVID-19 is 3.4%, and also on March 10th that the mortality rate in Korea was 0.7% [3]. Another fake news is that people could prevent COVID-19 infection by applying antiphlamine to their noses, mouths, and hands because bacteria hate antiphlamine and cannot enter the respiratory tract when it is applied. Regarding this rumor, the government responded that because antiphlamine is an anti-inflammatory pain reliever, it is not related to the prevention of coronavirus disease, a respiratory infectious disease [3].
Rumors have both negative and positive functions. On the positive side, they sometimes stimulate discussions of social issues. Dissolving the suspicion embedded in rumors can lead to better social discourse, which can help to resolve social conflict. Conversely, if rumors distort social facts, they produce suspicion and distrust, thereby enhancing social conflict [4] (p. 12). In the case of COVID-19, the infodemic is making the pandemic worse. PAHO [2] explains that the infodemic makes it hard for people, decision makers, and healthcare workers to find trustworthy sources and reliable guidance. Moreover, people may feel anxious, depressed, overwhelmed, emotionally drained, and unable to meet important demands [2]. Fake news and rumors related to COVID-19 therefore spread rapidly across borders through the Internet, in particular social network services (SNSs) [5,6,7,8].
Previous studies have successfully defined the concept of fake news and have classified different types of rumors. However, few studies have empirically analyzed belief in rumors and its determinants. No single factor determines belief in fake news. Instead, Halpern et al. [9] show that such belief is influenced by various factors, including (1) personal and psychological factors; (2) frequency of social media use and specific uses of social media; and (3) political views and online activism.
This study investigates who believes in the misinformation related to COVID-19 that is generated in the current infodemic. Based on survey data, we analyze how two factors such as risk perception factor, so called, psychometric paradigm, and communication factor affect belief in fake information related to COVID-19.

2. Theoretical Background and Hypotheses

2.1. Literature on Fake News and Rumors

Although interest in fake news is relatively recent, rumors have been studied for a long time. Studies on fake news seem to address different terms, definitions, and more determinants than studies on rumors do, but they have considerable similarities. Lazer et al. [10] (p. 1094) define fake news as “… fabricated information that mimics news media content in form but not in organizational process or intent.” According to Pennycook and Rand [11], fake news presently consists primarily of highly salient (if implausible) fabricated claims that are created to spread on social media. On the other hand, in “A Psychology of Rumor,” a seminal work published in 1944, Robert Knapp [12] defines a “rumor as a proposition for belief in a topical reference disseminated without official verification” (p. 22). Rosnow [13] defines rumors as issues that are recognized as being important to people but are not confirmed to be true, and Difonzo and Bordia [14] define a rumor as floating information that has not yet been verified but is highly likely to be useful for recognizing a situation and managing a crisis. Moreover, Allport et al. [15] define a rumor as a statement that prevails orally, predominantly among people, without clear evidence that it is true, leading people to believe that it is true. Rumors are often used as disinformation rather than as fake news. A typical definition of disinformation is “the deliberate creation and sharing of false and/or manipulated information that is intended to deceive and mislead audiences, either for the purposes of causing harm, or for political, personal or financial gain” [16].
There are subtle differences in definition between fake news, rumors, and disinformation, but they have the common features as fabricated content that is different from the facts as a core concept element in theory building.
The study of rumors has a long history, starting with analyses of the propaganda phrases of opponents during World War II. Early research focuses on analyzing national security problems or public anxiety caused by rumors that countries intentionally circulated to their enemies at the national level. However, recent rumors affect people’s attitudes as a kind of constant crisis through Internet media at the social level rather than at the national level [17].
DiFonzo and Bordia [18] highlight three macro-level dimensions of social discourse: context (i.e., the situation or psychological need out of which the discourse arises), function (people’s reasons for engaging in the discourse), and content (the types of statements uttered) (p. 213). Context plays a role as a cause of rumors, function is related to personal motivations for rumors, and content describes types of rumor. These three dimensions constitute large strands of research on rumors. Recent research trends about misinformation are as follows.
The first trend is studies that identify patterns in and types of fake news. After examining 34 articles referred to as fake news, Tandoc et al. [19] suggest categorizing fake news into six types: news satires, news parodies, fabrications, manipulations, advertising, and propaganda. Rowan [20] categorized rumors as being spontaneous or intentional depending on the motives for generating then. Similarly, Kimmel [21] classifies rumors as voluntary or socially planned. Knapp [12] analyzes 1000 kinds of rumors that exist in American society and divides them into four types, including pipedreams, bogies, and wedge drivers. Fearn-Banks [22] refers to rumors as “intentional rumors” meant to achieve specific purposes according to the main cause of a corporate crisis, “premature rumors” used to provide information that will be revealed later, “malicious rumors” intended to damage competing organizations, “nearly true rumors” that are close to the facts, and “birthday rumors” that occur continuously.
The second strand of literature includes studies of the impact of fake news on people and society. Fake news affects individual behavior, such as by changing readers’ opinions at the individual level, threatening people’s safety, and influencing political elections. According to Pew Research Center [23], 88% of Americans said that fake news has caused confusion about the basic facts of current events, and 29% of them reported that they have shared fake political news online. Moreover, Lefevere et al. [24] show through experimental studies based on the Internet that fake news on television, which is produced intentionally, decisively impacts viewers’ attitudes. Vargo et al. [25] suggest that the content produced by certain fake news sites is increasing in volume, indicating that they have the ability to boost publicity about a topic in the online space. Vargo et al. [25] investigate why fake news has this agenda-setting power. In 2016, partisan media sources were particularly vulnerable to fake news agendas owing to the elections. Emerging news outlets also respond to fake news agendas to a lesser extent. Fake news coverage itself is spreading and becoming more autonomous.
A third strand of research includes studies that analyze the causes that lead to the generation, spread, and acceptance of fake news. These studies demonstrate the impacts of not only micro-personal perceptual factors but also macro-structural contexts on belief in fake news. At the micro level, individuals’ perceptions and communication structures determine their attitudes and behaviors regarding rumors. Not everyone is equally exposed to fake news or rumors. According to Grinberg et al. [26], only 1% of individuals account for 80% of exposures to fake news sources, and 0.1% of individuals account for nearly 80% of fake news sources that are shared. Similarly, Guess et al. [27] show that fewer than 10% of Facebook users spread information from “fake news” domains. On the other hand, at the macro level, the national economic situation or uncertainty about a particular situation can influence rumors. For example, the IBS [28] uses an analysis of survey results to demonstrate the impact of GDP on fake news. Internet users in countries with lower economic indicators, such as GDP, tend to be more exposed to fake news about COVID-19 online. Only 16.7% of Internet users in economically developed countries believed that the fake news presented was true, whereas 33.3% of those in some developing countries said that they trusted fake news. This finding stems from the fact that countries with weak infrastructure suffer even greater damage from infodemics [28]. Figure 1 shows countries on the X axis, arranged from left to right by lowest to highest GDP (Gross Domestic Product). The Y-axis indicates the degree of trust in fake news. The graph shows that more economically developed countries have less trust in fake news.
Situational uncertainty based on ambiguity has long been considered as a factor that produces rumors and fake news. Allport and Postman [15,29] provide a basic law about rumors: R = i × a, where R is the strength of a rumor. It varies according to the information on the subject available to the individual concerned (i) multiplied by the ambiguity of the evidence related to the topic at hand (a). Thus, the intensity of a rumor depends not only on its association with a particular individual but also on situational uncertainty. Similarly, Rosnow [30] suggests that rumors are byproducts of an optimal combination of general uncertainty, personal anxiety, credulity, and outcome-relevant involvement. According to DiFonzo and Bordia [18], rumors arise in situational contexts that are ambiguous, threatening, or potentially threatening, in which people feel a psychological need for understanding or security (p. 203).

2.2. Risk Communication Versus Risk Perpception

We analyze the impacts of risk communication and perception factors on beliefs in rumors. Risk communication and risk perception factor, so called psychometric paradigm, have been research topics in risk studies for a long time. Risk communication can be defined as the process of exchanging related information between various risk factors [31]. The risk communication model focuses on external factors from individuals in evaluating risks. In other words, it focuses on the communication process, ranging from the source of an external message to its delivery to its reception. A prototype for the risk communication model is the information theory model proposed by Shannon and Weaver [32]. This model consists of (1) information production and delivery, (2) the sender of information, (3) the channel, (4) the receiver of information, and (5) the final destination of the message. Risk communication research dynamically analyzes the roles of information sources, media, and information receivers in the process of socially amplifying risk information and explores the structural principles of risk communication at the macro level [33]. Kasperson et al. [33] describe this communication process as follows: risk event → event characteristics → information flow → interpretation and response → spread of impact → type of impact.
Conversely, studies focusing on risk perception factors rely on the psychometric paradigm, a dominant paradigm in risk studies. This paradigm constructs a cognitive map of risks and classifies them according to their inherent attributes [34]. Paul Slovic and his colleagues, who introduced the psychometric paradigm, suggest that risk is a subjective rather than objective construct. They observe that people tend to depend on subjective judgment to evaluate degrees of danger [34,35,36]. An important task in the risk perception paradigm is identifying attributes that increase perceived risk. Typically, “dread” or “unknown” attributes of risky objects increase perceived risk. The former factor evokes fear, catastrophe, inequality, and uncontrolled emotions in people, whereas the latter is related to newness. According to Fishhoff et al.’s [36] analysis, 18 attributes of risk can be consolidated into three factors: dread, unknown aspects, and exposure. Within this paradigm, fear of risk stems from subjective aspects that people construct toward a specific object rather than objective aspects. The psychometric paradigm refers to key independent variables, such as perceived benefit, perceived risk, trust, stigma, and knowledge.
Table 1 shows the differences between the risk communication model and the risk perception paradigm. The former focuses on the risk information communication process, whereas the latter focuses on the subject’s perception of risk. Thus, the former model has dynamic properties, and the latter paradigm is somewhat static. The former has interdependent reciprocity because it defines the believer and the recipient, whereas the latter focuses on an independent individual because it stresses people’s individual judgment. Although the former model has the advantage of describing the flow of information and results according to the flow of macro-events or accidents in a given society, it does not clarify the operating mechanism in each communication stage. Moreover, because the communication structures differ in each case, the findings from a specific study are less generalizable. However, although the latter paradigm can identify individuals’ risk perception structures in detail, it tends to overlook the macro-structure or contextual factors that affect individuals’ perceptions.

2.3. Risk Communication Factor

In our communication model, we include source credibility, the quantity and quality of information, heuristic (systemic) information processing, and the receiver’s ability.
First, the source of information refers to the point where information originates, and individuals or organizations can play a role as information sources. Source credibility can be said to be a function of the reliability of the information source. In other words, trust in the source of information determines the information’s influence. According to Porter [37], there is a strong association between trust in a person producing rumors and trust in negative rumors. In Liang’s research [38], students’ belief in Internet rumors is related to the information source’s credibility. Based on an online study of 357 Facebook users, Buchanan and Benson [39] report that messages from trusted sources increase the likelihood that a potentially false message is propagated. Similarly, Lampinen and Smith [40] shows that, for adults, the credibility of the source takes a moderating role; misinformation presented by a credible source decreases the performance to a greater degree than does misinformation presented by a noncredible source. Visentin et al. [41] show that the presence of fake news influences behavioral intentions toward a brand advertised on the webpage. However, the results are that it is ineffective when intrinsic source credibility is high. However, Kim and Dennis [42] demonstrate that changing presentation format of highlighting the information influences its believability, regardless of the source’s credibility.
Second, the quality and the quantity of information both determine its impact. In general, as the amount of rumors increases, information influence tends to increase. If the quality of a rumor is high, the receiver will be more likely to accept it. Allport [29] demonstrates that the amount of negative information usually relates to belief in rumors. Various aspects of qualitative information can be measured, including usefulness, exactness, and vividness. Those qualities of information influence the belief in fake news. For example, rumors thrive when knowledge or useful information is lacking. Thus, providing more useful information for receivers reduces belief in rumors [4]. Similarly, Jin et al. [43] perceived message quality mediates the effect of corrective communication.
Third, information processing and judgment style influence whether a receiver believes targeted information. Information processing and judgment style refer to how people think about specific issues. Greater analytic and systemic thinking reduces confidence in false rumors. For example, Pennycook and Rand [11] demonstrate that analytic thinking is related to the ability to discern between fake and real information. After analyzing headlines, they report that the Cognitive Reflection Test is negatively correlated with the perceived accuracy of fake news, that is, relatively implausible headlines, and is positively correlated with the perceived accuracy of fact-based real news. Moravec et al. [44] show that social media users spent more time considering the headline. Also, since confirmation bias is pervasive, users are more likely to believe news headlines that align with their political opinions. Bago [45] confirmed that the classical account of reasoning (more deliberate thinking reduces belief in rumors) is right; people ineffectively discern between true and false news headlines when they fail to deliberate and instead rely on intuition.
Fourth, individuals’ abilities affect not only their interpretations of information but also their choices of information-processing methods [46]. Pennycook and Rand [11] demonstrate that cognitive ability is positively correlated with the ability to distinguish fake from real news. Self-efficacy refers to a strong belief in one’s abilities. When strong fear causes behavior changes, people may feel self-efficacy, which relates to the positive aspects of ability.
Hypothesis 1.
Source credibility, quality and quality of information, and the receiver’s ability decrease belief in fake news whereas heuristic information processing increases it.

2.4. Risk Perception Factor

The risk perception paradigm, so called, psychometric paradigm, focuses on the impacts of perceived risks and benefits, trust, knowledge, and stigma on rumors.
First, perceived risks and benefits are inversely related in that they have opposite effects. Because avoiding risk is a very natural tendency, perceived risk is been noted as the first factor affecting the acceptance of risk objects. The most common emotional response to the COVID-19 pandemic is fear. DiFonzo and Bordia [18] argue that rumors arise in threatening or potentially threatening situational contexts and when people feel a desperate need for security (p. 213). Based on survey data (N = 973) from Korea, Lee and You [47] report that the average perceived severity score of COVID-19 is higher than the perceived susceptibility; 48.6% of people said that the severity is “high,” and 19.9% said that it is “very high.” Dread and fear increase rumors, whereas positive benefits decrease their negative effects [4]. However, Buchanan and Benson [39] report that personal risk propensity at the individual level does not influence the propagation of “fake” information.
Second, knowledge, as an element of enlightenment, can overcome suspicion, imprisonment, and falsities. According to Chen and Chen [48], knowledge of fake news significantly impacts consumers’ ability to identify fake news and their subsequent brand trust. Moreover, persuasive knowledge of fake news mediates the relationship between self-efficacy and the perceived diagnosis of fake news and brand trust. Krishna [49] demonstrated that knowledge-deficient, vaccine-negative people have higher levels of perceptions, motivations, and active communication behaviors about vaccines. Knowledge about the Internet or social media expresses as information literacy. Information literacy is a necessary condition for reducing the belief in fake news [50].
Third, when social capital is scarce in a particular society, negative rumors and fake news abound. According to Schulz et al. [51], wide spread of false consensus and hostile media perceptions in given society can clearly be linked to populist attitudes. Lewandowsky et al. [52] point out that such phenomena as fake news are the result of great political, economic, and social change. In particular, worsening economic inequality, increased polarization, increased fragmentation of the media landscape, and trust in science are important factors to influence the belief in fake news. Finally, declining social capital affects the mass production of fake news. Social capital is a matter of trust. There was possible association between viewing fake news and distrust, i.e., attitudes cynicism [53]. Trust in information sources not only positively affects information recipients’ likelihood of sharing information elsewhere [54] but also affects their acceptance of information by fostering positive attitudes toward information [55]. Chen and Chen [48] report that media trust significantly predicts consumers’ persuasion knowledge of fake news. According to Talwar et al. [56], online trust is negatively related with authenticating news before sharing. Therefore, Kumar et al. [57] propose that the formation of trust communities would help in accepting the credibility of information from users of such community.
Fourth, stigma relates to the negative affective image of a specific object. Affective emotional aspects of fake news play a critical role in spreading them. Benková [58] (p. 16) argued that people tend to believe disinformation with the context of post-truth era when believing information is based more on emotions, feelings, and beliefs rather than on objective facts. Generally, mass media is closely related to the general public’s emotional response. Kremer et al. [59] show that emotional states can be transferred to others via emotional contagion; when fewer positive emotional posts are produced, people share fewer positive posts and more negative posts, but when negative posts decrease, the opposite pattern occurs. Slovic et al. [60] show that negative images related to the siting of nuclear facilities have negative economic impacts on Las Vegas and Nevada. Fake news starts with the use of personally and emotionally targeted news produced by algo-journalism and the so-called “empathic media” [61]. According to Bavel et al. [62], as negative emotions increase, people may rely on negative information about COVID-19 more than on other information to make decisions (p. 461). According to Paschen [63], the text body of fake news is higher under negative emotions, such as disgust and anger, and lower under positive emotions, such as joy. Also, Sangalang et al. [64] show that narratives consisting of emotional corrective endings are better at correcting attitudes than a simple corrective. Berduygina et al. [65] show that misinformation is more effective when society receives customized information that stimulates the expected emotional response of the target audience. Therefore, Gerbaudo [66] argued that more attention is paid to the all-too-real emotions that are brought out through social media platforms, rather than to the fake news.
Hypothesis 2.
Perceived benefit, trust, and knowledge decrease the belief in fake news but perceived risk and stigma increase it.

3. Method

This study uses survey data (n = 1525) for the analysis. Researchers designed the survey items, and Korea Research, a specialized public opinion poll, conducted the fieldwork to collect the data from survey respondents by using an online survey method. The survey was conducted for seven days (6 August 2020–11 August 2020). Korea Research’s online panel consists of 460,000 people. Emails were sent to 9839 of the panelists, 2083 of them opened the emails, and 1525 people participated in the survey. To match the proportions of respondents to the distribution of the population in Korea, the survey used a quota sampling method that considers region, gender, and age group based on the population of Internet users aged 18 or older. In Korea, there is approximate correspondence between the characteristic values in the whole population and ones in the sample. We used the Internet-based survey because rumors about COVID-19 are mainly distributed through online media. Assuming random sampling, a sampling error of ±2.5% at a 95% confidence level was adopted.
This study analyzes three contents. First, we performed a simple frequency analysis about the measurement items for misinformation. Second, after respondents were classified into subgroups according to main sociodemographic variables, we analyzed whether there was a difference in mean value of misinformation response between them. Third, after setting misinformation as a dependent variable, we analyzed how variable in risk communication and risk perception factors have impact on it.

4. Measurement

To measure belief in fake news that is part of the infodemic related to COVID-19, we selected ten representative fake news items that are distributed in Korean society. These statements include “drinking alcoholic beverages can kill the coronavirus,” “heat from a hair dryer kills the coronavirus,” “eating garlic or applying it to your nose can help prevent coronavirus disease,” “gargling with saltwater (if you rinse your mouth) can prevent coronavirus disease,” “the heat from cigarettes can kill the coronavirus,” “eating or touching wild animals spreads the coronavirus,” “if a person infected with the coronavirus does not have a fever, that person cannot pass it on to others,” “drinking warm water causes the virus to enter the stomach and then to be dissolved in stomach acid,” “being in the sun prevents coronavirus disease,” and “eating a lot of vitamins can protect you from coronavirus disease.” The responses to these questions are measured on a five-point scale, where one point means that the statement is definitely not true and five points means that it is definitely true. To exclude respondents who did not know whether a statement was true because they had not heard about it, those who responded “I do not know” were excluded from the analysis. The percentage of people who answered that they did not know was the highest 5.2% (n = 79) and the lowest 2.6% (n = 39). The reliability coefficient for the ten fake news items is 0.912, suggesting that the respondents have a consistent response structure to fake news.
The measures and reliability scores of the main independent variables are shown in Table 2. Except for source credibility, the variables are measured on a five-point scale, where one means strongly disagree and five means strongly agree. When multiple measurement items are combined into one variable, their mean is used. We check the reliability of the measurement items using Cronbach’s α before calculating the average values. All of the measures are higher than 60.
Among the demographic variables, subjective social class, ideology, and partisanship are measured on a ten-point scale, where higher values indicate a higher social class, more progressive views, and greater partisanship (i.e, strong support for the Moon Jae-in government in this study). In addition, the health statuses of the respondents (i.e., “I am healthy” and “I am in good health compared to other people”) and their health statuses after COVID-19 (i.e., “My physical health has worsened after coronavirus” and “My mental health has deteriorated after coronavirus”) are measured on a five-point scale. In the former, a higher value means good health, while in the latter it means that health is deteriorating.

5. Analysis

Of the respondents (n = 1525), 47.9% are male (n = 731) and 52.1% are female (n = 794). By age, 16.7% are 18–29 years old (n = 254), 16.3% are 30–39 years old (n = 248), 19.6% are 40–49 years old (n = 299), 20.3% are 50–59 years old (n= 310), and 27.1% are over 60 years old (n = 414). By educational background, 47.2% (n = 720) of respondents are high school graduates or under, whereas 52.8% (n = 805) are enrolled in university or graduated. Finally, by monthly average household income, 15.5% (n = 236) of respondents earn less than 2 million won, 17.2% (n = 263) earn 2–3 million won, 21.4% (n = 326) earn 3–4 million won, 16.5% (n = 251) earn 4–5 million won, 10.5% (n = 160) earn 5–6 million won, 7.5% (n = 115) earn 6–7 million won, and 11.4% (n = 174) earn 7 million won or more.
Figure 2 shows the respondents’ beliefs about the ten fake news items as percentages. The overall trend in the responses suggests a high tendency to see fake news as not true. Fake news 7 (i.e., “if a person infected with the coronavirus does not have a fever, that person cannot pass it on to others”) has the lowest percentage of “not true” responses (47.3%), whereas fake news 5 (i.e., “the heat from cigarettes can kill the coronavirus”) has the highest percentage (89.0%). Fake news 10 (i.e., “eating a lot of vitamins can protect you from coronavirus disease”) shows the largest percentage of middle responses (25.0%), whereas fake news 5 has the smallest percentage (3.6%).
For statement 5, the rate of responding as not true is the lowest, while the rate of responding as true is the highest. These results suggest that the intensity of belief in truth weakens the belief in fake news. The statement with the highest rate of “not true” responses is fake news 6 (i.e., “eating or touching wild animals spreads the coronavirus”), and that with the lowest rate is fake news 5. In all cases, the percentage of respondents who answered that they did not know whether the fake news was true or not was below 6%. This result means that many people are exposed to fake news and know about it. The fake news items with highest rate of respondents not knowing about it is fake news 3 (i.e., “eating garlic or applying it to your nose can help prevent coronavirus disease”), whereas fake news 1 (i.e., “drinking alcoholic beverages can kill the coronavirus”) has the lowest rate.
Next, we calculate the average values of the responses to fake news for each demographic group to analyze the degree of belief in fake news by group. Higher scores indicate stronger beliefs in fake news. The mean of overall belief in fake news across all respondents is 1.82 out of 5 points. Since 1.75 is lower than the median value of 2.5, the respondents’ overall belief in fake news seems to be low.
From Figure 3, men tend to believe in fake news more than women do (F-value = 7.826, p-value = 0.005). In general, women react more sensitively to risk than men do, and it is thought that this sensitivity to risk performs the function of filtering fake news.
Higher age groups tend to believe more strongly in fake news (F-value = 1.918, p-value = 0.105). For older people, a coronavirus infection is more likely to be fatal. Paradoxically, this health risk leads to believing in fake news. These results confirm the results of prior studies. According to Grinberg et al. [26], the people who are most likely to engage with fake news sources are conservative leaning, older, and highly engaged with political news. However, based on a Flash Eurobarometer survey of 26,576 respondents across 28 European countries, Borges-Tiago [67] reports that younger people are more likely to recognize fake news and are consequently able to evaluate digital information sources without relying on fake news. When we conducted post-hoc power analyses for age by using Tukey, Duncan, Scheffe, and Bonferroni tests, we found that there is a significant difference in age between respondents in their teens and those in their 60s.
Respondents with less than a high school education have more trust in fake news than those who attended college or had higher diploma (F-value = 6.446, p-value = 0.011). Corbu et al. [68] show that one of the important predictors of third person effect related to fake news detection is education.
Higher income groups also have lower belief in fake news (F-value = 0.950, p-value = 0.387). Because education and income both serve as resources to defend against risk, having more of these resources can lead to more thorough interpretations of fake news. On the other hand, across subjectively perceived social class groups, respondents who reported belonging to the higher social class exhibit stronger beliefs in fake news than those who reported belonging to the lower class (F-value = 21.763, p-value = 0.000). This result contradicts the previous result for income. It suggests that objective income and subjective social classes are fundamentally different attributes.
The ideologically progressive group and the conservative group show little differences in belief in fake news (F-value = 0.186, p-value = 0.667). Axt et al. [69] show that political orientation does moderating effect between need for structure and belief in intentional deception. However, according to Pennycook and Rand [19], the ability to discern real news from fake news is unrelated to how closely the headline matches the participant’s ideology. In addition, support for partisanship (support for the Moon Jae-in administration) does not affect belief in fake news (F-value = 0.023, p-value = 0.880). These results suggest that belief in fake news may be different from political beliefs.
Respondents who know someone who had tested positive for COVID-19 believe more strongly in fake news than those who do not (F-value = 5.229, p-value = 0.022). This result implies that the fear associated with knowing an infected person acts as a mechanism for strengthening belief in fake news. On the other hand, the same result may be due to the fact that people who believe in fake news are infecting others around them.
By health status, belief in fake news is higher among healthy than among unhealthy respondents (F-value = 4.245, p-value = 0.040). Misinformation is more quickly disseminated through social media when public information about issues, such as crises related to health concerns, is released [70]. These results show that health has a significant positive effect on fake news. Interestingly, those whose health has worsened after COVID-19 believe in fake news more strongly than those who maintain good health do (F-value = 33.099, p-value = 0.000). This result suggests that the impact of COVID-19 depends more on changes in health than on health status.
In the above results, the statistically significant factors are gender, education, social class, knowing someone with a confirmed case of COVID-19, health status, and change in health status after COVID-19. Men, people with less education, people in higher social classes, people who know a confirmed case, people in better health, and people with worse health after COVID-19 all have stronger beliefs in fake news.
We next analyze Pearson simple correlations to understand the relationships between the variables, and we show the results in Table 3.
In terms of demographic variables, age, social class, knowing the confirmed person, health status, and worsening health after COVID-19 have a positive relationship with beliefs in false news, whereas gender and education level have a negative relationship with it.
Among the risk perception paradigm variables, belief in fake news is positively related to perceived risk, knowledge, and stigma, and negatively related to perceived benefit and trust. It is generally accepted that knowledge plays a role in grasping the truth and facts in general. However, this study finds that knowledge about fake news is positively related to belief in fake news. Moreover, the result shows that knowledge performs the function of further confirming fake news. In the post-coronavirus era, knowledge may not serve its traditional role. Interpretation of negative role of knowledge is that since participants are rating their believed fake news as knowledge, they are unreliable reporters of their own knowledge.
Of the five risk perception paradigm variables correlated with belief in fake news, stigma has the highest correlation, followed by perceived risk, perceived benefit, trust, and knowledge. These results suggest that people’s emotional images play an important role in relation to COVID-19. Thus, when approaching the problem of fake news, there may be certain limitations to using a rational framework. Among the communication factors, the amount of information and heuristic thinking are positively related to belief in false news, whereas the credibility of the information source, the quality of the government, and the receiver’s ability are negatively related to such beliefs. The positive relationship between the amount of information and belief in fake news is notable, as it means that much of the information that individuals encounter may contain many elements that reinforce fake news. Among the risk communication variables, the variable with the highest correlation with belief in fake news is heuristic thinking, followed by the amount of information, trust in information sources, and the quality of information.
Among the ten variables, the one that is most correlated with belief in fake news is stigma, followed by heuristic thinking. These results suggest that belief in fake news is closely related not only to perception factors but also to communication factors.
We next perform a regression analysis to examine the influence of variables that affect beliefs in rumors, and the results are shown in Table 4. The dependent variable is the degree of belief in fake news, and the regression includes 20 independent variables. We perform a multicollinearity diagnosis for the regression analysis and find that, for all variables, the tolerance limit (tolerance) is 0.1 or more and the variance inflation factor is less than 10, indicating that there is no multicollinearity. The Durbin-Watson value is 2.082, where the reference value is between 1 and 3, indicating that there is no problem with residual independence. To determine the effects of the predictors, we control for gender, age, education, household income, perceived social class, ideology, political partisanship, and whether the respondent knows anyone who was infected with coronavirus disease.
Among the demographic variables, age, social class, partisanship, acquaintance with someone with a confirmed case of COVID-19, health status, and post-COVID-19 health status have positive effects on beliefs in fake news, whereas gender, education level, income, and ideology have negative effects on them. Remarkably, although health status and change in health status after COVID-19 are both related to health, they have different effects. The better a respondent’s health status, the stronger that respondent’s belief in fake news, whereas a worsening health state after COVID-19 is associated with a stronger belief in fake news. These conflicting roles of the two health variables suggest that what is the content of health is important.
Second, political ideology does not affect fake news, whereas partisanship has a significant effect. Currently, the higher a respondent’s level of support for the Moon Jae-in government, the stronger the respondent’s belief in fake news. Krouwel et al. [71] study the relationship between ideologies and trust in conspiracy theories related to rumors. They demonstrate that more extreme ideologies are associated with stronger tendencies to trust conspiracy theories. However, our study does not find that ideology plays a significant role.
Based on standardized beta values, the demographic variable that has the greatest impact on belief in fake news is health status. In addition, social class, age, and partisanship have relatively large effects. Conversely, education level and health status after COVID-19 have relatively small impacts. Interestingly, health status and change in health status after COVID-19 both affect belief in fake news. The former has a greater impact on belief in fake news than the latter. These results suggest that one’s normal health status, rather than the change in one’s health status, has a more decisive effect on the acceptance of disinformation.
Next, among the risk perception paradigm variables, perceived risk and stigma positively affect belief in fake news, and perceived benefit, trust, and knowledge negatively affect such beliefs. However, the influence of knowledge is not statistically significant. Based on the sizes of the standardized coefficients, stigma has the greatest influence, followed by perceived benefit, perceived risk, and trust. As mentioned earlier, belief in fake news is related to emotional factors. Moreover, perceived risk plays a significant role, as in Hajli and Lin’s [72] study, which shows that users’ perceptions of privacy risk in social networks impact their attitudes toward sharing information online. Those who view information sharing as riskier have less positive attitudes toward it.
Lastly, among the risk communication factors, the amount of information and heuristic thinking have positive effects on belief in fake news, and the credibility of the information source has a negative effect. Our findings confirm Buchanan and Benson’s [39] finding that “fake news” items coming from trusted sources are more likely to be propagated by the message recipients. The quality of information and the receiver’s ability do not have statistically significant effects. It is noteworthy that the quantity of information is more important than its quality. Although the government tries to provide objective and scientific information to improve its quality, it does not significantly affect belief in fake news.
Based on the standardized beta values, heuristic thinking has the greatest influence on fake news, followed, in order, by the amount of information and the credibility of the information source. The fact that recipients accept fake news based on heuristic rather than systematic processing suggests that when delivering information to the public, intuitive appeal is more important than elaborate or rational systematic approaches.
Of the twenty independent variables, those with the greatest influence, in order, are stigma, heuristic processing, health state, perceived benefit, and social class. Because risk perception factors, risk communication factors, and demographic factors are all represented among these variables, it follows that belief in fake news is a result of a combination of various factors rather than simply being influenced by a specific factor.
Examining the F-value shows that the full model has a good statistical fit. However, the coefficient of determination (i.e., the R-squared value), which represents the explanatory power of the overall model, is 23.0%, suggesting that it is necessary to identify more influential variables.
To determine each factor’s degree of explanatory power, we performed regression analysis for the variables related to each factor and checked the statistical significance and coefficient of determination for each model. The explanatory power is 4.1% for demographic factors, 16.6% for risk perception factors, and 7.7% for risk communication factors. This result suggests that risk perception factors play a very important role in belief in fake news. In comparison with the work of Halpern et al. [9], who found that personal and political–psychological factors are more relevant in explaining this behavior than specific uses of social media are, we highlight the role of risk perception factors.
In sum, age, social class, partisanship, knowing someone with a confirmed COVID-19 case, health state, health after COVID-19, perceived risk, stigma, the amount of information, and heuristic processing positively influence belief in false news. Gender (female), income, perceived benefits, trust, and trust in information sources have negative effects. Second, the influence of these variables can be ranked in the order of stigma, heuristic processing, health state, perceived benefit, and social class. Because all three factors intervene in believing in fake news, a public policy must simultaneously consider perceptual, communication, and demographic factors. Third, because the psychometric paradigm variables have the greatest influence among the three factors, policies must strategically consider these factors. Fourth, because the current model explains only 23.0% of the variance in the dependent variable, additional variables are needed to explain belief in fake news.

6. Discussion

Our analysis shows that structural factors exist that determine belief in fake news. How to respond to infodemic related to COVID-19? According to the WHO [73], an infodemic cannot be eliminated, but it can be managed. To respond effectively to infodemics, the WHO [73] calls for the adaptation, development, validation, and evaluation of new evidence-based measures and practices to prevent, detect, and respond to mis- and disinformation. Rubin [74] proposed the three types of interventions—automation, education, and regulation—to combat the fake news. The practical implications of this study are as follows.
How can we help individuals respond effectively to the infodemic? Misinformation and disinformation in the health space are thriving, including for COVID-19. It is important that people rely only on authoritative sources to get updated information on the COVID-19 outbreak. The EU suggests that individuals follow the advice of public health authorities and the websites of relevant internal and international organizations, for example, ECDC (European Centre for Disease Prevention and Control) and WHO.
A variety of support is needed to respond to fake news at an individual level. For example, PAHO [2] has considered how people can help in the fight against the COVID-19 infodemic. It emphasizes a more transparent and open information sharing culture. Moreover, education is good instrument to overcome the fake news. Roozenbeek and van der Linden [75] suggested that educational games may be a promising vehicle to inoculate the public against fake news. In response to fake news, it is necessary to provide additional information, refute, and continuously verify information. Based on experiment studies, Lee and Chen [76] show that post-event information, regardless of whether it contains a misleading item, increases the recall of correct information and reduces false recall. Also, Huang [77] shows that rebuttals generally reduce people’s belief in the specific content of rumors. Similarly, Vafeiadis et al. [78] show that attacking the source of fake news decreases the message’s credibility more than denying fake news. According to Jahng [79], key practices of information verifications include (1) looking for original sources and relying on trusted networks, (2) verifying through multiple sources, and (3) consulting others or crowdsourcing.
To support individuals, a more systematic analysis and approach to the mechanisms behind the operation of fake news are needed. For example, Cinelli et al. [80] identify information spreading from questionable sources, finding different volumes of misinformation in each platform. However, information from both reliable and questionable sources are not present. In particular, the current crisis highlights the need to think about future pandemics from a population-based management approach [81]. Also, according to Ciampaglia [82] (p. 147), computational social scientists, by their very nature, can play a two-fold role in the fight against fake news. First, they can elucidate the fundamental mechanisms that make people vulnerable to misinformation online, and, second, they can devise effective strategies to counteract misinformation. Bakir and McStay [61] recommend paying greater attention to the role of digital advertising in causing and combating both the contemporary fake news phenomenon and the near-horizon variant of empathically optimized automated fake news. In terms of information technology, Lewandowsky et al. [54] suggest that responses to fake news must involve technological solutions incorporating psychological principles and an interdisciplinary approach, that is, techno-cognitive solutions. Burkle et al. [83] recommend the utilization of a new multi-disciplinary model for population-based management (PBM) supported by a GPH (Global Public Health) database. Information and communication techniques that can functionally control fake news should be developed. Zhang et al. [84] propose analytic technological tools that can detect fake news for reducing misinformation risks. In terms of theory, the WHO [73] suggests developing “infodemiology,” which is defined as the science of managing infodemics.
In order to effectively respond to COVID-19, cooperation with related organizations is required. To help fight disinformation, the EU is working in close cooperation with online platforms. For example, the EU has been actively tackling disinformation since 2015. Following a decision of the European Council in March 2015, the East StratCom Task Force in the European External Action Service (EEAS) was set up. Also, December 2018, the EU launched an Action Plan against Disinformation based on principles of transparency and accountability. The EU shared the information about fake news by using established channels, such as the Rapid Alert System and the EU integrated political crisis response. To fight the fake news, the European Commission cooperated with European External Action Service and with international partners, including the WHO, the G7 Rapid Response Mechanism, NATO, and others. They together promote authoritative sources, demote content that is fact-checked as false or misleading, and take down illegal content or content that could cause physical harm. Google, Facebook, Twitter, and Mozilla signed onto an EU voluntary Code of Practice, which contained some standards for fighting disinformation [85]. On the other hand, in the USA, the Defense Department is working with the State Department, allies, partners, and other agencies to fight the disinformation. Also, United States Agency for Global Media (USAGM), the Cybersecurity and Infrastructure Security Agency (CISA), and the State Department’s Global Engagement Center have the responsibility of handling the disinformation. The former distributes the COVID-19 disinformation toolkit.

7. Implications and Limitations

This study suggests that risk communication and risk perception factors must be managed simultaneously in order to cope with the infodemic related to COVID-19. To consider risk perception requires an understanding of people’s perceptions and cognitive structures whereas risk communication requires analysis of the transmission, reception, and message of socially structured information.
Also, our analysis shows that stigma and heuristic thinking increase belief in fake news. These results suggest that the public’s understanding of fake news is not based on rational and systematic information processing. Thus, when the government communicates with the public, it needs to use a communication strategy based on more humane and emotional approach rather than a simple, mechanical, and fact-based approach.
Our study has several limitations. First, the explanatory power of the model used in this study is 33%, which seems to be low. Further studies are needed to find more significant variables and models. To increase the likelihood of successfully managing future events related to COVID-19, Khorram-Manesh et al. [86] suggest that it is important to consider preexisting health security, valid population-based management approaches, medical decision-making, communication, continuous assessment, triage, treatment, early and complete physical distancing strategies, and logistics. Those various variables could be considered the predictors for belief in fake news. Also, although the personality of individuals is an important variable, this study overlooked it. Personality traits, such as closed to experience or cautious, introverted, disagreeable or unsympathetic, unconscientious or undirected, and emotionally stable, are better able to distinguish disinformation [87]. Second, because we focused only on perception and communication, we disregarded valuable factors. Because value, politics, and culture matter in causal model [88,89,90,91,92,93,94], it needs to examine the role of them. Third, although the structural context and resource had the explanation power [95,96,97,98,99,100,101,102], we did not consider them when we set up the causal model. In particular, the national context brings about a difference between the degree of belief in fake news and the type of fake news that is believed. Humprecht [103] show that in the US and the UK, the largest shares of partisan disinformation are found, while in Germany and Austria, sensationalist stories prevail. The variables and models overlooked in this study should be verified in future studies. Also, since the causal model between attitude, perception, and behavior is as important [104], our model is necessary to verify this causality. Fourth, since our study focused on the rational dimension of knowledge, we dismissed the emotional attributes of knowledge. However, knowledge has emotional characteristics. Bratianu [105] suggested the three kinds of knowledge: rational knowledge, emotional knowledge, and spiritual knowledge. The failure to consider the emotional aspect of knowledge is one of limitations of this study. Fifth, fake news is closely related to news based on facts or related news. For example, Guo [106] showed that websites that spread misinformation had a close relationship with fact-based media in covering Trump, but not for the news about Clinton. Finally, we did not verify the moderator and mediators for fake news. Although our study shows the great influence of the emotion on fake news in our studies, we did not test its role as moderators or mediators. Also, Myrick and Erlichman [107] show that audience involvement processes and social norm perceptions influences subsequent emotional and social–cognitive reactions, which in turn influence openness to celebrity-based nutrition misinformation.

8. Conclusions

This study showed that both risk perception and risk communication factors have significant impact on believing fake news related to COVID-19. The findings can be summarized as follows.
First, in the analysis of mean values, we found that men rather than women and older individuals have higher confidence in false information. Respondents with less than a university education had higher confidence in rumors than those who attended university or higher had. Lower-income respondents accepted disinformation at a higher rate. Interestingly, we found that respondents in higher subjective social classes were relatively more likely to believe fake news.
Second, in the regression analysis, we found that, among the demographic variables, women do not believe in false information as much as men, and the degree of confidence in fake news increases as age increases. Respondents who support the Moon Jae-in regime have higher confidence in fake information. Thus, political partisanship seems to be related to trust in rumors. Health factors, such as having good health, positively impact trust in rumors. Interestingly, however, experiencing deteriorating health conditions after coronavirus increases confidence in false information.
Among the risk perception factors, perceived risk, perceived benefit, social trust, and stigma have significant effects. Perceived risk and stigma strengthen trust in false information, whereas perceived benefit and social trust weaken it. Among the risk communication factors, the credibility of the information source, the amount of information, and heuristic processing are influential factors. When the information source is more credible and the amount of information is greater, belief in misinformation decreases. When heuristic rather than systematic processing is used, belief in false information related to COVID-19 increases. In relation to hypothesis testing, the amount of information appears opposite to the direction of influence. It is hypothesized that when the information become large, belief in fake news will weaken, but the analysis results are the opposite.
Third, stigma has the greatest explanatory power among the variables, followed by heuristic information processing, health status, trust, and subjective social class.

Author Contributions

Formal analysis and writing—original draft preparation, S.K. (Seoyong Kim); conceptualization, S.K. (Sunhee Kim); methodology, S.K. (Sunhee Kim); writing—review and editing, S.K. (Sunhee Kim). All authors have read and agreed to the published version of the manuscript.

Funding

This research received the external funding from the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2018S1A3A2075609) and Ajou University.

Acknowledgments

This work was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2018S1A3A2075609). Also, this research was supported by research grant from Ajou University.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kawasaki, A.; Meguro, K.; Hener, M. Comparing the disaster information gathering behavior and post-disaster actions of Japanese and foreigners in the Kanto area after the 2011 Tohoku Earthquake. In Proceedings of the 2012 WCEE, Lisbon, Portugal, 24–28 September 2012; Available online: http://www.iitk.ac.in/nicee/wcee/article/WCEE2012_2649.pdf (accessed on 11 November 2012).
  2. PAHO (Pan American Health Organization). Understanding the Infodemic and Misinformation in the Fight Against COVID-19. Available online: https://www.paho.org/en/documents/understanding-infodemic-and-misinformation-fight-against-covid-19 (accessed on 12 August 2020).
  3. MOHW (Ministry of Health and Welfare). COVID-19. Fact & Issue Check. Available online: http://ncov.mohw.go.kr/factBoardList.do (accessed on 11 November 2020).
  4. Kim, S.; Kim, S. Impact of the Fukushima nuclear accident on belief in rumors: The role of risk perception and communication. Sustainability 2017, 9, 2188. [Google Scholar] [CrossRef] [Green Version]
  5. Fernandes, C.M.; Montuori, C. The misinformation network and health at risk: An analysis of fake news included in ‘The 10 reasons why you shouldn’t vaccinate your child’. RECIIS 2020, 14. [Google Scholar] [CrossRef]
  6. Jacob, B.; Mawson, A.R.; Payton, M.; Guignard, J.C. Disaster mythology and fact: Hurricane Katrina and social attachment. Public Health Rep. 2008, 123, 555–566. [Google Scholar] [CrossRef] [PubMed]
  7. Vosoughi, S.; Roy, D.; Aral, S. The spread of true and false news online. Science 2018, 359, 1146–1151. [Google Scholar] [CrossRef]
  8. Moscadelli, A.; Albora, G.; Biamonte, M.A.; Giorgetti, D.; Innocenzio, M.; Paoli, S.; Lorini, C.; Bonanni, P.; Bonaccorsi, G. Fake news and Covid-19 in Italy: Results of a quantitative observational study. Int. J. Environ. Res. Public Health 2020, 17, 5850. [Google Scholar] [CrossRef]
  9. Halpern, D.; Valenzuela, S.; Katz, J.; Miranda, J.P. From belief in conspiracy theories to trust in others: Which factors influence exposure, believing and sharing fake news. Gabriele Meiselwitz, 2019. In Social Computing and Social Media. Design, Human Behavior and Analytics. HCII 2019, Lecture Notes in Computer Science; Meiselwitz, G., Ed.; Springer: Cham, Switzerland, 2019; Volume 11578. [Google Scholar] [CrossRef]
  10. Lazer, D.; Baum, M.A.; Benkler, Y.; Berinsky, A.J.; Greenhill, K.M.; Menczer, F.; Metzger, M.J.; Nyhan, B.; Pennycook, G.; Rothschild, D.; et al. The science of fake news. Science 2018, 359, 1094–1096. [Google Scholar] [CrossRef]
  11. Rand, D.G.; Rand, D.G. Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 2019, 188, 39–50. [Google Scholar] [CrossRef]
  12. Knapp, A. Psychology of rumor. Public Opin. Q. 1944, 8, 22–37. [Google Scholar] [CrossRef] [Green Version]
  13. Rosnow, R.L. Psychology of rumor reconsidered. Psychol. Bull. 1980, 87, 578–591. [Google Scholar] [CrossRef]
  14. Di Fonzo, N.; Bordia, P. Rumor Psychology: Social and Organizational Approaches; American Psychological Association: Washington, DC, USA, 2007. [Google Scholar]
  15. Allport, G.W.; Postman, L.J. The psychology of rumor: Holt, Rinehart & Winston: New York. J. Clin. Psychol. 1947, 3, 402. [Google Scholar]
  16. House of Commons. Disinformation and “Fake News”: Interim Report: Government Response to the Committee’s Fifth Report of Session 2017–2019. House of Commons. 2018. Available online: https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1630/1630.pdf (accessed on 11 November 2020).
  17. Jung, D.; Kim, S. Response to risky society and searching for new governance: An analysis of the effects of value, perception, communication, and resource factors on the belief in rumors about particulate matter. Korean J. Public Adm. 2020, 58, 1–36. [Google Scholar] [CrossRef]
  18. Difonzo, N.; Bordia, P. Rumor, gossip and urban legends. Diogenes 2007, 54, 19–35. [Google Scholar] [CrossRef]
  19. Tandoc, E.C.; Lim, Z.W.; Ling, R. Defining “Fake News”. Digit. J. 2017, 6, 137–153. [Google Scholar] [CrossRef]
  20. Rowan, R. Where did that rumor come from. Fortune Mag. Arch. 1979, 100, 130–137. [Google Scholar]
  21. Kimmel, A.J. Rumors and Rumor Control: A Manager’s Guide to Understanding and Combating Rumors; Routledge: London, UK, 2013. [Google Scholar]
  22. Fearn-Banks, K. Crisis Communications: A Casebook Approach; Routledge: London, UK, 1996. [Google Scholar]
  23. Pew Research Center. The Future of Truth and Misinformation Online. 2017. Available online: https://www.pewresearch.org/internet/2017/10/19/the-future-of-truth-and-misinformation-online/ (accessed on 11 November 2020).
  24. Lefevere, J.; De Swert, K.; Walgrave, S. Effects of popular exemplars in television news. Commun. Res. 2012, 39, 103–119. [Google Scholar] [CrossRef]
  25. Vargo, C.J.; Guo, L.; Amazeen, M.A. The agenda-setting power of fake news: A big data analysis of the online media landscape from 2014 to 2016. New Media Soc. 2018, 20, 2028–2049. [Google Scholar] [CrossRef] [Green Version]
  26. Grinberg, N.; Joseph, K.; Friedland, L.; Swire-Thompson, B.; Lazer, D. Fake news on twitter during the 2016 U.S. presidential election. Science 2019, 363, 374–378. [Google Scholar] [CrossRef]
  27. Guess, A.M.; Nagler, J.; Tucker, J.A. Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Sci. Adv. 2019, 5, eaau4586. [Google Scholar] [CrossRef] [Green Version]
  28. Institute for Basic Science (IBS). 2020. Available online: https://www.ibs.re.kr/cop/bbs/BBSMSTR_000000000971/selectBoardArticle.do?nttId=18985 (accessed on 11 November 2020).
  29. Allport, G.W.; Postman, L.J. An analysis of rumor. Public Opin. Q. 1947, 10, 501–517. [Google Scholar] [CrossRef]
  30. Rosnow, R.L. Inside Rumor: A personal journey. Am. Psychol. 1991, 46, 484–496. [Google Scholar] [CrossRef]
  31. Powell, D.; Leiss, W. Mad Cows and Mother’s Milk: Case Studies in Risk Communication; McGill-Queen’s University Press: Montreal, QC, Canada, 1997. [Google Scholar]
  32. Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1949. [Google Scholar]
  33. Kasperson, R.E.; Renn, O.; Slovic, P.; Brown, H.S.; Emel, J.; Goble, R.; Kasperson, J.X.; Ratick, S. The social amplification of risk: A conceptual framework. Risk Anal. 1988, 8, 177–187. [Google Scholar] [CrossRef] [Green Version]
  34. Sjöberg, L. The methodology of risk perception research. Qual. Quant. 2000, 34, 407–418. [Google Scholar] [CrossRef]
  35. Slovic, P. The Perception of Risk; Routledge: London, UK, 2000. [Google Scholar]
  36. Fischhoff, B.; Slovic, P.; Lichtenstein, S.; Read, S.; Combs, B. How safe is safe enough? A psychometric study of attitudes towards technological risks and benefits. Policy Sci. 1978, 9, 127–152. [Google Scholar] [CrossRef]
  37. Porter, E.G. Birth control discontinuance as a diffusion process. Stud. Fam. Plan. 1984, 15, 20. [Google Scholar] [CrossRef]
  38. Liang, C.; Chou, W.-S.; Hsu, Y.-L. The factors of influencing college student’s belief in consumption-type internet rumors. Int. J. Cyber Soc. Educ. 2009, 2, 37–46. [Google Scholar]
  39. Buchanan, T.; Benson, V. Spreading disinformation on facebook: Do trust in message source, risk propensity, or personality affect the organic reach of “fake news”? Soc. Media Soc. 2019, 5, 87–96. [Google Scholar] [CrossRef] [Green Version]
  40. Lampinen, J.M.; Smith, V.L. The incredible (and sometimes incredulous) child witness: Child eyewitnesses’ sensitivity to source credibility cues. J. Appl. Psychol. 1995, 80, 621–627. [Google Scholar] [CrossRef]
  41. Visentin, M.; Pizzi, G.; Pichierri, M. Fake news, real problems for brands: The impact of content truthfulness and source credibility on consumers’ behavioral intentions toward the advertised brands. J. Interact. Mark. 2019, 45, 99–112. [Google Scholar] [CrossRef]
  42. Kim, A.; Dennis, A.R. Says Who? The Effects of Presentation Format and Source Rating on Fake News in Social Media. MIS Q. 2019, 43, 1025–1039. [Google Scholar] [CrossRef]
  43. Jin, Y.; Van Der Meer, T.G.L.A.; Lee, Y.-I.; Lu, X. The effects of corrective communication and employee backup on the effectiveness of fighting crisis misinformation. Public Relat. Rev. 2020, 46, 101910. [Google Scholar] [CrossRef]
  44. Moravec, P.; Minas, R.; Dennis, A.R. Fake news on social media: People believe what they want to believe when it makes no sense at all. MIS Q. 2019, 43, 1343–1360. [Google Scholar] [CrossRef]
  45. Bago, B.; Rand, D.G.; Pennycook, G. Fake news, fast and slow: Deliberation reduces belief in false (but not true) news headlines. J. Exp. Psychol. Gen. 2020, 149, 1608–1613. [Google Scholar] [CrossRef] [Green Version]
  46. Trumbo, C.W. Heuristic-systematic information processing and risk judgment. Risk Anal. 1999, 19, 391–400. [Google Scholar] [CrossRef] [PubMed]
  47. Lee, M.; You, M. Psychological and behavioral responses in South Korea during the early stages of coronavirus disease 2019 (COVID-19). Int. J. Environ. Res. Public Health 2020, 17, 2977. [Google Scholar] [CrossRef] [PubMed]
  48. Chen, Z.F.; Cheng, Y. Consumer response to fake news about brands on social media: The effects of self-efficacy, media trust, and persuasion knowledge on brand trust. J. Prod. Brand Manag. 2019, 29, 188–198. [Google Scholar] [CrossRef]
  49. Krishna, A. Motivation with misinformation: Conceptualizing lacuna individuals and publics as knowledge-deficient, issue-negative activists. J. Public Relat. Res. 2017, 29, 176–193. [Google Scholar] [CrossRef]
  50. De Paor, S.; Heravi, B.R. Information literacy and fake news: How the field of librarianship can help combat the epidemic of fake news. J. Acad. Libr. 2020, 46, 102218. [Google Scholar] [CrossRef]
  51. Schulz, A.; Wirth, W.; Müller, P. We are the people and you are fake news: A social identity approach to populist citizens’ false consensus and hostile media perceptions. Commun. Res. 2020, 47, 201–226. [Google Scholar] [CrossRef] [Green Version]
  52. Lewandowsky, S.; Ecker, U.K.; Cook, J. Beyond misinformation: Understanding and coping with the “Post-Truth” Era. J. Appl. Res. Mem. Cogn. 2017, 6, 353–369. [Google Scholar] [CrossRef] [Green Version]
  53. Balmas, M. When fake news becomes real: Combined exposure to multiple news sources and political attitudes of inefficacy, alienation, and cynicism. Commun. Res. 2014, 41, 430–454. [Google Scholar] [CrossRef]
  54. Shin, S.Y.; Lee, B.J.; Cha, S.M. Impact of online restaurant information WOM characteristics on the effect of WOM-Focusing on the mediating role of source-credibility. Prev. Nutr. Food Sci. 2011, 24, 217–225. [Google Scholar] [CrossRef] [Green Version]
  55. Yun, H.-J.; Ahn, S.-H.; Lee, C.-C. Determinants of trust in power blogs and their effect on purchase intention. J. Korea Contents Assoc. 2012, 12, 411–419. [Google Scholar] [CrossRef] [Green Version]
  56. Talwar, S.; Dhir, A.; Kaur, P.; Zafar, N.; Arlasheedi, M.A. Why do people share fake news? Associations between the dark side of social media use and fake news sharing behavior. J. Retail. Consum. Serv. 2019, 51, 72–82. [Google Scholar] [CrossRef]
  57. Kumar, K.P.K.; Srivastava, A.; Geethakumari, G. A psychometric analysis of information propagation in online social networks using latent trait theory. Computing 2016, 98, 583–607. [Google Scholar] [CrossRef]
  58. Benková, Z. The spread of disinformation. why do people believe them and how to combat them.The importance of media literacy. Mark. Identity 2018, 6 Pt 2, 16–27. [Google Scholar]
  59. Kramer, A.; Guillory, J.; Hancock, J. Experimental evidence of massive-scale emotional contagion through social networks. Proc. Natl. Acad. Sci. USA 2014, 111, 8788–8790. [Google Scholar] [CrossRef] [Green Version]
  60. Slovic, P.; Layman, M.; Kraus, N.; Flynn, J.; Chalmers, J.; Gesell, G. Perceived risk, stigma, and potential economic impacts of a high-level nuclear waste repository in Nevada. Risk Anal. 1991, 11, 683–696. [Google Scholar] [CrossRef] [Green Version]
  61. Bakir, V.; McStay, A. Fake news and the economy of emotions. Digit. Journal. 2018, 6, 154–175. [Google Scholar] [CrossRef]
  62. Van Bavel, J.J.; Baicker, K.; Boggio, P.S.; Capraro, V.; Cichocka, A.; Cikara, M.; Crockett, M.J.; Crum, A.J.; Douglas, K.M.; Druckman, J.N.; et al. Using social and behavioural science to support COVID-19 pandemic response. Nat. Hum. Behav. 2020, 4, 460–471. [Google Scholar] [CrossRef]
  63. Paschen, J. Investigating the emotional appeal of fake news using artificial intelligence and human contributions. J. Prod. Brand Manag. 2020, 29, 223–233. [Google Scholar] [CrossRef] [Green Version]
  64. Sangalang, A.; Ophir, Y.; Cappella, J.N. The potential for narrative correctives to combat misinformation. J. Commun. 2019, 69, 298–319. [Google Scholar] [CrossRef] [PubMed]
  65. Berduygina, O.N.; Vladimirova, T.N.; Chernyaeva, E.V. Trends in the spread of fake news in the mass media. Media Watch. 2019, 10, 122–132. [Google Scholar] [CrossRef]
  66. Gerbaudo, P. Fake news and all-too-real emotions: Surveying the social media battlefield. Brown J. World Aff. 2018, 25, 1–16. [Google Scholar]
  67. Borges-Tiago, M.T.; Tiago, F.; Silva, O.; Martínez, J.M.G.; Botella-Carrubi, D. Online users’ attitudes toward fake news: Implications for brand management. Psychol. Mark. 2020, 37, 1171–1184. [Google Scholar] [CrossRef]
  68. Corbu, N.; Oprea, D.-A.; Negrea-Busuioc, E.; Radu, L. ‘They can’t fool me, but they can fool the others’! Third person effect and fake news detection. Eur. J. Commun. 2020, 35, 165–180. [Google Scholar] [CrossRef]
  69. Axt, J.R.; Landau, M.J.; Kay, A.C. The psychological appeal of fake-news attributions. Psychol. Sci. 2020, 31, 848–857. [Google Scholar] [CrossRef]
  70. Jang, S.M.; McKeever, B.W.; McKeever, R.; Kim, J.K. From social media to mainstream news: The information flow of the vaccine-autism controversy in the US, Canada, and the UK. Health Commun. 2019, 34, 110–117. [Google Scholar] [CrossRef]
  71. Krouwel, A.; Kutiyski, Y.; Van Prooijen, J.-W.; Martinsson, J.; Markstedt, E. Does extreme political ideology predict conspiracy beliefs, economic evaluations and political trust? Evidence from Sweden. J. Buddh. Ethics 2018, 5, 435–462. [Google Scholar] [CrossRef] [Green Version]
  72. Hajli, N.; Lin, X. Exploring the security of information sharing on social networking sites: The role of perceived control of information. J. Bus. Ethic 2016, 133, 111–123. [Google Scholar] [CrossRef]
  73. WHO. 1st WHO Infodemiology Conference. 2020. Available online: https://www.who.int/news-room/events/detail/2020/06/30/default-calendar/1st-who-infodemiology-conference (accessed on 11 November 2020).
  74. Rubin, V.L. Disinformation and misinformation triangle: A conceptual model for “fake news” epidemic, causal factors and interventions. J. Doc. 2019, 75, 1013–1034. [Google Scholar] [CrossRef]
  75. Roozenbeek, J.; Van Der Linden, S. The fake news game: Actively inoculating against the risk of misinformation. J. Risk Res. 2019, 22, 570–580. [Google Scholar] [CrossRef]
  76. Lee, Y.-S.; Chen, K.-N. Post-event information presented in a question form eliminates the misinformation effect. Br. J. Psychol. 2013, 104, 119–129. [Google Scholar] [CrossRef] [PubMed]
  77. Huang, H. A War of (Mis) Information: The political effects of rumors and rumor rebuttals in an authoritarian country. Br. J. Politi. Sci. 2017, 47, 283–311. [Google Scholar] [CrossRef] [Green Version]
  78. Vafeiadis, M.; Bortree, D.S.; Buckley, C.; Diddi, P.; Xiao, A. Refuting fake news on social media: Nonprofits, crisis response strategies and issue involvement. J. Prod. Brand Manag. 2020, 29, 209–222. [Google Scholar] [CrossRef]
  79. Jahng, M.R.; Lee, H.; Rochadiat, A. Public relations practitioners’ management of fake news: Exploring key elements and acts of information authentication. Public Relat. Rev. 2020, 46, 101907. [Google Scholar] [CrossRef]
  80. Cinelli, M.; Quattrociocchi, W.; Galeazzi, A.; Valensise, C.M.; Brugnoli, E.; Schmidt, A.L.; Zola, P.; Zollo, F.; Scala, A. The COVID-19 social media infodemic. Sci. Rep. 2020, 10, 1–10. [Google Scholar] [CrossRef]
  81. Goniewicz, K.; Khorram-Manesh, A.; Hertelendy, A.J.; Goniewicz, M.; Naylor, K.; Burkle, F.M., Jr. Current response and management decisions of the European Union to the COVID-19 outbreak: A Review. Sustainability 2020, 12, 3838. [Google Scholar] [CrossRef]
  82. Ciampaglia, G.L. Fighting fake news: A role for computational social science in the fight against digital misinformation. J. Comput. Soc. Sci. 2018, 1, 147–153. [Google Scholar] [CrossRef]
  83. Burkle, F.M.; Bradt, D.A.; Green, J.; Ryan, B.J. Global public health database support to population-based management of pandemics and global public health crises, Part II: The database. Prehospital Disaster Med. 2020, 2020, 1–6. [Google Scholar] [CrossRef]
  84. Zhang, C.; Gupta, A.; Kauten, C.; Deokar, A.V.; Qin, X. Detecting fake news for reducing misinformation risks using analytics approaches. Eur. J. Oper. Res. 2019, 279, 1036–1052. [Google Scholar] [CrossRef]
  85. EC (European Commission). Coronavirus: EU Strengthens Action to Tackle Disinformation. Available online: https://ec.europa.eu/commission/presscorner/detail/en/ip_20_1006.2020 (accessed on 11 November 2020).
  86. Khorram-Manesh, A.; Carlström, E.; Hertelendy, A.J.; Goniewicz, K.; Casady, C.B.; Burkle, F. Does the prosperity of a country play a role in COVID-19 outcomes? Disaster Med. Public Health Prep. 2020, 1–20. [Google Scholar] [CrossRef] [PubMed]
  87. Wolverton, C.; Stevens, D. The impact of personality in recognizing disinformation. Online Inf. Rev. 2019, 44, 181–191. [Google Scholar] [CrossRef]
  88. Kim, S. Irresolvable cultural conflicts and conservation/development arguments: Analysis of Korea’s Saemangeum project. Policy Sci. 2003, 36, 125–149. [Google Scholar] [CrossRef]
  89. Kim, S.; Kim, H. Does cultural capital matter? Cultural divide and quality of life. Soc. Indic. Res. 2009, 93, 295–313. [Google Scholar] [CrossRef]
  90. Kim, S.; Kim, S. Exploring the Effect of Four Factors on Affirmative Action Programs for Women. Asian J. Women’s Stud. 2014, 20, 31–70. [Google Scholar] [CrossRef]
  91. Kim, S.; Kim, S. Analysis of the impact of health beliefs and resource factors on preventive behaviors against the COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2020, 17, 8666. [Google Scholar] [CrossRef]
  92. Ryu, Y.; Kim, S.; Kim, S. Does trust matter? analyzing the impact of trust on the perceived risk and acceptance of nuclear power energy. Sustainability 2018, 10, 758. [Google Scholar] [CrossRef] [Green Version]
  93. Wang, J.; Kim, S. Analysis of the impact of values and perception on climate change skepticism and its implication for public policy. Climate 2018, 6, 99. [Google Scholar] [CrossRef] [Green Version]
  94. Kwon, S.A.; Kim, S.; Lee, J.E. Analyzing the determinants of individual action on climate change by specifying the roles of six values in South Korea. Sustainability 2019, 11, 1834. [Google Scholar] [CrossRef] [Green Version]
  95. Kim, S.; Lee, J.E.; Kim, D. Searching for the next new energy in energy transition: Comparing the impacts of economic incentives on local acceptance of fossil fuels, renewable, and nuclear energies. Sustainability 2019, 11, 2037. [Google Scholar] [CrossRef] [Green Version]
  96. Kim, S.; Kwon, S.A.; Lee, J.E.; Ahn, B.-C.; Lee, J.H.; Chen, A.; Kitagawa, K.; Kim, D.; Wang, J. analyzing the role of resource factors in citizens’ intention to pay for and participate in disaster management. Sustainability 2020, 12, 3377. [Google Scholar] [CrossRef] [Green Version]
  97. Kim, S.; Kim, D. Does government make people happy? Exploring new research directions for government’s roles in happiness. J. Happiness Stud. 2011, 13, 875–899. [Google Scholar] [CrossRef]
  98. Kim, S.; Choi, S.-O.; Wang, J. Individual perception vs. structural context: Searching for multilevel determinants of social acceptance of new science and technology across 34 countries. Sci. Public Policy 2014, 41, 44–57. [Google Scholar] [CrossRef]
  99. Ryu, Y.; Kim, S. Testing the heuristic/systematic information-processing model (HSM) on the perception of risk after the Fukushima nuclear accidents. J. Risk Res. 2014, 18, 840–859. [Google Scholar] [CrossRef]
  100. Wang, J.; Kim, S. Comparative Analysis of Public Attitudes toward Nuclear Power Energy across 27 European Countries by Applying the Multilevel Model. Sustainability 2018, 10, 1518. [Google Scholar] [CrossRef] [Green Version]
  101. Kim, S.; Kim, S. Exploring the determinants of perceived risk of Middle East Respiratory Syndrome (MERS) in Korea. Int. J. Environ. Res. Public Health 2018, 15, 1168. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  102. Wang, J.; Kim, S. Searching for new directions for energy policy: Testing the cross-effect of risk perception and cyberspace factors on online/offline opposition to nuclear energy in South Korea. Sustainability 2019, 11, 1368. [Google Scholar] [CrossRef] [Green Version]
  103. Humprecht, E. Where ‘fake news’ flourishes: A comparison across four Western democracies. Inf. Commun. Soc. 2019, 22, 1973–1988. [Google Scholar] [CrossRef]
  104. Kim, B.J.; Kim, S.; Kim, S. Searching for new directions for energy policy: Testing three causal models of risk perception, attitude, and behavior in nuclear energy context. Int. J. Environ. Res. Public Health 2020, 17, 7403. [Google Scholar] [CrossRef]
  105. Bratianu, C. (Ed.) Organizational Knowledge Dynamics: Managing Knowledge Creation, Acquisition, Sharing, and Transformation: Managing Knowledge Creation, Acquisition, Sharing, and Transformation; IGI Global: Hershey, PA, USA, 2015. [Google Scholar] [CrossRef]
  106. Guo, L.; Vargo, C. “fake news” and emerging online media ecosystem: An integrated intermedia agenda-setting analysis of the 2016 U.S. Presidential Election. Commun. Res. 2020, 47, 178–200. [Google Scholar] [CrossRef]
  107. Myrick, J.G.; Erlichman, S. How audience involvement and social norms foster vulnerability to celebrity-based dietary misinformation. Psychol. Popul. Media Cult. 2020, 9, 367–379. [Google Scholar] [CrossRef]
Figure 1. GDP per capita and belief in fake news across 40 countries. Source: IBS [28].
Figure 1. GDP per capita and belief in fake news across 40 countries. Source: IBS [28].
Sustainability 12 09904 g001
Figure 2. Beliefs about fake news.
Figure 2. Beliefs about fake news.
Sustainability 12 09904 g002
Figure 3. Mean of belief in fake news by socio-demographic group.
Figure 3. Mean of belief in fake news by socio-demographic group.
Sustainability 12 09904 g003
Table 1. Comparison of the risk communication model and the risk perception paradigm.
Table 1. Comparison of the risk communication model and the risk perception paradigm.
Risk Communication ModelRisk Perception Paradigm
Discipline- Information theory, communications- Psychology
Key variables- Receiver, message (information), source- Perceived benefit, perceived risk, stigma, trust, knowledge
Assumed mode of judgment- Interdependent judgment- Independent judgment
Methods- Qualitative and quantitative methods- Quantitative methods, mainly surveys
Strengths- Highlights the roles and functions of risk communication factors
- Explains dynamics
- Causal explanations
- Explanatory power for risk judgment
Weaknesses- Limited generalizability
- Oversimplifies complex communication processes
- Dismisses the context
- Inability to explain perception changes
Source: Kim and Kim [4].
Table 2. Measures and reliability.
Table 2. Measures and reliability.
VariableItemReliability
Perceived riskI am relatively more likely to get coronavirus disease than others are.0.846
I am more vulnerable to coronavirus disease compared to others.
Perceived benefitIf the coronavirus problem is solved, it will be a great benefit to our society.0.812
When the coronavirus disease is overcome, our society will develop greatly.
KnowledgeI know a lot about coronavirus disease.0.840
I know more about coronavirus disease than others do.
StigmaPeople with coronavirus disease are bad people.0.919
People with coronavirus disease are dirty.
Source credibilityHow much do you trust the following subjects for providing coronavirus-related information? (Response scale: 1 = extremely distrust, 2 = slightly distrust, 3 = usually, 4 = slightly trust, 5 = extremely trust). ① Central Disease Control Headquarters, ② Korea Centers for Disease Control and Prevention (KCDC), and ③ Jeong Eun-kyeong, Director of KCDC, Korea 0.809
Quantity of informationI have more coronavirus-related information than others have.0.887
I have obtained a lot of meaningful information related to coronavirus disease.
Quality of informationCoronavirus-related information provided by the government is objective based on facts.0.912
Coronavirus-related information provided by the government is scientifically based and professional.
Heuristic processingRather than analyzing coronavirus-related information carefully and logically, I make judgments based on intuitive feelings.0.816
I interpret coronavirus-related information emotionally rather than rationally.
Receiver’s abilityI can understand coronavirus-related issues.0.664
I have the ability to distinguish between truth and fiction in coronavirus-related information.
Table 3. Pearson simple correlations.
Table 3. Pearson simple correlations.
1234567891011121314151617181920
1. Belief in fake news1
2. Gender (female)−0.072 ***1
3. Age0.066 **−0.0031
4. Education−0.065 **−0.074 ***−0.305 ***1
5. Income−0.028−0.017−0.072 ***0.207 ***1
6. Social class0.129 ***0.006−0.0220.210 ***0.343 ***1
7. Ideology (progressive)−0.0120.059 **−0.125 ***0.084 ***0.0070.059 **1
8. Partisanship−0.0060.000−0.103 ***0.068 ***0.0230.052 **0.564 ***1
9. Knowing the confirmed case0.059 **−0.015−0.0400.0360.0290.0200.0240.0231
10. Health status0.091 ***−0.022−0.0330.121 ***0.165 ***0.271 ***0.075 ***0.083 ***0.0041
11. Health status after COVIID-190.154 ***0.054 **0.019−0.049 *−0.038−0.0390.032−0.106 ***0.000−0.146 ***1
12. Perceived risk0.158 ***−0.0110.104 ***−0.064 **−0.079 ***−0.060 *0.020−0.0310.035−0.264 ***0.341 ***1
13. Perceived benefit−0.156 ***0.004−0.0140.121 ***0.077 ***0.044 *0.179 ***0.287 ***−0.0020.211 ***−0.102 ***−0.058 **1
14. Trust−0.074 ***−0.049 *0.076 ***0.0420.050 **0.156 ***0.055 **0.114 ***0.0380.122 ***−0.094 ***−0.078 ***0.098 ***1
15 Knowledge0.071 ***−0.066 **0.0410.115 ***0.088 ***0.159 ***0.136 ***0.123 ***0.052 **0.202 ***0.077 ***0.075 ***0.198 ***0.105 ***1
16. Stigma0.387 ***−0.073 ***−0.068 ***−0.0340.0110.136 ***0.018−0.027−0.0050.0240.224 ***0.186 ***−0.197 ***−0.119 ***0.112 ***1
17. Source credibility−0.114 ***0.0120.090 ***0.0070.0370.0080.206 ***0.302 ***0.0210.109 ***−0.087 ***−0.0030.286 ***0.098 ***0.095 ***−0.166 ***1
18. Quality of information −0.080 ***0.0080.0230.0300.0300.0170.356 ***0.582 ***−0.0010.162 ***−0.183 ***−0.060 *0.376 ***0.140 ***0.194 ***−0.147 ***0.493 ***1
19. Quantity of information0.116 ***−0.045 *0.065 **0.057 *0.064 **0.093 ***0.165 ***0.247 ***0.0190.188 ***0.082 ***0.100 ***0.199 ***0.081 ***0.454 ***0.051 **0.214 ***0.406 ***1
20. Heuristic processing0.215 ***−0.0230.025−0.071 ***−0.0250.0240.077 ***0.093 ***0.0110.0370.143 ***0.217 ***0.011−0.0230.0450.158 ***0.0360.136 ***0.318 ***1
21. Receiver’s ability−0.006−0.059 **0.0080.110 ***0.080 ***0.130 ***0.125 ***0.187 ***0.0140.196 ***−0.0120.0170.237 ***0.084 ***0.446 ***−0.069 ***0.203 ***0.303 ***0.451 ***0.042 *
Note: * p < 0.1, ** p < 0.05, *** p < 0.001.
Table 4. Regression analysis results.
Table 4. Regression analysis results.
BS.E.BetaT-ValueSig.
Controlled VariablesConstant0.5920.227 2.6070.009
Gender (female)−0.071 **0.036−0.046−1.9880.047
Age0.004 ***0.0010.0722.8920.004
Education level−0.0560.039−0.036−1.4310.153
Income−0.093 **0.042−0.055−2.2240.026
Social class0.042 ***0.0120.0893.4150.001
Ideology −0.0130.012−0.030−1.0690.285
Partisanship0.018 **0.0080.0722.2430.025
COVID-19 confirmed case0.271 ***0.1030.0602.6340.009
Health status0.118 ***0.0250.1234.7350.000
Health state after COVID-190.041 *0.0230.0451.7560.079
Risk Perception Factors (F1)Perceived risk0.063 **0.0230.0712.7300.006
Perceived benefit−0.093 ***0.024−0.099−3.8040.000
Trust−0.054 **0.024−0.054−2.2870.022
Knowledge−0.0120.033−0.010−0.3570.721
Stigma0.276 ***0.0240.28911.3500.000
Communication Factors(F2)Source credibility−0.050 *0.025−0.054−2.0110.044
Quality of information −0.0370.029−0.043−1.2640.206
Quantity of information0.071 ***0.0300.0712.3650.018
Heuristic processing0.115 ***0.0240.1174.6940.000
Receiver’s ability−0.0100.034−0.008−0.3070.759
F-Value/R2/Ad. R233.123 ***/0.230/0.219
Controlled variablesF-Value/R2/Ad. R27.979 ***/0.041/0.036
F1F-Value/R2/Ad. R259.706 ***/0.166/0.163
F2F-Value/R2/Ad. R225.502 ***/0.077/0.073
Note: * p < 0.1, ** p < 0.05, *** p < 0.001.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kim, S.; Kim, S. The Crisis of Public Health and Infodemic: Analyzing Belief Structure of Fake News about COVID-19 Pandemic. Sustainability 2020, 12, 9904. https://doi.org/10.3390/su12239904

AMA Style

Kim S, Kim S. The Crisis of Public Health and Infodemic: Analyzing Belief Structure of Fake News about COVID-19 Pandemic. Sustainability. 2020; 12(23):9904. https://doi.org/10.3390/su12239904

Chicago/Turabian Style

Kim, Seoyong, and Sunhee Kim. 2020. "The Crisis of Public Health and Infodemic: Analyzing Belief Structure of Fake News about COVID-19 Pandemic" Sustainability 12, no. 23: 9904. https://doi.org/10.3390/su12239904

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop