Next Article in Journal
“Virtual Masks” and Online Identity: The Use of Fake Profiles in Armenian Social Media Communication
Previous Article in Journal
Social Media’s Influence on Gendered Interpersonal Communication: Insights from Jordan
Previous Article in Special Issue
Exploring Conflict Escalation: Power Imbalance, Alliances, Diplomacy, Media, and Big Data in a Multipolar World
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Beyond Information Warfare: Exploring Fact-Checking Research About the Russia–Ukraine War

by
Ricardo Morais
1,*,
Valeriano Piñeiro-Naval
2 and
David Blanco-Herrero
3
1
Department of Communication and Information Sciences, Faculty of Arts and Humanities, University of Porto, 4150-564 Porto, Portugal
2
Department of Sociology and Communication, Faculty of Communication, University of Salamanca, 37007 Salamanca, Spain
3
Faculty of Social and Behavioural Sciences, University of Amsterdam, P.O. Box 19268, 1000 GG Amsterdam, The Netherlands
*
Author to whom correspondence should be addressed.
Journal. Media 2025, 6(2), 48; https://doi.org/10.3390/journalmedia6020048
Submission received: 13 January 2025 / Revised: 10 March 2025 / Accepted: 19 March 2025 / Published: 25 March 2025

Abstract

:
The Russian invasion of Ukraine has also ignited a battleground in the domain of information. The conflict has been accompanied by a relentless disinformation offensive designed to manipulate public opinion and undermine democratic processes. This paper deals with the role of academia and scholars in focusing this information warfare. This study conducts a comprehensive analysis of scientific articles to examine how researchers and institutions have addressed fact-checking initiatives. To this end, performance analysis and literature review are combined to observe the state of academic investigations on fact-checking during the first thousand days of war in Ukraine (from 24 February 2022 to 19 November 2024). To do this, we identified 595 fact-checking articles in the Web of Science database within the “Social Sciences” category and narrowed the focus to 270 articles in the field of “Communication”. Finally, through an in-depth literature review of eight manuscripts, we seek to understand the specific strategies employed by academics to address the conflict between Russia and Ukraine through fact-checking. Our findings suggest that fact-checking research on the Russia–Ukraine war predominantly examines the impact of disinformation in conflict contexts, the role of media literacy in countering false narratives, and the contribution of citizen journalism to verification efforts. These conclusions can shed light on the crucial role of academia in safeguarding truth and fostering informed public debate in an era of information overload and manipulation.

1. Introduction

Although the concept and existence of mis- and disinformation are not new, their importance has reached an unprecedented dimension in the new media ecosystem. At least since 2016, it has become clear that “the online information infrastructure was particularly permeable to disinformation and misinformation, [and] more and more groups decided to turn their attention to fact-checking” (Mantzarlis, 2018, p. 82). In this context, what is new is that manipulation is now “fueled by new technologies”, including advanced techniques such as deepfakes or AI-generated content, but also the use of social media as the main spreading channel. In addition, Ireton and Posetti argue that we have witnessed a “weaponisation of information on an unprecedented scale. Powerful new technology makes the manipulation and fabrication of content simple, and social networks dramatically amplify falsehoods” which “are shared by uncritical publics” (Ireton & Posetti, 2018, p. 15).
It is precisely the dimension and impact that disinformation and misinformation have today that makes this phenomenon a new and disturbing object of study. Following Wardle and Derakhshan (2018), we should highlight the distinction between both terms: misinformation refers to “information that is false, but the person who is disseminating it believes that it is true”, while “disinformation is information that is false, and the person who is disseminating it knows it is false”. The difference is that in the second case, we are talking about a “deliberate, an intentional lie, and points to people being actively disinformed by malicious actors” (p. 44).
In this context, if it is true that “the spread of disinformation and misinformation is made possible largely through social networks and social messaging, which begs the question of the extent of regulation and self-regulation of companies providing these services” (Berger, 2018, p. 8), we cannot ignore the fact that these platforms continue being used as the primary source of information by a significant number of people (Cardoso et al., 2023). This shift in news consumption occurs in an environment where false content spreads faster than authentic content, in addition to being more attractive and, therefore, more easily shared than the latter (Baptista & Gradim, 2022; Vosoughi et al., 2018).
This new disinformation and misinformation scenario, together with broader trends in the communication and information ecosystem, explain the growth of fact-checking initiatives, especially its second wave, “concentrated as much on fact-checking public claims as debunking these viral hoaxes” (Mantzarlis, 2018, p. 82). According to the latest data from Duke Reporter’s Lab, “457 fact-checking projects in 111 countries were active over the past two years” (Stencel et al., 2024).
Hence, our first objective (O1) is to examine the increasing significance of fact-checking in academic research. Moreover, this investigation aims to comprehensively analyze scholarly publications about fact-checking within the Social Sciences and Communication domains. We seek to understand the trend of such articles during this period, also identifying the most prominent authors, journals, institutions, and countries in this line of research, as well as the main keywords used in the published articles. The reason why we decided to study fact-checking in the context of academic publications arises from the relevance that this practice, considered by many to be a new journalistic genre (Almeida-Santos et al., 2023), has gained in recent years in the context of studies on disinformation (Salaverría & Cardoso, 2023). Nevertheless, it also results from the fact that several studies have identified gaps in studying the long-term, systemic impacts of fact-checking (Dias & Sippitt, 2020), suggesting that the practice of fact-checking could be more effective and more efficient in the context of global misinformation challenges if there were closer collaboration between academia and industry (Tejedor et al., 2024).
After several studies have analyzed fact-checking activities in the context of the COVID-19 pandemic (e.g., Almeida-Santos et al., 2023; Rivas-de-Roca & Pérez-Curiel, 2023; Sousa et al., 2022; Ferreira & Amaral, 2022), more recent studies have highlighted the role of what they call “war fact-check” (Magallón-Rosa et al., 2023), which, taking as a starting point the Russian invasion of Ukraine in February 2022, study the verifications carried out in the frame of this war (Baptista et al., 2023b; Burel & Alani, 2023; Zeng et al., 2024). Following these studies, our second goal (O2) is to comprehend how fact-checking about the war has been tackled by academia during the first thousand days since the beginning of the offensive (from 24 February 2022 to 19 November 2024). The choice of this period for analysis was not random but rather the result of several criteria.
First, when President Putin announced a “special military operation” in Ukraine, on 24 February 2022, the conflict took on a new dimension, with levels of disinformation about this war never seen before (Zecchinon & Standaert, 2025). For fact-checkers themselves, it has become, in many cases, impossible to “fact-check everything” due to “geographical distance and a lack of background information”, but also due to propaganda mechanisms (Dierickx & Lindén, 2024). On the other hand, “the Russian invasion is the first war waged to occur in Europe in the age of social networks” (Zecchinon & Standaert, 2025, p. 62), therefore having very particular characteristics, insofar as “it is not a lack of information that makes this war opaque, but the amount of data disseminated in record time, especially during the first weeks of the war” (Zecchinon & Standaert, 2025, p. 62). These were precisely the reasons that led us to analyze academic research and its expression through scientific articles since February 2022 because we considered that we could find research that already referred to the beginning of the tensions in 2014, but also because we understood that, as happened with the COVID-19 pandemic (Igartua et al., 2020), and despite the time lag that generally exists between the research work and its publication, we could find, from the very beginning, articles about fact-checking and the war in Ukraine.
As for the thousand days, it is a question of choosing an ephemeris, following one of the news values of selecting events that applies to journalism, the criterion of time, which can be understood as current affairs, the cliffhanger, or extension of the event (Traquina, 2002). In this case, it is the date, the thousand days, that serves as a temporal benchmark because it constitutes a news peg, which justifies the newsworthiness of an event that has already occurred in the past and, in this case, also serves as a temporal limit for the search for investigations that address fact-checking about the war (Traquina, 2002).
This paper is structured into four key sections. The first section briefly overviews the history of fact-checking, highlighting its importance in combating disinformation. The second section explains how the ongoing war in Ukraine has not only led to significant geopolitical shifts but has also been accompanied by an unprecedented wave of disinformation and misinformation. In the third section, we describe the methodology used for the study, namely the performance analysis. The fourth section presents and discusses the results of our analysis. Finally, we conclude by discussing the limitations of the research and suggesting directions for future studies.

2. State of the Art

2.1. The Role of Fact-Checking in Modern Journalism

Fact-checking is not a new practice, and it has been seen traditionally as an inherent function of newsrooms. It is part of the process of journalistic verification, of confirming sources and facts, a guarantee of the quality of the work produced, which included checking information before publication (Bigot, 2017; Mantzarlis, 2018; Graves & Amazeen, 2019). Renowned weekly magazines, such as TIME, pioneered this process in the 20th century. However, the economic crisis that affected journalism in the 21st century reduced or abolished many fact-checking departments. Nowadays, only some prestigious magazines still maintain teams dedicated to this function (Mantzarlis, 2018, p. 81).
Thus, while internal verification lost space, external verification grew. This external verification is what we refer to as fact-checking, or as Mantzarlis (2018) points out, “ex-post” fact-checking. “This form of ‘ex-post’ fact-checking seeks to make politicians and other public figures accountable for the truthfulness of their statements. Fact-checkers in this line of work seek primary and reputable sources that can confirm or negate claims made to the public” (Mantzarlis, 2018, pp. 81–82). Projects like Factcheck.org and Channel 4 Fact Check, launched in the early 2000s (Amazeen, 2020), boosted this practice.
According to Mantzarlis (2018), two moments help to explain the growth that fact-checking has had in recent decades: the 2009 Pulitzer Prize award to PolitiFact, which introduced the “Truth-O-Meter” to rate the veracity of statements; and the rise in fake news in mid-2016, which generated a wave of projects dedicated to combating online disinformation and misinformation. “This second wave often concentrated as much on fact-checking public claims as debunking these viral hoaxes. Debunking is a subset of fact-checking and requires a specific set of skills that are in common with verification” (Mantzarlis, 2018, p. 82).
With the increase in the number of fact-checking projects, international networks were created to help regulate the work of fact-checkers, as well as to ensure their independence and the validity of their methodologies. This is the case of the International Fact-Checking Network (IFCN), founded in 2015 at Poynter, and playing a pivotal role in the global fact-checking movement, mostly due to their code of principles, which includes more than 150 signatories (Lauer & Graves, 2024). The principles are the result “of consultations among fact-checkers from around the world and offer conscientious practitioners’ principles to aspire to in their everyday work” (IFCN, 2024).
More recently, in 2022, the European Fact-Checking Standards Network was launched with over 40 European fact-checking organizations, after six of them had guided the process to create the Code of Professional Integrity for Independent European Fact-checking (European Fact-Checking Standards Network Transparency Centre, 2023). As of December 2024, it has 55 verified members (EFCSN, 2024).
Thus, fact-checking has evolved from an internal newsroom function to a crucial tool in combating misinformation in the digital age. However, Duke Reporter’s Lab claims that “the number of reporting and research teams that routinely intercept political lies and disinformation is plateauing” (Stencel et al., 2024). The data from the last census from the Center for Journalism Research in the Sanford School of Public Policy at Duke University show that “457 fact-checking projects in 111 countries were active over the past two years. But so far in 2024, that number has shrunk to 439” (Stencel et al., 2024).
After a few years of growth, explained mainly by Brexit, Trump’s run for the White House, and the COVID-19 pandemic, the number of new projects began to stagnate globally. The authors conclude that “2023 was the first time there were more departing fact-checking teams than new ones—18 closures to 10 starts” (Stencel et al., 2024).
Building on this historical background, this article examines how academia has approached fact-checking initiatives and strategies. Thus, although fact-checking “has grounded itself as a worldwide movement, particularly in Western, Educated, Industrialized, Rich, and Democratic (WEIRD) countries […], literature on fact-checking beyond the West is still forthcoming” (Vinhas & Bastos, 2022, p. 1). Above all, meta-analysis about fact-checking demonstrates empirical research essentially focused on the US (Walter et al., 2020; Vinhas & Bastos, 2022).
In the next section, we try to understand what space studies on fact-checking occupy within the scope of studies on disinformation. More importantly, we seek to understand to what extent the war in Ukraine, central to this article, has been tackled within this field of studies.

2.2. Academic Approaches to Disinformation Research

As Salaverría and Cardoso (2023) noted, “in parallel with the surge of disinformation, academic interest in this phenomenon has grown in recent years” (p. 2). The authors highlight that “the projects, methodologies, and contexts by which the public dissemination of falsehoods is studied have multiplied, to the point where they now compose a diverse and especially fruitful corpus of research” (p. 2). According to some bibliometric studies, in recent decades, but especially since the COVID-19 pandemic, disinformation and misinformation have become one of the most researched topics in the fields of social sciences and communication (Sandu et al., 2024; Durr-Missau, 2024; Tătaru et al., 2024; Navarro-Sierra et al., 2024; KaabOmeir et al., 2024; Pandey & Ghosh, 2023; Salvador-Mata et al., 2023; X. Li et al., 2023; Pérez-Escolar et al., 2023; García-Marín & Salvat-Martinrey, 2021; Bran et al., 2021).
But in the context of growing studies on disinformation, how much attention has been paid to fact-checking? Salaverría and Cardoso (2023) argue that the main lines of work for disinformation studies include “typological studies”, that is, studies focused on classification and definition of disinformation content, distinguishing between involuntary errors (misinformation) and deliberate falsehoods (disinformation) (e.g., Tandoc et al., 2018; French et al., 2023; Salaverría et al., 2020); “fact-checking studies”, focused on the analysis of the work of fact-checking organizations, including their professional roles, ethical standards, and corporate structures (e.g., López-Pan & Rodríguez-Rodríguez, 2020; Tejedor et al., 2024; Wasike, 2023; Dafonte-Gómez et al., 2022; Moreno-Gil et al., 2021; Rodríguez-Pérez et al., 2021); “studies on disinformation on digital platforms”, with analysis of the “online dissemination of disinformation content”, trying to understand falsification practices (e.g., Diaz Ruiz, 2023; Arce-García et al., 2022; Chan, 2024; Rincón et al., 2022; Di-Domenico et al., 2021; Culloty & Suiter, 2021; Bruns et al., 2020; Freelon et al., 2020; Walker et al., 2020); and “studies in media literacy”, that is, investigations into strategies for teaching critical thinking and fact-checking skills (e.g., Sádaba & Salaverría, 2023; Frau-Meigs, 2022; Dame Adjin-Tettey, 2022; Valverde-Berrocoso et al., 2022; Hameleers, 2020; Richter, 2019; Jones-Jang et al., 2019).
We can see how fact-checking stands out as one of those main lines of work within disinformation studies. However, in the existing bibliometric studies, we found few or no references in studies on disinformation to the war in Ukraine, one of the most striking events in the recent history of Europe and the world. Perhaps because most of the studies are still very focused on the effects of the pandemic: according to Sandu et al. (2024), for example, the bibliometric analysis conducted shows “the rise of the COVID-19 misinformation theme as a motor theme for the 2019–2022 period” (Sandu et al., 2024, p. 39). Thus, and considering that several fact-checking organizations, projects, and observatories have paid considerable attention to the impacts of misinformation about war (e.g., Dierickx & Lindén, 2024; EDMO, n.d.; Ukraine War Resource Hub, 2024; Iberifier, 2024; #UkraineFacts, 2024), we attempt in this article to see whether academia has paid attention to fact-checking strategies during the ongoing war in Ukraine. This conflict has seen various actors, including state and non-state entities, engage in information warfare to manipulate public perception and influence international response, which demand further understanding. Moreover, the importance of this research also lies in the growing recognition of information warfare as a central element of modern conflicts.
Thus, considering the objectives we set for this research, but above all based on studies that highlight “the importance of disinformation as an object of study” and that “future research can be oriented to fill gaps, go a step further than the most common themes and areas and respond to emerging challenges, or those less studied, in this field of study” (Navarro-Sierra et al., 2024, p. 17), we decided to start by trying to understand the evolution of production on fact-checking in recent years. With these data, we could create the context that would help us understand to what extent scientific research has accompanied the work of fact-checking organizations (Dierickx & Lindén, 2024) when the topic is the war in Ukraine, considering that, in this case, the effort to combat disinformation is more effective if it is combined (Tejedor et al., 2024).
Therefore, based on the existing research mentioned in the previous paragraphs, we pose the following research questions to guide the analysis:
  • RQ1: How has scientific production on fact-checking evolved during the analyzed period?
  • RQ2: Who are the authors with the highest number of papers published on fact-checking during the analyzed period?
  • RQ3: Which universities had a higher scientific production on fact-checking during the analyzed period?
  • RQ4: Which countries had a higher scientific production on fact-checking during the analyzed period?
  • RQ5: Which are the journals publishing most scientific articles on fact-checking during the analyzed period?
  • RQ6: What are the most cited articles on fact-checking published during the analyzed period?
  • RQ7: What are the most used keywords in the articles on fact-checking published during the analyzed period?
Moreover, in order to comprehend how academic articles on fact-checking have tackled the Russia–Ukraine War, we pose our last research question:
  • RQ8: What are the main topics and methods used in the academic articles on fact-checking related to the Russia–Ukraine War published during the analyzed period?

3. Materials and Methods

This paper combines descriptive bibliometrics and a literature review to establish a comprehensive observation of the research on fact-checking during the first thousand days of war in Ukraine (from 24 February 2022 to 19 November 2024). As we have already mentioned, the choice of this period results from several criteria. Although the information conflict between Ukraine and Russia began in 2014, it escalated after President Putin announced a “special military operation” on 24 February 2022, leading to unprecedented levels of disinformation regarding the war (Zecchinon & Standaert, 2025). Fact-checkers have found it increasingly difficult to verify claims due to geographical distances, a lack of background information, and robust propaganda mechanisms (Dierickx & Lindén, 2024), but also due to “the virality of (fake) news raised by these events” (Zecchinon & Standaert, 2025, p. 62).
Moreover, the Russian invasion marks the first European war in the age of social media, where the volume and high speed of disseminated data complicate transparency (Zecchinon & Standaert, 2025). Consequently, we aimed to analyze academic research published since February 2022, hypothesizing that initial studies referencing the 2014 tensions might also exist, but also considering that we can witness a movement similar to what happened with the COVID-19 pandemic, which received unprecedentedly quick academic attention within months, also in fields like communication (e.g., Igartua et al., 2020). Given the contextual importance of “a thousand days”, selected as a temporal milestone that underscores the conflict’s ongoing relevance, we set this duration as a framework for our investigation into fact-checking research related to the war.
On the other hand, we look at scientific production in the field of Communication because we try to understand whether, after 24 February 2022, we can find an increase in the “number of scientific publications devoted to the Russian-Ukrainian war, both by Ukrainian authors and authors from other countries” (Ostapenko et al., 2023, p. 12) in these fields of knowledge, like some studies prove has happened in others: “the structural analysis shows the prevalence of publications from social, medical, and economic sciences” (Ostapenko et al., 2023, p. 12). Additionally, following the invasion, “the emergence of such trend as the formation of thematic special issues of journals about the Russian-Ukrainian war should be noted” (Ostapenko et al., 2023, p. 12).
So, according to C. Li et al. (2025), “bibliometric involves the quantitative review and investigation of existing literature within a specific field (…) The process includes collecting data on authors, keywords, journals, countries, affiliations, and references” (p. 314). Moreover, Donthu et al. (2021) and Passas (2024) state that bibliometric analysis involves two main approaches or categories: (1) performance analysis and (2) science mapping. The present study is based on the first approach, that is, on describing the performance of the research production, which requires “evaluating the impact of researchers, institutions, and countries using metrics such as total publications, author contributions, and citation-related indicators” (Passas, 2024, p. 1019).
To this end, we identified articles about fact-checking published in journals indexed in the Web of Science (hereinafter, WOS) under the general field of “Social Sciences” (n = 595). Then, we focused on the articles published in the “Communication” discipline (n = 270). Finally, through an in-depth literature review of 8 manuscripts, we try to understand the specific strategies employed by academics to address the conflict between Russia and Ukraine through fact-checking. Although there are other indexing databases (e.g., Scopus) or online bibliographic search platforms (such as EBSCOhost), we have chosen WOS due to its international recognition, its demanding standards for journal’s inclusion, and the usability of its interface.
Regarding the sampling strategy, it is divided into three steps. In Social Sciences, data on the n = 595 articles were obtained with this search algorithm1: (Topic = “fact-check*” OR “news fact-check*”) AND (Document Type = “Article”) AND (WOS Index = “SSCI” OR “ESCI”) AND (Publication Date: 24 February 2022 to 19 November 2024). The search terms had to be located on the title, abstract, or keywords of the articles.
On the other hand, Communication papers (n = 270) were obtained with this new search algorithm2: (Topic = “fact-check*” OR “news fact-check*”) AND (Document Type = “Article”) AND (WOS Index = “SSCI” OR “ESCI”) AND (WOS Categories = “Communication”) AND (Publication Date: 24 February 2022 to 19 November 2024). Again, the search terms had to be located on the title, abstract, or keywords of the articles. The third step led to the selection of those papers of the sample that focused on fact-checking in the context of the Russia–Ukraine War.
In the following section of the manuscript, results are presented according to the three stages of the selection process: first, the observations regarding fact-checking in Social Sciences will be used as a descriptive and preliminary contextualization; second, the analysis of articles on fact-checking within the Communication domain will allow us to answer the first to seven RQ’s; finally, the last RQ will be answered in detail using the articles on fact-checking in the context of the Russia–Ukraine War.

4. Results

4.1. General Approach to Social Sciences Production

Starting with the distribution of papers among disciplines, Figure 1 shows that Communication is the field that contributes the most to the study of fact-checking. Together with Communication, other disciplines such as Information Science and Computer Science are also very present in this distribution. This is particularly relevant if we consider that Salaverría and Cardoso (2023) argue that multidisciplinary studies on disinformation are still scarce. The need for more trans- and multidisciplinary studies also arises in the wake of the results of other research focused on disinformation in academic studies (Navarro-Sierra et al., 2024) or even on fact-checking (Tejedor et al., 2024), which highlighted “the importance of reinforcing the interdisciplinary and multi-thematic nature of research on fact-checking” (Tejedor et al., 2024, p. 8).
It is also worth mentioning that papers published in journals indexed in the Social Sciences Citation Index (SSCI) represent 65% (n = 387 papers), and manuscripts in journals in the Emerging Sources Citation Index (ESCI) represent the remaining 35% (n = 208 papers). When referring only to the Communication discipline (n = 270 papers), the distribution of manuscripts is SSCI = 192 (71.11%) and ESCI = 78 (28.89%).

4.2. Performance Analysis of Communication Papers

To answer our first research question on understanding the publication trend of fact-checking studies during the first thousand days of the war in Ukraine (from 24 February 2022 to 19 November 2024), Figure 2 shows the yearly number of publications focusing on fact-checking during the analyzed period both in Social Sciences and in the more specific field of Communication.
Ahead of any interpretation, it should be noted that our period comprises 311 days in 2022, 365 in 2023, and 324 days in 2024, which might help to explain the amount of articles published each year, with the highest in 2023. However, despite this factor, there appears to be a slight increase in the number of articles on fact-checking between 2022 and 2023, although that increase is not expected for 2024. Even though academic interest in fact-checking has been significantly increasing over the last years, the practice itself has become democratized, no longer a novelty, which might lead to a publication stabilization trend.
Next, we wanted to identify the authors with the strongest publication record addressing fact-checking in the studied period. Figure 3 shows the authors with at least three scientific articles on fact-checking in WOS journals. The top five authors are well-renowned names in the field: Lucas Graves (professor and researcher at the Reuters Institute), Michael Hameleers (associate professor in political communication at the Amsterdam School of Communication Research), Oscar Westlund (professor at the Department of Journalism and Media Studies at Oslo Metropolitan University), Steen Steensen (professor of journalism at the Department of Journalism and Media Studies at Oslo Metropolitan University), and Valerie Belair-Gagnon (associate professor from the Hubbard School of Journalism & Mass at the University of Minnesota).
After identifying the authors who publish the most on the topic of fact-checking, we also seek to determine which universities are more present in the scientific production on this issue in the field of Communication (RQ3). In Figure 4, we can see that two American universities appear at the top of the institutions that publish the most on this subject, but we can also observe that, as with authors, several Spanish universities stand out in these ranking.
These figures, which are consistent with previous studies, and which are sometimes partly explained as a result of collaborative work (Tejedor et al., 2024), are also aligned with the most frequent countries of affiliation. Thus, among the countries that contribute most to the development of this topic through the publication of scientific articles (RQ4), the USA appears at the top of the list, closely followed by Spain. The People’s Republic of China and Brazil come afterwards at a significant distance (Figure 5).
Concerning journals indexed in the field of Communication publishing the most articles on fact-checking (RQ5), Journalism Practice and Profesional de la Información seem to be the leading publications, with 21 articles each over the studied 1000-day period. Figure 6 shows the 10 most active journals in the Communication field, with strong predominance of the subfield of Journalism. Together with Anglo-Saxon journals from the USA and United Kingdom, we can also find some Spanish journals, also aligning with previous studies (Tejedor et al., 2024). It is also important to highlight that in terms of the WOS index, 71% of the published articles (n = 192) belong to SSCI, while the other 29% (n = 78) are part of ESCI.
Among the papers that comprise the sample, we also sought to identify the most cited ones (RQ6). In Table 1, we present the 24 with the highest number of citations as of December 2024.
One of the first aspects that stands out is that in this topic, more than half of the papers address issues related to the COVID-19 pandemic or, more generally, address misinformation and fact-checking related to health issues. These results align with previous studies that also identified “a predominant research interest in detecting misinformation within the health and healthcare domain” (Sandu et al., 2024, p. 38). Some studies (Blanco-Herrero et al., 2024) have identified a growing attention to war-related misinformation due to the wars in Ukraine or Gaza, although the presence of these events, both in fact-checking agencies and in academic research, is far from that of the pandemic. It is also worth noting that only one of the most cited papers addresses the issue of fact-checking in the context of Russia’s invasion of Ukraine (Morejón-Llamas et al., 2022).
Finally, regarding the keywords most used in the papers that make up the sample (RQ7), Table 2 shows that fact-checking is the term that stands out, which is to be expected, as it is the topic of study. Similarly, disinformation, misinformation, and fake news appear afterwards, given the strong interconnection and subordination of fact-checking studies and the more general scope of studies on mis- and disinformation. The results are once again in line with other investigations already carried out (Tejedor et al., 2024).
We found social media, journalism, COVID-19, verification, artificial intelligence, or digital in a second group of keywords, showing other recurrent approaches in papers on fact-checking.
Thus, we can see that key themes explored in these articles involve the efficacy of counter-narratives in combating disinformation and misinformation, the role of AI and other new technologies in fact-checking, or the dynamics of misinformation sharing and perception on social media platforms. We can also note that several studies specifically examine how credible sources can be discredited, the significance of public service media, and the impact of media literacy. The articles seem to employ diverse methodologies, including experiments, surveys, and content analysis, although more in-depth studies are needed to assess this.
Regarding emerging trends, there is a noticeable emphasis on the role of technology, particularly AI, in enhancing or complicating fact-checking efforts and how journalists navigate the complexities of information disorder. The main findings across the studies suggest practical implications for journalists, public service media, and policymakers in enhancing their strategies against misinformation and fostering a more informed public. Overall, this corpus of articles produced in the field of Communication encapsulates the multi-faceted approaches to understanding and addressing disinformation and misinformation in contemporary media landscapes, emphasizing the vital role of credible information dissemination and adaptive fact-checking practices.

4.3. In-Depth Review of Fact-Checking Articles About the Russia-Ukraine War

The previous section has drawn a general perspective of fact-checking articles in the field of Communication during the studied period (from 24 February 2022 to 19 November 2024). This period, as already explained, has been selected because it marks the first thousand days of the full-scale invasion of Ukraine by Russia. Thus, in order to comprehend how scientific articles have tackled fact-checking initiatives in the context of this conflict, it is necessary to identify which corpus we would work with. Considering the 270 articles included in the previous part of the study, and after searching by title, abstract, and keywords, we located eleven articles referencing the conflict. However, in a more detailed analysis, three of those cases were removed from the subsample, as their focus was not really on the war, and mentions to the conflict or the two involved countries were indirect or contextual. The final eight articles that form our corpus of analysis are shown in Table 3. A more detailed analysis of each one is presented in Appendix A.
The first observation relates to the fact that only one of the eight articles has a publication date of 2022, the year that marked the beginning of the conflict. The remaining articles were published in 2023 or 2024. This aspect is relevant, above all because it helps to explain why some of the bibliometric studies carried out on disinformation (Sandu et al., 2024; Durr-Missau, 2024; Tătaru et al., 2024; Navarro-Sierra et al., 2024; KaabOmeir et al., 2024; Pandey & Ghosh, 2023; Salvador-Mata et al., 2023; X. Li et al., 2023; Pérez-Escolar et al., 2023), but were very concretely about fact-checking (Tejedor et al., 2024), still do not present many studies on the Russia–Ukraine War. This is probably explained by the duration of scientific production, including data collection and analysis, but also publication processes. However, this is an aspect that deserves to be highlighted, especially because recently, in the context of the COVID-19 pandemic, we have seen a considerably different trend, with articles on misinformation, fact-checking, and the pandemic appearing very quickly in scientific journals. Although the difference between the topics is understood, the truth is that, just as happened concerning the pandemic, in this case we also witnessed a considerable and never-before-seen increase in false content about the war, so high that it even posed challenges to the work of fact-checkers (Dierickx & Lindén, 2024), leading to the need to create task forces to deal with the situation (EDMO Task Force on Disinformation on the War in Ukraine, n.d.).
Related to this, the data considered for the analyses in most articles’ date back to the first months of the conflict. This might illustrate what the authors regard as the peak of disinformation during the war in Ukraine or, alternatively, the peak of fact-checking activities, as posterior events, such as the war in Gaza, might have also removed the spotlight from the conflict in Ukraine. However, once again, the time gap between the moment data are produced and their later publication, although it may also be related to the journals’ operating times, may also refer to the need to collect databases for analysis, especially if we consider that several articles’ content analyzes the work conducted by fact-checkers.
It should also be noted that some articles address the invasion of Ukraine by Russia, but only in an instrumental way, that is, the objectives of the articles are more comprehensive, going beyond this single conflict. This is the case of the article by Sacaluga-Rodríguez et al. (2024), which does not only analyze disinformation related to the conflict, but rather the skills to identify disinformation, including aspects also related to war. This research is also relevant because it approaches disinformation and fact-checking from an innovative angle, that of neuro-communication, as it seeks to establish a relationship between personality traits, information consumption patterns, and the ability to detect false information. Therefore, the work thus aligns with others who have sought to study how the brain perceives issues of misinformation (Pennycook & Rand, 2021; Martínez-Costa et al., 2023). By doing so, this investigation falls within the scope of what Salaverría and Cardoso (2023) call “emerging fields for disinformation studies”, also in connection to their call for “multidisciplinary studies of disinformation”.
As noted, Sacaluga-Rodríguez et al.’s article is also one of those least centered on the issue of fact-checking. In this regard, two more focus on fact-checking (Springer et al., 2023; Charlton et al., 2024), but not on ex-post fact-checking (Mantzarlis, 2018), but with the practice inherent to journalistic work, that is, verification before publication. In the case of Charlton et al. (2024), they analyze the role of communities that form verification networks for information that appears online even before it reaches journalists. In the case of Springer et al. (2023), they seek to understand how journalists determine if information is problematic or needs cross-checking, and how they use this information.
Then, we have four articles (Zecchinon & Standaert, 2025; García-Marín et al., 2023; Magallón-Rosa et al., 2023; Morejón-Llamas et al., 2022) studying the work conducted by different fact-checking organizations. Zecchinon and Standaert (2024) use a case study to learn about the practices used by the fact-checking unit of the newspaper Le Monde (Les Décodeurs) to check visual disinformation around the Russia–Ukraine conflict. García-Marín et al. (2023) also address visual disinformation related to the Russian–Ukrainian conflict: besides characterizing this disinformation, they also analyze the reaction time of international fact-checking agencies to false images related to the war, while also comparing the disinformation strategies of Russia and Ukraine. The study by Magallón-Rosa et al. (2023) focuses the behavior and dissemination patterns of disinformation during the first year of the war by examining the work carried out by a group of Spanish fact-checkers. Finally, the research by Morejón-Llamas et al. (2022) also seeks to analyze the work carried out by a group of Spanish fact-checkers about the war, but paying special attention to Twitter (currently X). This platform also provides the sample for the study of Sacaluga-Rodríguez et al. (2024). The Discord platform was analyzed in another study from the sample (Charlton et al., 2024). Hence, these works integrate what Salaverría and Cardoso (2023) call “studies on disinformation on digital platforms”, although focused on the issue of fact-checking.
The last study in the sample is the one conducted by Tulin et al. (2024), who address fact-checking from the perspective of citizens’ motivation. The research seeks to analyze and understand the motivations in citizens’ decisions to read fact-checks about the war in Ukraine, but also to know how country-level factors such as polarization, press freedom, and geographical proximity to the war influence citizens’ fact-checking behaviors.
After learning about the topics covered in each article and observing their different foci regarding fact-checking, we tackle now the diversity of methodological approaches. Half of the articles present a mixed-methods approach (Zecchinon & Standaert, 2025; Springer et al., 2023; Magallón-Rosa et al., 2023; Morejón-Llamas et al., 2022). In their case study, Zecchinon and Standaert (2024) carry out a quantitative content analysis of fact-checks, complementing these data with semi-structured interviews with journalists. Springer et al. (2023) adopt a similar strategy, albeit content analysis does not focus on fact-checks but on journalistic coverage carried out by the media. Interviews with journalists from two countries also complement the study, and in this case, the investigation also adopts a comparative perspective between the two countries. This study goes back the furthest in terms of the analysis period (it starts in 2017), which leads to a different understanding of the beginning of the conflict. Magallón-Rosa et al. (2023) also carry out a quantitative analysis through a content analysis that considers the number of verifications, format, time distribution, etc., together with a qualitative study of narrative and discursive strategies. Morejón-Llamas et al. (2022) also speak of a mixed-methods approach, combining a quantitative, qualitative, and discursive measurement of the information produced by the selected fact-checkers.
From the remaining four articles, three use solely quantitative approaches. In two cases (Sacaluga-Rodríguez et al., 2024; Tulin et al., 2024), the technique chosen for data collection is the questionnaire. In the case of García-Marín et al. (2023), there is an analysis of false images verified by fact-checkers, with the authors employing then descriptive and inferential statistical analyses. Finally, only Charlton et al. (2024) adopt a solely qualitative approach: a qualitative content analysis of a case study.
In total, even in articles with a mixed-methods approach, the quantitative dimension seems to be privileged (Navarro-Sierra et al., 2024), which reveals that the results of our study are in line with others already carried out and focused more broadly on fact-checking (Tejedor et al., 2024).
From the theoretical framework’s perspective, the approaches are quite different among the eight articles. The literature review of the study by Morejón-Llamas et al. (2022) focuses mainly on the idea of hybrid warfare, highlighting the role of social networks as a battlefield, but also addresses the concepts of disinformation, the role of fact-checkers, the understanding of fact-checking as a new journalistic practice, and also content curation as an important element within the scope of fact-checking and considering the role assumed by social networks in the dissemination and circulation of false content.
In the article by García-Marín and Salvat-Martinrey (2023), the literature review focuses mainly on war propaganda, addressing aspects such as the use of different strategies in the production of disinformation, the power of false context as a preferred strategy for visual disinformation, and even the predominant use of Facebook as a platform for disseminating much false content.
The study of Magallón-Rosa et al. (2023) is also a theoretical framework focused on the role of social platforms in war journalism. It still addresses the prevailing narratives in terms of disinformation regarding the war in Ukraine. The study also addresses, in the literature review, the evolution of the journalist’s role, referring to changes in gatekeeper theories and the emergence of the gatewatcher.
The article by Springer et al. (2023) fits theoretically into the investigation of the factors that influence the verification of news about conflicts. In the literature review, the authors discuss how journalists, geographically dispersed and far from the war, deal with the common ideal of the search for journalistic truth as a professional duty.
In the research by Sacaluga-Rodríguez et al. (2024), the theoretical framework is centered on the intersection between personality traits, information consumption patterns, and the ability to detect fake news. The authors use a neuro-communicative approach.
In turn, the study by Tulin et al. (2024) fits within the scope of studies that seek to identify the motivations for reading fact-checking articles. The authors address different motivations that may help explain the selection of fact-checks about the war in Ukraine. The study also addresses the truth default theory, according to which people tend to accept the honesty of information unless they are alerted to suspicions related to specific content, which calls into question the value of the information.
The article by Charlton et al. (2024) theoretically fits into the interactions between journalism and open-source intelligence and investigation (OSINT) communities. In the literature review, the study also addresses the concepts of monitory democracy, associated with the networked public sphere, and the Dynamic Intermediation Model (D(X)IM.
Finally, the initiative by Zecchinon and Standaert (2024) addresses in its theoretical framework the disorder of information and visual propaganda, namely the concept of multimodal disinformation. It also addresses the change in the traditional role of the journalist as a gatekeeper and refers to the concept of gatebouncing, which emerges as a retroactive activity of information selection.
Having analyzed the central theme of the articles and the methodologies adopted, as well as the theoretical framework, we have a better understanding of the scientific production on the topic of fact-checking and the war in Ukraine. However, we should point out that there is not yet a considerable corpus of articles on this specific issue in WOS. This aspect is relevant to us, especially if the conflict is entering its third year of activity and the analysis focuses on one of the most extensive scientific journal and article databases. Despite the relevance of the conflict and the extensiveness of the WOS database, the scarcity of research on this event may have several explanations, including the fact that there are publications in journals that are not part of WOS (e.g., Baptista et al., 2023b). The limitations of fact-checking, considered by some to be an ineffective practice (Baptista et al., 2023a; Walter et al., 2020; Walter & Salovich, 2021), may also help to explain these results.
Similarly, we need to consider the complexity of a still ongoing and highly dynamic event and the difficulty to collect accurate and reliable data, making in-depth studies more difficulty. It is also essential to consider the duration of the scientific and publishing process itself, which means that the limited amount of scientific articles on the topic could change in the coming years as researchers continue to analyze data and develop new methodologies for fact-checking in conflict contexts.

5. Conclusions

In a context of growing concern about the threat posed by mis- and disinformation, studies on fact-checking have gained significant relevance. This study seeks to understand the state of fact-checking research in WOS journals in the context of a striking event such as the Russian invasion of Ukraine. The ongoing war, which began on 24 February 2022, has been fought on the battlefield and in the realm of information. The manipulation of facts and dissemination of disinformation have emerged as critical tactics employed by various actors, leading to significant implications for public perception and international relations.
Although research on fact-checking is predominantly carried out within the field of Communication, the issue is also being approached by other disciplines, such as psychology or neuro-communication. However, during the period of analysis, the first thousand days since Russia’s invasion of Ukraine, the number of articles had not significantly increased, perhaps in line with a certain slowing trend in the number of new fact-checking initiatives (Stencel et al., 2024). Although this study has not sought a comparison of topics, there are hints showing how the interest of fact-checking initiatives, just like that of academic research, shifts depending on the events taking place in the world. Hence, fact-checking and misinformation research grew initially in connection with political events, such as Donald Trump’s first presidency (e.g., Allcott & Gentzkow, 2017); the COVID-19 pandemic became a clearly predominant theme for some years, as it has been observed in this and other studies (Blanco-Herrero et al., 2024); more recently, the war in Ukraine, and later also in Gaza, gained relevance. Although there have been relevant contributions (e.g., Tulin et al., 2024), this topic is still far from the predominance of others, perhaps due to its specific location or due to the time lag of academic publications. In fact, the study clearly shows the lag between an event or topic arises and its scientific presence; this was not the case with the COVID-19 pandemic, which received unprecedentedly quick academic attention within months, also in fields like Communication (e.g., Igartua et al., 2020), but our study has shown that it is indeed the case of the war in Ukraine, with attention to the topic starting significantly later than the war started.
Focusing on the main findings of our study, we have identified who the leading actors (authors, universities, countries, and journals) behind scientific articles on fact-checking during this period are. A key observation is the role played by the USA and Spain. Although the leading role of the USA could be expected (Walter et al., 2020; Vinhas & Bastos, 2022), the role of Spanish scholars and universities might be seen as more surprising. Although more in-depth studies are needed to comprehend this role, one potential explanation is the relevance of the Spanish fact-checking movement, being one of the countries with most fact-checker signatories of the IFCN code of principles and the EFCSN (which also had Spanish participation in its origin). Moreover, this also relates to the proportionally higher concern about the topic of mis- and disinformation in this country compared to other European countries (European Commission, 2023; Vara et al., 2022). Finally, and more related to the functioning of academia, there are several Spanish-speaking journals indexed in SSCI and, especially, ESCI, which might also make the publication of Spanish authors easier.
The answers to the first questions helped to create the context for the growth of publications on fact-checking in the context of studies on disinformation. However, they also opened doors for the following question, which arises in line with the need identified by some research studies to address emerging challenges and less-explored topics in this field of research, going beyond the most prevalent themes and areas (Navarro-Sierra et al., 2024).
Together with this broader approach to studies on fact-checking, we have paid specific attention to scientific articles on fact-checking in the context of the war in Ukraine. It should be noted that we have identified a limited number of publications on this specific issue. Other topics, such as health (mostly in connection to the COVID-19 pandemic) or political issues, seem to still be dominant; the recency of the still-ongoing conflict, combined with the duration of the research and publication processes, might change this trend in the years to come. Although more research is still needed on this topic, we have observed that the existing articles seem to focus on analyzing the work conducted by fact-checking organizations in different contexts, frequently adopting a quantitative or mixed-methods perspective. There is also a significant interest in examining what happens in social media, a space particularly conducive to the dissemination of disinformation. In addition, contrary to what has been an emerging trend in disinformation studies, we have not identified, within the scope of this work, that artificial intelligence has a central role in fact-checking studies about war.
These observations are relevant in connection to well-established communication theories, such as the agenda setting (McCombs & Shaw, 1972). Previous studies have shown that misinformation has the capacity to determine the agenda (e.g., Vargo et al., 2018) and so does fact-checking (e.g., Feng et al., 2021). This helps to explain the relevance of academic research on these topics, given that understanding their spread, practices, and foci helps scholars comprehend the way these types of communication can influence public discourse. Moreover, the analysis of the articles demonstrated that the volume of fact-checks is closely linked to the news cycle and the intensity of coverage of the war in Ukraine. In this context, fact-checkers help shape the public agenda, deciding which claims and information deserve to be verified and which topics receive more attention. Through this process, fact-checkers can influence how the public perceives war-related events and issues. On the other hand, the results also suggest that although academic interest in fact-checking has increased significantly in recent years, the practice itself has become democratized, which could lead to a tendency for fact-checks to stabilize. This stabilization could have implications for the agenda itself since the attention given to specific topics could also decrease or stagnate.
On the other hand, the results obtained, namely through the analysis of articles that directly address fact-checking on the war in Ukraine, bring important contributions to research in this field of studies because the findings highlight the importance of media literacy in combating disinformation. However, the results also refer to citizens’ motivation to read fact-checks about the war in Ukraine, analyzing how factors such as polarization, freedom of the press, and geographical proximity to the war can influence citizens’ fact-checking behaviors. The results also highlight how easily credible sources can be discredited in a war context, reinforcing the role of media literacy.
To conclude, bibliometric studies, while valuable for analyzing research trends and impact, have several limitations that also apply to this work. Bibliometric databases like WOS and Scopus do not capture all publications, particularly those from non-English sources, conference proceedings, or books; thus, a relevant portion of the scientific literature on fact-checking cannot be studied. Related to this, data selection cannot be ignored, since search terms and periods can influence the results. In this study, the period of analysis is determined by the first thousand days of the war in Ukraine, as it was sought to contextualize fact-checking during this period, but longitudinal studies with a longer period of analysis might be able to identify more consistent and long-term trends.
We should highlight that our approach is relevant to understand the interplay between misinformation, fact-checking, and the war in Ukraine, but analyzing the production of scholars does not capture the complexity of these factors. We have identified a selection of studies addressing information manipulation during the war with different approaches, but future works need to keep paying attention to social media discourse or direct witnesses accounts for more comprehensive perspectives.
Moreover, it should be acknowledged that focusing on fact-checking offers a limited view of the current misinformation scenario, in which other factors, such as geopolitics or the rise in national-populist forces, also play a determining role. In this context, fact-checking has shown a positive but limited effect (Bachmann & Valenzuela, 2023; Walter et al., 2020), but it is, per se, not sufficient to counter a phenomenon that goes beyond exclusively communication issues.
Finally, bibliometric data provide descriptive quantitative information but do not capture the qualitative aspects of research, such as its originality, significance, or societal impact. To remedy this limitation, we have conducted a qualitative analysis of the articles about fact-checking in the context of the war. By combining both methods, we have obtained a broader quantitative understanding of existing patterns in fact-checking studies, complementing them with more in-depth observations of the fact-checking in the context of the war. As Dias and Sippitt (2020) highlight, significant time and effort have been dedicated to fact-checking as a strategy to combat mis- and disinformation. Thus, we must examine its possible effects comprehensively, considering how it could address the global challenge of information disorder. Without this understanding, we cannot fully grasp how fact-checking can enhance our democratic systems.
Thus, this research has not only provided a novel overview of studies on fact-checking, but also a more complete understanding of the role of fact-checking in one of the most critical events in Europe’s recent history.

Author Contributions

Conceptualization, R.M.; methodology, R.M. and V.P.-N.; formal analysis, R.M. and V.P.-N.; investigation, R.M., V.P.-N. and D.B.-H.; data curation, R.M. and V.P.-N.; writing—original draft preparation, R.M., V.P.-N. and D.B.-H.; writing—review and editing, R.M., V.P.-N. and D.B.-H.; supervision, R.M., V.P.-N. and D.B.-H.; project administration, R.M. All authors have read and agreed to the published version of the manuscript.

Funding

This manuscript and research were partially conducted within the framework of CITCEM—Transdisciplinary Research Center for Culture, Space and Memory which is funded by the Portuguese Foundation for Science and Technology (FCT), number UIDB/04059.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original data presented in the study are openly available in Web of Science.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Summary of papers about Russia–Ukraine War from a Communication perspective.
Table A1. Summary of papers about Russia–Ukraine War from a Communication perspective.
ReferencesPrinciple GoalsMethods and MaterialsMain Findings
Charlton et al. (2024)
  • To understand how OSINT communities use digital platforms, such as Discord, to organize and coordinate their activities.
  • To examine the interactions between OSINT communities and journalistic actors on digital platforms.
  • Qualitative method: case study.
  • Corpus: two OSINT communities.
  • Qualitative content analysis: three events that involved OSINT practices and were covered by journalism.
  • Timeframe: May to July 2022.
  • OSINT communities on Discord serve as an intermediary between the initial sources of information and the final public dissemination, verifying and validating the information before passing it on to journalistic actors.
  • OSINT communities engage in a constant monitoring and validation process, shifting between being the recipient and the source of information, in order to sort through the abundance of unverified information in the digital public sphere.
  • The interactions between OSINT communities and journalists represent a new division of labor in the digital public sphere, where both parties engage in a process of co-monitoring the information and knowledge production.
García-Marín et al. (2023)
  • To characterize the visual disinformation related to the Russo–Ukrainian conflict.
  • To find the international reach of that content by quantifying the number of countries where each hoax had circulated.
  • To analyze the reaction of international fact-checking agencies in terms of the time taken to check the false images related to the war.
  • To compare the disinformation strategies of Russia and Ukraine.
  • Quantitative method: descriptive and inferential statistical parameters.
  • Corpus: 326 false images.
  • Timeframe: January to April 2022.
  • False context is the most common type of visual disinformation related to the Russo–Ukrainian conflict.
  • Facebook and Twitter are the main platforms used to spread this disinformation.
  • The highest intensity of disinformation occurred in March 2022, shortly after the Russian invasion of Ukraine.
Magallón-Rosa et al. (2023)
  • To collect the verifications of Spanish fact-checkers on the Russian invasion of Ukraine.
  • To study the behavior and patterns of disinformation dissemination during the first year of the war.
  • To analyze the distribution of hoaxes by typology, subject, month, country involved, etc.
  • Quantitative (number of verifications, format, time distribution, etc.) and qualitative content analysis (narrative and discursive strategies).
  • Corpus: 307 fact-checks from six fact-checkers.
  • Timeframe: 24 February 2022 to 23 February 2023.
  • The volume of fact-checking verifications is closely tied to the news cycle and intensity of coverage of the Russo–Ukrainian war, with March 2022 seeing the highest number of verifications.
  • The Spanish fact-checking organization Maldita.es was the most prolific, publishing nearly half of the unique hoaxes identified in the study.
  • Hoaxes related to Ukrainian President Zelensky were one of the most prevalent narratives in the Russian disinformation strategy.
Morejón-Llamas et al. (2022)
  • To conduct a comparative analysis of the activity and impact (likes, retweets, and comments) of the Twitter accounts of the fact-checking organizations Maldito Bulo, Newtral, and EFE Verifica during the early days of the Russian invasion of Ukraine.
  • To understand the ability of these fact-checkers to curate content on social media.
  • To define the evolution of the verifications and the ability to react to a war with high disinformation.
  • Mixed-methods: quantitative, qualitative, and discursive measurement of the information produced by the selected fact-checkers.
  • Corpus: 397 tweets.
  • Timeframe: 21 to 28 February 2022.
  • Fact-checkers rapidly increased their Twitter activity to counter disinformation surrounding the Russian invasion of Ukraine, with a peak in publications on the day of the invasion.
  • The fact-checkers focused more on expanding and contextualizing information as well as debunking false claims, rather than on improving media literacy.
  • The fact-checkers’ posts, while rapid, were often repetitive in their content and lacked innovation in their visual presentation, which may have limited their impact and engagement on Twitter.
Sacaluga-Rodríguez et al. (2024)
  • To analyze the general disposition to detect disinformation in relation to the different existing enneatypes.
  • To evaluate the competences of identification of disinformation related to the main topics in which a greater number of fake news is observed, and the differences that the enneatypes have when detecting them.
  • Quantitative method: survey.
  • Sample: 144 students in their final years of communication and journalism degrees at the European University of Madrid.
  • Timeframe: February to March 2023.
  • The study found that personality type, as measured by the enneagram, influenced the ability to detect disinformation, with some enneatypes showing greater susceptibility than others.
  • The majority of participants were identified as having enthusiast, loyal, and challenger enneatypes, comprising 60% of the sample.
  • The individualist and researcher enneatypes performed significantly worse than other types in detecting disinformation.
Springer et al. (2023)
  • To understand individual and collective verification routines in news journalism.
  • To investigate how journalists determine if information is problematic or needs cross-checking, and how they use this information.
  • To understand how the verification determination process is made transparent to audiences.
  • Mixed-methods: content analysis of media coverage and interviews with journalists.
  • Corpus and sample: 149 news articles from Swedish and Ukrainian media outlets; 7 Swedish and 11 Ukrainian journalists.
  • Timeframe: content analysis between 2017 and 2019; interviews between spring 2019 and spring 2020.
  • Sourcing and verification practices in conflict news reporting are largely individualized, with a lack of innovation, and collective practices of trusting which can lead to confirmation bias.
  • Source transparency is often lacking, and journalists may use linguistic framing to delegitimize certain sources or perspectives, which can undermine the audience’s understanding of the information’s reliability.
Tulin et al. (2024)
  • To investigate the role of accuracy-motivated goals versus directionally motivated goals in citizens’ decisions to read fact-checks about the Russian war in Ukraine.
  • To examine how country-level factors such as polarization, press freedom, and geographical proximity to the war influence citizens’ fact-checking behaviors.
  • Quantitative method: survey.
  • Sample: 19,037 citizens from 19 countries.
  • Timeframe: April to May 2022.
  • Accuracy motivations (i.e., seeking to know the truth), rather than directional motivations (i.e., seeking to confirm existing beliefs), drive citizens to read fact-checks about the Russian war in Ukraine.
  • Individuals are more likely to read fact-checks if they perceive false information to be deliberately spread (disinformation), have a higher need for cognition, and find the issue more personally relevant.
  • Country-level press freedom is negatively associated with the likelihood of reading fact-checks, suggesting that citizens in countries with lower press freedom are more motivated to verify information.
Zecchinon and Standaert (2025)
  • To understand how the fact-checking unit of Le Monde (Les Décodeurs) organized its practices in response to visual disinformation surrounding the Russia-Ukraine conflict.
  • To place the practices of Les Décodeurs within the broader framework of previous research on fact-checking and disinformation.
  • To understand the discursive positions that fact-checkers develop about the function of fact-checking in the context of war.
  • Mixed-methods: semi-structured interviews (qualitative) and content analysis of fact-checks (quantitative).
  • Corpus and sample: 48 debunks and five semi-structured interviews.
  • Timeframe: interviews from November 2022 to February 2023; debunks from 24 February 2022 to 31 December 2022.
  • The majority of visual disinformation surrounding the war in Ukraine involved authentic images with misleading captions rather than sophisticated manipulation.
  • Existing categorizations of disinformation were able to effectively classify the visual disinformation analyzed in the study.
  • The study suggests refining the “false context” category to better capture the nuances of how authentic visuals are misused to spread disinformation.
Source: authors’ own elaboration using analysis and data provided by Elicit.

Notes

1
Information retrieved from WOS. https://tinyurl.com/249c5s92, accessed on 28 November 2024.
2
Information retrieved from WOS. https://tinyurl.com/3d3uuxtx, accessed on 28 November 2024

References

  1. Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211–236. [Google Scholar] [CrossRef]
  2. Almeida-Santos, C., Peixinho, A. T., Lopes, F., & Araújo, R. (2023). Fact-checks: La liquidez de un género. Un estudio de caso portugués en un contexto pandémico. Estudios Sobre el Mensaje Periodístico, 29(2), 259–272. [Google Scholar] [CrossRef]
  3. Amazeen, M. A. (2020). Journalistic interventions: The structural factors affecting the global emergence of fact-checking. Journalism, 21(1), 95–111. [Google Scholar] [CrossRef]
  4. Amazeen, M. A., Krishna, A., & Eschmann, R. (2022). Cutting the Bunk: Comparing the Solo and Aggregate Effects of Prebunking and Debunking COVID-19 Vaccine Misinformation. Science Communication, 44(4), 387–417. [Google Scholar] [CrossRef]
  5. Arce-García, S., Said-Hung, E., & Mottareale-Calvanese, D. (2022). Astroturfing as a strategy for manipulating public opinion on Twitter during the pandemic in Spain. Profesional de la Información, 31(3), e310310. [Google Scholar] [CrossRef]
  6. Bachmann, I., & Valenzuela, S. (2023). Studying the downstream effects of fact-checking on social media: Experiments on correction formats, belief accuracy and media trust. Social Media + Society, 9(2), 1–13. [Google Scholar] [CrossRef]
  7. Badrinathan, S., & Chauchard, S. (2024). “I Don’t Think That’s True, Bro!” Social Corrections of Misinformation in India. International Journal of Press-Politics, 29(2), 394–416. [Google Scholar] [CrossRef]
  8. Baptista, J. P., & Gradim, A. (2022). Online disinformation on facebook: The spread of fake news during the portuguese 2019 election. Journal of Contemporary European Studies, 30(2), 297–312. [Google Scholar] [CrossRef]
  9. Baptista, J. P., Gradim, A., Loureiro, M., & Ribeiro, F. (2023a). Fact-checking: Uma prática recente em Portugal? Análise da perceção da audiência. Anuario Electrónico de Estudios en Comunicación Social “Disertaciones”, 16(1), 1–28. [Google Scholar] [CrossRef]
  10. Baptista, J. P., Rivas-de-Roca, R., Gradim, A., & Loureiro, M. (2023b). The disinformation reaction to the russia–ukraine war: An analysis through the lens of iberian fact-checking. KOME—An International Journal of Pure Communication Inquiry, 11(2), 27–48. [Google Scholar] [CrossRef]
  11. Berger, G. (2018). Foreword. In C. Ireton, & J. Posetti (Eds.), Journalism, ‘fake news’ & disinformation: Handbook for journalism education and training (pp. 7–13). UNESCO Series on Journalism Education. United Nations Educational, Scientific and Cultural Organization. [Google Scholar]
  12. Bigot, L. (2017). Le fact-checking ou la réinvention d’une pratique de vérification. Communication & Langages, 192(2), 131–156. [Google Scholar] [CrossRef]
  13. Blanco-Herrero, D., Arcila-Calderón, C., & Tovar-Torrealba, M. (2024). Pandemia, politización y odio: Características de la desinformación en España. Estudios sobre el Mensaje Periodístico, 30(3), 503–515. [Google Scholar] [CrossRef]
  14. Bran, R., Tiru, L., Grosseck, G., Holotescu, C., & Malita, L. (2021). Learning from each other: A bibliometric review of research on information disorders. Sustainability, 13(18), 10094. [Google Scholar] [CrossRef]
  15. Brookes, S., & Waller, L. (2023). Communities of practice in the production and resourcing of fact-checking. Journalism, 24(9), 1938–1958. [Google Scholar] [CrossRef]
  16. Bruns, A., Harrington, S., & Hurcombe, E. (2020). ‘Corona? 5G? or both?’: The dynamics of COVID-19/5G conspiracy theories on Facebook. Media International Australia, 177, 12–29. [Google Scholar] [CrossRef]
  17. Burel, G., & Alani, H. (2023). The fact-checking observatory: Reporting the co-spread of misinformation and fact-checks on social media. In 34th ACM conference on hypertext and social media (HT ’23) (pp. 1–3). Association for Computing Machinery. [Google Scholar] [CrossRef]
  18. Cardoso, G., Paisana, M., & Pinto-Martinho, A. (2023). Digital news report portugal 2023. Obercom—Reuters Institute for the Study of Journalism. Available online: https://tinyurl.com/2jtaa3hn (accessed on 17 December 2024).
  19. Chan, J. (2024). Online astroturfing: A problem beyond disinformation. Philosophy & Social Criticism, 50(3), 507–528. [Google Scholar] [CrossRef]
  20. Charlton, T., Mayer, A.-T., & Ohme, J. (2024). A Common Effort: New Divisions of Labor Between Journalism and OSINT Communities on Digital Platforms. The International Journal of Press/Politics, 0(0), 1–22. [Google Scholar] [CrossRef]
  21. Culloty, E., & Suiter, J. (2021). Disinformation and manipulation in digital media: Information pathologies. Routledge. [Google Scholar] [CrossRef]
  22. Dafonte-Gómez, A., Míguez-González, M., & Ramahí-García, D. (2022). Fact-checkers on social networks: Analysis of their presence and content distribution channels. Communication & Society, 35(3), 73–89. [Google Scholar] [CrossRef]
  23. Dame Adjin-Tettey, T. (2022). Combating fake news, disinformation, and misinformation: Experimental evidence for media literacy education. Cogent Arts & Humanities, 9(1), 2037229. [Google Scholar] [CrossRef]
  24. Dias, N., & Sippitt, A. (2020). Researching fact checking: Present limitations and future opportunities. The Political Quarterly, 91, 605–613. [Google Scholar] [CrossRef]
  25. Diaz Ruiz, C. (2023). Disinformation on digital media platforms: A market-shaping approach. New Media & Society, 14614448231207644. [Google Scholar] [CrossRef]
  26. Di-Domenico, G., Sit, J., Ishizaka, A., & Nunan, D. (2021). Fake news, social media and marketing: A systematic review. Journal of Business Research, 124, 329–341. [Google Scholar] [CrossRef]
  27. Dierickx, L., & Lindén, C. (2024). Fact-checking the war in Ukraine, or when the screens have become battlefields. EDMO. Available online: https://tinyurl.com/yjjd8svx (accessed on 8 March 2025).
  28. Dierickx, L., Lindén, C. G., & Opdahl, A. L. (2023). Automated Fact-Checking to Support Professional Practices: Systematic Literature Review and Meta-Analysis. International Journal of Communication, 17, 5170–5190. Available online: https://ijoc.org/index.php/ijoc/article/view/21071/4287 (accessed on 8 March 2025).
  29. Donthu, N., Kumar, S., Mukherjee, D., Pandey, N., & Lim, W. M. (2021). How to conduct a bibliometric analysis: An overview and guidelines. Journal of Business Research, 133, 285–296. [Google Scholar] [CrossRef]
  30. Durr-Missau, L. (2024). The role of social sciences in the study of misinformation: A bibliometric analysis of web of science and scopus publications (2017–2022). Tripodos, 56, 3. [Google Scholar] [CrossRef]
  31. EDMO. (n.d.). War in Ukraine. Available online: https://tinyurl.com/4mv2zvhj (accessed on 17 December 2024).
  32. EDMO Task Force on Disinformation on the War in Ukraine. (n.d.). Available online: https://tinyurl.com/44yfzmek (accessed on 8 March 2025).
  33. European Commission. (2023). European Commission: Directorate-general for communication. Public opinion in the European Union—First results—Winter 2022–2023. European Commission. Available online: https://data.europa.eu/doi/10.2775/460956 (accessed on 8 March 2025).
  34. European Fact-Checking Standards Network (EFCSN). (2024). European fact-checking standards network (EFCSN). Available online: https://efcsn.com/ (accessed on 17 December 2024).
  35. European Fact-Checking Standards Network Transparency Centre. (2023). Transparency centre. Available online: https://tinyurl.com/34rjkrnt (accessed on 17 December 2024).
  36. Feng, M., Tsang, N. L., & Lee, F. L. (2021). Fact-checking as mobilization and counter-mobilization: The case of the anti-extradition bill movement in Hong Kong. Journalism Studies, 22(10), 1358–1375. [Google Scholar] [CrossRef]
  37. Ferreira, C., & Amaral, I. (2022). Media literacy and critical thinking: Evaluating the impact on combating misinformation. International Journal of Communication, 16, 4567–4585. [Google Scholar]
  38. Frau-Meigs, D. (2022). How disinformation reshaped the relationship between journalism and media and information literacy (mil): Old and new perspectives revisited. Digital Journalism, 10(5), 912–922. [Google Scholar] [CrossRef]
  39. Freelon, D., Bossetta, M., Wells, C., Lukito, J., Xia, Y., & Adams, K. (2020). Black trolls matter: Racial and ideological asymmetries in social media disinformation. Social Science Computer Review, 40, 560–578. [Google Scholar] [CrossRef]
  40. French, A. M., Storey, V. C., & Wallace, L. (2023). A typology of disinformation intentionality and impact. Information Systems Journal, 34, 1324–1354. [Google Scholar] [CrossRef]
  41. García-Marín, D., Pérez-Serrano, M. J., & Santos-Díez, M. T. (2023). Youth, social networks, and political participation: A study on the influence of Instagram. Comunicar, 31(74), 45–55. [Google Scholar]
  42. García-Marín, D., & Salvat-Martinrey, G. (2021). Investigación sobre desinformación en España. Análisis de tendencias temáticas a partir de una revisión sistematizada de la literatura. Fonseca Journal of Communication, 23, 199–225. [Google Scholar] [CrossRef]
  43. García-Marín, D., & Salvat-Martinrey, G. (2022). Viralizar la verdad. Factores predictivos del engagement en el contenido verificado en TikTok. Profesional de la Información, 31(2), e310210. [Google Scholar] [CrossRef]
  44. García-Marín, D., & Salvat-Martinrey, G. (2023). Disinformation and war. Verification of false images about the Russian-Ukrainian conflict. Revista ICONO 14. Revista científica de Comunicación y Tecnologías emergentes, 21(1), 1–23. [Google Scholar] [CrossRef]
  45. Graves, L., & Amazeen, M. (2019). Fact-checking as idea and practice in journalism. Oxford Research Encyclopedia of Communication. Available online: https://tinyurl.com/2twmrxz5 (accessed on 17 December 2024).
  46. Graves, L., Bélair-Gagnon, V., & Larsen, R. (2023). From public reason to public health: Professional implications of the “Debunking Turn” in the global fact-checking field. Digital Journalism, 12(10), 1417–1436. [Google Scholar] [CrossRef]
  47. Hameleers, M. (2020). Separating truth from lies: Comparing the effects of news media literacy interventions and fact-checkers in response to political misinformation in the US and Netherlands. Information, Communication & Society, 25, 110–126. [Google Scholar] [CrossRef]
  48. Hameleers, M. (2023). The (Un)Intended consequences of emphasizing the threats of mis- and disinformation. Media and Communication, 11(2), 5–14. [Google Scholar] [CrossRef]
  49. Herrero-Diz, P., Pérez-Escolar, M., & Aramburu, D. V. (2022). Fact-checking skills: A proposal for Communication studies. Revista de Comunicación, 21(1), 231–249. [Google Scholar] [CrossRef]
  50. Iberifier. (2024). Iberian digital media observatory. Available online: https://iberifier.eu/ (accessed on 17 December 2024).
  51. IFCN. (2024). About. IFCN code of principles. Available online: https://tinyurl.com/2j69jksf (accessed on 17 December 2024).
  52. Igartua, J. J., Ortega, F., & Arcila-Calderón, C. (2020). Communication use in the times of the coronavirus. A cross-cultural study. Profesional de la Información, 29(3), e290318. [Google Scholar] [CrossRef]
  53. Ireton, C., & Posetti, J. (2018). Introduction. In C. Ireton, & J. Posetti (Eds.), Journalism, ’fake news’ & disinformation: Handbook for journalism education and training (pp. 14–31). UNESCO Series on Journalism Education. United Nations Educational, Scientific and Cultural Organization. [Google Scholar]
  54. Jiang, S. (2022). The roles of worry, social media information overload, and social media fatigue in hindering health Fact-checking. Social Media + Society, 8(3), 1–12. [Google Scholar] [CrossRef]
  55. Jones-Jang, S. M., Mortensen, T. M., & Liu, J. (2019). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavioral Scientist, 65, 371–388. [Google Scholar] [CrossRef]
  56. KaabOmeir, F., Khademizadeh, S., Seifadini, R., Balani, S. O., & Khazaneha, M. (2024). Overview of misinformation and disinformation research from 1971 to 2022. Journal of Scientometric Research, 13(2), 430–447. [Google Scholar] [CrossRef]
  57. Lauer, L., & Graves, L. (2024). How to grow a transnational field: A network analysis of the global fact-checking movement. New Media & Society, 15, 14614448241227856. [Google Scholar] [CrossRef]
  58. Lelo, T. (2022). The rise of the Brazilian fact-checking movement: Between economic sustainability and editorial independence. Journalism Studies, 23(9), 1077–1095. [Google Scholar] [CrossRef]
  59. Li, C., Ali, M. N. S., Rizal, A. R. B. A., & Xu, J. (2025). A bibliometric analysis of media convergence in the twenty-first century: Current status, hotspots, and trends. Studies in Media and Communication, 13(1), 313–331. [Google Scholar] [CrossRef]
  60. Li, X., Lyu, W., & Salleh, S. M. (2023). Misinformation in communication studies: A review and bibliometric analysis. Jurnal Komunikasi: Malaysian Journal of Communication, 39(4), 467–488. [Google Scholar] [CrossRef]
  61. López-Pan, F., & Rodríguez-Rodríguez, J. (2020). El fact checking en España. Plataformas, prácticas y rasgos distintivos. Estudios Sobre el Mensaje Periodístico, 26(3), 1045–1065. [Google Scholar] [CrossRef]
  62. Lu, Y., & Shen, C. (2023). Unpacking multimodal fact-checking: Features and engagement of fact-checking videos on Chinese TikTok (Douyin). Social Media + Society, 9(1), 1–16. [Google Scholar] [CrossRef]
  63. Magallón-Rosa, R., Fernández-Castrillo, C., & Garriga, M. (2023). Fact-checking in war: Types of hoaxes and trends from a year of disinformation in the Russo-Ukrainian war. Profesional de la Información, 32(5), e320520. [Google Scholar] [CrossRef]
  64. Mantzarlis, A. (2018). Fact-checking 101. In C. Ireton, & J. Posetti (Eds.), Journalism, ’fake news’ & disinformation: Handbook for journalism education and training (pp. 81–95). UNESCO Series on Journalism Education. United Nations Educational, Scientific and Cultural Organization. [Google Scholar]
  65. Mare, A., & Munoriyarwa, A. (2022). Guardians of truth? Fact-checking the ‘disinfodemic’ in Southern Africa during the COVID-19 pandemic. Journal of African Media Studies, 14(1), 63–79. [Google Scholar] [CrossRef]
  66. Martínez-Costa, M., López-Pan, F., Buslón, N., & Salaverría, R. (2023). Nobody-fools-me perception: Influence of age and education on the overconfidence of spotting disinformation. Journalism Practice, 17(10), 2084–2102. [Google Scholar] [CrossRef]
  67. McCombs, M. E., & Shaw, D. L. (1972). The agenda-setting function of mass media. Public Opinion Quarterly, 36(2), 176–187. [Google Scholar] [CrossRef]
  68. Moon, W. K., Chung, M., & Jones-Jang, S. M. (2023). How can we fight partisan biases in the COVID-19 pandemic? AI source labels on fact-checking messages reduce motivated reasoning. Mass Communication and Society, 26(4), 646–670. [Google Scholar] [CrossRef]
  69. Morejón-Llamas, N., Martín-Ramallal, P., & Micaletto-Belda, J. P. (2022). Twitter content curation as an antidote to hybrid warfare during Russia’s invasion of Ukraine. Profesional de la Información, 31(3), e310308. [Google Scholar] [CrossRef]
  70. Moreno-Gil, V., Ramon-Vegas, X., & Mauri-Ríos, M. (2022). Bringing journalism back to its roots: Examining fact-checking practices, methods, and challenges in the Mediterranean context. Profesional de la Información, 31(2), e310215. [Google Scholar] [CrossRef]
  71. Moreno-Gil, V., Ramon, X., & Rodríguez-Martínez, R. (2021). Fact-checking interventions as counteroffensives to disinformation growth: Standards, values, and practices in Latin America and Spain. Media and Communication, 9(1), 1251–1263. [Google Scholar] [CrossRef]
  72. Navarro-Sierra, N., Magro-Vela, S., & Vinader-Segura, R. (2024). Research on disinformation in academic studies: Perspectives through a bibliometric analysis. Publications, 12(2), 14. [Google Scholar] [CrossRef]
  73. Ostapenko, L., Vorontsova, A., Voronenko, I., Makarenko, I., & Kozmenko, S. (2023). Coverage of the Russian armed aggression against Ukraine in scientific works: Bibliometric analysis. Journal of International Studies, 16(3), 9–33. [Google Scholar] [CrossRef]
  74. Pandey, S., & Ghosh, M. (2023). Bibliometric review of research on misinformation: Reflective analysis on the future of communication. Journal of Creative Communications, 18(2), 149–165. [Google Scholar] [CrossRef]
  75. Passas, I. (2024). Bibliometric analysis: The main steps. Encyclopedia, 4(2), 1014–1025. [Google Scholar] [CrossRef]
  76. Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388–402. [Google Scholar] [CrossRef] [PubMed]
  77. Pérez-Escolar, M., Lilleker, D., & Tapia-Frade, A. (2023). A systematic literature review of the phenomenon of disinformation and misinformation. Media and Communication, 11(2), 76–87. [Google Scholar] [CrossRef]
  78. Richter, A. (2019). Accountability and media literacy mechanisms as a counteraction to disinformation in Europe. Journal of Digital Media & Policy, 10(3), 311–327. [Google Scholar] [CrossRef]
  79. Rincón, A. G., Barbosa, R. L., Segovia-García, N., & Franco, D. R. (2022). Disinformation in social networks and bots: Simulated scenarios of its spread from system dynamics. Systems, 10(2), 34. [Google Scholar] [CrossRef]
  80. Rivas-de-Roca, R., & Pérez-Curiel, C. (2023). Global political leaders during the COVID-19 vaccination: Between propaganda and fact-checking. Politics and the Life Sciences, 42(1), 104–119. [Google Scholar] [CrossRef] [PubMed]
  81. Rodríguez-Ferrándiz, R. (2023). An overview of the fake news phenomenon: From untruth-driven to post-truth-driven approaches. Media and Communication, 11(2), 15–29. [Google Scholar] [CrossRef]
  82. Rodríguez-Pérez, C. V., Paniagua-Rojano, F. J., & Magallón-Rosa, R. (2021). Debunking political disinformation through journalists’ perceptions: An analysis of colombia’s fact-checking news practices. Media and Communication, 9, 264–275. [Google Scholar] [CrossRef]
  83. Sacaluga-Rodríguez, I., Vargas, J. J., & Sánchez, J. P. (2024). Exploring neurocommunicative confluence: Analysis of the interdependence between personality traits and information consumption patterns in the detection of fake news. A study with university students of journalism and communication using enneagrams. Revista Latina de Comunicación Social, 1–16. [Google Scholar] [CrossRef]
  84. Salaverría, R., Buslón, N., López-Pan, F., León, B., López-Goñi, I., & Erviti, M. (2020). Disinformation in times of pandemic: Typology of hoaxes on COVID-19/Desinformación en tiempos de pandemia: Tipología de los bulos sobre la COVID-19. Profesional de la Información, 29(3), e29031. [Google Scholar] [CrossRef]
  85. Salaverría, R., & Cardoso, G. (2023). Future of disinformation studies: Emerging research fields. Profesional de la Información, 32(5), e320525. [Google Scholar] [CrossRef]
  86. Salvador-Mata, B., Cortiñas-Rovira, S., & Herrero-Solana, V. (2023). La investigación en periodismo y COVID-19 en España: Mayor impacto académico en citas, aproximaciones metodológicas clásicas e importancia temática de la desinformación. Revista Latina de Comunicación Social, 81, 554–574. [Google Scholar] [CrossRef]
  87. Sandu, A., Ioanăș, I., Delcea, C., Geantă, L.-M., & Cotfas, L.-A. (2024). Mapping the landscape of misinformation detection: A bibliometric approach. Information, 15(1), 60. [Google Scholar] [CrossRef]
  88. Sádaba, C., & Salaverría, R. (2023). Combatir la desinformación con alfabetización mediática: Análisis de las tendencias en la Unión Europea. Revista Latina de Comunicación Social, 17–33. [Google Scholar] [CrossRef]
  89. Sousa, Â., Silva, J., & Ferreira, C. (2022). Digital journalism and the challenge of fake news: A study on verification practices. Journal of Media Studies, 34(2), 123–140. [Google Scholar]
  90. Springer, N., Nygren, G., Orlova, D., Taradai, D., & Widholm, A. (2023). Sourcing dis/information: How Swedish and Ukrainian journalists source, verify, and mediate journalistic truth during the Russian-Ukrainian conflict. Journalism Studies, 24(9), 1111–1130. [Google Scholar] [CrossRef]
  91. Steensen, S., Belair-Gagnon, V., Graves, L., Kalsnes, B., & Westlund, O. (2022). Journalism and Source Criticism. Revised Approaches to Assessing Truth-Claims. Journalism Studies, 23(16), 2119–2137. [Google Scholar] [CrossRef]
  92. Steensen, S., Kalsnes, B., & Westlund, O. (2024). The limits of live fact-checking: Epistemological consequences of introducing a breaking news logic to political fact-checking. New Media & Society, 26(11), 6347–6365. [Google Scholar] [CrossRef]
  93. Stencel, M., Ryan, E., & Luther, J. (2024). With half the planet going to the polls in 2024, fact-checking sputters. Available online: https://reporterslab.org/2024/05/30/with-half-the-planet-going-to-the-polls-in-2024-fact-checking-sputters/ (accessed on 17 December 2024).
  94. Sun, Y. Q. (2022). Verification upon exposure to COVID-19 misinformation: Predictors, outcomes, and the mediating role of verification. Science Communication, 44(3), 261–291. [Google Scholar] [CrossRef]
  95. Tandoc, E. C., Jr., Lim, Z., & Ling, R. (2018). Defining ‘fake news’. A typology of scholarly definitions. Digital Journalism, 6(2), 137–153. [Google Scholar] [CrossRef]
  96. Tătaru, G.-C., Domenteanu, A., Delcea, C., Florescu, M. S., Orzan, M., & Cotfas, L. A. (2024). Navigating the disinformation maze: A bibliometric analysis of scholarly efforts. Information, 15(12), 742. [Google Scholar] [CrossRef]
  97. Tejedor, S., Romero-Rodríguez, L. M., & Gracia-Villar, M. (2024). Unveiling the truth: A systematic review of fact-checking and fake news research in social sciences. Online Journal of Communication and Media Technologies, 14(2), e202427. [Google Scholar] [CrossRef]
  98. Traquina, N. (2002). O que é jornalismo. Quimera Editores. [Google Scholar]
  99. Tulin, M., Hameleers, M., de Vreese, C., Aalberg, T., Corbu, N., Van Erkel, P., Esser, F., Gehle, L., Halagiera, D., Hopmann, D. N., Koc-Michalska, K., Matthes, J., Mihelj, S., Schemer, C., Stetka, V., Strömbäck, J., Terren, L., & Theocharis, Y. (2024). Why do citizens choose to read fact-checks in the context of the russian war in ukraine? The role of directional and accuracy motivations in nineteen democracies. The International Journal of Press/Politics, 29, 19401612241233533. [Google Scholar] [CrossRef]
  100. #UkraineFacts. (2024). Available online: https://ukrainefacts.org/ (accessed on 17 December 2024).
  101. Ukraine War Resource Hub, EU DisinfoLab. (2024). EU disinfoLab. Available online: https://www.disinfo.eu/ukraine-hub/ (accessed on 17 December 2024).
  102. Valverde-Berrocoso, J., González-Fernández, A., & Acevedo-Borrega, J. (2022). Disinformation and multiliteracy: A systematic review of the literature. Comunicar, 70, 97–110. [Google Scholar] [CrossRef]
  103. Vara, A., Amoedo, A., Moreno, E., Negredo, S., & Kaufmann, J. (2022). Digital news report españa 2022. Servicio de Publicaciones de la Universidad de Navarra. [Google Scholar] [CrossRef]
  104. Vargo, C. J., Guo, L., & Amazeen, M. A. (2018). The agenda-setting power of fake news: A big data analysis of the online media landscape from 2014 to 2016. New Media & Society, 20(5), 2028–2049. [Google Scholar] [CrossRef]
  105. Vinhas, O., & Bastos, M. (2022, November 2–5). When fact-checking is not WEIRD: Challenges in fact-checking beyond the western world. AoIR 2022: The 23rd Annual Conference of the Association of Internet Researchers, Dublin, Ireland. Available online: http://spir.aoir.org (accessed on 17 December 2024).
  106. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. [Google Scholar] [CrossRef]
  107. Vu, H. T., Baines, A., & Nguyen, N. (2023). Fact-checking climate change: An analysis of claims and verification practices by fact-checkers in four countries. Journalism & Mass Communication Quarterly, 100(2), 286–307. [Google Scholar] [CrossRef]
  108. Walker, M., McNaughton, N., & Elliott, D. (2020). Journalistic integrity in the age of social media: A case study of the UK press. Digital Journalism, 8(5), 678–695. [Google Scholar]
  109. Walter, N., Cohen, J., Holbert, R. L., & Morag, Y. (2020). Fact-checking: A meta-analysis of what works and for whom. Political Communication, 37, 350–375. [Google Scholar] [CrossRef]
  110. Walter, N., & Salovich, N. A. (2021). Unchecked vs. Uncheckable: How opinion-based claims can impede corrections of misinformation. Mass Communication and Society, 24, 500–526. [Google Scholar] [CrossRef]
  111. Wardle, C., & Derakhshan, H. (2018). Thinking about ‘information disorder’: Formats of misinformation, disinformation, and mal-information. In C. Ireton, & J. Posetti (Eds.), Journalism, ‘fake news’ & disinformation: Handbook for journalism education and training (pp. 43–54). UNESCO Series on Journalism Education. United Nations Educational, Scientific and Cultural Organization. [Google Scholar]
  112. Wasike, B. (2023). You’ve been fact-checked! examining the effectiveness of social media fact-checking against the spread of misinformation. Telematics and Informatics Reports, 11, 100090. [Google Scholar] [CrossRef]
  113. Xiao, X. Z. (2024). Let’s verify and rectify! Examining the nuanced influence of risk appraisal and norms in combatting misinformation. New Media & Society, 26(7), 3786–3809. [Google Scholar] [CrossRef]
  114. Yu, W. T., Payton, B., Sun, M. R., Jia, W. F., & Huang, G. X. (2023). Toward an integrated framework for misinformation and correction sharing: A systematic review across domains. New Media & Society, 25(8), 2241–2267. [Google Scholar] [CrossRef]
  115. Zecchinon, E., & Standaert, O. (2024). Algorithmic transparency in news feeds: Implications for media pluralism. Journalism Studies, 25(1), 89–105. [Google Scholar]
  116. Zecchinon, P., & Standaert, O. (2025). The war in ukraine through the prism of visual disinformation and the limits of specialized fact-checking. A case-study at le monde. Digital Journalism, 13(1), 61–79. [Google Scholar] [CrossRef]
  117. Zeng, Y., Ding, X., Zhao, Y., Li, X., Zhang, J., Yao, C., Liu, T., & Qin, B. (2024). RU22Fact: Optimizing evidence for multilingual explainable fact-checking on russia-ukraine conflict. arXiv, arXiv:2403.16662. [Google Scholar] [CrossRef]
Figure 1. Top 10 disciplines of scientific articles on fact-checking in Social Sciences (n). Source: WOS.
Figure 1. Top 10 disciplines of scientific articles on fact-checking in Social Sciences (n). Source: WOS.
Journalmedia 06 00048 g001
Figure 2. Evolution of scientific production on fact-checking during the analyzed period (n). Source: authors’ own elaboration using WOS data.
Figure 2. Evolution of scientific production on fact-checking during the analyzed period (n). Source: authors’ own elaboration using WOS data.
Journalmedia 06 00048 g002
Figure 3. Top 20 researchers in Communication with most articles on fact-checking published during the studied period (n). Source: authors’ own elaboration using WOS data.
Figure 3. Top 20 researchers in Communication with most articles on fact-checking published during the studied period (n). Source: authors’ own elaboration using WOS data.
Journalmedia 06 00048 g003
Figure 4. Top 20 universities in Communication (n). Source: WOS.
Figure 4. Top 20 universities in Communication (n). Source: WOS.
Journalmedia 06 00048 g004
Figure 5. Top 10 affiliation countries of the articles’ authors on fact-checking published during the studied period (n). Source: authors’ own elaboration using WOS data.
Figure 5. Top 10 affiliation countries of the articles’ authors on fact-checking published during the studied period (n). Source: authors’ own elaboration using WOS data.
Journalmedia 06 00048 g005
Figure 6. Top 10 journals publishing scientific articles on fact-checking (n). Source: authors’ own elaboration using WOS data.
Figure 6. Top 10 journals publishing scientific articles on fact-checking (n). Source: authors’ own elaboration using WOS data.
Journalmedia 06 00048 g006
Table 1. Most cited papers on fact-checking published during the studied period.
Table 1. Most cited papers on fact-checking published during the studied period.
RankCitationsComplete ReferencesCitations
1Jiang (2022)Jiang, S. H. (2022). The Roles of Worry, Social Media Information Overload, and Social Media Fatigue in Hindering Health Fact-Checking. Social Media + Society, 8(3). https://doi.org/10.1177/20563051221113070 34
2Amazeen et al. (2022)Amazeen, M. A., Krishna, A., & Eschmann, R. (2022). Cutting the Bunk: Comparing the Solo and Aggregate Effects of Prebunking and Debunking COVID-19 Vaccine Misinformation. Science Communication, 44(4), 387–417. https://doi.org/10.1177/10755470221111558 24
3García-Marín and Salvat-Martinrey (2022)García-Marín, D., & Salvat-Martinrey, G. (2022). Viralizing the truth: predictive factors of fact-checkers’ engagement on TikTok. Profesional de la Información, 31(2). https://doi.org/10.3145/epi.2022.mar.10 21
4Moon et al. (2023)Moon, W. K., Chung, M., & Jones-Jang, S. M. (2023). How Can We Fight Partisan Biases in the COVID-19 Pandemic? AI Source Labels on Fact-checking Messages Reduce Motivated Reasoning. Mass Communication and Society, 26(4), 646–670. https://doi.org/10.1080/15205436.2022.2097926 17
Morejón-Llamas et al. (2022)Morejón-Llamas, N., Martín-Ramallal, P., & Micaletto-Belda, J. P. (2022). Twitter content curation as an antidote to hybrid warfare during Russia’s invasion of Ukraine. Profesional de la Información, 31(3). https://doi.org/10.3145/epi.2022.may.08
Xiao (2024)Xiao, X. Z. (2024). Let’s verify and rectify! Examining the nuanced influence of risk appraisal and norms in combatting misinformation. New Media & Society, 26(7), 3786–3809. https://doi.org/10.1177/14614448221104948
5Badrinathan and Chauchard (2024)Badrinathan, S., & Chauchard, S. (2024). “I Don’t Think That’s True, Bro!” Social Corrections of Misinformation in India. International Journal of Press-Politics, 29(2), 394–416. https://doi.org/10.1177/19401612231158770 16
6Brookes and Waller (2023)Brookes, S., & Waller, L. (2023). Communities of practice in the production and resourcing of fact-checking. Journalism, 24(9), 1938–1958. https://doi.org/10.1177/14648849221078465 15
Graves et al. (2023)Graves, L., Bélair-Gagnon, V., & Larsen, R. (2023). From Public Reason to Public Health: Professional Implications of the “Debunking Turn” in the Global Fact-Checking Field. Digital Journalism. https://doi.org/10.1080/21670811.2023.2218454
7Lu and Shen (2023)Lu, Y. D., & Shen, C. H. (2023). Unpacking Multimodal Fact-Checking: Features and Engagement of Fact-Checking Videos on Chinese TikTok (Douyin). Social Media + Society, 9(1). https://doi.org/10.1177/20563051221150406 14
Sun (2022)Sun, Y. Q. (2022). Verification Upon Exposure to COVID-19 Misinformation: Predictors, Outcomes, and the Mediating Role of Verification. Science Communication, 44(3), 261–291. https://doi.org/10.1177/10755470221088927
8Bachmann and Valenzuela (2023)Bachmann, I., & Valenzuela, S. (2023). Studying the Downstream Effects of Fact-Checking on Social Media: Experiments on Correction Formats, Belief Accuracy, and Media Trust. Social Media + Society, 9(2). https://doi.org/10.1177/20563051231179694 13
Hameleers (2023)Hameleers, M. (2023). The (Un)Intended Consequences of Emphasizing the Threats of Mis- and Disinformation. Media and Communication, 11(2), 5–14. https://doi.org/10.17645/mac.v11i2.6301
Mare and Munoriyarwa (2022)Mare, A., & Munoriyarwa, A. (2022). Guardians of truth? Fact-checking the ‘disinfodemic’ in Southern Africa during the COVID-19 pandemic. Journal of African Media Studies, 14(1), 63–79. https://doi.org/10.1386/jams_00065_1
Steensen et al. (2022)Steensen, S., Belair-Gagnon, V., Graves, L., Kalsnes, B., & Westlund, O. (2022). Journalism and Source Criticism. Revised Approaches to Assessing Truth-Claims. Journalism Studies, 23(16), 2119–2137. https://doi.org/10.1080/1461670x.2022.2140446
Steensen et al. (2024)Steensen, S., Kalsnes, B., & Westlund, O. (2024). The limits of live fact-checking: Epistemological consequences of introducing a breaking news logic to political fact-checking. New Media & Society, 26(11), 6347–6365. https://doi.org/10.1177/14614448231151436
Yu et al. (2023)Yu, W. T., Payton, B., Sun, M. R., Jia, W. F., & Huang, G. X. (2023). Toward an integrated framework for misinformation and correction sharing: A systematic review across domains. New Media & Society, 25(8), 2241–2267. https://doi.org/10.1177/14614448221116569
9Herrero-Diz et al. (2022)Herrero-Diz, P., Pérez-Escolar, M., & Aramburu, D. V. (2022). Fact-checking skills: a proposal for Communication studies. Revista de Comunicación, 21(1), 231–249. https://doi.org/10.26441/RC21.1-2022-A12 12
Lelo (2022)Lelo, T. (2022). The Rise of the Brazilian Fact-checking Movement: Between Economic Sustainability and Editorial Independence. Journalism Studies, 23(9), 1077–1095. https://doi.org/10.1080/1461670x.2022.2069588
Moreno-Gil et al. (2022)Moreno-Gil, V., Ramon-Vegas, X., & Mauri-Ríos, M. (2022). Bringing journalism back to its roots: examining fact-checking practices, methods, and challenges in the Mediterranean context. Profesional de la Información, 31(2). https://doi.org/10.3145/epi.2022.mar.15
Rodríguez-Ferrándiz (2023)Rodríguez-Ferrándiz, R. (2023). An Overview of the Fake News Phenomenon: From Untruth-Driven to Post-Truth-Driven Approaches. Media and Communication, 11(2), 15–29. https://doi.org/10.17645/mac.v11i2.6315
10Dierickx et al. (2023)Dierickx, L., Lindén, C. G., & Opdahl, A. L. (2023). Automated Fact-Checking to Support Professional Practices: Systematic Literature Review and Meta-Analysis. International Journal of Communication, 17, 5170–5190. 11
Frau-Meigs (2022)Frau-Meigs, D. (2022). How Disinformation Reshaped the Relationship between Journalism and Media and Information Literacy (MIL): Old and New Perspectives Revisited. Digital Journalism, 10(5), 912–922. https://doi.org/10.1080/21670811.2022.2081863
Vu et al. (2023)Vu, H. T., Baines, A., & Nguyen, N. (2023). Fact-checking Climate Change: An Analysis of Claims and Verification Practices by Fact-checkers in Four Countries. Journalism & Mass Communication Quarterly, 100(2), 286–307. https://doi.org/10.1177/10776990221138058
Source: authors’ own elaboration using WOS data.
Table 2. Top 20 most used keywords.
Table 2. Top 20 most used keywords.
RankKeywordsnRankKeywordsn
1Fact-Checking17317Motivated Reasoning8
2Disinformation91Platforms
3Misinformation90Post-Truth
4Fake News66Twitter
5Social Media5018Content Analysis7
6Journalism46Elections
7COVID-1941Engagement
8Verification31Experiment
9Artificial Intelligence25Media
10Digital 2219Computational Methods6
11Media Literacy18Credibility
12Correction13Ibero-America
TransparencyJournalism Practice
13Trust1220Automated Fact-Checking5
14Health Communication11Climate Change
15Europe10Comparative Research
HoaxesInterviews
SpainPartisanship
16Audiences9Perceived Credibility
CommunicationPropaganda
InfodemicSurvey
17Debunk8WhatsApp
Source: authors’ own elaboration using WOS data.
Table 3. List of articles on fact-checking in the context of the Russia–Ukraine conflict.
Table 3. List of articles on fact-checking in the context of the Russia–Ukraine conflict.
CitationsReferences
Charlton et al. (2024)Charlton, T., Mayer, A.-T., & Ohme, J. (2024). A Common Effort: New Divisions of Labor Between Journalism and OSINT Communities on Digital Platforms. The International Journal of Press/Politics. https://doi.org/10.1177/19401612241271230
García-Marín and Salvat-Martinrey (2023)García-Marín, D., & Salvat-Martinrey, G. (2023). Disinformation and war. Verification of false images about the Russian-Ukrainian conflict. Icono 14. https://doi.org/10.7195/ri14.v21i1.1943
Magallón-Rosa et al. (2023)Magallón-Rosa, R., Fernández-Castrillo, C., & Garriga, M. (2023). Fact-checking in war: Types of hoaxes and trends from a year of disinformation in the Russo-Ukrainian war. Profesional de la Información. https://doi.org/10.3145/epi.2023.sep.20
Morejón-Llamas et al. (2022)Morejón-Llamas, N., Martín-Ramallal, P., & Micaletto-Belda, J. P. (2022). Twitter content curation as an antidote to hybrid warfare during Russia’s invasion of Ukraine. Profesional de la Información. https://doi.org/10.3145/epi.2022.may.08
Sacaluga-Rodríguez et al. (2024)Sacaluga-Rodríguez, I., Vargas, J. J., & Sánchez, J. P. (2024). Exploring Neurocommunicative Confluence: Analysis of the Interdependence Between Personality Traits and Information Consumption Patterns in the Detection of Fake News. A Study with University Students of Journalism and Communication Using Enneagrams. Revista Latina de Comunicación Social. https://doi.org/10.4185/rlcs-2024-2281
Springer et al. (2023)Springer, N., Nygren, G., Orlova, D., Taradai, D., & Widholm, A. (2023). Sourcing Dis/Information: How Swedish and Ukrainian Journalists Source, Verify, and Mediate Journalistic Truth During the Russian-Ukrainian Conflict. Journalism Studies. https://doi.org/10.1080/1461670X.2023.2196586
Tulin et al. (2024)Tulin, M., Hameleers, M., de Vreese, C. et al. (2024). Why do Citizens Choose to Read Fact-Checks in the Context of the Russian War in Ukraine? The Role of Directional and Accuracy Motivations in Nineteen Democracies. The International Journal of Press/Politics. https://doi.org/10.1177/19401612241233533
Zecchinon and Standaert (2025)Zecchinon, P., & Standaert, O. (2025). The War in Ukraine Through the Prism of Visual Disinformation and the Limits of Specialized Fact-Checking. A Case-Study at Le Monde. Digital Journalism. https://doi.org/10.1080/21670811.2024.2332609
Source: authors’ own elaboration using WOS data.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Morais, R.; Piñeiro-Naval, V.; Blanco-Herrero, D. Beyond Information Warfare: Exploring Fact-Checking Research About the Russia–Ukraine War. Journal. Media 2025, 6, 48. https://doi.org/10.3390/journalmedia6020048

AMA Style

Morais R, Piñeiro-Naval V, Blanco-Herrero D. Beyond Information Warfare: Exploring Fact-Checking Research About the Russia–Ukraine War. Journalism and Media. 2025; 6(2):48. https://doi.org/10.3390/journalmedia6020048

Chicago/Turabian Style

Morais, Ricardo, Valeriano Piñeiro-Naval, and David Blanco-Herrero. 2025. "Beyond Information Warfare: Exploring Fact-Checking Research About the Russia–Ukraine War" Journalism and Media 6, no. 2: 48. https://doi.org/10.3390/journalmedia6020048

APA Style

Morais, R., Piñeiro-Naval, V., & Blanco-Herrero, D. (2025). Beyond Information Warfare: Exploring Fact-Checking Research About the Russia–Ukraine War. Journalism and Media, 6(2), 48. https://doi.org/10.3390/journalmedia6020048

Article Metrics

Back to TopTop