Next Article in Journal
Examining Crisis Communication in Geopolitical Conflicts: The Micro-Influencer Impact Model
Previous Article in Journal
The Politics of Framing Water Infrastructure: A Topic Model Analysis of Media Coverage of India’s Ken-Betwa River Link
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Beyond the Battlefield: A Cross-European Study of Wartime Disinformation

by
Rocío Sánchez-del-Vas
* and
Jorge Tuñón-Navarro
Departamento de Comunicación, Universidad Carlos III de Madrid, 28903 Getafe, Madrid, Spain
*
Author to whom correspondence should be addressed.
Journal. Media 2025, 6(3), 115; https://doi.org/10.3390/journalmedia6030115
Submission received: 21 May 2025 / Revised: 17 June 2025 / Accepted: 1 July 2025 / Published: 24 July 2025

Abstract

Russia’s invasion of Ukraine has profoundly altered the global geopolitical landscape. Owing to its geographical proximity, the conflict has had a considerable impact on Europe. Marked by the professionalisation and democratisation of technology, it has underscored the growing significance of hybrid warfare, in which disinformation and propaganda serve as additional instruments of war. Within this context, the aim of this article is to examine the characteristics of false information related to the war between Russia and Ukraine in four European countries between 2022 and 2023. To this end, a content analysis of 297 hoaxes was conducted across eight fact-checking platforms, complemented by ten in-depth interviews with specialised professionals. The findings indicate that disinformation is characterised by viral audiovisual hoaxes, particularly on Facebook and X (formerly Twitter), with a notable surge in disinformation flows at the onset of the invasion. In the early months, misleading content predominantly consisted of decontextualised images of the conflict, whereas a year later, the focus shifted to narratives concerning international support and alliances. The primary objective of this disinformation is to polarise public opinion against a perceived common enemy. The conclusions provide a broader and more nuanced understanding of wartime disinformation within the European context.

1. Introduction

On 24 February 2022, the President of Russia, Vladimir Putin, announced on television the commencement of a “special military operation” in Ukraine. These words marked the beginning of the invasion of the neighbouring country and, consequently, a war that still persists on its territory. This conflict, which represents a continuation of the military and propaganda campaign through which Russia annexed the Crimean Peninsula in 2014, is particularly paradigmatic, as it has also revealed a new way of employing disinformation in a wartime context. Indeed, the latest technological advances have been used to manipulate public perceptions, making disinformation an effective external weapon to gain international support and polarise audiences (Montes, 2022). One example of this is the so-called “Z Bloggers,” pro-Russian content creators who advocate for armed conflict on social media platforms such as TikTok while also earning significant advertising revenues (Atanesian, 2023).
Russia’s invasion of Ukraine has profoundly altered the global geopolitical landscape, prompting a strategic reconfiguration that has reinstated a bipolar world. This conflict has underscored the impact of hybrid warfare in the twenty-first century, categorised as fourth-generation conflicts (Baqués, 2015). In this context, and as part of its combat strategy, Russia has, for many years, placed propaganda at the core of its foreign policy (Veebel, 2015), intensifying efforts to promote disinformation regarding the conflict (Pierri et al., 2023). In fact, the European Commission had previously warned of the dangers posed by Russian disinformation campaigns, highlighting their systematic nature, considerable resources, and execution on a greater scale than those of other states such as China, Iran, and North Korea (EUvsDesinfo, 2015; Tuñón-Navarro et al., 2023). This tactic constitutes an extension of post-Soviet propaganda strategies (Vorster, 2021). Over the last decade, the Kremlin has not only strengthened its infrastructures in anticipation of a potential invasion but has also developed an international propaganda strategy (Carrión, 2022). Furthermore, such cyber threats become particularly significant during times of crisis, when access to accurate information is crucial.
In this regard, it has been observed that much of the disinformation about the war disseminated outside Ukraine originates from Russian sources, making this practice an aggressive and calculated tool of Moscow’s foreign policy (Rondeli, 2014). Additionally, numerous rumours and falsehoods about the conflict have been spread internationally, rendering Russian disinformation the principal phenomenon related to the war in Europe (Sánchez-del-Vas & Tuñón-Navarro, 2024a). Over the course of more than three years, Russia has constructed false narratives about alleged peace promises, displays of nuclear and biological power, obsessions with purported Nazis, and criticisms of the West (EUvsDisinfo, 2024).
Accordingly, the European Council and Council of the European Union (Consejo Europeo & Consejo de la Unión Europea, n.d.) have indicated that Russia primarily seeks to reinforce its strategy of destabilising its neighbours and the EU by turning Europeans against the millions of Ukrainian refugees, who represent the largest displacement on the continent since the Second World War (Morris & Oremus, 2022). It is also worth noting that a prominent feature of this conflict is the use of disinformation by both Russian and Ukrainian actors, not only as an offensive tool but also as a means of defence, making it an effective mechanism to garner public support and boost the morale of troops and civilians alike (Tuñón-Navarro et al., 2024).
To counter these strategies, content verification or fact-checking has, in recent years, emerged as one of the most widely used initiatives to combat disinformation (Sánchez-del-Vas, 2025). This practice is based on identifying and refuting falsehoods disseminated in the public sphere. According to data collected by the Duke Reporter’s Lab, as of May 2025, there are a total of 451 registered fact-checking organisations worldwide. Moreover, the invasion prompted a collaborative fact-checking response to disinformation. In fact, on the very day it occurred, the Spanish fact-checking outlet Maldita launched the initiative #UkraineFacts, an international database that brought together thousands of verifications by fact-checkers from across the globe concerning the conflict.

2. Materials and Methods

This research was grounded in a mixed methodology, combining both quantitative and qualitative techniques. This methodological triangulation allowed for the integration and validation of findings, thereby strengthening the study.

2.1. Research Objectives

The following outlines the main research objective and the specific objectives of this study:
  • Research Objective 1: To study the characteristics of disinformation concerning the Russo-Ukrainian war in Spain, Germany, the United Kingdom, and Poland.
  • Specific Objective 1: To examine the format and dissemination platform, as well as the typology and purpose, of the selected falsehoods.
  • Specific Objective 2: To analyse the temporal evolution of the frequency of fact-checked falsehoods, as well as their themes, in March 2022 and March 2023.
The aim is to obtain a broader and more comprehensive view of wartime disinformation within the European context (Sánchez-del-Vas & Tuñón-Navarro, 2023), identifying patterns and trends that may contribute to a better understanding of the phenomenon on a transnational scale.

2.2. Research Questions

This study was based on the following research questions:
  • Research Question 1: How has the volume of verified disinformation related to the war evolved between 2022 and 2023?
  • Research Question 2: What is the most prominent format in the dissemination of falsehoods about the conflict?
  • Research Question 3: What is the predominant typology in the verified disinformation?
  • Research Question 4: Which communication channel has been the main medium for the viral spread of hoaxes about the Russo-Ukrainian war?
  • Research Question 5: What is the predominant purpose of the disinformation disseminated about the armed conflict?
  • Research Question 6: In what way has the theme of disinformation about the war evolved between 2022 and 2023?

2.3. Unit of Analysis and Case Studies

This study has adopted an approach to disinformation through the verifications conducted by eight fact-checking organisations from four European countries. Therefore, the unit of analysis in this study corresponds to the publications of verified falsehood. The use of verification outlets as a tool to study disinformation has been employed in recent prior studies such as Salaverría et al. (2020); Almansa-Martínez et al. (2022); Magallón et al. (2023); Ruiz-Incertis et al. (2024); Sánchez-del-Vas and Tuñón-Navarro (2024a, 2024b); Rodríguez-Pérez et al. (2025); and Casero-Ripollés et al. (2025), among others.
Moreover, this article revolves around four case studies corresponding to four European countries, selected according to Hallin and Mancini’s (2004) media systems model: Spain, Germany, the United Kingdom, and Poland. Although Hallin and Mancini’s model does not include Eastern European countries, Poland has been included in order to provide greater diversity and representativeness on a European scale. Furthermore, the comparison of case studies enables generalisations to be drawn from identified similarities and even allows for tentative inferences. For this purpose, a comparison between a limited number of countries has been chosen, following the actor-centred approach and the “most similar systems design” proposed by Landman and Carvalho (2017).

2.4. Content Analysis

2.4.1. Sample Selection Criteria

With the aim of analysing disinformation in Europe relating to the war between Russia and Ukraine from 2022 to 2023, a content analysis was carried out on eight specialised fact-checking organisations, two from each country: Newtral and Maldito Bulo (Spain); CORRECTIV Faktencheck and BR24 Faktenfuchs (Germany); FullFact and Reuters Fact Check (United Kingdom); and Demagog and FakenewsPL (Poland). It is worth noting that all of these outlets are signatories of the International Fact-Checking Network’s (IFCN) Code of Principles, which guarantees their ethical standards.
As this research seeks to analyse disinformation itself through the fact-checkers’ verifications, only those publications that refute falsehoods have been considered. This methodological decision has previously been adopted in studies such as Brennen et al. (2020), Ruiz-Incertis et al. (2024), and Rodríguez-Pérez et al. (2025).
Regarding the time period under study, the months of March in two consecutive years—2022 and 2023—were selected for analysis. March 2022 witnessed a major outbreak of disinformation on the topic due to its timeliness, international relevance, and geopolitical impact, given that the invasion occurred at the end of February of that year, making March the first full month following the invasion of Ukraine. Therefore, its study is particularly significant for analysing the initial prevailing trends. In line with this, it was deemed relevant to examine how the trajectory of disinformation evolved over the course of one year. Thus, March 2023 was chosen to explore the characteristics of falsehoods twelve months later. This approach aims to analyse disinformation on the same conflict in different temporal contexts to verify and understand the development of disinformation throughout the war.
Accordingly, the evolution of verification frequency and the themes of the falsehoods were examined, as these variables are most affected by current events. For this reason, the results for 2022 and 2023 were presented and compared. Meanwhile, the format, dissemination platform, and purpose of the falsehoods were studied through a combined analysis of the data from both years. This methodological choice was made because these three variables are less dependent on the evolution of the news context.
As for the research sample, since the aim is to study only disinformation related to the war between Russia and Ukraine, all verifications unrelated to this subject were filtered out. Ultimately, the selection of sample publications, based on the aforementioned criteria, resulted in 297 verifications being included in the study.
It is important to note that, in order to examine the platforms through which falsehoods were spread, secondary dissemination units (i.e., specific platforms and social media networks) must be taken into account. For this reason, two alternative samples are considered. Although the total number of verified falsehoods serves as the point of departure for all variables (N = 297), there is no one-to-one correspondence between each falsehood and the dissemination channels used, as a single falsehood may circulate across multiple platforms and, within them, across various social media networks. Consequently, it is common for the number of identified dissemination channels to differ from the number of falsehoods. This discrepancy gives rise to analytical sub-samples, such as the following:
  • Dissemination platforms (N = 309 channels): a count of the media or spaces where the falsehood was spread.
  • Specific social media networks (N = 317): a detailed breakdown within the category of social media, which may reflect simultaneous circulation across more than one network per falsehood.
It is worth noting that this methodological approach has already been applied in other studies of a similar nature (Sánchez-del-Vas & Tuñón-Navarro, 2024b). Likewise, these figures may vary not only due to the dynamics specific to each country, but also due to the coding practices adopted by the fact-checking organisations and the availability or traceability of data.

2.4.2. Variables and Categories of Analysis

To conduct the content analysis of the selected media outlets, a coding manual was developed based on the following variables and their respective categories and subcategories:
  • Variable 1: Frequency of falsehoods verified by fact-checking outlets.
  • Variable 2: Format of the falsehoods. Categories: text; image; combined (text and image; text and video; text and audio); video; and audio.
  • Variable 3: Typology of falsehood according to Wardle (2017). Categories: fabricated content; manipulated content; impostor content; false context; misleading content; and false connection.
  • Variable 4: Platform of falsehoods. Categories: social media (Facebook; X, formerly Twitter; WhatsApp; Tik Tok; Instagram; Telegram; Youtube; Other); blogs; news media.
  • Variable 5: Purpose of the falsehood according to Wardle (2017). Categories: provocation; reinforcement of a common enemy; political influence; parody; economic gain; and poor journalism.
  • Variable 6: Theme of the falsehood. Categories: international support and allies; humanitarian crisis; war and combat; public image and political representation; resources and supplies; and other.
It is important to note that the variables were coded based on the information provided by fact-checking organisations in relation to the debunked falsehoods.

2.4.3. Semi-Structured Interviews with Specialist Actors

As a secondary and qualitative technique, ten semi-structured interviews were conducted with specialist actors. The interview is a qualitative research technique that offers greater freedom to the participants in the interaction. It should be noted that the interviews were anonymised to protect the privacy of the sources. This ensures that neither the interviewees nor the organisations to which they belong can be identified. The areas of expertise of each interviewee, as well as the nomenclature used throughout the article to cite their contributions, can be found in the following Table 1.
The anonymised interviews were coded using NVivo software, which facilitated a systematic thematic analysis. An initial round of open coding was conducted to identify recurring patterns and concepts across participants. These codes were then grouped into broader categories through axial coding, allowing key themes to emerge. The selection of themes included in the findings was based on their frequency, their relevance to the research questions, and the depth of insight they provided into participants’ experiences.

3. Results

3.1. Frequency

According to Interviewee 7, disinformation is a phenomenon intrinsically linked to current events and fluctuates in response to them. This is due, as Interviewee 5 notes, to the fact that audiences do not maintain sustained attention on a topic but rather engage in peaks of interest. In this regard, of the 297 verifications analysed in this study, the highest peak in fact-checking activity occurred in March 2022 (N = 256; 86% of the total).
As shown in Table 2, the country reporting the highest proportion of publications throughout the entire period analysed is Spain, in the Spanish language (N = 97; 33%), followed by the United Kingdom, in English (N = 88; 30%), Poland, in Polish (N = 79; 27%), and Germany, in German (N = 33; 11%).
Specifically, in Spain, 89 verifications were published in March 2022, whereas only 8 were recorded in 2023. In Germany, 25 verifications were registered in March 2022, which decreased to 8 in 2023. In the United Kingdom, 76 verifications were conducted in March 2022, dropping significantly to 12 in 2023. In Poland, 66 verifications were carried out in March 2022, while 13 were reported in 2023.

3.2. Format

The war in Ukraine was highly visual in terms of disinformation. Specifically, falsehoods circulated mainly through images and videos taken out of context, as noted by Interviewee 1. In this regard, overall, in the studied falsehoods related to the war between Russia and Ukraine, the predominant format through which these contents were disseminated was the combined format, specifically image and text.
In particular, as observed in Figure 1, in all countries, the combined format of text and image is the most prevalent. It is worth noting that Spain and Germany exhibit a similar structure in the distribution of formats. The United Kingdom stands out for the prominence of videos as the second most common format, relegating standalone images and text without images to minimal representation. Poland differs due to a notable proportion of falsehoods in an exclusively text-based format, followed by videos and standalone images, showing a variation compared to the trend observed in the other countries.

3.3. Typology

Content analysis shows that, in general terms, the falsehoods analysed were mostly identified under the typology of false context. However, Interviewee 2 highlights that while visual decontextualisation was highly prevalent at the beginning of the war, as the conflict progressed, the falsehoods began to become much more complex and difficult to verify.
As can be seen in Figure 2, in Spain, Germany, and the United Kingdom, false context and fabricated content are the most common categories, followed by manipulated content. Meanwhile, Poland stands out for having fabricated content as the predominant category. It is worth noting that false connection and impostor content are minority categories in all countries, although with slight variations in their incidence.

3.4. Platform

As outlined in the methodology section, due to the multi-channel nature of disinformation dissemination—whereby a single falsehood was often propagated across more than 1 platform—the 297 falsehoods were disseminated through a total of 309 platforms (N = 309), of which 87% correspond to social media. This represents a predominant trend across the countries analysed, as illustrated in Figure 3.
In this context, as individual falsehoods were at times identified across several social media platforms, a total of 317 instances of social media dissemination were coded (N = 317). In this sense, “The more channels [disinformers] have, and the more open they are, the better,” notes Interviewee 6.
Particularly, when focusing specifically on the predominant category—social media platforms—it is observed that the vast majority (90%) correspond to public social media services, while 7% pertain to private messaging channels. The remaining 3% are classified under the category of ‘other’ social media.
As shown in Figure 4, in the case of Spain, falsehoods were disseminated primarily via X (formerly Twitter) and Facebook, which together accounted for the majority of identified instances. These were followed, at a considerable distance, by TikTok and WhatsApp, with minimal activity recorded on Instagram, Telegram, and other platforms.
In Germany, dissemination also centred on Facebook, which emerged as the principal platform. X (formerly Twitter) and Telegram played a secondary role, while WhatsApp, TikTok, and YouTube appeared with lower frequency, indicating limited diversification and a clear preference for a small number of dominant channels.
In the United Kingdom, dissemination was largely concentrated on Facebook and X (formerly Twitter), both of which showed comparable prominence. Other platforms—such as TikTok, Instagram, and Telegram—were used to a lesser extent, suggesting a slightly more balanced distribution across platforms, though still marked by a strong concentration on the two leading services.
In Poland, Facebook registered an even higher level of use than in other contexts, indicating a pronounced centralisation of dissemination activity. X (formerly Twitter) followed at a distance, while TikTok and Telegram played only a marginal role. This pattern suggests a narrower use of platforms in the dissemination of falsehoods within this national context.

3.5. Purpose

Across the countries studied, similar results can be observed regarding the purpose of the falsehoods. During the war, most of the analysed falsehoods aimed to provoke the population, particularly by reinforcing a common enemy. “The search for scapegoats always fits into any type of narrative,” emphasises Interviewee 4.
As observed in Figure 5, in Spain, Germany, and the United Kingdom, audience provocation is the primary purpose, followed by the promotion of a common enemy. Meanwhile, in Poland, the predominant aim is to reinforce a common enemy, followed by provocation. Likewise, in all four countries, political influence is also noteworthy, while parody appears only marginally. The objective of economic gain and poor journalism have minimal representation and are only present in the United Kingdom and Spain.

3.6. Theme

In general, the themes on which disinformative content relating to the Russian invasion of Ukraine went most viral were war and combat, followed by public image and political representation.
In Spain, during March 2022, the most prevalent falsehoods were related to war and combat, particularly focusing on military manoeuvres and Russian attacks. False content was also disseminated concerning public image and political representation, with a particular emphasis on the distortion of Ukraine and its leader. In March 2023, the most widespread falsehoods concerned international support and allies, with a pro-Ukrainian inclination, followed by similar topics related to public image and political representation, again focusing on the distortion of Russia and Ukraine, as well as their respective leaders.
As indicated in Figure 6, in Germany, during the same period, the predominant falsehoods were related to war and combat, especially centred on military manoeuvres and Russian attacks, as well as disinformation about the humanitarian crisis, with a particular focus on refugees. In March 2023, most false information was related to international support and allies, once again with a pro-Ukrainian bias, followed by similar themes concerning public image and political representation, once again focusing on the distortion of Russia and its leader.
In the United Kingdom, a similar pattern is observed during both periods, with a prevalence of falsehoods about war and combat, specifically regarding military manoeuvres and Russian attacks. Additionally, erroneous content was spread about the humanitarian crisis, with an emphasis on refugees and civilian victims. In March 2023, falsehoods about war and combat continued to be prominent, along with disinformation about public image and political representation, with a particular focus on the distortion of Ukraine and its leader.
In Poland, during March 2022, most falsehoods were related to war and combat, especially with reference to military manoeuvres and attacks by both Russian and Ukrainian forces. There was also a significant amount of disinformation about the humanitarian crisis, with a focus on health. In March 2023, falsehoods about international support and allies and the humanitarian crisis were equally prominent, along with the continued spread of disinformation related to war and combat, particularly concerning fabricated reports of injuries and deaths.
Figure 6. Theme of analysed falsehoods by country and year. Own elaboration.
Figure 6. Theme of analysed falsehoods by country and year. Own elaboration.
Journalmedia 06 00115 g006

4. Discussion and Conclusions

“Disinformation is a weapon of war, and whoever controls the narrative and creates a particular image of the enemy has the upper hand,” states Interviewee 10. In this regard, Interviewee 9 notes that in the past century, disinformation crises have generally been linked to wartime contexts. In fact, according to Interviewee 8, disinformation as we know it in this century is closely associated with Russia’s intent to aggress against Ukraine, and they suggest that Russia’s 2014 annexation of Crimea was “the founding moment of the new era of disinformation.” In other words, from that moment onwards, Russia formalised information manipulation as a core strategy of hybrid warfare.
Concerning the frequency of disinformation verification, as evidenced by the sample analysed in this study, the volume of debunked disinformation regarding the war was considerably higher in March 2022 compared to March 2023. Since then, its frequency has steadily declined. Interviewee 2 argues that when a topic dominates public discourse and garners widespread attention, the spread of falsehoods intensifies. However, when audiences shift focus to other matters, the volume of disinformation diminishes accordingly, remaining “at a different level of the conversation.”
As Interviewee 5 points out, audiences do not maintain sustained attention on a single issue. Thus, falsehoods target areas where there is an active demand for information, directly influencing the work of fact-checkers (Tuñón-Navarro & Sánchez-del-Vas, 2022). Despite this, Interviewee 1 highlights that whenever current events relating to the conflict arise, disinformation resurfaces. In this regard, Interviewee 9 predicts that in the coming years, the volume of disinformation will multiply exponentially. Nonetheless, they note that artificial intelligence tools will be available to identify and monitor disinformation more swiftly and efficiently (Garriga et al., 2024). These insights help address the first research question posed.
Regarding the format of the falsehoods, content analysis shows that the combination of images and text was particularly substantial throughout the period studied. As some researchers (Hameleers et al., 2020) report, there is evidence that multimodal disinformation—i.e., that which uses both text and images—is perceived as slightly more credible than purely textual disinformation. Other authors, such as Gamir-Ríos et al. (2021), also confirm the visual dominance in the viral spread of falsehoods.
Indeed, from the early days of the war in Ukraine, false videos spread rapidly, especially on TikTok, even enabling fake livestreams (Baptista et al., 2023). Interviewee 2 suggests that the predominance of visual content may be due to the mass use of visually oriented platforms such as TikTok and Instagram.
In addition, Interviewee 6 underscores the role of technological professionalisation, which enables the creation of rich multimedia content that is easy to produce. These videos and images—particularly the most striking ones (Amorós-García, 2018)—are processed more superficially by audiences than other formats (Sundar et al., 2021). Another hypothesis proposed by experts is that the difficulty of understanding foreign languages beyond the conflict zone has led to a preference for photos and videos over textual information. “Not being on the ground, [disinformation producers] insert elements that distort the context,” explains Interviewee 4. This evidence directly addresses the second research question.
Derived from the format and in reference to the third research question, the results show that false context was the predominant typology of disinformation in the sample between 2022 and 2023 over other types proposed by Wardle (2017). Hameleers (2024) emphasises that authentic visual content, when recontextualised, interacts with false textual narratives to enhance the perceived credibility of falsehoods. Decontextualisation is a strategy commonly used when disinformation is presented in visual formats.
In this respect, Interviewee 1 stresses that “images are among the most powerful, especially videos taken out of context.” Likewise, Interviewee 3 notes that a significant number of photos from the Syrian war were recycled to falsely illustrate the war in Ukraine. These findings align with those of García-Marín and Salvat-Martinrey (2023) and Baptista et al. (2023), whose studies report that the use of decontextualised videos and images has been a recurring tactic during the conflict. Similarly, Dierickx and Lindén (2024) report that, according to a survey of fact-checkers, video was the format they focused on most when debunking disinformation about the Russia–Ukraine war, followed by images.
Moreover, Interviewee 4 remarks that the most damaging disinformation is that which is strategically planned and gradually executed. “It doesn’t matter what you manipulate,” says Interviewee 3, “what matters is having the will and capacity to do it.” Despite the sophisticated professionalisation of disinformation, Interviewee 7 observes that audiences are increasingly aware of the viral spread of falsehoods. However, they also warn this awareness could be a double-edged sword, as it “might push actors to produce increasingly sophisticated falsehoods.” These insights address the third research question.
With respect to the primary channel of dissemination, the results confirm the near-exclusive predominance of social media, primarily on public platforms, with Facebook and X (formerly Twitter) standing out. Prior studies have already shown that disinformation spreads faster via digital platforms (Bak et al., 2022). Specifically, the most used platforms for disseminating these contents were Facebook (Allcott et al., 2019) and X (formerly Twitter) (Golovchenko, 2020).
Interviewee 4 emphasises X (formerly Twitter) for its influence bubble, as it brings together politicians, journalists, and other influencers, making disinformative messages “more incendiary.” “Algorithms tend to trap you in your bubble, keeping you inside it, feeding you information you might not usually seek,” adds Interviewee 5. In this context, false news contributes to turning social media feeds into battlegrounds of discursive conflict, especially during highly controversial events (Asmolov, 2018). Although still in its early stages, a noticeable presence of hoaxes is beginning to appear on TikTok, a development also highlighted by Interviewee 6. Indeed, some academics argue that the invasion of Ukraine constitutes the “first war narrated on TikTok” (Aso, 2022).
Although public and open social media platforms are the primary channels for debunking disinformation, it would be misleading to assume they are the spaces where the most disinformation circulates. Fact-checkers are also subject to their own biases when selecting which falsehoods to verify. Moreover, some of the fact-checkers in this study are, or have previously been, part of Meta’s Third-Party Fact-Checking Programme and focus on verifying content circulating on Meta platforms. As Interviewee 7 notes, “if [fact-checkers] work for Facebook, they are more likely to debunk disinformation on that platform.” Interviewee 10 even states, “Facebook plays a more important role in our democracies than Russia does”.
Telegram is also beginning to emerge as a relevant platform for content verification, particularly in Germany. In this regard, Interviewee 8 adds that the main responsibility for disinformation dissemination lies with private messaging networks, which are impossible to monitor or control. Interviewee 5 highlights that these networks are especially prolific due to their personal nature—“a falsehood sent by someone you know hits harder than one from a stranger.” These findings help clarify the fourth research question.
As for the purpose of the falsehoods, this study’s results indicate that most were aimed at provoking audiences by reinforcing a common enemy. Based on their experience at a fact-checking organisation, Interviewee 2 states that the typical structure of such falsehoods is to create a villain and find ways to attack them. Interviewee 3 adds that disinformation related to the war targets the West as its primary enemy, aiming to polarise and divide its population.
According to Asmolov (2018), disinformation campaigns serve as technologies that facilitate social polarisation and the erosion of social cohesion, often by appealing to emotion via social media. This study highlights the aggressive nature of Facebook posts about the Russia–Ukraine conflict (even before the 2022 invasion), where users were divided into “friends” and “enemies,” with the latter group more prominently represented. Emotion plays a crucial role in this media battle, as it enables the shaping, influencing, and controlling of public and political opinion, as shown in the Cambridge Analytica and Facebook scandals (Boler, 2019).
Aso (2022) argues that sophisticated algorithms can reach into individuals’ personal lives, bombarding them with polarising content in favour of one side or the other. Therefore, the purpose of falsehoods is closely linked to their channels of dissemination. In this vein, Interviewees 7 and 9 agree that such emotionally driven narratives seek to discredit democracy and damage perceptions of institutions. Examples include messages claiming that governments are wrong to support Ukraine, that refugees pose a problem, or that sanctions against Russia will negatively affect domestic economies.
In this quest for scapegoats, links have emerged between various types of disinformation actors—from COVID-19 and climate change deniers to pro-Russian agents. “They’re the same people,” says Interviewee 5. Interviewee 4 explains that these actors seek to exploit systemic weaknesses, and a common enemy enables alignment between otherwise disparate viewpoints. These conclusions help address the fifth research question.
Regarding the themes of the falsehoods, the invasion of Ukraine has exposed new ways of deploying post-Soviet disinformative narratives, which, according to Interviewee 3, have been in use since 2014. In this context, the analysis confirms that during the initial weeks of the invasion, the most viralised falsehoods focused on the conflict itself, especially alleged Russian military manoeuvres and attacks. One instance of this can be found in the verification conducted by the Spanish media outlet Newtral on 3 March 2022, under the title “The images of a firefighting plane in Turkey that are being shared as if it were an aircraft ‘shot down by Russian forces’ in Ukraine” (Newtral, 2022).
Interestingly, in Poland, false narratives about Ukrainian military manoeuvres were also prominent. Due to its proximity to Ukraine, Poland has been one of the main targets of Russian propaganda. Therefore, the country has seen more anti-Ukrainian narratives than the other nations studied. “Ukraine is a bordering country, so its role in the war is crucial. Poland is thus one of the Kremlin’s main propaganda targets,” says Interviewee 3. A further example is provided by the Polish media outlet Demagog in a verification published on 2 March 2022, entitled “False: A Ukrainian rocket hit the headquarters of the Kharkiv Regional State Administration” (Demagog, 2022).
In March 2022, disinformation strongly centred on the humanitarian crisis. This theme became the second most disseminated narrative across the countries studied—except for Spain. Within this category, falsehoods related to refugees were particularly significant in Germany. Indeed, the fact-checking team of the British media outlet Reuters refuted on 30 March 2022 the claim that “A group of sex workers in Berlin is ‘recruiting’ Ukrainian refugees” (Reuters Fact Check, 2022).
In Spain, the second most prevalent disinformative theme focused on public image and political representation, particularly the distortion of Ukraine and its leader, which made up a significant share of this category.
These findings closely mirror those of the European Digital Media Observatory (EDMO) in March 2022, which highlighted the most viral disinformation narratives: conspiracy theories questioning the war’s reality; narratives minimising Ukrainian victims; false military manoeuvres (Russian and Ukrainian); distorted portrayals of Ukrainians and Russians; and the humanitarian crisis (EDMO, 2022a). These trends follow a clear disinformation pattern previously analysed by researchers such as Aso (2022), who notes that one side typically accuses the other of atrocities and provides graphic “evidence.”
As time passed and the conflict became more embedded in public opinion, the presence of war-related falsehoods in the sample declined. Thus, by March 2023, disinformation narratives about international support and allies gained particular prominence. This theme became the most widely spread in the countries analysed—except in the United Kingdom, where war and combat narratives still prevailed. Within the category of international support and allies, falsehoods surrounding support for Ukraine were especially notable. This conflict has a major international component, as the world is split between pro-Russian and pro-Ukrainian allies. Such a case is documented in the verification carried out by the German media outlet CORRECTIV on 31 March 2023, under the title “These military vehicles were not destined for Ukraine, they are being transferred back to the USA” (Thom, 2023).
Additionally, in March 2023, disinformation narratives related to diplomatic matters—especially public image and political representation, commonly referred to as “soft power”—gained relevance. This theme became the second most widely disseminated narrative, except in Poland, where disinformation continued to focus on the humanitarian crisis. Within this category, falsehoods targeting Putin’s image were identified.
Falsehoods about Zelensky also stand out. He has been one of the main disinformation targets and is reported as the most affected figure in the Russia–Ukraine information war, according to Magallón et al. (2023). “Deceptive content has been used to portray him in opposing ways: either as a cowardly enemy in a pro-Russian perspective, or as a national hero from a pro-Ukrainian viewpoint” (EDMO, 2022b).
Thus, the findings of this study align with those reported in EDMO’s March 2023 report (EDMO, 2023), which states that the main disinformation narratives in Europe concerning the Russia–Ukraine war aimed to damage Ukraine’s image, attack NATO and pro-Ukraine countries, and undermine Zelensky’s public reputation. This pattern is also reflected in the debunk published by the Spanish outlet Maldita.es, titled “The hoax that Zelensky has asked the United States to send troops to the war in Ukraine immediately” (Maldita.es, 2023). These conclusions address the sixth and final research question.
Regarding certain aspects that have influenced the optimal development of this research, it is worth highlighting that this study was based on falsehoods previously selected by fact-checking organisations. Although the analysed fact-checkers hold recognised ethical quality standards, they are nonetheless subject to various biases that could have influenced the research findings. Furthermore, for the coding of publications from German and Polish outlets, automated online dictionaries were used due to a lack of in-depth knowledge of their specific vocabulary and terminology.
In addition, the time constraints associated with conducting this research led to the selection of a limited number of case studies. Broader geographical coverage and a more extensive timeframe would have provided more precise conclusions regarding the object of study. As for future lines of research, it is considered relevant to expand the investigation to other case studies, such as the conflict in the Middle East, since, as has been demonstrated, wars are critical moments in which disinformation can significantly influence public opinion and the course of events.

Author Contributions

Conceptualization, R.S.-d.-V. and J.T.-N.; Methodology, R.S.-d.-V. and J.T.-N.; Software, R.S.-d.-V. and J.T.-N.; Validation, R.S.-d.-V. and J.T.-N.; Formal analysis, R.S.-d.-V. and J.T.-N.; Investigation, R.S.-d.-V. and J.T.-N.; Resources, R.S.-d.-V. and J.T.-N.; Data curation R.S.-d.-V. and J.T.-N.; Writing—original draft, R.S.-d.-V. and J.T.-N.; Writing—review & editing, R.S.-d.-V. and J.T.-N.; Visualization, R.S.-d.-V.; Supervision, J.T.-N.; Project administration, J.T.-N. Funding acquisition, J.T.-N. All authors have read and agreed to the published version of the manuscript.

Funding

This article is part of an European Chair funded by the Education, Audiovisual and Culture Executive Agency (EACEA), belonging to the European Commission, Jean Monnet (Erasmus+), “Future of Europe Communication in times of Pandemic Disinformation” (FUTEUDISPAN), Ref: 101083334-JMO-2022-CHAIR, directed between 2022 and 2025, from the University Carlos III of Madrid, by Professor Jorge Tuñón. However, its content is the sole responsibility of the authors and EACEA cannot be held responsible for any use which may be made of the information contained therein. This research was also supported by a University teacher training grant (FPU22/01905), awarded to one of the co-authors by the Spanish Ministry of Universities.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

The interviewees consented to participate in the study. The materials and quotes from the interviews have been anonymized to protect their privacy. This ensures that neither the interviewees nor the organisations they belong to can be identified.

Data Availability Statement

Correspondence and requests for materials should be addressed to the authors.

Acknowledgments

We thank the experts who participated in the interviews for their valuable contributions in supporting and validating the findings of our research.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Allcott, H., Gentzkow, M., & Yu, C. (2019). Trends in the diffusion of misinformation on social media. Research & Politics, 6(2), 1–8. [Google Scholar] [CrossRef]
  2. Almansa-Martínez, A., Fernandez-Torres, M. J., & Rodríguez-Fernández, L. (2022). Desinformación en España un año después de la COVID-19. Análisis de las verificaciones de Newtral y Maldita. Revista Latina de Comunicación Social, 80, 183–200. [Google Scholar] [CrossRef]
  3. Amorós-García, M. (2018). Fake news. La verdad de las noticias falsas. Plataforma Editorial. ISBN 978-84-17114-72-5. [Google Scholar]
  4. Asmolov, G. (2018). The disconnective power of disinformation campaigns. Journal of International Affairs, 71(1.5), 69–76. Available online: https://www.jstor.org/stable/26508120 (accessed on 30 June 2025).
  5. Aso, H. S. (2022). Ucrania 2022: La guerra por las mentes. Revista General de Marina, 283, 563–576. Available online: https://tinyurl.com/3ak9b3nb (accessed on 30 June 2025).
  6. Atanesian, G. (2023, September 4). Los influencers de Putin: El negocio de los blogueros militares rusos que promueven la guerra contra Ucrania. BBC News Mundo. Available online: https://tinyurl.com/44dkdbs4 (accessed on 30 June 2025).
  7. Bak, P., Tveen, M. H., Walter, J., & Bechmann, A. (2022). Research on disinformation about the war in Ukraine and outlook on challenges in times of crisis. En EDMO. Available online: https://tinyurl.com/27kdk9bj (accessed on 30 June 2025).
  8. Baptista, J. P., Rivas-De-Roca, R., Gradim, A., & Loureiro, M. (2023). The disinformation reaction to the Russia–Ukraine war. KOME, 11(2), 27–48. [Google Scholar] [CrossRef]
  9. Baqués, J. (2015). El papel de Rusia en el conflicto de Ucrania: ¿La guerra híbrida de las grandes potencias? Revista de Estudios en Seguridad Internacional, 1(1), 41–60. [Google Scholar] [CrossRef]
  10. Boler, M. (2019). Digital disinformation and the targeting of affect: New frontiers for critical media education. Research in the Teaching of English, 54(2), 187–191. Available online: https://www.jstor.org/stable/26912445 (accessed on 30 June 2025).
  11. Brennen, J., Scott, S., Felix, M., Howard, P. N., & Nielsen, R. (2020). Types, sources, and claims of COVID-19 misinformation. En Reuters Institute for the Study of Journalism Factsheet. Available online: https://tinyurl.com/4pba3zyh (accessed on 30 June 2025).
  12. Carrión, J. (2022). Estamos ante la primera guerra mundial digital. Comunición: Estudios Venezolanos de Comunicación, 198, 17–20. Available online: https://tinyurl.com/4tjfu7tt (accessed on 30 June 2025).
  13. Casero-Ripollés, A., Alonso-Muñoz, L., & Moret-Soler, D. (2025). Spreading false content in political campaigns: Disinformation in the 2024 European parliament elections. Media and Communication, 13, 9525. [Google Scholar] [CrossRef]
  14. Consejo Europeo & Consejo de la Unión Europea. (n.d.). EU sanctions against Russia explained. European Council. Available online: https://tinyurl.com/4kkrn4y8 (accessed on 30 June 2025).
  15. Demagog. (2022, March 2). False: A Ukrainian rocket hit the headquarters of the Kharkiv regional state administration. Demagog. Available online: https://demagog.org.pl/fake_news/stopfake-falsz-ukrainska-rakieta-uderzyla-w-siedzibe-charkowskiej-obwodowej-administracji-panstwowej/ (accessed on 30 June 2025).
  16. Dierickx, L., & Lindén, C. (2024). Screens as battlefields: Fact-checkers’ Multidimensional challenges in debunking Russian-Ukrainian war propaganda. Media and Communication, 12, 8668. [Google Scholar] [CrossRef]
  17. EDMO. (2022a, March 11). The five disinformation narratives about the war in Ukraine. European Digital Media Observatory. Available online: https://edmo.eu/publications/the-five-disinformation-narratives-about-the-war-in-ukraine/ (accessed on 30 June 2025).
  18. EDMO. (2022b, March 18). Weekly insight n°1-disinformation narratives about the war in Ukraine. European Digital Media Observatory. Available online: https://tinyurl.com/bddpbuck (accessed on 30 June 2025).
  19. EDMO. (2023). De la pandemia a la guerra en Ucrania: Año y medio de lucha contra la desinformación: Informe en edición especial. EDMO. Available online: https://tinyurl.com/2mtnmdnn (accessed on 30 June 2025).
  20. EUvsDesinfo. (2015). About. EUvsDisinfo. Available online: https://tinyurl.com/2p8hsxyu (accessed on 30 June 2025).
  21. EUvsDisinfo. (2024, March 4). Reflexiones tras dos años de guerra y desinformación. EUvsDisinfo. Available online: https://tinyurl.com/27rdt2aj (accessed on 30 June 2025).
  22. Gamir-Ríos, J., Tarullo, R., & Ibáñez-Cuquerella, M. (2021). La desinformació multimodal sobre l’alteritat a Internet. Difusió de boles racistes, xenòfobes i islamòfobes el 2020. Anàlisi, 64, 49–64. [Google Scholar] [CrossRef]
  23. García-Marín, D., & Salvat-Martinrey, G. (2023). Desinformación y guerra. Verificación de las imágenes falsas sobre el conflicto ruso-ucraniano. La Revista Icono, 21(1). [Google Scholar] [CrossRef]
  24. Garriga, M., Ruiz-Incertis, R., & Magallón-Rosa, R. (2024). Propuestas de inteligencia artificial, desinformación y alfabetización mediática en torno a los deepfakes. Observatorio (OBS*), 18(5), 175–194. [Google Scholar] [CrossRef]
  25. Golovchenko, Y. (2020). Measuring the scope of pro-Kremlin disinformation on Twitter. Humanities & Social Sciences Communications, 7(1), 176. [Google Scholar] [CrossRef]
  26. Hallin, D. C., & Mancini, P. (2004). Comparing media systems: Three models of media and politics. Cambridge University Press. [Google Scholar]
  27. Hameleers, M. (2024). The nature of visual disinformation online: A qualitative content analysis of alternative and social media in The Netherlands. Political Communication, 42, 108–126. [Google Scholar] [CrossRef]
  28. Hameleers, M., Powell, T. E., Van Der Meer, T. G., & Bos, L. (2020). A picture paints a thousand lies? The effects and mechanisms of multimodal disinformation and rebuttals disseminated via social media. Political Communication, 37(2), 281–301. [Google Scholar] [CrossRef]
  29. Landman, T., & Carvalho, E. (2017). Issues and methods in comparative politics: An introduction (4th ed.). Routledge. [Google Scholar]
  30. Magallón, R., Fernández-Castrillo, C., & Garriga, M. (2023). Fact-checking in war: Types of hoaxes and trends from a year of disinformation in the Russo-Ukrainian war. Profesional De La Informacion, 32(5), 1–15. [Google Scholar] [CrossRef]
  31. Maldita.es. (2023, March 2). No, Zelensky did not say that “American children will die” for Ukraine. Maldita.es. Available online: https://maldita.es/malditobulo/20230302/Zelenski-Estados-Unidos-Ucrania-hijos/ (accessed on 30 June 2025)Maldita.es.
  32. Montes, J. (2022). La desinformación: Un arma moderna en tiempos de guerra. Número 44 cuadernos de periodistas. Available online: https://tinyurl.com/mwwwm76b (accessed on 30 June 2025).
  33. Morris, L., & Oremus, W. (2022, December 8). Russian disinformation is demonizing Ukrainian refugees. Washington Post. Available online: https://tinyurl.com/2ktvhtx2 (accessed on 30 June 2025).
  34. Newtral. (2022, March 3). The images of a firefighting plane in Turkey that are being shared as if it were an aircraft ‘shot down by Russian forces’ in Ukraine. Newtral. Available online: https://www.newtral.es/avion-derribado-fuerzas-rusas-bulo/20220303/ (accessed on 30 June 2025).
  35. Pierri, F., Luceri, L., Jindal, N., & Ferrara, E. (2023, April 30). Propaganda and misinformation on Facebook and Twitter during the Russian invasion of Ukraine. 15th ACM Web Science Conference (pp. 65–74), Austin, TX, USA. [Google Scholar] [CrossRef]
  36. Reuters Fact Check. (2022, March 30). Berlin sex worker group not ‘recruiting’ Ukrainian refugees. Reuters. Available online: https://www.reuters.com/article/fact-check/berlin-sex-worker-group-not-recruiting-ukrainian-refugees-idUSL2N2VW1MJ/ (accessed on 30 June 2025).
  37. Rodríguez-Pérez, C., Sánchez-del-Vas, R., & Tuñón-Navarro, J. (2025). From fact-checking to debunking: The case of elections24check during the 2024 European elections. Media and Communication, 13, 9475. [Google Scholar] [CrossRef]
  38. Rondeli, A. (2014). Moscow’s information campaign and Georgia. Opinion paper, Georgian foundation for strategic and international studies. Available online: https://tinyurl.com/35da8xr9 (accessed on 30 June 2025).
  39. Ruiz-Incertis, R., Sánchez-del-Vas, R., & Tuñón-Navarro, J. (2024). Análisis comparado de la desinformación difundida en Europa sobre la muerte de la reina Isabel II. Revista De Comunicación, 23(1), 507–534. [Google Scholar] [CrossRef]
  40. Salaverría, R., Buslón, N., López-Pan, F., León, B., López-Goñi, I., & Erviti, M. C. (2020). Desinformación en tiempos de pandemia: Tipología de los bulos sobre la COVID-19. El profesional de la Información (EPI), 29(3), 1–15. [Google Scholar] [CrossRef]
  41. Sánchez-del-Vas, R. (2025). Verificar para informar: La consolidación del fact-checking europeo como escudo frente a la desinformación. In J. Tuñón-Navarro, R. Sánchez-del-Vas, & L. Bouza-García (Eds.), Periodismo versus populismo comunicación de la unión Europea frente a la pandemia desinformativa (pp. 153–168). Editorial Comares. [Google Scholar]
  42. Sánchez-del-Vas, R., & Tuñón-Navarro, J. (2023). La comunicación europea del deporte en un contexto desinformativo y post-pandémico. In E. Ortega-Burgos, & M. M. García Caba (Eds.), Anuario de derecho deportivo 2023 (pp. 85–104). Tirant lo Blanch. [Google Scholar]
  43. Sánchez-del-Vas, R., & Tuñón-Navarro, J. (2024a). Disinformation on the COVID-19 pandemic and the Russia-Ukraine war: Two sides of the same coin? Humanities And Social Sciences Communications, 11(1), 851. [Google Scholar] [CrossRef]
  44. Sánchez-del-Vas, R., & Tuñón-Navarro, J. (2024b). Hoaxes’ anatomy: Analysis of disinformation during the coronavirus pandemic in Europe (2020–2022). Communication & Society, 37(4), 1–19. [Google Scholar] [CrossRef]
  45. Sundar, S. S., Molina, M., & Cho, E. (2021). Seeing is believing: Is video modality more powerful in spreading fake news via online messaging apps? Journal of Computer-Mediated Communication, 26(6), 301–319. [Google Scholar] [CrossRef]
  46. Thom, P. (2023, March 31). These military vehicles were not destined for Ukraine, they are being transferred back to the USA. CORRECTIV. Available online: https://correctiv.org/faktencheck/2023/03/31/diese-militaerfahrzeuge-waren-nicht-fuer-die-ukraine-bestimmt-sie-werden-in-die-usa-zurueckverlegt/ (accessed on 30 June 2025).
  47. Tuñón-Navarro, J., & Sánchez-del-Vas, R. (2022). Verificación: ¿la cuadratura del círculo contra la desinformación y las noticias falsas? adComunica Revista Científica de Estrategias Tendencias E Innovación En Comunicación, 23, 75–95. [Google Scholar] [CrossRef]
  48. Tuñón-Navarro, J., Sánchez-del-Vas, R., & Ruiz-Incertis, R. (2023). La regulación europea frente a la pandemia desinformativa. In J. A. Nicolás, & F. García-Moriyón (Eds.), El reto de la posverdad. Análisis multidisciplinar, valoración crítica y alternativas (pp. 129–150). Editorial Sindéresis. [Google Scholar]
  49. Tuñón-Navarro, J., Sánchez-del-Vas, R., & Sáenz-de-Ugarte, I. (2024). Desinformación y censura en conflictos internacionales: Los casos de Ucrania y Gaza (V. Palacio, Coord. y Ed.). Fundación Alternativas. Documento de trabajo No. 236/2024. Available online: https://tinyurl.com/2x5apbcm (accessed on 30 June 2025).
  50. Veebel, V. (2015). Russian propaganda, disinformation, and Estonia’s experience. En Foreign Policy Research Institute. Available online: https://tinyurl.com/4ju69tk2 (accessed on 30 June 2025).
  51. Vorster, O. R. (2021). The soviet information machine: The USSR’s influence on modern Russian media practices & disinformation campaigns. LSE Undergraduate Political Review, 4(2), 106–112. Available online: https://tinyurl.com/3vsac4ec (accessed on 30 June 2025).
  52. Wardle, C. (2017, March 14). Noticias falsas. Es complicado. First draft. Available online: https://tinyurl.com/yc7bad8t (accessed on 30 June 2025).
Figure 1. Format of analysed falsehoods by country. Own elaboration. Values shown in this and subsequent figures are rounded; totals may not equal 100% due to rounding.
Figure 1. Format of analysed falsehoods by country. Own elaboration. Values shown in this and subsequent figures are rounded; totals may not equal 100% due to rounding.
Journalmedia 06 00115 g001
Figure 2. Typology of analysed falsehoods by country. Own elaboration.
Figure 2. Typology of analysed falsehoods by country. Own elaboration.
Journalmedia 06 00115 g002
Figure 3. Platform of analysed falsehoods by country. Own elaboration.
Figure 3. Platform of analysed falsehoods by country. Own elaboration.
Journalmedia 06 00115 g003
Figure 4. Social networks of the analysed falsehoods by country. Own elaboration.
Figure 4. Social networks of the analysed falsehoods by country. Own elaboration.
Journalmedia 06 00115 g004
Figure 5. Purpose of analysed falsehoods by country. Own elaboration.
Figure 5. Purpose of analysed falsehoods by country. Own elaboration.
Journalmedia 06 00115 g005
Table 1. Overview of interviewed experts, their professional backgrounds. Own elaboration.
Table 1. Overview of interviewed experts, their professional backgrounds. Own elaboration.
Interviewee CodeProfessional Field at the Time of the InterviewDate
Interviewee 1Fact-checker 2023
Interviewee 2Fact-checker 2023
Interviewee 3Fact-checker2023
Interviewee 4University lecturer and researcher specialising in media and disinformation2023
Interviewee 5University lecturer and researcher specialising in social network analysis2023
Interviewee 6Researcher specialising in Big Data and artificial intelligence2023
Interviewee 7Journalist and researcher specialising in disinformation in Europe2023
Interviewee 8Former member of European institutions2023
Interviewee 9Researcher at a think tank specialising in disinformation2023
Interviewee 10University lecturer and researcher specialising in European Affairs2023
Table 2. Number of fact-checks by country, language, and fact-checking organisation. Own elaboration.
Table 2. Number of fact-checks by country, language, and fact-checking organisation. Own elaboration.
CountryLanguageFact-Checking Organisation 1 and
Number of Fact Checks
Fact-Checking Organisation 1 and Number of Fact ChecksTotal%
SpainSpanishNewtral: 45Maldito Bulo: 529733%
GermanyGermanCORRECTIV: 29BR24.Faktenfuchs: 43311%
United KingdomEnglishFullFact: 24Reuters Fact Check: 648830%
PolandPolishFakenewsPL: 9Demagog: 707927%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sánchez-del-Vas, R.; Tuñón-Navarro, J. Beyond the Battlefield: A Cross-European Study of Wartime Disinformation. Journal. Media 2025, 6, 115. https://doi.org/10.3390/journalmedia6030115

AMA Style

Sánchez-del-Vas R, Tuñón-Navarro J. Beyond the Battlefield: A Cross-European Study of Wartime Disinformation. Journalism and Media. 2025; 6(3):115. https://doi.org/10.3390/journalmedia6030115

Chicago/Turabian Style

Sánchez-del-Vas, Rocío, and Jorge Tuñón-Navarro. 2025. "Beyond the Battlefield: A Cross-European Study of Wartime Disinformation" Journalism and Media 6, no. 3: 115. https://doi.org/10.3390/journalmedia6030115

APA Style

Sánchez-del-Vas, R., & Tuñón-Navarro, J. (2025). Beyond the Battlefield: A Cross-European Study of Wartime Disinformation. Journalism and Media, 6(3), 115. https://doi.org/10.3390/journalmedia6030115

Article Metrics

Back to TopTop