Next Article in Journal
TraceAll: A Real-Time Processing for Contact Tracing Using Indoor Trajectories
Next Article in Special Issue
Application of Multi-Criteria Decision-Making Models for the Evaluation Cultural Websites: A Framework for Comparative Analysis
Previous Article in Journal
Ontology-Based Approach to Semantically Enhanced Question Answering for Closed Domain: A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Combatting Visual Fake News with a Professional Fact-Checking Tool in Education in France, Romania, Spain and Sweden

by
Thomas Nygren
1,*,
Mona Guath
1,2,
Carl-Anton Werner Axelsson
1,3 and
Divina Frau-Meigs
4
1
Department of Education, Uppsala University, 750 02 Uppsala, Sweden
2
Department of Psychology, Uppsala University, 751 42 Uppsala, Sweden
3
Department of Information Technology, Uppsala University, 751 05 Uppsala, Sweden
4
Digital Humanities, University Sorbonne, Nouvelle, 75006 Paris, France
*
Author to whom correspondence should be addressed.
Information 2021, 12(5), 201; https://doi.org/10.3390/info12050201
Submission received: 15 March 2021 / Revised: 28 April 2021 / Accepted: 28 April 2021 / Published: 6 May 2021
(This article belongs to the Special Issue Evaluating Methods and Decision Making)

Abstract

:
Educational and technical resources are regarded as central in combating disinformation and safeguarding democracy in an era of ‘fake news’. In this study, we investigated whether a professional fact-checking tool could be utilised in curricular activity to make pupils more skilled in determining the credibility of digital news and to inspire them to use digital tools to further their transliteracy and technocognition. In addition, we explored how pupils’ performance and attitudes regarding digital news and tools varied across four countries (France, Romania, Spain, and Sweden). Our findings showed that a two-hour intervention had a statistically significant impact on teenagers’ abilities to determine the credibility of fake images and videos. We also found that the intervention inspired pupils to use digital tools in information credibility assessments. Importantly, the intervention did not make pupils more sceptical of credible news. The impact of the intervention was greater in Romania and Spain than among pupils in Sweden and France. The greater impact in these two countries, we argue, is due to cultural context and the fact that pupils in Romania and Spain learned to focus less on ’gut feelings’, increased their use of digital tools, and had a more positive attitude toward the use of the fact-checking tool than pupils in Sweden and France.

1. Introduction

Faced with the challenges that are caused by information disorder and infodemics, there is a demand for educational interventions to support citizens and safeguard democracy [1,2,3]. Education is considered to be key, since automated fact-checking has significant limitations, not least when it comes to debunking visual images and deep fakes [4,5]. In addition, platform companies and fact-checkers struggle to keep pace with the speed and spread of disinformation (e.g., [1,6]), which makes it critical that citizens develop resilience to disinformation by learning to navigate digital news in more up-to-date and autonomous ways.
Disinformation—defined as inaccurate, manipulative, or falsified information that is deliberately designed to mislead people—is intentionally difficult to detect. This poses a challenge not only for professional fact-checkers in mainstream media and digital platforms, but also for media literacy specialists, whose expertise does not extend much beyond imparting basic source verification strategies [3]. Yet, the journalistic profession has been able to benefit from a growing number of fact-checking initiatives that have generated digital tools and novel responses to infodemics. However, such tools have not broadly reached the general public, which has mostly been left to its own devices. This gap between professionals and the general public is further widened by the evolution of disinformation itself; fake news is now not only text-based, but also increasingly image-based, especially on the social media used by young people, and so debunking news requires more sophisticated approaches.
Building resilience to fake news requires navigating online information in new ways and with the support of digital tools, similar to the methods used by professional fact-checkers [7,8,9]. Because new technology makes it hard to see the difference between a fake and a real video [10] or to distinguish a misleading image in a tweet from a credible one [11], teenagers often struggle to determine the credibility of images and videos when these are presented in deceptive ways [12,13,14]. Citizens need a combination of digital knowledge, attitudes, and skills to navigate the complicated digital world of post-truth, as highlighted by theories of media and information literacy, such as transliteracy [15] and technocognition [8].
Young people growing up in an era of online misinformation have been found to struggle to separate fake news from real news [12,14,16,17,18]. Teenagers stating that they are quite skilled at fact-checking may not hold the skills they think they have [13,19]. The idea that young people are digital natives, knowing how to navigate digital media much better than other generations, does not have any support in the research. Instead, there is a call for educational efforts to promote the media and information literacy of teenagers with diverse backgrounds [14,20,21].
Research has highlighted the existence of a media and information literacy divide between people and pupils in different groups, highlighting a digital inequality between citizens [14,22,23,24,25]. Teenagers with poor socio-economic status may spend more time online on entertainment and simple tasks than peers with better support from home, and they may also find it difficult to separate fake news from real news [14,21,26,27]. Access to computers will not automatically bridge this divide since source-critical thinking has multiple interlinked dimensions and it is very complex and intellectually challenging to determine whom to trust online [28,29]. Pupils need more education designed to promote media and information literacy in general and visual transliteracy in particular in order to overcome this divide in different contexts.
Research indicates that it is possible to support people’s abilities to evaluate online information by giving short instructions on how to identify misleading headlines on Facebook and WhatsApp [7], by the use of games that are designed to alert against manipulative tweets [30] and by educational interventions that support pupils’ civic online reasoning [20,21,31]. However, because the technological advances in visual media manipulation are leveraging the spread of false or misleading information, researchers are calling for ’more intensive digital literacy training models (such as the "lateral reading” approach used by professional fact checkers)’ ([7], p. 7).
In this study, we took on this challenge by evaluating a professional digital fact-checking tool in classroom settings in France, Romania, Spain, and Sweden. The aim of this design-based study was to make the professional plug-in InVID-WeVerify useful in curricular activity in order to improve pupils’ skills in evaluating misleading images and videos. We investigated the potential benefits and challenges of implementing the latest advances in image and video verification in collaboration with teachers across Europe.
The tool, InVID-WeVerify, is a free verification plug-in that is available in multiple languages used today by professional journalists and fact-checkers to verify images and videos in newsrooms, such as France24, India Today, Canal 1, and Volkskrant [32]. The plug-in has been downloaded across the globe more than 40,000 times, and it is used on a daily basis by, among others, fact-checkers at Agence France-Presse (AFP) to investigate rumours and suspicious content regarding, for example, Covid-19 and politics.

1.1. Educational Interventions to Support Fact-Checking in a Post-Truth Era

International organizations, like UNESCO and the European Union, underscore the importance of education to promote so-called media and information literacy as an important defence against propaganda and disinformation [1,33]. Media and information literacy may be viewed as an umbrella term covering knowledge, skills, and attitudes described by researchers as information, news, media, and digital literacies [33,34,35]. Information literacy—the ability to evaluate and use information wisely—has especially been noted as a ‘survival skill’ [36]. In line with this, the theory of civic online reasoning underscores how “the ability to effectively search for, evaluate, and verify social and political information online” is essential for all citizens ([18], p. 1). The multi-modal aspects of digital information involve new challenges when people search for, find, review, analyse, and create information [37,38,39]. Researchers also call for more research on civic online reasoning with new and more complex tasks and test-items paying attention to pupils’ knowledge, skills, and attitudes in different educational settings [20].
Today, the ability to read, write, and interact across a range of platforms, tools, and media, described as transliteracy, has become a key literacy in a world of digital multimodal information [40]. Transliteracy has been enlarged to embrace the double-meaning of digital convergence: ‘1. the ability to embrace the full layout of multimedia, which encompasses skills for reading, writing, and calculating with all the available tools (from paper to image, from book to wiki); 2. the capacity to navigate through multiple domains, which entails the ability to search, evaluate, test, validate and modify information according to its relevant contexts of use (as code, news, and document)’ ([41], pp. 15–16). Transliteracy echoes technocognition as an emerging interdisciplinary field that involves technological solutions incorporating psychological principles to solve disinformation issues [8]. In the present study, we focus on the latter aspect of transliteracy, more precisely, how tools can facilitate navigating a digital information landscape.
Scholars point out that journalistic principles and technology may support citizens in navigating a digital world of deep fakes and misleading visual and text-based information [8,9]. The use of digital tools to support the verification of news has been discussed in terms of technocognition and civic online reasoning [42]. These prescriptive theories emphasise that citizens need to be better at scrutinising online information, and that this is a psychological, technological, and educational challenge. In a post-truth era, people need to consider that images and videos may be manipulated and also be able to use digital resources, such as text search and reverse image search, to corroborate information. Professional fact-checkers use technology to read laterally, which means that they verify information on a webpage by corroborating it with information on other credible webpages [9]. Researchers note that education and technology that support lateral reading may be key to safeguarding democracy in a post-truth era that is saturated by disinformation [7,8,9]. However, the use of professional fact-checking tools in education to support pupils’ transliteracy, lateral reading, and technocognition has not been studied in previous research to date.
What has been noted in media and information literacy research is that pupils often struggle to separate credible from misleading digital multimodal information [12,14]. Even individuals with proficient news media knowledge may struggle to evaluate evidence online [17,43]. The high expectations of news literacy programmes [3] should be understood in light of these challenges. Scholars also emphasise that technology and educational interventions are not quick fixes for the complex challenge of misinformation [44]. More time in front of computers does not necessarily make pupils more skilled at navigating online information [18,45,46]. Without adequate media and information literacy, pupils may fail to separate credible information from misleading informationl, because they are not able to use effective and adaptive strategies when evaluating manipulated images and junk news [28]. In education, it is critical that the educational design includes a combination of challenging and stimulating tasks, and different types of hard and soft scaffolds to help pupils use online resources in constructive ways [47,48,49,50,51,52].
While noting the many challenges, we still find a few studies highlighting the ways in which it is possible to support pupils lateral reading skills in education. Educational designs for promoting civic online reasoning have made it possible for teenagers at the university and high school level to scrutinise digital news in a similar manner to professional fact-checkers [20,21,31,53]. Previous research has also identified that it is possible for upper secondary school pupils to use digital tools that are designed for professional historians in critical and constructive ways if they are supported by an educational intervention comprising supporting materials and teaching [54,55].

1.2. Design-Based Research

Implementing innovative technology in schools is often linked to design-based research in education, also known as design experiments, design research, or design study [56]. Testing and developing digital tools that may hold new dimensions and practices is often at the core of design-based research [56], not least since this may provide new practical and theoretical insights. design-based research aims to "test and build theories of teaching and learning, and produce instructional tools that survive the challenges of everyday practice” ([57], p. 25). The usefulness of design-based research comes from the methods where researchers and teachers collaborate to identify challenges and test new materials and methods in complex classroom settings with a purpose to promote pupils’ learning [58,59]. In line with a previous call for "research that features practitioner co-creation of knowledge as a vehicle for use and uptake” ([60], p. 98) and congruent with Akkerman et al. [61], we acknowledge that dialogue with teachers regarding technology is essential in educational design-based research.
Design-based research advocate Ann Brown [62] stresses the importance of collecting data from messy classrooms in order to measure learning where it usually occurs. She underscores how measuring effects through pre- and post-tests designed to fit the research focus and this is central in our study ([62], Additionally, see Figure 1 and the section below on materials and methods). Our research is based on the assumption that the design of materials and methods is important for learning and we focus on developing new tools and theories for teaching in the complex reality of teaching and learning [63]. In line with the methodology of design-based research we see that the materials and methods that were developed through iterative studies in the classrooms should preferably survive the challenges of classroom practices and remain to be used in teaching long after the research project is completed [57,64]. Thus, design-based research professionals argue that educational science needs to develop ideas and products that work in thoughtful ways [65] and, in this article, we present some steps in this direction.

1.3. The Present Study

Noting the major challenge of fake news, the limited ability of pupils to navigate information in digital environments and the existence of digital fact-checking tools, we see an opportunity to answer the calls for media and information literacy interventions with a design-based research approach. In this study, we investigated whether a two-hour educational intervention in four European countries, using a computer-based tool that was designed for professional fact-checkers, could stimulate pupils’ visual literacy and make them better at determining the credibility of digital news. We explored the following hypotheses:
Hypothesis 1.
Pupils will become more skilled at assessing digital news after the intervention. Specifically, we expected the following:
a 
The total post-test score will be significantly better than the total pre-test score.
b 
The ability to debunk false and true items separately will also be significantly better in the post-test.
Hypothesis 2.
Better performance in assessing information will be facilitated by the use of digital tools.
In addition, we investigated the following exploratory research questions:
Q1 
How do media attitudes and digital information attitudes vary across countries?
Q2 
How does performance on pre- and post-test vary across countries?

2. Materials and Methods

2.1. Participants

A total of 373 upper secondary school pupils, aged 16–18, across the four countries participated in the lessons and responded to the questionnaire during the Autumn term of 2020. All of the pupils agreed to complete the pre-test with anonymised responses for research purposes (with informed consent in line with the ethical guidelines of all countries). Of 373 pupils, there were 238 who took both the pre- and post-test, and this was the number of participants that we used in the analyses. The number of complete responses in each country was: 59 in France, 22 in Romania, 47 in Spain, and 110 in Sweden. The gender distribution was: 144 girls, 83 boys, and 11 pupils, which indicated that they did not wish to specify their gender or identified as non-binary. The different sample sizes in each country were primarily due to lockdowns and challenges that are linked to schooling during the Covid-19 pandemic.

2.2. Material

Media and information literacy theories, such as transliteracy, civic online reasoning, and technocognition, all emphasise the importance of using digital tools when evaluating online information. The plug-in InVID-WeVerify is such a tool and it offers multiple functionalities that provide support to users when verifying videos and images [66,67]. InVID-WeVerify makes it possible to (a) conduct reverse image searches with multiple search engines (e.g., Google, Yandex, Bing); (b) analyse images with forensic filters to detect alterations in their structure, such as quantisation, frequencies, colours, and pixel coherence; (c) scrutinise the details of an image with a magnifying lens; (d) fragment videos into key-frames; and, (e) retrieve metadata about videos and images.
Thus, the tool is designed to support advanced media verification strategies. However, introducing a professional tool for fact-checking in education may have little effect if the tool is not understood or is found to be unusable by teachers or pupils. General media literacy principles reinforce the need for sense-making uses of technology in terms of knowledge acquisition and societal values, recommending that the tool or operational device not be the main entryway to problem-solving [41]. This is consistent with prior research pointing to the fact that dialogue with teachers regarding technology is essential in educational design-based research [61]. Therefore, we initiated our endeavour with a study design phase, in which 34 teachers from France, Romania, Spain, and Sweden participated in focus group discussions with the aim of testing the tool and providing feedback on the usefulness of the tool in education (for a complete design overview, see Figure 1). Focus group discussions were organised to assess the perception of disinformation by teachers in their local context and their perspectives on InVID-WeVerify functionalities, especially in relation to image reverse search, automated video key frames extraction, and image forensics. The findings from these focus group discussions highlighted that implementing the tool in class may be very complicated and pointed to a need for scaffolds [68]. The results also highlighted some cultural differences; for instance, challenges may be greater in Romania than in Sweden due to the different media cultures and technical resources available in the two countries. Learning from teacher feedback, we designed educational materials to scaffold the use of the tool in classrooms. This was achieved through close collaboration between teachers and researchers. Researchers from the participating countries discussed and created materials and methods for stimulating transliteracy and technocognition. These materials were then introduced to a teacher who tested and piloted them in teaching and then provided feedback. In this phase, we also designed and piloted credible and fake news items for use in pre- and post-tests (see example, items in Appendix A). The final educational design, limited to a two hour intervention, was agreed upon by 16 social studies teachers and eight researchers that were situated in the four countries.
Materials for the educational intervention included a lesson plan for teachers, handouts for use in the classroom, and presentation slides. The educational intervention was introduced with an initial 60 min lesson on the theme ’News in a world of fake news: Definitions, credibility, and identification of different types of misinformation’, and included a combination of lectures (presenting concepts that are linked to misinformation, examples of fake news, and summing up discussions) and active exercises for pupils where they were asked to (a) come up with news sources (see Figure 2) and (b) identify different types of misinformation. The lesson was concluded with a sharing of results and collective discussions about what the pupils learned from the lesson.
The second lesson, which was also 60 min in duration, focused on ’Individual defence against misinformation and disinformation: Understanding image manipulation with InVID-WeVerify’. This lesson started with a short lecture on how fact-checkers use lateral reading and verify information by considering: (a) Who is behind this information? (b) What is the evidence? and (c) What do other sources say? [9]. The teacher acted as a fact-checker by conducting a reverse image search and forensic analysis of an image with InVID-WeVerify. Next, the teacher showed a fake video and verified this using the InVID-WeVerify key frames analysis. Thereafter, the pupils downloaded the plug-in and worked in groups of two to three to verify images and videos with InVID-WeVerify. The pupil task (Figure 3) was provided in PDF format, which makes it possible for them to click on hyperlinks and use InVID-WeVerify ’in the wild’ with authentic examples of misleading images and videos.
The step-by-step task was designed to prompt pupils to use different techniques to debunk images and videos. After the group work, the teacher showed slides explaining how to debunk the fake images and video using InVID-WeVerify (see Figure 4) and discussed, in class, what the pupils had learned. Summing up the two lessons, the teacher then presented some information regarding how to navigate online news in clever ways with tips, such as: (a) set filters on credible sources (i.e., reviewed news feeds from established news media); (b) be careful about frames and deceptive design (what looks great may be the very opposite); (c) rely on several search engines; (d) look for other independent sources, double check!; (e) think first then share—share with care!; and, (f) stay cool!

2.3. Procedure

With an aim to develop new methods and materials that are useful in the complexity of everyday classroom practices, we made sure to collect a rich set of data, enabling us to investigate the possibilities and challenges of this educational design [56,64]. The intervention took place in October 2020 during the Covid-19 pandemic, presenting us with a special challenge in conducting the educational effort. It was initially planned for March 2020, but was postponed due to school lockdowns in three out of the four countries. The intervention started and ended with an online questionnaire, which included a test with two fake news test items (a misleading image and a misleading video) and one credible news test item. We made sure to include both credible and fake information, because scholars have noted that exposure to fake news may lead to diminished levels of trust in credible news [69]. We used different items in the pre- and post-tests, and counterbalanced these items between groups to ensure that the results would come from the intervention and not the test items. Test items—one true news item, one item with a manipulated image, and one fake video—were introduced with the following instruction: ’You will now be asked to rate the credibility of news on social media. When you do this, you are free to double check the information online’. The items were presented in a randomised order to minimise the order effects. All of the items included were social media posts intended to ’go viral’, which is, they were designed to attract attention, clicks, and shares.
The pre- and post-test also included questions regarding the pupils’ use of digital tools when they assessed the credibility of the information. We asked: ‘When you answered the questions in the survey, did you use any digital tools? For example, did you use Google or reverse image search to examine the credibility of the news? Yes/No’. If they checked ‘Yes’, we asked them: ‘How did you use digital tools to help you examine the credibility of the news? (check multiple boxes if you did multiple things) (a) I used text searches (for instance on Google), (b) I did reverse image searches, (c) I used multiple search engines, and/or (d) other (please specify)’.
We also asked the pupils questions regarding their background, their attitudes towards news, and how they usually determine credibility (see Appendix B and Appendix C). These factors have been identified as important in previous research and in research highlighting the complexity of media and information literacy [13,14,70]. In order to investigate the pupils’ self-perceived skills and attitudes, we asked them to rate their ability to find and evaluate information online and their attitude towards credible information sources in line with previous research [13,14]. Answers were given on a five-point scale (see Appendix B Question 3–6). The participants were then asked to rate statements on their strategies to determine the credibility of news on a scale from 1 (never) to 7 (very often), with questions being adapted from Frunzaru and Corbu [70] (see Appendix B Question 7).
In the post-test, we asked the pupils to rate their user experience of InVID-WeVerify in order to assess their perception of the visual verification tool. All of the questions in the tests were asked in the native language of the pupils. We also interviewed teachers and asked them (a) what worked well in the teaching, (b) problems in the teaching, and (c) suggestions for improvements.

2.4. Design

The study design was a repeated measurement using pre- and post-tests around the educational intervention with the InVID-WeVerify-tool, where the order of the pre- and post-test items were counterbalanced to avoid the order effects.

2.5. Analysis

We transformed false item scores by subtracting each false score from the maximum rating for each false item, so that a high score signified good performance. We also reversed the items in the news evaluation test, where higher ratings indicated less awareness of the need to fact-check information. We then summed all of the pre- and post-test items for the respective order conditions to yield two scores, a pre-test score and a post-test score, respectively.
We made a two-way mixed ANOVA with time as the repeated measure, and the use of digital tools on post-test and language as the between-subjects variable, and total test score as the outcome variable, in order to analyse performance in relation to the hypothesised relationships. Because the sample sizes in each country were unequal, we followed Langsrud’s advice [71] and made ANOVAs with Type II squares for unbalanced design. Essentially, the Type II sum-of-squares allows for lower-order terms to explain as much variation as possible, adjusting for one another. Because of non-normally distributed data, we analysed the post-test scores on false and true items with the Wilcoxon rank-sum test, with the use of digital tools as the independent variable.
Next, we investigated how the attitudes differed between the countries. Summary statistics of all attitudes are provided in Table A1, Table A2, Table A3, Table A4, Table A5, Table A6, Table A7, Table A8, Table A9, Table A10 and Table A11 in Appendix C; only statistically significant differences are discussed here in the text. For the self-rated skills and attitudes in relation to digital proficiency, we ran one-way ANOVAs using the Type II sum-of-squares, with language as the independent variable. For the self-rated attitudes towards news and attitudes towards the digital tool, we made pairwise t-tests for each attitude, separately rating pre- and post-tests for each language.

3. Results

3.1. Performance Test

The maximum total score on the performance test was 21, with each item providing a maximum score of 7. The mean total score on pre-test across all countries was 12.2 ( S D = 3.2 ), and the mean total score on post-test was 13.2 ( S D = 2.9 ), a statistically significant difference ( t ( 467.63 ) = 3.58 , p < 0.001 ). Table 1 presents the pre- and post-test performance for each language version of the tests. The median total score on false items on pre-test was 9 ( M A D = 3.0 ) and 10 ( M A D = 3.0 ) on post-test, a difference that was statistically significant ( W = 34742 , p < 0.001 ). For the true items, the median on the pre-test (3; M A D = 3.0 ) did not result in a statistically significant difference ( W = 26 , 283 , p = 0.29 ) from the median on the post-test (3; M A D = 3.0 ).

3.2. Differences in Pre- and Post-Test Scores in Relation to Use of Digital Tools and Language

Regarding the use of digital tools, 14% of the participants stated that they had used digital tools in the pre-test and 44% stated that they had used digital tools in the post-test when evaluating the news. Table 2 presents the digital tool use in pre- and post-test for each language version of the tests.
We ran a mixed 2 × 2 × 4 ANOVA, with time as the within-subjects variable (pre/post), and use of digital tool (yes/no) and language (French, Romanian, Spanish, and Swedish) as the between-subjects variables (for complete results, refer to Table A13 in the Appendix C). The results showed a statistically significant main effect of using digital tools ( F ( 1229 ) = 7.29 , M S E = 11.94 , η p 2 = 0.031 , p = 0.007 ). A post-hoc Bonferonni-corrected t-test showed a statistically significantly higher mean total score ( p < 0.001 ) with digital tools ( M = 13.1 , 95% CI[12.7, 13.6]) than without ( M = 12.2 , 95% CI[11.7, 12.6]). There was also a statistically significant main effect of time ( F ( 1 , 229 ) = 18.87 , M S E = 6.14 , η p 2 = 0.076 , p < 0.001 ).
A follow-up Bonferonni-corrected t-test ( p = 0.008 ) showed a statistically significant higher total score on post-test ( M = 13.4 , 95% CI[13.0, 13.8]) than on pre-test ( M = 11.9 , 95% CI[11.5, 12.3]). However, the main effect of time was mediated by an interaction between language and time ( F ( 3229 ) = 2.95 , M S E = 6.14 , η p 2 = 0.037 , p = 0.033 ; depicted in Figure 5, Panel B).
A follow-up analysis with Bonferroni-corrected pairwise comparisons resulted in a statistically significant difference ( p = 0.021 ) between pre- ( M = 12.2 , 95% CI[11.3, 13.0]) and post-test ( M = 13.9 , 95% CI[13.0, 14.7]) for the Spanish language. There was also a statistically significant difference ( p = 0.004 ) between pre- ( M = 10.4 , 95% CI[9.24, 11.6]) and post-test ( M = 13.3 , 95% CI[12.1, 14.5]) for the Romanian language. However, the differences between the pre- and post-test for the Swedish and French languages were not statistically significant. To summarise, using digital tools resulted in higher total post-test scores and the post-test scores were statistically significantly higher than the pre-test scores for the Spanish and Romanian languages.

3.3. Post-Test Scores on True and False Items When Using Digital Tools

For total scores on false items on post-test, there was a statistically significant difference ( W = 5642 , p = 0.028 ) with an advantage for those using digital tools. For total scores on true post-test items, there was also a statistically significant difference ( W = 5383 , p = 0.007 ), with an advantage for those using digital tools. We present medians and median absolute deviations in Table 3. Using digital tools is clearly advantageous; however, there is also greater spread (MAD) in post-test scores within the group using digital tools as compared with the group not using digital tools.

3.4. Attitudes

For the digital attitude scale, we report the means in Table 4 and, for the news evaluation scale and InVID-WeVerify attitudes, we refer to Table A1, Table A2, Table A3, Table A4, Table A5, Table A6, Table A7, Table A8, Table A9, Table A10 and Table A11 and Table A12, respectively, in the Appendix C. For the self-rated attitudes and skills, the participants were provided with a five-point rating scale.
The mean scores were moderate for the media and information literacy attitudes, except for internet info reliability and credibility importance. The pupils were quite sceptical about news on the internet (info reliability) and valued credible news highly (credibility importance), which may be due to the fact that they were just about to go through a fact-checking intervention, but it may also reflect a more general scepticism of online information and a positive attitude towards credible news. In addition, pupils rate their ability to assess online information (fact-checking ability and search ability) quite highly.

3.5. Self-Rated Attitudes and Skills

For self-rated fact-checking ability (see Figure 6A), there was a statistically significant difference between languages ( F ( 3234 ) = 12.36 , M S E = 0.72 , η p 2 = 0.14 , p < 0.001 ). Tukey corrected pairwise comparisons showed a statistically significant difference between Spanish and Swedish ( p < 0.001 ), as well as between French and Swedish ( p < 0.001 ).
For self-rated search ability (see Figure 6B), there was a statistically significant difference between languages ( F ( 3234 ) = 17.32 , M S E = 0.67 , η p 2 = 0.18 , p < 0.001 ). Tukey corrected pairwise comparisons showed a statistically significant difference between Spanish and Romanian ( p < 0.001 ), French and Romanian ( p < 0.001 ), Swedish and French ( p < 0.001 ), and Romanian and Swedish ( p < 0.001 ).
For the ratings of internet info reliability (see Figure 6C), there was a statistically significant difference between languages ( F ( 3234 ) = 12.36 , M S E = 0.32 , η p 2 = 0.039 , p = 0.024 ). Tukey corrected pairwise comparisons showed a statistically significant difference between French and Swedish ( p = 0.034 ). Finally, there were no statistically significant differences ( F ( 3234 ) = 1.08 , M S E = 0.86 , η p 2 = 0.014 , p = 0.36 ) for the ratings of credibility importance.

3.6. News Evaluation Ratings

For the news evaluation attitudes, we made pairwise t-tests for pre- and post-ratings separately for each language. There were no statistically significant differences for French.
For Romanian (see Figure 7A), there was a statistically significant difference ( t ( 40.91 ) = 3.83 , p < 0.001 ) between pre- and post-test ratings of relying on my gut feeling, with a lower rating on post-test. There was also a statistically significant difference ( t ( 40.38 ) = 2.24 , p = 0.030 ) between the pre- and post-test ratings of design of images, with higher ratings at post-test.
For Spanish (see Figure 7B), there was a statistically significant difference ( t ( 89.48 ) = 2.92 , p = 0.004 ) between pre- and post-test ratings of ‘relying on my gut feeling’, with lower ratings at post-test. There was a statistically significant difference ( t ( 90.81 ) = 2.49 , p = 0.015 ) between the pre- and post-test ratings of ‘search for the source’, with higher ratings at post-test. There was a statistically significant difference ( t ( 89.97 ) = 2.69 , p = 0.009 ) between pre- and post-test ratings of ‘design of images’, with higher ratings on post-test.
For Swedish (see Figure 7C), there was a statistically significant difference ( t ( 215.98 ) = 2.50 , p = 0.013 ) between the pre- and post-test ratings of ‘relying on journalists’ reputation’, with lower ratings on post-test. There was also a statistically significant difference ( t ( 209.51 ) = 4.53 , p = 0.015 ) between pre- and post-test ratings of ‘design of images’, with higher ratings on post-test.

3.7. InVID-WeVerify Ratings

For the ratings on InVID-WeVerify, we present the mean total ratings, with error bars for each language, in Figure 8. There were 17 items in total, each with a 1–7 point rating scale, resulting in a maximum total rating of 119. A higher rating represents a more positive attitude towards the digital tool.
A one-way ANOVA (Type II sum-of-squares), with language as the independent variable and total mean ratings as the dependent variable, showed a statistically significant main effect ( F ( 3202 ) = 8.22 , M S E = 322.65 , η p 2 = 0.11 , p < 0.001 ). A follow-up test with Tukey pairwise comparisons showed a statistically significant difference between Spanish and Swedish ( t ( 202 ) = 2.98 , p = 0.017 ), French and Romanian ( t ( 202 ) = 3.56 , p = 0.003 ), and Romanian and Swedish ( t ( 202 ) = 4.52 , p < 0.001 ).

3.8. Teachers’ Impressions from Teaching

Teachers from all countries found the intervention interesting, but most of them also found it difficult to fit into the limited two-hour time frame. Across countries, teachers emphasised that pupils showed interest in the topic of ’fake news’. Teachers in Spain and Romania, in particular, reported that pupils were very excited about using the new technology in the classroom. In contrast, one of the Swedish teachers reported less enthusiasm than normal in his class. In France, teachers experienced technical difficulties due to problems with installing and using the tool, slow connectivity, and dependency on network quality. Spanish teachers experienced few problems in the implementation and called for an app version of the tool to make it more useful for pupils. Teachers from all countries highlighted the usefulness of the forensic analysis functionality of the tool, and they also reported problems with the analysis of videos functionality (key frame analysis).

3.9. Summary of Results

With regard to performance, there were statistically significantly higher mean total scores on post-test as compared with pre-test. Using digital tools on the post-test resulted in higher total scores. Further, for Spanish and Romanian pupils, there were statistically significantly higher scores on total post-test scores when compared with pre-test, but not for French and Swedish pupils. For separate true and false items, there were statistically significant differences between pre- and post-test for false items, but not for true items. Using digital tools produced higher scores on post-test for both false and true items. For the attitude ratings, there were statistically significant differences between languages on media and information literacy attitudes ratings. For fact-checking ability, Swedish ratings were statistically significantly higher than Spanish and French ratings. For search ability, Romanian ratings were statistically significantly higher than all other languages, and there were statistically significant differences between the Swedish and French ratings. For internet info reliability, Swedish ratings were statistically significantly higher than French ratings. For news evaluation ratings, there were statistically significant differences between pre- and post-test ratings for Romanian and Spanish pupils on ‘gut-feeling’, with lower ratings on post-test. There were also statistically significant differences between pre- and post-test ratings for Romanian, Spanish, and Swedish pupils on ‘design of images’, with higher ratings on post-test. Finally, there were statistically significant differences between pre- and post-test ratings on ‘search source’ for Spanish pupils, with higher ratings on post-test; and, statistically significant differences between pre- and post-test ratings on ‘journalists’ reputation’ for Swedish pupils, with lower ratings on post-test.
For mean total InVID-WeVerify attitude ratings, there were statistically significant differences between languages. Romanian speakers displayed higher ratings than French and Swedish, and Spanish speakers displayed higher ratings than Swedish speakers. The attitudes towards the tool are congruent with the reports from teachers, highlighting that pupils in Romania and Spain were especially positive towards the educational intervention with what they perceived as very new and interesting technology.

4. Discussion

Previous research has highlighted that digital fact-checking strategies are scarce among historians and students, even at elite universities [9]. However, previous research has also highlighted how it is possible, but also an educational challenge, to support teenagers’ abilities to evaluate news [20,21]. In the present study, with regard to our initial hypotheses, we found that the overall performance on post-test was better than on pre-test (H1a), and that the overall performance on false items was better on post-test than on pre-test, but not on true items (H1b). In line with previous educational interventions that were designed to support teenagers’ fact-checking abilities, we saw that not all of the pupils learned to evaluate digital news better. However, we found that our intervention supported pupils in groups with poor performance on the pre-test. This indicates that our intervention with a professional fact-checking tool is possible to implement in education cross-contextually and especially in classrooms with pupils lacking skills to navigate online news. We find that it is possible to stimulate pupils’ technocognition and their transliteracy across texts, images, and videos in updated ways. What scholars call for in theory, we test in practice [8,72]. What seem to be complicated cognitive acts in human–computer interaction, using professional fact-checking tools for video and image verification, can be supported in education. Introducing a state-of-the-art fact-checking tool was possible and with the support of the educational design, and not least teachers, it was possible for pupils to better evaluate misleading images and news with the support of technology and training.
In line with our second hypothesis (H2), the results indicate that using digital tools led to better performance on post-test. In the post-test, the pupils performed better on false and true items when using digital tools. Research has previously highlighted how implementing digital tools that are designed for professionals may be very confusing to pupils and hard for them to use without proper support [55]. We do not see this problem in our results, perhaps due to helpful materials and teaching developed in dialogue between researchers and teachers in this design-based research study. Instead, our results indicate that a complicated tool, like Invid-WeVerify, may help pupils to navigate better in the complex world of misinformation. In future research, it would be interesting to investigate more in detail if new technology may actually be more useful to pupils with poor previous research and how technocognition and transliteracy plays out on an individual level in relation to different tools and digital environments. It is evident that using google reverse image search can be useful—but using different search engines can produce different results and being aware of other search engines, such as Bing and Yandex, broadens lateral reading strategies. Conducting image analysis with digital tools may also help pupils to see how algorithms work and how automated fact-checking with forensic analysis can identify manipulated images. In future research, it would be interesting to investigate how pupils’ understanding of algorithms and machine learning can be stimulated by using verification tools. The many challenges of misinformation, including misleading images and videos, makes it important to safeguard how citizens hold a rich set of digital tools to support them when navigating online news feeds.

4.1. Total Performance Highlights the Importance of Technocognition and Transliteracy

It is evident that the intervention had a positive effect on pupils’ performance. Pupils rated fake videos and manipulated images as less credible after the intervention, and the intervention did not lead them to rate credible news as less credible, which has previously been noted as a risk that is linked to exposure to fake news [69]. However, only pupils using digital tools in the post-test rated credible news as more trustworthy than before the intervention. This highlights the importance of using digital tools to support the process of verifying digital news, in line with theories of technocognition and transliteracy. Evidently, more pupils used digital tools after the intervention, but many pupils still did not.
The intervention did not have an overall effect on pupils’ rating of true items as more true; instead, the mean score indicates that most pupils landed on the fence between ‘not at all credible’ and ‘extremely credible’. This calls for further research to identify how to help pupils become better at identifying credible news as more than just ’somewhere in between’.

4.2. Variations across Countries, Performances and Attitudes

In relation to our research questions: (Q1) how does performance on pre- and post-test vary across countries? and (Q2) how do media attitudes and digital information attitudes vary across countries?, there are some interesting results. We did not conduct any direct analysis of pupils’ self-rated attitudes and their performance, but we did see some interesting variations across the four countries that may, in a non-direct way, help us to understand the impact of the intervention, and the lack thereof.
The fact that the impact on performance was stronger in Romania and Spain than in Sweden and France may be due to the different skills of pupils participating in the educational intervention. Swedish pupils scored better than other pupils on the pre-test (M = 12.7), while Romanian pupils scored the lowest on the pretest (M = 10.4). In the post-test, Romanian pupils scored marginally better than Swedish pupils ( M = 13.3 versus M = 13.1 , respectively). Pupils in Spain also started with lower scores and gained more from this intervention than pupils in France and Sweden. This result may be understood in the light of our previous research, highlighting the great challenge facing education in Romania due to the media culture [68]. It may also be understood in relation to the fact that pupils in Romania and Spain, in particular, stated that they followed their gut feeling significantly less after the intervention. Previous research highlights how reliance on gut feelings and emotions may increase the belief in misinformation [73,74]. The positive impact in Spain may also be a result of pupils in Spain learning to see the importance of investigating sources of information. In addition, we found that pupils in three out of four countries learned to pay more attention to the design of images. This could also partly explain the better post-test performance in Spain and Romania, although pupils in Sweden did not seem to benefit from this. Thus, perhaps the combination of not relying on your gut feelings and paying attention to the design is key. The lack of impact for Swedish pupils may be understood in light of the fact that Swedish pupils self-rated as very skilled at fact-checking and searching information, which previous research has highlighted as problematic attitudes in relation to performance [13]. Further, we found that pupils in Romania were more positive overall to the digital tool than other pupils, particularly those in France and Sweden. Perhaps this also increased their engagement in using digital tools for fact-checking than other pupils. The more positive attitudes towards the digital tool in Romania and Spain may be an important element in the mix of attitudes and actions that made them navigate better in the post-test.

4.3. Limitations

This study has some important limitations. The small number of teachers and pupils in each country makes it difficult to generalise our results. The results might have been quite different in other groups in the same countries. Larger-scale studies across different types of schools would help us to better understand the extent to which the differences depend on the local setting. The study also holds important qualitative limitations, since we did not closely follow how pupils made sense of the tool and the educational design. A more detailed study of pupils’ use of InVID-WeVerify could provide a better understanding as to why some pupils learn to navigate in more clever ways and change their attitudes, while other pupils do not. A potential limitation to this study is that the plug-in has not been designed with pupils in mind, but rather with professional fact-checkers. The statistically non-significant results in improving credibility assessment of French and Swedish pupils may be explained by the fact that these pupils gave the lowest scores to the InVID-WeVerify plug-in being elegant, exciting, and motivating (see Table A12 in the Appendix C). Usability, as well as the perceived utility, of an application has repeatedly been shown to be obfuscated by aesthetic preferences (e.g., [75]), and it has been shown to be a strong predictor of users overall impression [76]. However, aesthetics are valued differently cross-culturally [77], which may explain the differences in preferences of the plug-in between France and Sweden on the one hand, and between Romania and Spain on the other, and which is also mirrored in the impact of the tool on credibility assessment performance between the two groups of countries. One potential avenue to investigate is to adapt the plug-in for pupils to improve its use in curricular activity.

5. Conclusions

In conclusion, we have conducted a novel study using a professional verification tool in contextually varied educational settings. The study delivered promising results, highlighting how it is possible to use a state-of-the-art image and video verification tools in classrooms and how this may support pupils to debunk fake images and videos. The intervention stimulated pupils’ media and information literacy and made an impact on their ability to determine the credibility of visual fake news. The intervention also seemed to inspire pupils to rely less on gut-feelings. We find the fact that pupils in Romania benefited from this intervention particularly promising, since it indicates that the intervention may be especially useful for pupils facing special challenges with regard to misinformation. The participants in the present study were more likely to make use of digital tools post intervention and they generally performed better after the intervention. Thus, our research highlights that it is possible to implement advanced digital tools and stimulate pupils’ knowledge, skills, and attitudes in an era of disinformation. Therefore, this study paves the way for future educational research on how to support pupils’ lateral reading, technocognition, and transliteracy in different educational contexts, which is evidently possible with a combination of new technologies and teaching.

Author Contributions

Conceptualization, D.F.-M. and T.N.; methodology, M.G. and T.N.; formal analysis, M.G.; investigation, D.F.-M. and T.N.; resources, D.F.-M. and T.N.; writing—original draft preparation, T.N.; writing—review and editing, C.-A.W.A., D.F.-M. and M.G.; visualization, C.-A.W.A. and M.G.; supervision, D.F.-M.; project administration, D.F.-M.; funding acquisition, D.F.-M. and T.N. All authors have read and agreed to the published version of the manuscript.

Funding

This study is part of the YouCheck! project, funded by the European Commission programme Media Education for All, 2019–2020 (http://project-youcheck.com/ (accessed on 30 April 2021) and http://www.savoirdevenir.net/youcheck (accessed on 30 April 2021)). The study was also partly funded by Vinnova, grant number 2018-01279.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

We have not made data available.

Acknowledgments

We would like to thank the YouCheck! group for the involvement in designing and executing the study. We would also like to thank the participating teachers and pupils in the respective countries.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Examples of Test-Items

Figure A1. Viral, true example.
Figure A1. Viral, true example.
Information 12 00201 g0a1
Figure A2. Fake video example.
Figure A2. Fake video example.
Information 12 00201 g0a2
Figure A3. Manipulated image example.
Figure A3. Manipulated image example.
Information 12 00201 g0a3

Appendix B. Background and Evaluative Questions

Appendix B.1. Questions about Background, Attitudes towards Digital News, Perceptions of Credibility and Digital Fact-Checking Tools

  • What is your gender? (Man/Woman/Non-binary/Other alternative/Unsure/Prefer not to answer)
  • Do you follow news in multiple languages? (Yes/No)
  • How skilled are you at finding information online? (Very good—Very poor)
  • How skilled are you at critically evaluating online information? (Very good—Very poor)
  • How much of the information on the internet do you perceive as credible? (None—All)
  • How important is it for you to consume credible news? (Not at all important—Very important)
  • How do you discern factually correct information from information that is false? Please evaluate each of the following statements, on a scale from 1 (never) to 7 (very often)
    • I rely on journalist’s reputation
    • I rely on news source/brand’s reputation
    • I search for the source of the information
    • I compare different news sources to corroborate the facts
    • I consult factchecking websites in case of doubt
    • I check that the design of images and/or videos have good quality
    • I check what people say about the story online (e.g., on blogs, social media, opinion makers’ websites)
    • I rely on my own knowledge and/or expertise on the subject
    • I rely on my gut feeling
    • I use digital tools (reverse image search, tineye, etc.)
    • I confront my impressions with friends and peers

Appendix B.2. Additional Questions after the Intervention

8.
Please rate the use of InVID-WeVerify, on various dimensions, on a scale from 1 (Not at all) to 7 (Very much):
  • enjoyable
  • pleasing
  • attractive
  • friendly
  • fast
  • efficient
  • practical
  • organized
  • understandable
  • easy to learn
  • clear
  • exciting
  • valuable
  • motivating
  • creative
  • leading edge
  • innovative
9.
Will you use digital tools like InVID-WeVerify in the future to fact-check online information? (Definitely not—Definitely)

Appendix C. News Evaluation Attitudes

Means and standard deviations for pre- and post-test on news evaluation attitude test on pre- and post-test with separate estimates for each language are presented in Table A1, Table A2, Table A3, Table A4, Table A5, Table A6, Table A7, Table A8, Table A9, Table A10 and Table A11. The rating scale ranged from 1 (never) to 7 (very often).
Table A1. I rely on journalists’ reputation.
Table A1. I rely on journalists’ reputation.
LanguagePrePost
M S D M S D
French3.11.93.51.8
Romanian2.92.03.12.1
Spanish2.61.83.42.1
Swedish3.51.43.01.4
Table A2. I rely on my gut feeling.
Table A2. I rely on my gut feeling.
LanguagePrePost
M S D M S D
French3.61.83.42.0
Romanian4.02.01.91.7
Spanish4.31.93.12.3
Swedish3.41.73.01.6
Table A3. I use digital tools (reverse image search, tineye, etc.).
Table A3. I use digital tools (reverse image search, tineye, etc.).
LanguagePrePost
M S D M S D
French5.01.84.51.7
Romanian4.41.84.11.8
Spanish4.91.64.81.6
Swedish3.82.04.11.9
Table A4. I rely on news source/brand’s reputation.
Table A4. I rely on news source/brand’s reputation.
LanguagePrePost
M S D M S D
French5.01.64.42.0
Romanian4.81.94.91.8
Spanish5.01.95.31.6
Swedish4.31.74.41.5
Table A5. I search for the source of the information.
Table A5. I search for the source of the information.
LanguagePrePost
M S D M S D
French4.61.94.52.0
Romanian5.21.75.71.3
Spanish3.92.14.91.8
Swedish4.51.74.41.8
Table A6. I compare different news sources to corroborate the facts.
Table A6. I compare different news sources to corroborate the facts.
LanguagePrePost
M S D M S D
French4.51.74.41.9
Romanian5.31.85.71.6
Spanish4.52.05.01.8
Swedish5.31.75.41.5
Table A7. I check that the design of images and/or videos have good quality.
Table A7. I check that the design of images and/or videos have good quality.
LanguagePrePost
M S D M S D
French2.61.83.11.9
Romanian4.22.15.51.7
Spanish2.61.83.72.1
Swedish3.72.14.91.7
Table A8. I consult fact-checking websites in case of doubt.
Table A8. I consult fact-checking websites in case of doubt.
LanguagePrePost
M S D M S D
French4.21.84.61.8
Romanian5.31.55.51.7
Spanish4.31.85.01.7
Swedish4.21.84.12.0
Table A9. I confront my impressions with friends and peers.
Table A9. I confront my impressions with friends and peers.
LanguagePrePost
M S D M S D
French4.41.64.41.8
Romanian4.61.94.91.9
Spanish3.81.83.81.7
Swedish4.61.74.51.6
Table A10. I check what people say about the story online (e.g., on blogs, social media, opinion makers’ websites).
Table A10. I check what people say about the story online (e.g., on blogs, social media, opinion makers’ websites).
LanguagePrePost
M S D M S D
French4.61.74.51.5
Romanian4.51.74.02.2
Spanish4.91.54.81.7
Swedish4.21.94.41.8
Table A11. I rely on my own knowledge and/or expertise on the subject.
Table A11. I rely on my own knowledge and/or expertise on the subject.
LanguagePrePost
M S D M S D
French2.81.72.72.0
Romanian2.91.83.22.2
Spanish2.81.92.42.0
Swedish2.61.52.51.4
Table A12. Means and Standard Deviations for InVID-WeVerify attitudes on post-test with separate estimates for each language. The rating scale ranged from 1 (not at all) to 7 (very much).
Table A12. Means and Standard Deviations for InVID-WeVerify attitudes on post-test with separate estimates for each language. The rating scale ranged from 1 (not at all) to 7 (very much).
AttitudeFrenchRomanianSpanishSwedish
M S D M S D M S D M S D
Appealing4.81.45.81.24.81.54.31.6
Clear4.61.55.61.44.41.64.51.7
Creative4.41.65.41.65.41.24.31.6
Cutting edge4.51.55.41.45.81.24.41.5
Easily learned4.51.55.71.35.11.74.51.7
Efficient5.41.45.71.45.61.35.01.6
Elegant4.01.55.71.44.71.54.01.5
Exciting4.31.64.52.04.41.84.01.7
Fast4.91.45.81.45.11.54.81.7
Friendly4.51.65.71.24.41.94.51.5
Innovative5.31.56.01.36.01.04.31.7
Motivating3.91.64.61.84.31.83.81.6
Practical5.11.45.81.55.71.55.11.6
Simple4.41.35.91.34.91.64.51.7
Unambiguous4.61.55.71.44.61.64.31.7
Unorganised4.91.56.11.04.91.74.51.6
Valuable5.11.65.41.65.71.54.91.6
Table A13. Detailed report of the total score entered into a 2 (using digital tools, between-subjects) × 4 (language) × 2 (time, within-subjects) mixed factorial ANOVA. Significant effects are in italics.
Table A13. Detailed report of the total score entered into a 2 (using digital tools, between-subjects) × 4 (language) × 2 (time, within-subjects) mixed factorial ANOVA. Significant effects are in italics.
Effect S S d f M S E FpPartial η 2
Use of digital tools (1)871 11.94 7.29 0.007 0.031
Language (2)443 11.94 1.22 0.31 0.016
Error (subject) 229
Time (3)1161 6.14 18.89 0.000 0.076
1 × 3 141 6.14 2.32 0.13 0.010
2 × 3 543 6.14 2.95 0.033 0.037
Error 229

References

  1. European Commission. Action Plan against Disinformation: Joint Communication to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions. 2018. JOIN/2018/36 Final. Available online: https://op.europa.eu/en/publication-detail/-/publication/8a94fd8f-8e92-11e9-9369-01aa75ed71a1/language-en (accessed on 30 April 2021).
  2. World Health Organization. Responding to Community Spread of COVID-19: Interim Guidance, 7 March 2020; World Health Organization: Geneva, Switzerland, 2020. [Google Scholar]
  3. Wardle, C.; Derakhshan, H. Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. 2017. DGI(2017)09. Available online: http://tverezo.info/wp-content/uploads/2017/11/PREMS-162317-GBR-2018-Report-desinformation-A4-BAT.pdf (accessed on 30 April 2021).
  4. García Lozano, M.; Brynielsson, J.; Franke, U.; Rosell, M.; Tjörnhammar, E.; Varga, S.; Vlassov, V. Veracity assessment of online data. Decis. Support Syst. 2020, 129, 113132. [Google Scholar] [CrossRef]
  5. Hussain, S.; Neekhara, P.; Jere, M.; Koushanfar, F.; McAuley, J. Adversarial deepfakes: Evaluating vulnerability of deepfake detectors to adversarial examples. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikola, HI, USA, 5–9 January 2021; pp. 3348–3357. [Google Scholar]
  6. Scott, M. Facebook’s private groups are abuzz with coronavirus fake news. Politico, 30 March 2020. [Google Scholar]
  7. Guess, A.M.; Lerner, M.; Lyons, B.; Montgomery, J.M.; Nyhan, B.; Reifler, J.; Sircar, N. A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proc. Natl. Acad. Sci. USA 2020, 117, 15536. [Google Scholar] [CrossRef] [PubMed]
  8. Lewandowsky, S.; Ecker, U.K.H.; Cook, J. Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. J. Appl. Res. Mem. Cogn. 2017, 6, 353–369. [Google Scholar] [CrossRef] [Green Version]
  9. Wineburg, S.; McGrew, S. Lateral Reading and the Nature of Expertise: Reading Less and Learning More When Evaluating Digital Information. Teach. Coll. Rec. 2019, 121, 1–40. [Google Scholar]
  10. Kim, H.; Garrido, P.; Tewari, A.; Xu, W.; Thies, J.; Niessner, M.; Pérez, P.; Richardt, C.; Zollhöfer, M.; Theobalt, C. Deep video portraits. ACM Trans. Graph. 2018, 37, 1–14. [Google Scholar] [CrossRef]
  11. Shen, C.; Kasra, M.; Pan, W.; Bassett, G.A.; Malloch, Y.; O’Brien, J.F. Fake images: The effects of source, intermediary, and digital media literacy on contextual assessment of image credibility online. N. Media Soc. 2019, 21, 438–463. [Google Scholar] [CrossRef]
  12. Breakstone, J.; Smith, M.; Wineburg, S.; Rapaport, A.; Carle, J.; Garland, M.; Saavedra, A. Students’ Civic Online Reasoning: A National Portrait; Stanford History Education Group & Gibson Consulting: Stanford, CA, USA, 2019; Available online: https://purl.stanford.edu/gf151tb4868 (accessed on 30 April 2021).
  13. Nygren, T.; Guath, M. Swedish teenagers’ difficulties and abilities to determine digital news credibility. Nord. Rev. 2019, 40, 23–42. [Google Scholar] [CrossRef] [Green Version]
  14. Nygren, T.; Guath, M. Students Evaluating and Corroborating Digital News. Scand. J. Educ. Res. 2021, 1–17. [Google Scholar] [CrossRef]
  15. Frau-Meigs, D. Transliteracy: Sense-making mechanisms for establishing e-presence. In Media and Information Literacy and Intercultural Dialogue; Hope Culver, S., Carlsson, U., Eds.; Nordicom: Gothenburg, Sweden, 2013; pp. 175–189. [Google Scholar]
  16. Axelsson, C.A.W.; Guath, M.; Nygren, T. Learning How to Separate Fake From Real News: Scalable Digital Tutorials Promoting Students’ Civic Online Reasoning. Future Internet 2021, 13, 60. [Google Scholar] [CrossRef]
  17. Ku, K.Y.; Kong, Q.; Song, Y.; Deng, L.; Kang, Y.; Hu, A. What predicts adolescents’ critical thinking about real-life news? The roles of social media news consumption and news media literacy. Think. Skills Creat. 2019, 33, 100570. [Google Scholar] [CrossRef]
  18. McGrew, S.; Breakstone, J.; Ortega, T.; Smith, M.; Wineburg, S. Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory Res. Soc. Educ. 2018, 46, 1–29. [Google Scholar] [CrossRef]
  19. Porat, E.; Blau, I.; Barak, A. Measuring digital literacies: Junior high-school students’ perceived competencies versus actual performance. Comput. Educ. 2018, 126, 23–36. [Google Scholar] [CrossRef]
  20. McGrew, S. Learning to evaluate: An intervention in civic online reasoning. Comput. Educ. 2020, 145, 103711. [Google Scholar] [CrossRef]
  21. McGrew, S.; Byrne, V.L. Who Is behind this? Preparing high school students to evaluate online content. J. Res. Technol. Educ. 2020, 1–19. [Google Scholar] [CrossRef]
  22. Hargittai, E. Second-level digital divide: Mapping differences in people’s online skills. arXiv 2001, arXiv:cs/0109068. [Google Scholar] [CrossRef]
  23. Hargittai, E. Digital na(t)ives? Variation in internet skills and uses among members of the “net generation”. Sociol. Inq. 2010, 80, 92–113. [Google Scholar] [CrossRef]
  24. Hargittai, E.; Hinnant, A. Digital inequality: Differences in young adults’ use of the Internet. Commun. Res. 2008, 35, 602–621. [Google Scholar] [CrossRef] [Green Version]
  25. Van Dijk, J. The Digital Divide; Polity Press: Cambridge, UK, 2020. [Google Scholar]
  26. Van Deursen, A.J.; Van Dijk, J.A. The digital divide shifts to differences in usage. N. Media Soc. 2014, 16, 507–526. [Google Scholar] [CrossRef]
  27. Hatlevik, O.E.; Ottestad, G.; Throndsen, I. Predictors of digital competence in 7th grade: A multilevel analysis. J. Comput. Assist. Learn. 2015, 31, 220–231. [Google Scholar] [CrossRef]
  28. Nygren, T.; Wiksten Folkeryd, J.; Liberg, C.; Guath, M. Students Assessing Digital News and Misinformation. In Disinformation in Open Online Media; van Duijn, M., Preuss, M., Spaiser, V., Takes, F., Verberne, S., Eds.; Springer International Publishing: Leiden, The Netherlands, 2020; pp. 63–79. [Google Scholar]
  29. Sundar, S.S.; Knobloch-Westerwick, S.; Hastall, M.R. News cues: Information scent and cognitive heuristics. J. Am. Soc. Inf. Sci. Technol. 2007, 58, 366–378. [Google Scholar] [CrossRef]
  30. Roozenbeek, J.; van der Linden, S. Fake news game confers psychological resistance against online misinformation. Palgrave Commun. 2019, 5, 1–10. [Google Scholar] [CrossRef] [Green Version]
  31. Breakstone, J.; Smith, M.; Connors, P.; Ortega, T.; Kerr, D.; Wineburg, S. Lateral reading: College students learn to critically evaluate internet sources in an online course. Harv. Kennedy Sch. Misinf. Rev. 2021. Available online: https://misinforeview.hks.harvard.edu/article/lateral-reading-college-students-learn-to-critically-evaluate-internet-sources-in-an-online-course/ (accessed on 30 April 2021).
  32. Bontcheva, K. WeVerify Technology Helps Fight Coronavirus Misinformation. 2020. Available online: https://weverify.eu/news/weverify-technology-helps-fight-coronavirus-misinformation/ (accessed on 30 April 2021).
  33. Carlsson, U. Understanding Media and Information Literacy (MIL) in the Digital Age: A Question of Democracy; University of Gothenburg: Gothenburg, Sweden, 2019. [Google Scholar]
  34. Koltay, T. The media and the literacies: Media literacy, information literacy, digital literacy. Media Cult. Soc. 2011, 33, 211–221. [Google Scholar] [CrossRef]
  35. Tibor, K. New media and literacies: Amateurs vs. Professionals. First Monday 2011, 16. Available online: https://journals.uic.edu/ojs/index.php/fm/article/download/3206/2748 (accessed on 30 April 2021).
  36. Eshet, Y. Digital literacy: A conceptual framework for survival skills in the digital era. J. Educ. Multimed. Hypermedia 2004, 13, 93–106. [Google Scholar]
  37. Aufderheide, P. Media Literacy; A Report of the National Leadership Conference on Media Literacy; ERIC: Washington, DC, USA, 1993. [Google Scholar]
  38. Hobbs, R. Digital and Media Literacy: A Plan of Action; A White Paper on the Digital and Media Literacy Recommendations of the Knight Commission on the Information Needs of Communities in a Democracy; ERIC: Washington, DC, USA, 2010. [Google Scholar]
  39. Livingstone, S. Media literacy and the challenge of new information and communication technologies. Commun. Rev. 2004, 7, 3–14. [Google Scholar] [CrossRef] [Green Version]
  40. Thomas, S.; Joseph, C.; Laccetti, J.; Mason, B.; Mills, S.; Perril, S.; Pullinger, K. Transliteracy: Crossing Divides. First Monday 2007. [Google Scholar] [CrossRef]
  41. Frau-Meigs, D. Transliteracy as the new research horizon for media and information literacy. Media Stud. 2012, 3, 14–27. [Google Scholar]
  42. McGrew, S.; Ortega, T.; Breakstone, J.; Wineburg, S. The Challenge That’s Bigger than Fake News: Civic Reasoning in a Social Media Environment. Am. Educ. 2017, 41, 4. [Google Scholar]
  43. Jones-Jang, S.M.; Mortensen, T.; Liu, J. Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. Am. Behav. Sci. 2021, 65, 371–388. [Google Scholar] [CrossRef]
  44. Roozenbeek, J.; van der Linden, S.; Nygren, T. Prebunking interventions based on ‘inoculation’ theory can reduce susceptibility to misinformation across cultures. Harv. Kennedy Sch. Misinf. Rev. 2020, 1. [Google Scholar] [CrossRef] [Green Version]
  45. Kahne, J.; Hodgin, E.; Eidman-Aadahl, E. Redesigning civic education for the digital age: Participatory politics and the pursuit of democratic engagement. Theory Res. Soc. Educ. 2016, 44, 1–35. [Google Scholar] [CrossRef]
  46. OECD. Students, Computers and Learning; Organisation for Economic Co-operation and Development: Paris, France, 2015; p. 204. [Google Scholar] [CrossRef]
  47. Kirschner, P.A.; De Bruyckere, P. The myths of the digital native and the multitasker. Teach. Teach. Educ. 2017, 67, 135–142. [Google Scholar] [CrossRef]
  48. Kirschner, P.A.; Sweller, J.; Clark, R.E. Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching. Educ. Psychol. 2006, 41, 75–86. [Google Scholar] [CrossRef]
  49. Mason, L.; Junyent, A.A.; Tornatora, M.C. Epistemic evaluation and comprehension of web-source information on controversial science-related topics: Effects of a short-term instructional intervention. Comput. Educ. 2014, 76, 143–157. [Google Scholar] [CrossRef]
  50. Pérez, A.; Potocki, A.; Stadtler, M.; Macedo-Rouet, M.; Paul, J.; Salmerón, L.; Rouet, J.F. Fostering teenagers’ assessment of information reliability: Effects of a classroom intervention focused on critical source dimensions. Learn. Instr. 2018, 58, 53–64. [Google Scholar] [CrossRef]
  51. Saye, J.W.; Brush, T. Scaffolding critical reasoning about history and social issues in multimedia-supported learning environments. Educ. Technol. Res. Dev. 2002, 50, 77–96. [Google Scholar] [CrossRef]
  52. Walraven, A.; Brand-Gruwel, S.; Boshuizen, H.P.A. How students evaluate information and sources when searching the World Wide Web for information. Comput. Educ. 2009, 52, 234–246. [Google Scholar] [CrossRef] [Green Version]
  53. McGrew, S.; Smith, M.; Breakstone, J.; Ortega, T.; Wineburg, S. Improving university students’ web savvy: An intervention study. Br. J. Educ. Psychol. 2019, 89, 485–500. [Google Scholar] [CrossRef]
  54. Nygren, T.; Sandberg, K.; Vikström, L. Digitala primärkällor i historieundervisningen: En utmaning för elevers historiska tänkande och historiska empati. Nordidactica J. Hum. Soc. Sci. Educ. 2014, 4, 208–245. [Google Scholar]
  55. Nygren, T.; Vikström, L. Treading old paths in new ways: Upper secondary students using a digital tool of the professional historian. Educ. Sci. 2013, 3, 50–73. [Google Scholar] [CrossRef] [Green Version]
  56. Anderson, T.; Shattuck, J. Design-based research: A decade of progress in education research? Educ. Res. 2012, 41, 16–25. [Google Scholar] [CrossRef] [Green Version]
  57. Shavelson, R.J.; Phillips, D.C.; Towne, L.; Feuer, M.J. On the science of education design studies. Educ. Res. 2003, 32, 25–28. [Google Scholar] [CrossRef]
  58. Andersson, B. Design och utvärdering av undervisningssekvenser. Forsk. Om Undervis. Och Lärande 2011, 1, 19–27. [Google Scholar]
  59. Edelson, D.C. Design research: What we learn when we engage in design. J. Learn. Sci. 2002, 11, 105–121. [Google Scholar] [CrossRef]
  60. Ormel, B.J.; Roblin, N.N.P.; McKenney, S.E.; Voogt, J.M.; Pieters, J.M. Research–practice interactions as reported in recent design studies: Still promising, still hazy. Educ. Technol. Res. Dev. 2012, 60, 967–986. [Google Scholar] [CrossRef] [Green Version]
  61. Akkerman, S.F.; Bronkhorst, L.H.; Zitter, I. The complexity of educational design research. Qual. Quant. 2013, 47, 421–439. [Google Scholar] [CrossRef]
  62. Brown, A.L. Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. J. Learn. Sci. 1992, 2, 141–178. [Google Scholar] [CrossRef]
  63. Design-Based Research Collective. Design-based research: An emerging paradigm for educational inquiry. Educ. Res. 2003, 32, 5–8. [Google Scholar] [CrossRef]
  64. Kelly, A. Design research in education: Yes, but is it methodological? J. Learn. Sci. 2004, 13, 115–128. [Google Scholar] [CrossRef]
  65. Collins, A.; Joseph, D.; Bielaczyc, K. Design research: Theoretical and methodological issues. J. Learn. Sci. 2004, 13, 15–42. [Google Scholar] [CrossRef]
  66. Teyssou, D.; Leung, J.M.; Apostolidis, E.; Apostolidis, K.; Papadopoulos, S.; Zampoglou, M.; Papadopoulou, O.; Mezaris, V. The InVID plug-in: Web video verification on the browser. In Proceedings of the First International Workshop on Multimedia Verification, Mountain View, CA, USA, 23–27 October 2017; pp. 23–30. [Google Scholar]
  67. InVID. InVID Verification Plugin. 2019. Available online: https://www.invid-project.eu/tools-and-services/invid-verification-plugin/ (accessed on 30 April 2021).
  68. Nygren, T.; Frau-Meigs, D.; Corbu, N.; Santoveña-Casal, S. Teachers’ views on disinformation and media literacy supported by a tool designed for professional fact-checkers: Perspectives from France, Romania, Spain and Sweden. Under Review.
  69. Chesney, B.; Citron, D. Deep fakes: A looming challenge for privacy, democracy, and national security. Calif. Law Rev. 2019, 107, 1753. [Google Scholar] [CrossRef] [Green Version]
  70. Frunzaru, V.; Corbu, N. Students’ attitudes towards knowledge and the future of work. Kybernetes 2020, 49, 1987–2002. [Google Scholar] [CrossRef]
  71. Langsrud, Ø. ANOVA for unbalanced data: Use Type II instead of Type III sums of squares. Stat. Comput. 2003, 13, 163–167. [Google Scholar] [CrossRef]
  72. Frau-Meigs, D. Information Disorders: Risks and Opportunities for Digital Media and Information Literacy? Medijske Studije 2019, 10, 10–28. [Google Scholar] [CrossRef]
  73. Garrett, R.K.; Weeks, B.E. Epistemic beliefs’ role in promoting misperceptions and conspiracist ideation. PLoS ONE 2017, 12, e0184733. [Google Scholar] [CrossRef]
  74. Martel, C.; Pennycook, G.; Rand, D.G. Reliance on emotion promotes belief in fake news. Cogn. Res. Princ. Implic. 2020, 5, 1–20. [Google Scholar] [CrossRef]
  75. Kurosu, M.; Kashimura, K. Apparent usability vs. inherent usability: Experimental analysis on the determinants of the apparent usability. In Proceedings of the Conference Companion on Human Factors in Computing Systems, Denver, CO, USA, 7–11 May 1995; pp. 292–293. [Google Scholar]
  76. Schenkman, B.N.; Jönsson, F.U. Aesthetics and preferences of web pages. Behav. Inf. Technol. 2000, 19, 367–377. [Google Scholar] [CrossRef]
  77. Shin, D.H. Cross-analysis of usability and aesthetic in smart devices: What influences users’ preferences? Cross Cult. Manag. Int. J. 2012. [Google Scholar] [CrossRef]
Figure 1. Study design.
Figure 1. Study design.
Information 12 00201 g001
Figure 2. Pupil active group task designed to stimulate conversation and link the educational content to pupils’ perceptions of news.
Figure 2. Pupil active group task designed to stimulate conversation and link the educational content to pupils’ perceptions of news.
Information 12 00201 g002
Figure 3. Pupils’ task designed to stimulate and scaffold their use of InVID-WeVerify.
Figure 3. Pupils’ task designed to stimulate and scaffold their use of InVID-WeVerify.
Information 12 00201 g003
Figure 4. Explainers informing how to debunk fake images and video using InVID-WeVerify.
Figure 4. Explainers informing how to debunk fake images and video using InVID-WeVerify.
Information 12 00201 g004
Figure 5. Panel (A) depicts the mean total score for participants not using/using digital tools in the post-test. Panel (B) depicts the mean total score on pre- and post-test for each language. Error bars denote a bootstrapped standard error of the mean.
Figure 5. Panel (A) depicts the mean total score for participants not using/using digital tools in the post-test. Panel (B) depicts the mean total score on pre- and post-test for each language. Error bars denote a bootstrapped standard error of the mean.
Information 12 00201 g005
Figure 6. Mean ratings (y-axis) with standard error of mean for (A) fact-checking ability, (B) search ability, (C) internet info reliability for each language on the x-axis.
Figure 6. Mean ratings (y-axis) with standard error of mean for (A) fact-checking ability, (B) search ability, (C) internet info reliability for each language on the x-axis.
Information 12 00201 g006
Figure 7. Mean ratings (y-axis) with the standard error of mean for statistically significant differences between pre- and post-test for items on news evaluation test (x-axis) for Romanian (A), Spanish (B) and Swedish (C) language versions of the intervention.
Figure 7. Mean ratings (y-axis) with the standard error of mean for statistically significant differences between pre- and post-test for items on news evaluation test (x-axis) for Romanian (A), Spanish (B) and Swedish (C) language versions of the intervention.
Information 12 00201 g007
Figure 8. Mean total ratings on InVID-WeVerify attitudes (y-axis) with standard error of mean (error bars) for each language (x-axis).
Figure 8. Mean total ratings on InVID-WeVerify attitudes (y-axis) with standard error of mean (error bars) for each language (x-axis).
Information 12 00201 g008
Table 1. Mean total scores on pre- and post test for the separate languages with standard deviation in parentheses.
Table 1. Mean total scores on pre- and post test for the separate languages with standard deviation in parentheses.
MeasureFrenchRomanianSpanishSwedish
Pre-test score12.1 (3.2)10.5 (4.5)12.1 (3.1)12.5 (2.8)
Post-test score13.1 (2.6)13.7 (3.4)13.9 (3.4)12.7 (2.7)
Table 2. Percentage of digital tool use on pre- and post test for the separate languages.
Table 2. Percentage of digital tool use on pre- and post test for the separate languages.
MeasureFrenchRomanianSpanishSwedish
Pre-test digital tool use29%23%8.5%8.1%
Post-test digital tool use42%91%64%27%
Table 3. Medians and median absolute deviation (in parentheses) for scores on total false post-test (maximum 14) and true post-test (maximum 7) for participants using/not using digital tools on post-test.
Table 3. Medians and median absolute deviation (in parentheses) for scores on total false post-test (maximum 14) and true post-test (maximum 7) for participants using/not using digital tools on post-test.
Type of ItemsNo Digital ToolsDigital Tools
Total false post-test 10 ( 3.0 ) 11 ( 1.5 )
True post-test 3 ( 1.5 ) 4 ( 3.0 )
Table 4. Means and standard deviations (in parentheses) for fact-checking ability, search ability, internet info reliability, and credibility importance based on a five-point rating scale.
Table 4. Means and standard deviations (in parentheses) for fact-checking ability, search ability, internet info reliability, and credibility importance based on a five-point rating scale.
Attitude Measure M ( S D )
Fact-checking ability 3.3 ( 0.9 )
Search ability 3.6 ( 0.9 )
Internet info reliability 2.8 ( 0.6 )
Credibility importance 4.3 ( 0.9 )
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Nygren, T.; Guath, M.; Axelsson, C.-A.W.; Frau-Meigs, D. Combatting Visual Fake News with a Professional Fact-Checking Tool in Education in France, Romania, Spain and Sweden. Information 2021, 12, 201. https://doi.org/10.3390/info12050201

AMA Style

Nygren T, Guath M, Axelsson C-AW, Frau-Meigs D. Combatting Visual Fake News with a Professional Fact-Checking Tool in Education in France, Romania, Spain and Sweden. Information. 2021; 12(5):201. https://doi.org/10.3390/info12050201

Chicago/Turabian Style

Nygren, Thomas, Mona Guath, Carl-Anton Werner Axelsson, and Divina Frau-Meigs. 2021. "Combatting Visual Fake News with a Professional Fact-Checking Tool in Education in France, Romania, Spain and Sweden" Information 12, no. 5: 201. https://doi.org/10.3390/info12050201

APA Style

Nygren, T., Guath, M., Axelsson, C. -A. W., & Frau-Meigs, D. (2021). Combatting Visual Fake News with a Professional Fact-Checking Tool in Education in France, Romania, Spain and Sweden. Information, 12(5), 201. https://doi.org/10.3390/info12050201

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop