Next Article in Journal
Deep Learning Approach for Equivalent Circuit Model Parameter Identification of Lithium-Ion Batteries
Previous Article in Journal
BinoForce: A Force-Based 3D Dynamic Label Layout Method Under Binocular Viewpoints
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Gamification and User Experience in Fake News Detection on Tourism in Primary Education

by
Androniki Koutsikou
* and
Nikos Antonopoulos
Department of Digital Media and Communication, Ionian University, 28100 Argostoli, Greece
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(11), 2200; https://doi.org/10.3390/electronics14112200
Submission received: 19 April 2025 / Revised: 19 May 2025 / Accepted: 27 May 2025 / Published: 29 May 2025
(This article belongs to the Section Electronic Multimedia)

Abstract

:
The concept of gaming is universal and familiar to students worldwide. Gamification involves integrating game elements and mechanics into non-game environments, making it a valuable tool for enhancing user engagement and motivation in Human–Computer Interaction. This approach is particularly valuable for primary school education. Students are exposed to a great deal of information daily. This contains several inaccuracies and misinformation regarding the tourism sector. Our research is being conducted as part of the Computer Science course to help students aged 9 to 12 understand the concept of fake news in the context of tourism. Bilingual students brought valuable perspectives to the classroom, especially during discussions about cultural representation and media bias. Incorporating intercultural communication into learning activities helped these students enhance their language and critical thinking skills while navigating various cultural contexts. We used an application with gamification elements to engage the students and enhance their learning experience. We evaluated user experience and usability using quantitative methods through questionnaires. The results revealed that students found the application easy to use and had a positive experience with it. This study assessed the effectiveness of the educational intervention by comparing pre-test and post-test scores on a Likert scale on four key questions. The intervention was largely successful in enhancing student outcomes. These findings suggest that participants not only maintained stable information literacy behaviors over time but also showed improvements in critical evaluation and skepticism.

1. Introduction

In today’s digital landscape, primary students are often exposed to misleading or false information, particularly in areas such as tourism, where idealized or fabricated portrayals of destinations are common. This exposure challenges young learners whose cognitive and media literacy skills are still developing. According to Stuart Hall’s Encoding/Decoding Model (1980), the process of communication involves not just the transmission but also the interpretation of meaning [1]. In this context, the sender’s intended message, such as a promotional travel post, may be decoded by young audiences in unintended or even harmful ways, especially without critical media literacy skills.
Given the risks of misinterpretation in Hall’s model, especially for young learners, media literacy is crucial for helping students navigate digital content. Media literacy theory provides a framework to address these interpretive challenges. Potter emphasizes that media literacy involves foundational knowledge and skills for effective information processing, focusing on cognitive processes such as attention, perception, comprehension, and evaluation, which are often underdeveloped in younger learners. Our goal is to foster these skills through gamified learning environments, engaging students in analyzing and evaluating digital content to enhance their media literacy and critical thinking [2].
To implement media literacy principles for primary learners, our project uses a gamified learning environment that encourages active interpretation and critique of digital content. By incorporating game elements beyond traditional entertainment, we create an engaging educational experience. Grounded in Hobbs’ (2010) Five Core Competencies of Media Literacy, access, analyze, evaluate, create, and act, the initiative promotes critical engagement with media rather than passive consumption. This structured, interactive approach enhances students’ critical literacy skills, empowering them to interpret and evaluate the credibility of digital tourism content effectively. In conclusion, while communication theory highlights how media messages can be encoded with specific intent and decoded with unintended interpretations, this process is particularly problematic when young learners lack the media literacy skills to identify such distortions. Prior work has acknowledged the risks of media manipulation, yet interventions at the primary level remain underdeveloped, especially in context-specific domains such as tourism [3,4].
While the theoretical foundation supports the need for media literacy, the practical realization of this educational goal depends heavily on the usability and design of digital learning environments, areas where Human–Computer Interaction (HCI) plays a critical role. The principles of HCI play a pivotal role in the design and functionality of educational technology platforms. Effective HCI emphasizes the importance of creating intuitive interfaces that resonate with users’ mental models. In our project, we are not only applying these HCI principles to interface design but also to interaction patterns. Furthermore, the incorporation of the System Usability Scale (SUS) serves to quantify perceived usability, reinforcing its significance as a determinant of the effectiveness and success of educational technology solutions. Through these strategies, we aim to foster an environment that promotes user satisfaction and effective learning outcomes [5].
Alongside HCI principles, gamification offers the necessary motivational and structural framework to maintain engagement and enhance cognitive learning in digital environments. Recent research in HCI emphasizes adaptive and intelligent advancements in gamification [6]. Well-designed gamification strategies can significantly enrich the user experience in educational environments [7]. Gamification, which refers to the incorporation of game elements in non-game settings, is a powerful tool for helping children learn through educational technologies. When applied effectively alongside teaching methods that encourage student interaction with learning materials, gamification has consistently proven to enhance students’ participation, motivation, engagement, and overall learning outcomes [8].
To ensure that our gamified intervention not only attracts attention but also fosters intrinsic motivation, we incorporate elements from Self-Determination Theory (SDT), which identifies the psychological needs driving engagement. This theory identifies autonomy, competence, and relatedness as fundamental components that drive intrinsic motivation. In this context, we have intentionally selected game elements such as choice-based narratives and achievable challenges to meet these psychological needs. For example, students are granted autonomy in their navigation of misinformation scenarios and are provided with feedback and achievement systems that enhance their sense of competence. This approach aims to create a motivating learning environment that encourages active participation and self-directed learning [9]. Using these game-inspired experiences encourages people to interact with an application or system more dynamically, focusing on reward-based interactions [10]. It also enhances user engagement by fulfilling psychological needs such as autonomy, competence, and relatedness [11]. Gamification adds game elements such as storytelling that create an immersive experience but do not focus on making actual games [12]. These immersive environments can serve as meaningful platforms for bilingual and culturally diverse learners to bridge cultural gaps, enhance engagement, and develop critical awareness of global contexts [13].
Our research used this technique to engage students with phenomena such as misinformation and fake news. Fake news is not a new phenomenon. It has existed throughout history, with evidence cited by scholars dating back to the eighth century [14]. Fake news is common everywhere, affecting people of all ages and cultures. This term refers to news that is inaccurate or not based on facts. It includes false claims, contradicts evidence, or places facts in misleading contexts. There are six ways that fake news has been operationalized: satire, parody, fabrication, manipulation, propaganda, and advertising. Despite the wealth of typologies developed to categorize such misinformation, ranging from satire and parody to outright fabrication, there has been a noticeable lack of translations of these concepts into practical applications for primary education. Particularly concerning is the emergence of fake tourism content, characterized by manipulated reviews and skewed portrayals of destinations [15,16].
Motivating students through gamification is only one part of the equation; equally important is the content they engage with. One key focus is misinformation, particularly in the tourism sector, where misleading portrayals are common and often overlooked in educational media. ‘Fake news’ has become prominent with digital technology and social media, affecting tourism. It shapes travelers’ perceptions and can benefit or harm destinations and businesses [17]. Fake news has rarely been addressed in tourism academic research. The digitization of news has redefined traditional journalistic concepts. Online platforms enable non-journalists to reach a wide audience [18]. In recent years, the prevalence of fake online reviews has become a significant concern. As online reviews have grown increasingly common, their influence on consumers’ purchasing decisions has surged [19]. As concerns about misinformation on social media rise, experts advocate for news literacy to help people critically evaluate media. Increased media literacy enables individuals to better identify and counter fake news online [20]. News literacy in schools is crucial for helping students analyze and question news content, promoting responsible sharing. Many teachers still view it as a new concept [21]. Digital platforms are competing with traditional media for young audiences, rapidly reshaping how they access and engage with news [22]. Researchers, policymakers, and educators believe that teaching media literacy can help young people recognize, reject, and handle misinformation they see online [20,23].
Several studies have examined how gamification can support media literacy in general. These findings offer useful models for structuring engagement, although few have addressed domain-specific concerns such as tourism misinformation. For example, Ferrer created a system that helps students distinguish between credible and unreliable sources through storytelling and role-playing. This study focused on students’ views of motivation, digital competence, and learning, excluding educators’ perspectives [24]. Recent efforts by scholars and organizations such as Capecchi et al. and MediaSmarts have demonstrated promising results in older learners. However, these interventions typically address general digital literacy rather than tourism-specific misinformation in primary education. The research conducted by Capecchi and the quiz from MediaSmarts has effectively utilized assessment tools such as the System Usability Scale (SUS) and GUESS to evaluate gamified educational interventions, primarily targeting older children aged 12 and above. In this work, we adapt these assessment metrics for younger audiences, thereby testing their applicability and relevance within the context of primary education [25,26]. Similarly, while Aldalur’s (2024) work illustrates how gamified learning environments can enhance engagement and community in programming education, it reinforces the broader trend of using gamification for content delivery rather than for critical literacy development [27].
MAthE, for example, is an educational tool designed to enhance news verification skills and raise awareness about misinformation. While it effectively instructs users on authentication tools and verification techniques, it does not specifically target younger learners or emphasize tourism content [28]. Recent efforts, such as those by Capecchi [25] and initiatives such as MAthE the Game (2019), represent important strides in addressing misinformation. However, neither of these initiatives specifically addresses the context of tourism or targets primary education students. Our research seeks to bridge this gap in the academic literature by integrating various types of misinformation into interactive game scenarios. This approach not only engages students but also allows them to explore the consequences of misinformation through their decision-making processes, thus enhancing their understanding of the topic in a relevant and relatable context.
The existing landscape of gamified media literacy tools, such as MediaSmarts and MAthE the Game, predominantly emphasizes internet safety and general misinformation detection, but they primarily target older learners. This oversight neglects the contextual specificity required in domains such as tourism, which is crucial for developing a comprehensive understanding of misinformation in specific fields. Our approach diverges from these traditional interventions by integrating culturally relevant narratives that specifically address tourism misinformation, a topic often excluded from primary education despite its significant influence on learners’ perceptions and decision-making processes. We create content that aligns with culturally inclusive narratives, making it more relatable and engaging for primary-aged learners. This approach enhances their understanding and engagement. Additionally, while many existing gamification interventions focus on secondary or post-secondary education, our research specifically targets primary students. We consider their cognitive development and use age-appropriate teaching strategies. By situating our intervention at the nexus of media literacy theory, motivational psychology, communication theory, and Human–Computer Interaction (HCI), we contribute a novel and theoretically robust framework to educational research. This comprehensive approach allows for a more nuanced evaluation of how gamified applications can effectively equip young learners with the critical skills necessary to navigate and assess the increasingly complex media landscapes they encounter. To address the research objectives, three questions were developed:
RQ1: What is the perceived usability of a gamified educational application among primary school students according to the SUS?
RQ2: How do primary school students perceive their experience while interacting with a gamified educational application?
RQ3: Does an application incorporating gamification elements improve primary students’ ability to critically evaluate online information and detect misinformation?

2. Materials and Methods

The study adopts a quantitative methodological framework, employing an experimental design within the regular Information and Communication Technology (ICT) curriculum. The intervention was implemented in the school’s computer lab, with each class engaging in a single instructional session lasting approximately 45 min. This session was integrated into the official school timetable as part of the ICT lesson. Before the commencement of the educational activities, students were informed that their ICT teacher would also serve as the researcher for the duration of the study. Participants were fully briefed on the structure and objectives of the intervention, ensuring transparency and ethical compliance. All student data was collected by the supervising teacher, who acted as the data processor, and digitized in an anonymized, non-identifiable format. The data was securely stored on an external hard drive, accessible only to the data controller and the supervising teacher. At no point was any personal data shared with third parties. To protect participant privacy, all physical (paper-based) questionnaires were immediately destroyed after digitization. Participation was entirely voluntary; students and their guardians were informed that they could withdraw at any time without facing any penalties. In the event of withdrawal, all related anonymized data was promptly destroyed. The research was carried out per the guidelines and instructions set by the Institute of Education Policy (IEP) and the school’s internal program. Anonymity and data confidentiality were ensured under the provisions of the General Data Protection Regulation (EU 2016/679, GDPR).
The educational application utilized in this study was constructed with Genially, a cloud-based platform specifically designed for creating interactive and gamified digital content. Genially provides an intuitive authoring environment that allows educators to craft visually captivating learning experiences by incorporating clickable objects, animations, embedded multimedia, branching scenarios, and various assessment tools. The platform supports HTML5 output and operates entirely in a web-based context, thus negating the necessity for software installation or specialized hardware. This accessibility facilitates use on any modern web browser, such as Chrome, Firefox, and Safari, as well as across diverse operating systems, including Windows, macOS, and iOS/Android tablets. The application was hosted and executed online via Genially, ensuring uninterrupted accessibility on frequently employed devices in educational settings, including desktop computers, laptops, tablets, and interactive whiteboards. Thanks to its responsive design, the application adjusts to varying screen sizes and retains functional consistency across devices, which is especially beneficial for implementation in blended or remote learning settings [29].
Specifically, the intervention-focused application, entitled Fake News Travel Fast, was organized as a progressive series of interactive modules that included multimedia content, gamified tasks, and narrative-driven challenges. By leveraging Genially’s interactive canvas and logic-based navigation, a non-linear learning pathway was created that fosters learner autonomy and engagement. Furthermore, the platform allows for real-time content updates and learner analytics, granting educators valuable insights into student progress and interaction patterns. Genially’s capabilities are particularly advantageous for producing dynamic presentations, quizzes, escape rooms, and learning games through the implementation of clickable, animated, and visually stimulating content. This design framework promotes student engagement by integrating interactivity and narrative elements.
The application designed for this intervention, Fake News Travel Fast, employed a combination of challenge-based tasks and interactive storytelling to enhance learner motivation and cognitive engagement. Students were required to identify instances of fake news to progress through a sequence of five gamified modules, each reinforcing critical thinking and media literacy within the context of the tourism sector. The narrative guided students through these five gamified challenges, culminating in the discovery of the fictional “Island of Truth” (Figure 1). To align with current European digital literacy standards, the intervention incorporated guidelines from the European Union on identifying and addressing fake news [30].
The educational intervention was aligned with the curriculum of the cognitive area “ICT as a Social Phenomenon: Building Digital Literacy”, which emphasizes the critical topic of misinformation [31]. At the outset of the session, students completed a pre-test questionnaire designed to assess their ability to critically manage information, drawing on established media literacy frameworks [32,33]. This questionnaire included items related to evaluating the credibility of online content and recognizing fake news [34].
Following the completion of the activities within the gamified application, students were administered a post-test questionnaire. This second questionnaire included parallel items from the pre-test to allow for comparative analysis of learning outcomes. In addition, it featured validated items related to application usability [35,36] and user experience [37,38], enabling a comprehensive assessment of both functional and affective dimensions of the learning tool [39].
Responses were collected using the Smileyometer, a child-friendly evaluation tool derived from the Funometer, developed with direct input from children to assess their subjective experiences in educational contexts (Figure 2). This instrument utilizes a 5-point Likert scale incorporating pictorial representations to facilitate comprehension and engagement among younger participants [40,41]. Upon completion of the instructional session, the questionnaires were administered and collected. The sample consisted of students aged 9 to 12, specifically those enrolled in the 4th, 5th, and 6th grades. Data analysis was conducted after the experimental phase to address the study’s core research questions. Quantitative statistical techniques were employed to process and interpret the collected responses.
The evaluation instruments selected for this study were meticulously aligned with the cognitive and developmental characteristics of the target demographic, which comprises primary school students aged 9 to 12 years. During this critical stage, children are evolving towards more sophisticated cognitive processes and abstract reasoning while still benefiting from concrete and visually enriched tools. To this end, the SUS was employed with age-adjusted modifications, ensuring that the language used was accessible and appropriate for younger users. This approach builds upon prior research that successfully adapted the SUS for educational and Human–Computer Interaction contexts [42].
Additionally, the KIDS-GEQ (Kids’ Game Experience Questionnaire) was incorporated, as it is specifically designed to evaluate gameplay experiences among children. Its established framework effectively captures emotional, cognitive, and social engagement within game-based learning environments. This instrument’s application has been documented in similar studies focused on child–computer interaction. To enhance the validity of self-reported measures among young learners, the Smileyometer was also utilized as a visual Likert-type scale. Created as part of the Fun Toolkit [40], this tool employs emotive pictograms, which aid in comprehension and help mitigate response bias, thereby making it a highly suitable instrument for child participants. The combination of these evaluation tools ensures methodological appropriateness, striking a balance between psychometric rigor and developmental sensitivity. This careful alignment ultimately enhances both the reliability and interpretability of the data collected in this research endeavor. While self-reported measures such as the SUS and KIDS-GEQ offer valuable insights into students’ perceptions, it is essential to recognize the inherent limitations of collecting data from children aged 9 to 12. At this developmental stage, children’s responses may be influenced by social desirability or their limited capacity for introspection, possibly impacting the reliability of their evaluations [43]. To address these concerns, the questionnaires were adapted with age-appropriate language and simplified Likert scales.

3. Results

The research was conducted at a primary school in a provincial town in Northern Greece, involving students from the last three grades of primary school. The participating students were aged between 9 and 12 years old. In the sample of 47 students, the gender distribution was as follows: 40.4% were girls (19 students) and 59.6% were boys (28 students). Most participants (37 students) spoke only one language, 9 students spoke two languages, and a small percentage (1 student) spoke three languages.

3.1. SUS Questionnaire

In addition to the specific questions in the paper-based questionnaire, we used two established questionnaires. We used the SUS [41] to evaluate how easy it is to use and learn the application. The SUS is a simple set of ten questions that provides an overall view of usability from the user’s perspective. The findings are presented in Table 1. The SUS was used to assess students’ perceptions of the application’s usability. The SUS consists of 10 items rated on a 5-point Likert scale, traditionally ranging from 1 (Strongly Disagree) to 5 (Strongly Agree). To better suit the age group of primary school students, the response options were adapted to more accessible language: “Strongly Disagree” was replaced with Not at all, “Disagree” with Very little, “Neutral” with Quite, “Agree” with Fairly, and “Strongly Agree” with Very much. Items 1, 3, 5, 7, and 9 assess positive aspects of usability, while items 2, 4, 6, 8, and 10 assess negative aspects, as per the original structure proposed by Brooke (1996) [42]. The SUS score provides a single number that measures the overall usability of the system being studied. This score shows how the application ranks regarding usability and ease of learning.
The students’ responses are displayed in Figure 3 in the SUS questionnaire. The final score for the application, which included gamification elements, was calculated and found to be 75.31, which is considered acceptable. The SUS score that tends to be more than 70 is rated as GOOD.

3.1.1. Reliability Analysis of the SUS Questionnaire

Reliability refers to how an instrument, such as a questionnaire, yields the same results under consistent conditions. The most common method for measuring internal consistency is Cronbach’s alpha. In the analysis of the dataset, which consisted of 47 completed surveys, the 10-item SUS questionnaire demonstrated good internal consistency, with a Cronbach’s alpha of 0.714. Notably, the overall scale’s alpha would increase slightly if items Q6 and Q7 were removed, resulting in an alpha of 0.744. These findings align with the results from the survey of the Greek version of the questionnaire conducted by Katsanos [44]. We wanted to investigate whether our study aligned with Borsci’s research, which found that the SUS is divided into two main components: Learnability (represented by questions Q4 and Q10) and Usability (comprising the other questions) [45]. We examined how our SUS questionnaire performed. The Usability component demonstrated good reliability, achieving a Cronbach’s alpha of 0.711 based on 47 participants. However, the Learnability component displayed insufficient reliability, with a Cronbach’s alpha of only 0.437, which falls below the acceptable standard of 0.70.

3.1.2. Factor Analysis of the SUS Questionnaire

Factor analysis typically requires 5 to 10 observations per variable. As a general rule, the minimum is at least five times as many observations as the number of variables to be analyzed [45]. We conducted our study using a sample of 47 students, which is close to our ideal target of 50. To assess the appropriateness of this sample, we performed the Kaiser-Meyer-Olkin (KMO) Test, yielding a KMO value of 0.681. This value indicates that our sample is adequate, as KMO values above 0.6 suggest suitability for factor analysis. Additionally, we evaluated the KMO Test to check whether the variables were sufficiently correlated for factor analysis. The results revealed a chi-square (χ2) value of 120, with 45 degrees of freedom (df), and a p-value of less than 0.001. Since the p-value is below 0.001, the results are statistically significant, indicating that the variables are correlated enough to justify conducting factor analysis.
To investigate multidimensionality, we conducted a Principal Component Analysis (PCA) with oblimin rotation. The PCA results, as presented in Table 2, indicated the presence of two components, and we excluded question 6 due to its high uniqueness (>0.6), signifying that this item does not strongly align with either component. The findings indicate that the SUS questionnaire does not have reliable subscales. The items Q2, Q3, Q5, Q4, Q8, Q9, and Q10 show a Cronbach’s alpha of 0.763, while the subscale formed by Q1 and Q7 has a significantly lower Cronbach’s alpha of only 0.494. Consequently, all items in the SUS collectively represent a single, reliable scale that assesses perceived usability. Although examining individual items or combinations may be appealing, it is not advisable to do so [43]. Numerous studies have shown that surveys with multiple questions tend to produce more reliable results than those with a single question [42].

3.2. User Experience

The students also completed the KIDS-GEQ. This questionnaire is designed to measure children’s in-game experiences across seven areas: immersion, tension, competence, flow, negative affect, challenge, and positive affect. In our survey, we used six primary measurements along with a seventh question regarding honesty. This provides us with a comprehensive way to assess the main aspects of their gaming experience. While the Kids-GEQ typically uses a 0–4 Likert scale, we opted to use a 1–5 scale in this study to align with the other survey questions. The questionnaire consisted of seven closed-ended questions. We employed a 5-point Likert scale ranging from “Not at all” to “Very much”, focusing on key aspects of the participants’ gaming experiences. The results are presented in Table 3. The responses were then analyzed for frequency distributions, mean scores, and standard deviations.
The descriptive statistics from the game experience questionnaire revealed positive responses across most items. Participants reported high levels of enjoyment, with the statement “I thought it was fun to play the game” receiving a mean score of 4.15 (SD = 1.08) and “It was exciting” averaging a score of 3.91 (SD = 1.18). Notably, the highest agreement was observed for the statement “I felt capable”, with a mean score of 4.53 (SD = 0.65), indicating a strong sense of competence among users and low variability in their responses. Similarly, the statement “I answered the above questions honestly” also received a high mean score of 4.53 (SD = 0.86), suggesting that participants felt confident about the reliability of their responses. In contrast, responses to the statement “I could use my fantasy in the game” were more varied, resulting in a lower mean score of 2.72 (SD = 1.53). This indicates that participants were less engaged on a creative or imaginative level. The sense of immersion, as reflected in the statement “I felt like I was inside the game”, showed moderate agreement (M = 3.30, SD = 1.38) with considerable variability, suggesting a mixed experience regarding immersion. Importantly, participants overwhelmingly disagreed with the statement “I found it tiresome”, which had the lowest mean score of 1.30 (SD = 0.66). This highlights the game’s ability to maintain user interest and avoid feelings of fatigue. Overall, the results suggest a generally positive user experience, with particularly strong feelings of competence and enjoyment.

3.3. Comparative Analysis of Game Experience and Usability Perceptions Between Bilingual and Monolingual Students

To examine whether bilingualism influenced students’ perceptions of usability, an Independent Samples t-Test was conducted to compare responses between bilingual and monolingual groups across ten usability-related items (Q1 to Q10)—the analysis employed Welch’s t-Tests to account for unequal variances. None of the usability items achieved statistical significance (all p > 0.05), indicating no statistically reliable differences in usability perceptions, as shown in Table 4, between bilingual and monolingual students within the current sample. Furthermore, to investigate potential differences in user experience, an Independent Samples t-Test was performed on seven User Experience (UX)-related questionnaire items (UX1 to UX7). Across all seven UX measures (UX1–UX7), no statistically significant differences emerged between bilingual and monolingual students (p values > 0.05 for all comparisons), as we can see in Table 5. This suggests that language background did not significantly impact students’ perceptions of the application’s usability or overall enjoyment.
Overall, the findings indicate that language background, whether bilingual or monolingual, did not significantly affect students’ reported game experiences or their perceptions of usability. This supports the inclusivity and accessibility of the gamified intervention for diverse learners.

3.4. Pre-Test and Post-Test

This study also examines the effects of a gamified educational intervention on the information literacy of primary school students. Specifically, it investigates whether engagement in a gamified activity enhances students’ ability to differentiate between true and false information online. The research also explores whether the intervention impacts their information-seeking behaviors, such as the tendency to consult multiple sources, and whether it cultivates a greater skepticism toward potential misinformation on social media. Furthermore, it aims to determine if students are more inclined to cross-check the information they encounter following the intervention. Lastly, the study analyzes the correlations between students’ pre-test and post-test responses to evaluate changes in key dimensions of information literacy.
Descriptive statistics were calculated for participants’ responses to four questionnaire items, measured both before (pre-test) and after (post-test) the intervention. As shown in Table 6, the mean scores increased across all items from pre- to post-test. Specifically, the mean score for Q1 rose from M = 3.49, SD = 1.08 to M = 3.89, SD = 1.05, and for Q4, from M = 2.98, SD = 1.03 to M = 3.51, SD = 1.28, suggesting a positive change in user perceptions. Smaller increases were also observed in Q2 (from M = 3.49 to M = 3.55) and Q3 (from M = 3.30 to M = 3.66), indicating modest improvements. To verify the appropriateness of parametric testing, the Shapiro–Wilk test was used to assess the normality of the distribution for each item. Results indicated that none of the variables met the assumption of normality, with W statistics ranging from 0.847 to 0.901 and p-values < 0.001 for all pre- and post-test scores. These statistically significant results reflect notable deviations from a normal distribution. Consequently, non-parametric statistical methods, specifically the Wilcoxon signed-rank test, were employed for further analyses of pre–post differences.
To evaluate the impact of the intervention on participants’ perceptions, Wilcoxon signed-rank tests were conducted for four questionnaire items due to non-normal data distributions, as confirmed by Shapiro–Wilk tests. The analysis revealed (Table 7) statistically significant differences in responses for three of the four items. Specifically, Item Q1 showed a significant increase from pre- to post-test (W = 97.0, p = 0.011) with a moderate effect size (rank biserial correlation = −0.5222), indicating a meaningful improvement in participants’ engagement. Similarly, Item Q3 demonstrated a significant change (W = 149.0, p = 0.042) with a moderate effect (−0.3992), and Item Q4 exhibited the most substantial change (W = 95.5, p = 0.002) alongside a large effect size (−0.6149), suggesting enhanced imaginative involvement following the intervention. In contrast, Item Q2 did not reach statistical significance (W = 190.0, p = 0.768), with a negligible effect size (−0.0640), implying that participants’ sense of capability remained largely unchanged.
A Spearman’s rank-order correlation analysis was conducted to investigate the relationships between students’ self-reported information literacy skills before and after the gamified intervention (Table 8). Several statistically significant associations were identified, highlighting consistent patterns in behavior and perceptions. A moderate positive correlation was found between Q1pre (“I can figure out if the information on the internet is false or true”) and Q3pre (“I often suspect that the information I receive on social media is fake news”), ρ = 0.331, p = 0.023. This indicates that students who initially showed higher confidence in identifying misinformation were also more likely to demonstrate skepticism toward social media content. Following the intervention, Q1post was significantly correlated with both Q1pre (ρ = 0.418, p = 0.003) and Q3pre (ρ = 0.457, p = 0.001), indicating that participants maintained or even enhanced their self-perceived ability to evaluate online information credibility and continued to approach online content critically. In terms of information-seeking behavior, Q2post (“I am looking for information from various internet sites”) was positively correlated with Q2pre (ρ = 0.400, p = 0.005), suggesting stability in the use of diverse information sources. The strongest observed association was between Q4pre (“Whenever I receive information on social media, I cross-check it to see if it’s true”) and Q4post, ρ = 0.589, p < 0.001, reflecting a consistent and possibly strengthened habit of verifying information across the study period. Furthermore, Q4post was positively associated with Q2post (ρ = 0.456, p = 0.001), indicating that students who frequently consulted various sources were also more inclined to cross-check information. It was noted that students who often referred to different sources were also more likely to verify the information they encountered. These results indicate that the gamified educational program may have enhanced students’ ability to critically engage with online material and fostered the growth of more selective information habits.

4. Discussion

The present study aimed to assess the usability and perceived effectiveness of an educational application incorporating gamification elements. The application was designed to enhance students’ critical thinking skills and their ability to identify fake news encountered online and on social media, particularly concerning tourism. Utilizing a combination of validated instruments, including the SUS and self-report questionnaires, we examined both user experience and potential cognitive changes before and after the intervention. The SUS results indicated that the application achieved a mean usability score of 75.31, which falls within the “good” range of acceptability. This aligns with established benchmarks, suggesting that SUS scores above 70 reflect favorable perceptions of usability. These findings indicate that students found the application intuitive and easy to navigate. While this study focused on media literacy, the design overlaps with language learning strategies, especially critical reading and evaluation [46].
The KIDS-GEQ results corroborated this positive evaluation, showing strong engagement in dimensions such as competence and enjoyment, with low reported fatigue. However, lower scores in creativity and immersion suggest that the game’s narrative and fantasy elements may not have fully aligned with students’ cognitive preferences or expectations. According to Piaget’s theory of cognitive development, children in the concrete operational stage (ages 7–11) tend to favor logical reasoning over abstract or fantastical elements, which may explain reduced imaginative involvement. This is further supported by SDT, which posits that autonomy, competence, and relatedness are core to intrinsic motivation. The game appeared to satisfy competence but lacked sufficiently immersive components to foster autonomy or identity exploration [9,47].
To address this limitation, future iterations could benefit from enhanced narrative depth, avatar customization, and exploratory environments. These adjustments may improve immersion by aligning more closely with the psychological and developmental needs of the target age group.
No significant differences emerged between bilingual and monolingual participants in terms of usability or game experience, suggesting the application was effective across linguistic groups. However, deeper cultural and cognitive dimensions often missed by quantitative measures warrant further investigation. Drawing on Vygotsky’s sociocultural theory and Gee’s notion of “semiotic domains [48,49], bilingual learners likely interact with game narratives through multiple cultural frameworks. These unique interpretive strategies may not be fully captured via standardized questionnaires.
The intervention yielded statistically significant gains in students’ critical thinking and skepticism toward online information. Notably, Item Q4, which measured imaginative engagement, showed the most substantial improvement, implying that the gamified approach supported deeper cognitive involvement. Although no significant changes were observed in self-perception of evaluative skills (Q2), correlations between pre- and post-test responses suggest a strengthened intentionality in verifying digital content, an essential behavior in combating misinformation.
These results underscore the potential of gamified tools to develop higher-order thinking and reflective media engagement, even among younger learners. Students’ increased attention to source verification and analytical thinking, particularly in the context of tourism misinformation, highlights the domain-specific benefits of tailored gamification strategies. Therefore, future studies should employ mixed-methods approaches incorporating interviews, reflections, or classroom dialogue to better understand how language background and cultural identity influence engagement and learning outcomes. Doing so would enrich our understanding of how gamified systems can equitably support diverse learner populations.
While the intervention demonstrated generally positive outcomes, it is essential to further explore the impact of demographic factors such as age, gender, and digital literacy. Although all participants were similar in age and educational background, individual variations in prior technology exposure, learning preferences, and cultural perspectives may have influenced how students engaged with the gamified content. For instance, certain narrative or design elements may resonate more profoundly with specific gender identities or cultural backgrounds, thereby affecting immersion and perceived relevance. These insights are consistent with recent findings in “Gamification for Higher Education Applications”, which underscore the necessity for gamified interventions to consider learner diversity to promote inclusivity and equity. Guided by the gamification taxonomy presented in that work, this intervention incorporated instant feedback and a point-based progression system to sustain motivation through continuous reinforcement [50].
The lack of significant differences between bilingual and monolingual students in this study is promising; however, a more thorough qualitative analysis is required to understand how demographic and sociocultural variables intersect with gamified environments. Future research should break down user experience and performance data by these factors and incorporate student perspectives through interviews or focus groups to enhance inclusive design strategies.
Despite positive findings, several practical limitations warrant discussion. The study relied on a single-session design, limiting the depth and long-term retention of learning. While effective in raising awareness and initial engagement, more longitudinal or multi-session designs are needed to evaluate sustained impact and behavioral change. Moreover, the use of self-reported data introduces possible biases such as social desirability or overestimation of competence.
Another key concern relates to policy environments. Many educational institutions, particularly in Europe, are moving toward restricting smartphone or tablet use during school hours. Although this study was conducted in a controlled classroom setting, such policies could pose barriers to scalability. Alternative platforms (e.g., desktop versions) or offline adaptations may be necessary to ensure broader applicability and compliance with evolving digital education policies. Success in gamified learning also depends on institutional readiness, teacher support, and policy alignment. As shown in recent studies [51], even well-designed tools face barriers without strong curricular integration and platform flexibility.
This study extends the existing literature by exploring gamified media literacy in primary education through a domain-specific lens, specifically tourism misinformation, and an area rarely addressed despite its relevance in students’ digital lives. Whereas most prior gamification research has emphasized motivation or general digital literacy in older populations, this work integrates communication theory, cognitive development, and HCI principles to design and assess an intervention tailored to younger learners. Based on our findings and recent literature [46,52], several key recommendations for implementing gamified educational tools in primary schools are outlined:
  • Choose features like feedback and collaborative challenges that support educational goals while avoiding confusing elements.
  • Combine competitive and cooperative environments to engage both older and younger students effectively.
  • Provide additional help for students with lower digital skills through orientations and tutorials, along with teacher support.
  • Ongoing professional development is vital for effective gamification implementation.
  • Future gamified tools should be compatible with various platforms and offer offline options, ensuring alignment with institutional policies.
By combining validated UX tools with a critical literacy framework, the findings highlight the importance of developing age-appropriate, inclusive, and context-sensitive educational technologies. This aligns with the demand for more comprehensive frameworks that go beyond superficial engagement, aiming to develop critical digital literacies within complex media environments. Our findings are consistent with recent research indicating that interactive and immersive game formats can improve learners’ ability to identify misinformation and promote deeper cognitive engagement [53]. This suggests that integrating similar design elements could further enhance educational outcomes.

5. Conclusions

In today’s digital and culturally diverse landscape, it is crucial to equip students with the skills to critically assess information. This study offers initial evidence that gamified educational tools may enhance engagement, motivation, and media literacy among primary school students. By incorporating storytelling, interactivity, and reward-based elements, gamification fosters deeper cognitive engagement and aids in the development of critical thinking skills, especially in the context of digital misinformation. The intervention utilized core mechanics such as immediate feedback, progress tracking, and challenge-based tasks to boost motivation and higher-order thinking. Future implementations should balance competitive elements with cooperative tasks to foster peer collaboration, especially for less competitive learners. Scaffolding strategies, such as step-by-step instructions and peer mentoring, should also support younger learners or those with limited digital skills. These adjustments align with research on tailoring the gamified design to learners’ cognitive and cultural needs. Future gamified learning tools should also incorporate coherent narrative arcs to foster emotional engagement and conceptual retention in misinformation-related educational interventions [54].
While the results are promising, they reflect students’ subjective experiences within a limited timeframe and do not constitute direct assessments of digital literacy or intercultural competence. Nonetheless, the findings align with broader educational trends, indicating that younger learners are increasingly receptive to game-based learning methodologies. The game allowed students, especially bilingual and culturally diverse learners, to reflect on cultural representations and contextual nuances in digital content.
Critical literacy remains a foundational skill in the era of digital misinformation. As online content becomes more pervasive and complex, the ability to assess the credibility of information is especially vital for younger audiences. Educational institutions have a key role in fostering these competencies, underlining the need for digital and media literacy to be integrated into formal curricula from an early age. This study also highlights the potential of mobile applications and gamified platforms to extend learning beyond the classroom, offering scalable and inclusive tools for diverse learner populations.
Furthermore, several limitations should be acknowledged. The relatively small sample size (n = 47) and the single-school setting restrict the generalizability of findings. In addition, reliance on self-report measures introduces potential bias and limits the objectivity of outcome assessments. Although age-appropriate modifications were made to instruments such as the SUS and KIDS-GEQ, children’s limited introspective abilities may have influenced response accuracy. Finally, the study captured only short-term effects and did not assess the sustainability of observed learning outcomes.
Future research should incorporate longitudinal designs, mixed-methods approaches, and larger, more diverse samples to explore the long-term efficacy and scalability of gamified interventions. Triangulating self-report data with behavioral metrics, classroom observations, and educator feedback would provide a more comprehensive understanding of learning processes. Iterative, design-based research frameworks can also help refine educational games to better align with developmental needs and classroom realities.
Overall, this study highlights the effectiveness of gamified digital tools in enhancing critical and intercultural literacy among young learners. As gamified misinformation becomes more common online [55], it is crucial to implement early interventions that foster students’ digital discernment. However, these insights should be interpreted with caution and within the context of the study’s exploratory nature. As digital education evolves, ongoing research is essential to ensure that gamification strategies are pedagogically sound, culturally sensitive, and accessible to all students.

Author Contributions

Conceptualization, A.K.; methodology, A.K.; software, A.K.; validation, A.K.; formal analysis, A.K.; investigation, A.K.; data curation, A.K.; writing—original draft preparation, A.K. and N.A.; writing—review and editing, A.K. and N.A.; visualization, A.K.; supervision, N.A.; project administration, N.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of Ionian University with protocol code 4893, approved on 13 March 2025.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are not publicly available due to ethical restrictions involving minors and the sensitive nature of the collected information. The data are anonymized and securely stored by the corresponding author. Access to the data may be granted upon reasonable request and subject to approval by an appropriate ethics body, in accordance with the General Data Protection Regulation (GDPR) and national legislation on children’s data protection.

Acknowledgments

The authors would like to express their sincere appreciation to the school for granting permission to conduct the research and for its support throughout the duration of this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hall, S. Encoding and Decoding in the Television Discourse. In Foundations of Cultural Studies; Morley, D., Ed.; Duke University Press: Durham, NC, USA, 1973; Volume 1. [Google Scholar]
  2. Potter, W.J. Theory of Media Literacy: A Cognitive Approach; SAGE Publications: Thousand Oaks, CA, USA, 2010; ISBN 978-1-4833-2888-1. [Google Scholar]
  3. Huotari, K.; Hamari, J. A Definition for Gamification: Anchoring Gamification in the Service Marketing Literature. Electron. Mark. 2017, 27, 21–31. [Google Scholar] [CrossRef]
  4. Hobbs, R. Digital and Media Literacy: A Plan of Action: A White Paper on the Digital and Media Literacy Recommendations of the Knight Commission on the Information Needs of Communities in a Democracy; Aspen Institute: Washington, DC, USA, 2010; ISBN 978-0-89843-535-1. [Google Scholar]
  5. Carroll, J. Human-Computer Interaction: Psychology as a Science of Design. Annu. Rev. Psychol. 1997, 48, 61–83. [Google Scholar] [CrossRef] [PubMed]
  6. Janson, A.; Schmidt-Kraepelin, M.; Schöbel, S.; Sunyaev, A. Special Issue Editorial: Adaptive and Intelligent Gamification Design. AIS Trans. Hum.-Comput. Interact. 2023, 15, 136–145. [Google Scholar] [CrossRef]
  7. Toledo Palomino, P.; Isotani, S. Enhancing User Experience in Learning Environments: A Narrative Gamification Framework for Education. J. Interact. Syst. 2024, 15, 478–489. [Google Scholar] [CrossRef]
  8. Ribeiro Silva, L.; Maciel Toda, A.; Chalco Challco, G.; Chamel Elias, N.; Ibert Bittencourt, I.; Isotani, S. Effects of a Collaborative Gamification on Learning and Engagement of Children with Autism. Univers. Access Inf. Soc. 2025, 24, 911–932. [Google Scholar] [CrossRef]
  9. Ryan, R.M.; Deci, E.L. Self-Determination Theory. Basic Psychological Needs in Motivation, Development and Wellness; Guilford Press: New York, NY, USA, 2017; Volume 38, p. 231. [Google Scholar]
  10. Pradhan, D.; Malik, G.; Vishwakarma, P. Gamification in Tourism Research: A Systematic Review, Current Insights, and Future Research Avenues. J. Vacat. Mark. 2023, 31, 130–156. [Google Scholar] [CrossRef]
  11. Suh, A.; Wagner, C.; Liu, L. Enhancing User Engagement through Gamification. J. Comput. Inf. Syst. 2018, 58, 204–213. [Google Scholar] [CrossRef]
  12. Roinioti, E.; Pandia, E.; Konstantakis, M.; Skarpelos, Y. Gamification in Tourism: A Design Framework for the TRIPMENTOR Project. Digital 2022, 2, 191–205. [Google Scholar] [CrossRef]
  13. Wu, C.-H.; Chao, Y.-L.; Xiong, J.-T.; Luh, D.-B. Gamification of Culture: A Strategy for Cultural Preservation and Local Sustainable Development. Sustainability 2023, 15, 650. [Google Scholar] [CrossRef]
  14. Berkowitz, D.; Schwartz, D. Miley, CNN and The Onion. J. Pract. 2015, 10, 1–17. [Google Scholar]
  15. Tandoc, E.; Lim, Z.; Ling, R. Defining “Fake News”: A Typology of Scholarly Definitions. Digit. J. 2017, 6, 1–17. [Google Scholar] [CrossRef]
  16. Pennycook, G. The Psychology of Fake News. Trends Cogn. Sci. 2021, 25, 388–402. [Google Scholar] [CrossRef] [PubMed]
  17. Fedeli, G. ‘Fake News’ Meets Tourism: A Proposed Research Agenda. Ann. Tour. Res. 2020, 80, 102684. [Google Scholar] [CrossRef]
  18. Robinson, S.; DeShano, C. ‘Anyone Can Know’: Citizen Journalism and the Interpretive Community of the Mainstream Press. Journalism 2011, 12, 963–982. [Google Scholar] [CrossRef]
  19. Choi, S.; Mattila, A.; Van Hoof, H.; Quadri, D. The Role of Power and Incentives in Inducing Fake Reviews in the Tourism Industry. J. Travel Res. 2016, 56, 975–987. [Google Scholar] [CrossRef]
  20. Jones-Jang, S.M.; Mortensen, T.; Liu, J. Does Media Literacy Help Identification of Fake News? Information Literacy Helps, but Other Literacies Don’t. Am. Behav. Sci. 2021, 65, 371–388. [Google Scholar] [CrossRef]
  21. Ku, K.; Fung, T.; Au, A.; Choy, A.; Kajimoto, M.; Song, Y. Helping Young Students Cope with the Threat of Fake News: Efficacy of News Literacy Training for Junior-Secondary School Students in Hong Kong. Educ. Stud. 2025, 51, 1–19. [Google Scholar] [CrossRef]
  22. Clark, L.S.; Marchi, R. Young People and the Future of News: Social Media and the Rise of Connective Journalism; Cambridge University Press: Cambridge, UK, 2017; p. 316. ISBN 978-1-107-19060-3. [Google Scholar]
  23. Vraga, E.; Tully, M. News Literacy, Social Media Behaviors, and Skepticism toward Information on Social Media. Inf. Commun. Soc. 2019, 24, 150–166. [Google Scholar] [CrossRef]
  24. Campillo-Ferrer, J.-M.; Miralles-Martínez, P.; Sánchez-Ibáñez, R. Gamification in Higher Education: Impact on Student Motivation and the Acquisition of Social and Civic Key Competencies. Sustainability 2020, 12, 4822. [Google Scholar] [CrossRef]
  25. Capecchi, S.; Lieto, A.; Patti, F.; Pensa, R.G.; Rapp, A.; Vernero, F.; Zingaro, S. A Gamified Platform to Support Educational Activities About Fake News in Social Media. IEEE Trans. Learn. Technol. 2024, 17, 1765–1779. [Google Scholar] [CrossRef]
  26. Maqsood, S.; Mekhail, C.; Chiasson, S. A Day in the Life of Jos: A Web-Based Game to Increase Children’s Digital Literacy. In Proceedings of the 17th ACM Conference on Interaction Design and Children, Trondheim, Norway, 19–22 June 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 241–252. [Google Scholar]
  27. Aldalur, I. Enhancing Software Development Education through Gamification and Experiential Learning with Genially. Softw. Qual. J. 2024, 33, 1. [Google Scholar] [CrossRef]
  28. Katsaounidou, A.; Vrysis, L.; Kotsakis, R.; Dimoulas, C.; Veglis, A. MAthE the Game: A Serious Game for Education and Training in News Verification. Educ. Sci. 2019, 9, 155. [Google Scholar] [CrossRef]
  29. Genially | The Easiest Way to Create Interactive Experiences. Available online: https://genially.com/node/742/ (accessed on 9 September 2024).
  30. European Parliament. How to Spot When News Is Fake. 2021. Available online: https://www.europarl.europa.eu/RegData/etudes/ATAG/2017/599386/EPRS_ATA%282017%29599386_EL.pdf (accessed on 2 November 2024).
  31. Institute of Educational Policy (I.E.P.). Guidelines for the Management of the Curriculum for the Subject “Information and Communication Technologies” in Primary School for the Academic Year 2024–2025. Institute of Educational Policy, 2024. Available online: https://iep.edu.gr/images/IEP/EPISTIMONIKI_YPIRESIA/Epist_Monades/tmima_B/Aθμια_2024-2025/2024-09-16_138644_4_ΠΑΡΑΡΤΗΜΑ_2_ΤΠΕ.pdf (accessed on 18 November 2024).
  32. Chen, D.; Wu, J.; Wang, Y. Unpacking New Media Literacy. J. Syst. Cybern. Inform. 2011, 9, 84–88. [Google Scholar]
  33. Chen, D.-T.; Li, J.-Y.; Lin, T.-B.; Lee, L.; Ye, X. New Media Literacy of School Students in Singapore; National Institute of Education: Singapore, 2014. [Google Scholar]
  34. Syam, H.M.; Nurrahmi, F. “I Don’t Know If It Is Fake or Real News” How Little Indonesian University Students Understand Social Media Literacy. J. Komun. Malays. J. Commun. 2020, 36, 92–105. [Google Scholar] [CrossRef]
  35. Dilmen, K.; Kert, S.; Uğraş, T. Children’s Coding Experiences in a Block-Based Coding Environment: A Usability Study on Code.Org. Educ. Inf. Technol. 2023, 28, 1–26. [Google Scholar] [CrossRef]
  36. Avouris, N.; Katsanos, C.; Tselios, N.; Moustakas, K. Introduction to Human-Computer Interaction; Kallipos: Lesvos, Greece, 2015; ISBN 978-960-603-407-7. (In Greek) [Google Scholar]
  37. Poels, K.; de Kort, Y.A.W.; IJsselsteijn, W.A. D3.3: Game Experience Questionnaire: Development of a Self-Report Measure to Assess the Psychological Impact of Digital Games; Technische Universiteit Eindhoven: Eindhoven, The Netherlands, 2007. [Google Scholar]
  38. De Kort, Y.; Ijsselsteijn, W.; Poels, K. Digital Games as Social Presence Technology: Development of the Social Presence in Gaming Questionnaire (SPGQ). In Proceedings of the PRESENCE 2007, Barcelona, Spain, 25–27 October 2007. [Google Scholar]
  39. Shadish, W.R.; Cook, T.D.; Campbell, D.T. Experimental and Quasi-Experimental Designs for Generalized Causal Inference; Houghton Mifflin: Boston, MA, USA, 2001; ISBN 978-0-395-61556-0. [Google Scholar]
  40. Read, J.; MacFarlane, S.; Casey, C. Endurability, Engagement and Expectations: Measuring Children’s Fun. In Interaction Design and Children; Shaker Publishing: Eindhoven, The Netherlands, 2002; Volume 2, pp. 1–23. [Google Scholar]
  41. Sim, G.; Horton, M. Investigating Children’s Opinions of Games: Fun Toolkit vs. This or That. In Proceedings of the 11th International Conference on Interaction Design and Children, Bremen Germany, 12–15 June 2012; Association for Computing Machinery: New York, NY, USA, 2012; pp. 70–77. [Google Scholar]
  42. Brooke, J. SUS: A Quick and Dirty Usability Scale; Taylor and Francis: Abingdon, UK, 1996; pp. 189–194. [Google Scholar]
  43. Borgers, N.; de leeuw, E.; Hox, J. Children as Respondents in Survey Research: Cognitive Development and Response Quality 1. Bull. Methodol. Sociol. 2000, 66, 60–75. [Google Scholar] [CrossRef]
  44. Katsanos, C.; Tselios, N.; Xenos, M. Perceived Usability Evaluation of Learning Management Systems: A First Step towards Standardization of the System Usability Scale in Greek. In Proceedings of the 2012 16th Panhellenic Conference on Informatics, Piraeus, Greece, 5–7 October 2012. [Google Scholar]
  45. Borsci, S.; Federici, S.; Lauriola, M. On the Dimensionality of the System Usability Scale: A Test of Alternative Measurement Models. Cogn. Process. 2009, 10, 193–197. [Google Scholar] [CrossRef]
  46. Koivisto, J.; Hamari, J. The Rise of Motivational Information Systems: A Review of Gamification Research. Int. J. Inf. Manag. 2019, 45, 191–210. [Google Scholar] [CrossRef]
  47. Piaget, J.; Inhelder, B.; Piaget, J. The Psychology of the Child, 3rd ed.; Harper Torchbooks; Basic Books: New York, NY, USA, 1973; ISBN 978-0-465-09500-1. [Google Scholar]
  48. Gee, J. What Video Games Have to Teach Us About Learning and Literacy. Comput. Entertain. 2003, 1, 20. [Google Scholar] [CrossRef]
  49. Vygotsky, L.S. Mind in Society: The Development of Higher Psychological Processes; Cole, M., John-Steiner, V., Scribner, S., Souberman, E., Eds.; Harvard University Press: Cambridge, MA, USA, 1978. [Google Scholar]
  50. Lo, N.; Chan, S. Gamification for Higher Education Applications: A Social Learning Theory Approach; Brill: Leiden, The Netherlands, 2023; ISBN 978-90-04-70278-3. [Google Scholar]
  51. Toda, A.M.; Cristea, A.I.; Isotani, S. (Eds.) Gamification Design for Educational Contexts: Theoretical and Practical Contributions; Springer: Cham, Switzerland, 2023; ISBN 978-3-031-31948-8. [Google Scholar]
  52. Subhash, S.; Cudney, E.A. Gamified Learning in Higher Education: A Systematic Review of the Literature. Comput. Hum. Behav. 2018, 87, 192–206. [Google Scholar] [CrossRef]
  53. Buchner, J. Playing an Augmented Reality Escape Game Promotes Learning About Fake News. Technol. Knowl. Learn. 2025, 30, 425–445. [Google Scholar] [CrossRef]
  54. Devasia, N.; Zhao, R.; Lee, J.H. Does the Story Matter? Applying Narrative Theory to an Educational Misinformation Escape Room Game. arXiv 2025, arXiv:2503.02135. [Google Scholar]
  55. Appleby, M.F. Trust the Game: Gamification and the Rise of Online Conspiracy Theories. Doctoral Dissertation, Virginia Tech, Blacksburg, VA, USA, 2025. [Google Scholar]
Figure 1. Screenshots of the educational application. (a) The first picture displays the home screen of the educational application, and the text reads: “Fake news travels fast”. The book title shown is “Searching for the truth—The Island of Truth”. The smaller text on the book cover says: “Some things are not as they seem”. (b) The second picture displays the main menu of the application, showing the user’s progress through five challenges. The message at the top reads: “You are ready. You must collect 1000 points to reach the Island of Truth”.
Figure 1. Screenshots of the educational application. (a) The first picture displays the home screen of the educational application, and the text reads: “Fake news travels fast”. The book title shown is “Searching for the truth—The Island of Truth”. The smaller text on the book cover says: “Some things are not as they seem”. (b) The second picture displays the main menu of the application, showing the user’s progress through five challenges. The message at the top reads: “You are ready. You must collect 1000 points to reach the Island of Truth”.
Electronics 14 02200 g001
Figure 2. The Smileyometer: A Visual Likert Scale for Child-Friendly Questionnaire Evaluation.
Figure 2. The Smileyometer: A Visual Likert Scale for Child-Friendly Questionnaire Evaluation.
Electronics 14 02200 g002
Figure 3. Frequency of responses to the SUS questionnaire.
Figure 3. Frequency of responses to the SUS questionnaire.
Electronics 14 02200 g003
Table 1. Descriptive statistics of the SUS questionnaire.
Table 1. Descriptive statistics of the SUS questionnaire.
QuestionNot At AllVery LittleQuiteFairlyVery MuchMeanSD
Q12 (4.3%)5 (10.6%)14 (29.8%)11 (23.4%)15 (31.9%)3.681.15
Q223 (47.9%)12 (25.0%)6 (12.5%)4 (8.3%)3 (6.2%)1.241.24
Q31 (2.1%)1 (2.1%)6 (12.8%)15 (31.9%)24 (51.1%)4.280.92
Q428 (59.6%)12 (25.5%)6 (12.8%)0 (0.0%)1 (2.1%)1.600.87
Q50 (0.0%)4 (8.5%)11 (23.4%)12 (25.5%)20 (42.6%)4.021.00
Q621 (44.7%)5 (10.6%)10 (21.3%)9 (19.1%)2 (4.3%)2.281.32
Q71 (2.1%)2 (4.3%)10 (21.3%)21 (44.7%)13 (27.7%)3.910.92
Q837 (78.7%)7 (14.9%)1 (2.1%)2 (4.3%)0 (0.0%)1.320.72
Q91 (2.1%)3 (6.4%)10 (21.3%)18 (38.3%)15 (31.9%)3.910.99
Q109 (19.1%)17 (36.2%)14 (29.8%)3 (6.4%)4 (8.5%)2.491.13
Table 2. Two Factor Solutions and Derived Subscales.
Table 2. Two Factor Solutions and Derived Subscales.
Component
Item12
Q1 0.855
Q20.561
Q30.722
Q40.524−0.364
Q50.786
Q7 0.555
Q80.595
Q90.6670.338
Q100.549−0.509
Table 3. Descriptive statistics of the Game Experience questionnaire.
Table 3. Descriptive statistics of the Game Experience questionnaire.
Question Not At AllVery LittleQuiteFairlyVery Much MeansSt. Deviation
1. It was exciting241011203.911.18
4.3%8.5%21.3%23.4%42.6%
2. I felt capable.00414294.530.654
0.0%0.0%8.5%29.8%61.7%
3. I felt like I was inside the game412124153.31.38
8.5%25.5%25.5%8.5%31.9%
4. I could use my fantasy in the game1588792.721.53
31.9%17.0%17.0%14.9%19.1%
5. I found it tiresome3772101.30.657
78.7%14.9%4.3%2.1%0.0%
6. I thought it was fun to play the game06610254.151.08
0.0%12.8%12.8%21.3%53.2%
7. I answered the above questions honestly11211324.530.856
2.1%2.1%4.3%23.4%68.1%
Table 4. Comparing Usability Perceptions by Language Background.
Table 4. Comparing Usability Perceptions by Language Background.
Statisticdfp
Q1Welch’s t−0.261415.60.797
Q2Welch’s t0.672916.00.511
Q3Welch’s t−0.971313.90.348
Q4Welch’s t0.894516.00.384
Q5Welch’s t−1.905614.40.077
Q6Welch’s t0.065315.30.949
Q7Welch’s t−0.492917.10.628
Q8Welch’s t1.388310.50.194
Q9Welch’s t−1.233112.50.240
Q10Welch’s t0.758519.30.457
Table 5. Comparing User Experience Perceptions by Language Background.
Table 5. Comparing User Experience Perceptions by Language Background.
Statisticdfp
ux1Welch’s t0.27816.20.785
ux2Welch’s t−1.05811.70.311
ux3Welch’s t−0.26115.10.798
ux4Welch’s t0.19015.80.852
ux5Welch’s t0.52813.50.606
ux6Welch’s t−0.49914.80.625
ux7Welch’s t−1.19912.00.254
Table 6. Descriptive statistics and normality test of pre-tests and post-tests.
Table 6. Descriptive statistics and normality test of pre-tests and post-tests.
QuestionsMeanMedianSDWp
Q1pre
I can figure out if the information on the internet is false or true
3.4931.080.900<0.001
Q2pre
I look for information from various sources
3.4941.280.887<0.001
Q3pre
I often suspect that the information I receive on social media is fake news
3.3031.140.899<0.001
Q4pre
Whenever I receive information on social media, I cross-check it to see if it’s true
2.9831.030.901<0.001
Q1post
I can figure out if the information on the internet is false or true
3.8941.050.847<0.001
Q2post
I look for information from various sources
3.5531.180.877<0.001
Q3post
I often suspect that the information I receive on social media is fake news
3.6641.110.887<0.001
Q4post
Whenever I receive information on social media, I cross-check it to see if it’s true
3.5141.280.882<0.001
Table 7. Paired Samples Analysis.
Table 7. Paired Samples Analysis.
Question Pre-TestQuestion Post-Test Statisticp Effect Size
Q1preQ1postWilcoxon W97.00.011Rank biserial correlation−0.5222
Q2preQ2postWilcoxon W190.00.768Rank biserial correlation−0.0640
Q3preQ3postWilcoxon W149.00.042Rank biserial correlation−0.3992
Q4preQ4postWilcoxon W95.50.002Rank biserial correlation−0.6149
Table 8. Correlation Matrix.
Table 8. Correlation Matrix.
Q1preQ2preQ3preQ4preQ1postQ2postQ3postQ4post
Q1pre
Q2pre−0.016
Q3pre0.331 *0.117
Q4pre−0.1350.342 *0.092
Q1post0.418 **0.1630.457 **0.259
Q2post−0.0610.400 **−0.0670.2230.005
Q3post0.1350.2590.427 **0.1310.2530.063
Q4post0.0050.2210.0380.589 ***0.2620.456 **0.071
Note: * p < 0.05, ** p < 0.01, *** p < 0.001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Koutsikou, A.; Antonopoulos, N. Gamification and User Experience in Fake News Detection on Tourism in Primary Education. Electronics 2025, 14, 2200. https://doi.org/10.3390/electronics14112200

AMA Style

Koutsikou A, Antonopoulos N. Gamification and User Experience in Fake News Detection on Tourism in Primary Education. Electronics. 2025; 14(11):2200. https://doi.org/10.3390/electronics14112200

Chicago/Turabian Style

Koutsikou, Androniki, and Nikos Antonopoulos. 2025. "Gamification and User Experience in Fake News Detection on Tourism in Primary Education" Electronics 14, no. 11: 2200. https://doi.org/10.3390/electronics14112200

APA Style

Koutsikou, A., & Antonopoulos, N. (2025). Gamification and User Experience in Fake News Detection on Tourism in Primary Education. Electronics, 14(11), 2200. https://doi.org/10.3390/electronics14112200

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop