Next Article in Journal
Natural Disasters, Psychosocial Distress, Psychological Flexibility, and Satisfaction with Life
Previous Article in Journal
Loneliness and Intersectional Discrimination Among Aging LGBT People in Spain: A Qualitative Research Study of Gay Men
Previous Article in Special Issue
Fostering Empathy Through Play: The Impact of Far From Home on University Staff’s Understanding of International Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Designing for Engagement in Primary Health Education Through Digital Game-Based Learning: Cross-National Behavioral Evidence from the iLearn4Health Platform

by
Evgenia Gkintoni
1,*,
Emmanuella Magriplis
2,
Fedra Vantaraki
1,
Charitini-Maria Skoulidi
3,
Panagiotis Anastassopoulos
3,
Alexandra Cornea
4,
Begoña Inchaurraga
5,
Jaione Santurtun
5,
Ainhoa de la Cruz Mancha
5,
George Giorgakis
6,
Kleri Kouppas
7,
Stella Timotheou
7,
Maria Jose Moreno Juan
8,
Miren Muñagorri
8,
Marta Harasiuk
9,
Alfredo Garmendia Lopez
3,
Efi Skoulidi
3 and
Apostolos Vantarakis
1
1
Lab of Public Health, Department of Medicine, University of Patras, 26504 Patra, Greece
2
Lab of Dietetics and Quality of Life, Department of Food Science and Human Nutrition, Agricultural University of Athens, 11855 Athens, Greece
3
p-Consulting, 26442 Patras, Greece
4
FSLI, 030167 Bucharest, Romania
5
Centro San Viator, 48190 Sopuerta, Spain
6
C.F.C.D.C. Centre for Competence, Development Cyprus Limited, Nicosia 1065, Cyprus
7
Dimotiko Scholeio Agias Napas-Antoni Tsokkou, Ayia Napa 5330, Cyprus
8
Errotu Taldea S.L.P., 20160 Donostia-San Sebastian, Spain
9
OIC POLAND Foundation, WSEI University, 20-209 Lublin, Poland
*
Author to whom correspondence should be addressed.
Behav. Sci. 2025, 15(7), 847; https://doi.org/10.3390/bs15070847
Submission received: 24 April 2025 / Revised: 11 June 2025 / Accepted: 18 June 2025 / Published: 24 June 2025
(This article belongs to the Special Issue Benefits of Game-Based Learning)

Abstract

This study evaluates design effectiveness in Digital Game-Based Learning (DGBL) for primary health education through systematic teacher assessment of the iLearn4Health platform. Rather than measuring educational transformation, the research investigates how DGBL design principles influence user engagement patterns and platform usability as evaluated by education professionals. The study contributes to design optimization frameworks for primary school digital health education applications by examining the distinction between DGBL and superficial gamification approaches in creating engaging educational interfaces. The iLearn4Health platform underwent comprehensive design evaluation by 337 teachers across 24 schools in five European countries (Greece, Cyprus, Romania, Poland, and Spain). Teachers served as design evaluators rather than end-users, assessing platform engagement mechanisms through systematic interaction analysis. The study employed multiple statistical approaches—descriptive analysis, correlation analysis, ANOVA, regression modeling, and cluster analysis—to identify design engagement patterns and their predictors, tracking completion rates, progress trajectories, and interaction time as indicators of design effectiveness. Design evaluation revealed a distinctive bimodal engagement distribution, with 52.8% of teacher–evaluators showing limited platform exploration (progress ratio 0.0–0.2) and 35.3% demonstrating comprehensive design assessment (progress ratio 0.8–1.0). A strong positive correlation (r = 0.95, p < 0.001) between time spent and steps completed indicated that design elements successfully sustained evaluator engagement. Multiple regression analysis identified initial design experience as the strongest predictor of continued engagement (β = 0.479, p < 0.001), followed by country-specific implementation factors (Romania vs. Cyprus, β = 0.183, p = 0.001) and evaluator age (β = 0.108, p = 0.049). Cluster analysis revealed three distinct evaluator profiles: comprehensive design assessors (35.3%), early design explorers (52.8%), and selective feature evaluators (11.9%). Cross-national analysis showed significant variations in design engagement, with Romania demonstrating 53% higher average progress ratios than Cyprus (0.460 vs. 0.301, p < 0.01). Teacher evaluation validates effective design implementation in the iLearn4Health platform for creating engaging primary health education experiences. The platform successfully demonstrates DGBL design principles that integrate health concepts into age-appropriate interactive environments, distinct from gamification approaches that merely overlay game elements onto existing content. Identifying initial engagement as the strongest predictor of sustained interaction highlights the critical importance of onboarding design in determining user experience outcomes. While this study establishes design engagement effectiveness through educator assessment, actual educational transformation and student learning outcomes require future implementation studies with primary school populations. The design validation approach provides essential groundwork for subsequent educational effectiveness research while contributing evidence-based design principles for engagement optimization in digital health education contexts.

1. Introduction

In recent years, the educational and health sectors have increasingly incorporated game elements into their practices, albeit through distinct design approaches that merit clear differentiation for effective implementation. Gamification—the application of game-design elements to non-game contexts—differs fundamentally from Digital Game-Based Learning (DGBL), which employs complete games explicitly designed for educational purposes. This critical distinction establishes the framework for our design evaluation investigation (Camacho-Sánchez et al., 2022; Q. Zhang & Yu, 2022; de Carvalho & Coelho, 2022; Dahalan et al., 2024).
Gamification typically incorporates standard mechanics such as points, leaderboards, rewards, and levels into non-gaming environments to enhance user motivation and engagement (Mohanty & Christopher, 2023; Oke et al., 2024; Zhao et al., 2021). Since approximately 2010, gamification has gained significant traction across multiple domains, including education, marketing, and public health. By contrast, DGBL involves comprehensive game experiences specifically developed to achieve learning objectives, where educational content is intrinsically woven into gameplay mechanics (Vijay & Jayan, 2023; Castellano-Tejedor & Cencerrado, 2024; Speelman et al., 2023; Zolfaghari et al., 2025).
The education sector has embraced both approaches, with gamification elements enhancing traditional learning platforms and DGBL offering immersive educational experiences. Similarly, the health promotion field has adopted these strategies—gamification appears in behavior-tracking applications and incentive systems, while health-focused games deliver educational content through interactive narratives and simulations (M. Xu et al., 2023; Shaheen et al., 2023; Abou Hashish et al., 2024).
Despite growing interest in both fields, there remains a significant gap in research that systematically evaluates design effectiveness across these distinct approaches in educational and health contexts (Rycroft-Smith, 2022; Cong-Lem et al., 2025; Taneja-Johansson & Singal, 2025). The current literature often conflates or treats these distinct concepts in isolation without rigorous design assessment frameworks. This lack of design evaluation methodology hinders the development of effective implementation strategies that could inform evidence-based design practices in both fields (Samala et al., 2025; Caprara & Caprara, 2022; Keyes & Platt, 2024).
This study focuses specifically on design evaluation and engagement optimization rather than educational transformation assessment. While the ultimate goal of health education platforms is to improve children’s knowledge and behavior, this research addresses the prerequisite step of design validation through systematic educator evaluation. We examine design effectiveness for creating engaging interactions that could support future educational implementation, without claiming to demonstrate educational transformation or student outcome improvement.
Our research examines explicitly the design characteristics of the iLearn4Health platform, an online DGBL application designed to optimize engagement in health literacy development among primary school students. This platform features modules addressing Healthy Dietary Habits, Physical Activity, Stereotypes, Accident Prevention, Internet Safety, and Sexual Health Education. Unlike gamified applications that merely add game elements to existing content, iLearn4Health represents a comprehensive DGBL design environment where health concepts are integrated into the core gameplay mechanics (Halkiopoulos & Gkintoni, 2024).
This study aims to analyze how DGBL design principles—rather than gamification approaches broadly—contribute to user engagement patterns in health education contexts. By establishing clear theoretical boundaries between gamification and DGBL design approaches, we can more accurately assess how game-based interface design enhances sustained interaction and usability in health education platforms. Furthermore, we investigate how data-driven design insights can inform iterative optimization processes that align educational game interfaces with engagement objectives (Muniandy et al., 2023; All et al., 2021; Antonopoulou et al., 2022; K. Wang et al., 2023).
Rather than measuring educational effectiveness or student outcomes, this investigation employs teacher evaluation as a design validation methodology. Teachers serve as informed design evaluators who assess platform engagement mechanisms, interface usability, and age-appropriateness for the target demographic. This approach provides critical design feedback while acknowledging that educational transformation requires subsequent implementation studies with primary school populations.
Through this focused design evaluation approach, we contribute to a more nuanced understanding of how game-based design strategies—specifically gamification versus DGBL—can effectively support engagement objectives in health education contexts. This design-centered analysis addresses a critical need in the literature for systematic evaluation methodologies when assessing game-related interface design across these intersecting fields (Sáiz-Manzanares et al., 2021; Feldhacker et al., 2025; Vázquez-Calatayud et al., 2024; Gkintoni et al., 2024b).
The study makes three primary contributions to educational technology design research: First, it establishes a systematic approach for evaluating DGBL design effectiveness through educator assessment, providing a replicable framework for design optimization in educational contexts. Second, it identifies specific design factors that predict sustained user interaction, contributing evidence-based principles for interface optimization in primary school health education applications. Third, it examines how design effectiveness varies across different cultural and educational contexts, informing adaptive design strategies for international implementation.
This design evaluation study establishes engagement effectiveness but does not measure educational transformation or student learning outcomes. The findings provide essential groundwork for future educational effectiveness research while contributing to design optimization frameworks for digital health education. Subsequent implementation studies with primary school students are necessary to validate whether design engagement translates into educational impact and behavioral change. By maintaining clear boundaries between design validation and educational effectiveness assessment, this research offers valuable insights into engagement-focused design principles while appropriately positioning future research needs for comprehensive educational evaluation.

2. Literature Review

2.1. Theoretical Frameworks Distinguishing Gamification and DGBL in Education and Health Promotion

Theoretical underpinnings that distinguish gamification and Digital Game-Based Learning (DGBL) are essential for effective application in both educational and health settings (del Cura-González et al., 2022; Gkintoni et al., 2025b; Gupta & Goyal, 2022). Gamification heavily borrows from motivational theories and, more specifically, Self-Determination Theory (SDT), which postulates autonomy, competence, and relatedness as basic psychological needs. When game design mechanics like points, badges, and leaderboards are translated into non-game settings, they can potentially facilitate or undermine these inherent motivational needs (Gao, 2024; H. Oliveira et al., 2024; Nivedhitha et al., 2025; Y. Chen & Zhao, 2022).
On the other hand, DGBL has stronger roots in constructivist learning theory, in which learning is achieved through experience and problem-solving in whole game worlds (Tyack & Mekler, 2024; Breien & Wasson, 2021). Whereas gamification incorporates game mechanics into existing tasks, DGBL incorporates curricular material into the gameplay itself, with an opportunity to engage more deeply with learning content (Papakostas, 2024; R. Zhang et al., 2025; Nadeem et al., 2023).
In health promotion specifically, both the Theory of Planned Behavior (TPB) and Social Cognitive Models provide frameworks through which both approaches can influence health behaviors in different ways (Höyng, 2022; Yang et al., 2021; Luo, 2022). Gamification targets specific behavior through the use of extrinsic rewards, while DGBL can potentially create immersive storylines that embed health concepts into significant experiences (Kim & Castelli, 2021; Macey et al., 2025; Riar et al., 2022).
Research with primary school populations, the specific target demographic for the iLearn4Health platform, demonstrates particular developmental considerations that affect the efficacy of both approaches (Nordby et al., 2024; Xiong et al., 2022; Ghosh et al., 2022). Children aged 6–12 respond differently to game elements than adolescents or adults, with greater responsiveness to visual rewards, narrative engagement, and playful interactions. This developmental distinction is critical when designing interventions specifically for primary school students versus general populations, as the cognitive and motivational processes differ substantially across age groups (Hassan et al., 2021; Henry et al., 2021; Masi et al., 2021).
Experimental comparisons between minimally gamified applications and those with comprehensive game elements demonstrate that theoretical grounding significantly impacts outcomes (Leitão et al., 2022a; Dah et al., 2024; Kniestedt et al., 2022). These comparisons reveal that superficial implementation of game elements without consideration of age-appropriate motivational factors often fails to produce sustained engagement, particularly among primary school children who require more concrete and immediate feedback systems than adult users (Torresan & Hinterhuber, 2023; Leitão et al., 2022b; Tzachrista et al., 2023).

2.2. Design Principles for Effective Gamification vs. DGBL in Primary School Health Education

The design principles for gamification differ substantially from those for DGBL, particularly when targeting primary school children for health education. User-centered design takes on heightened importance with younger populations, as their cognitive abilities, attention spans, and motivational triggers vary significantly from adolescents or adults (Pitic & Irimiaș, 2023; Wiljén et al., 2022).
For gamification approaches with primary school children, effective design principles include the following: (1) immediate and concrete feedback mechanisms, (2) simple reward structures that avoid overwhelming cognitive load, (3) age-appropriate challenges that match developmental capabilities, and (4) social comparison elements calibrated to minimize adverse competitive effects. These principles address the specific needs of younger learners whose abstract thinking and self-regulatory capacities are still developing (Burn et al., 2022).
DGBL design for this age group, however, emphasizes different principles: (1) narrative integration that contextualizes health concepts within engaging stories, (2) experiential learning through role-playing and simulation, (3) scaffolded progression that matches developing abilities, and (4) multimodal representation of concepts through visual, auditory, and kinesthetic channels (Peláez & Solano, 2023; Govender & Arnedo-Moreno, 2021).
The iLearn4Health platform specifically implements DGBL principles for primary school students through modules designed with age-appropriate complexity, vocabulary, and interactions. Each health topic—dietary habits to Internet safety—is presented through developmentally appropriate gameplay rather than simply adding game elements to traditional educational content (Pellas et al., 2021).
Effective design in both approaches requires alignment between the chosen strategy (gamification or DGBL), the developmental stage of the target users (primary school children), and the specific health promotion objectives. This alignment ensures that the intervention addresses both the cognitive capabilities and motivational factors relevant to the target age group (Ongoro & Fanjiang, 2023; Chatterjee et al., 2022; Ghahramani et al., 2022; AlDaajeh et al., 2022).

2.3. Implementation Technologies for Primary School Health Education: Gamification Tools vs. Complete DGBL Platforms

The technological implementation of gamification differs substantially from DGBL platforms, particularly when designed for primary school populations. Gamification typically employs add-on elements to existing systems through points, badges, and leaderboards, while DGBL requires comprehensive game development environments that integrate educational content directly into gameplay (Kulkov et al., 2024; Zahedi et al., 2021).
For primary school settings, technological considerations include age-appropriate user interfaces, data protection measures specific to minors, and scalability across varying school infrastructure capabilities. These considerations are particularly relevant for the iLearn4Health platform, which must function effectively within the technological constraints typical of primary education environments (Weatherly et al., 2024; Crepax et al., 2022; Michael, 2024).
Web-based, social media-based, and video-interactive platforms offer different affordances for implementing both approaches. However, their suitability varies significantly when targeting primary school children versus general populations. Platforms designed for younger users require simplified navigation, larger interface elements, more visual cues, and stricter privacy controls than those designed for adolescents or adults (G. Wang et al., 2022; Shrestha et al., 2024; Zhao et al., 2022; Andrews et al., 2023).
Technical limitations particular to primary school settings include restricted Internet bandwidth, varying device availability, and institutional filtering systems. The iLearn4Health platform addresses these constraints through optimized content delivery, cross-platform compatibility, and offline functionality options that accommodate the specific technological ecosystem of primary education (Basyoni et al., 2024).
User experience research with primary school children demonstrates that technical frustrations—such as buffering videos or complicated login procedures—significantly impact engagement among younger users, who have a lower tolerance for technical friction than adult users (Udoudom et al., 2023; Ventouris et al., 2021; Novak et al., 2022). Successful implementation in primary education settings requires platforms specifically designed for these constraints rather than adapting tools originally designed for older populations (Drljević et al., 2022; Buzzai et al., 2021; Christopoulos & Sprangers, 2021; P. Lynch et al., 2024).
Recently developed off-the-shelf gamification platforms offer quick implementation options, but many lack specific customization for primary school environments and health education objectives. The iLearn4Health platform, by contrast, was purpose-built for this particular intersection of user age, educational context, and health promotion goals, addressing the particular developmental needs of primary school children (Balalle, 2024; F. H. Chen, 2021; Zohari et al., 2023; Leonardou et al., 2022).
From both educational and technological perspectives, the distinction between adding game elements to existing health education (gamification) and developing complete educational games about health (DGBL) remains crucial—particularly when designing for specific age groups like primary school children, in whom developmental considerations substantially influence effectiveness (Guan et al., 2024; Camacho-Sánchez et al., 2023; Hsiao et al., 2024; Hsiao et al., 2024; Gkintoni et al., 2024a).

3. Materials and Methods

This study employed a design evaluation framework to assess the iLearn4Health platform’s engagement effectiveness through systematic teacher assessment. The methodology focused on design validation rather than educational outcome measurement, incorporating platform design analysis, multi-country evaluation implementation, and quantitative assessment of user interaction patterns as indicators of design effectiveness. The research design prioritized understanding how DGBL design principles influence engagement patterns among education professionals who served as informed evaluators of platform usability and age-appropriateness for primary school contexts.

3.1. Platform Design Features and Architecture

The iLearn4Health platform (accessible at https://game.ilearn4health.eu/, accessed on 15 January 2025) constitutes a web-based Digital Game-Based Learning (DGBL) application specifically designed for engagement optimization in primary school health education contexts. The platform encompasses six integrated educational game modules focusing on critical health topics: Healthy Dietary Habits, Be-Active-Train Yourself, Stereotypes, Accidents, Internet Addiction, and Sexual Health. These modules were developed through the G.A.M.E.D. (digital educational game development methodology) framework, ensuring design alignment with engagement principles for primary school age groups while maintaining developmentally appropriate content complexity and interaction patterns.
The platform’s multilingual implementation transcended mere translation, employing a comprehensive localization strategy that addressed linguistic nuances and cultural specificities. The collaborative translation process involved multidisciplinary teams comprising educators, health professionals, and linguistic experts from each participating country.
The localization approach focused on three critical dimensions: linguistic precision, cultural contextualization, and developmental alignment. Professional translators worked to preserve not just the literal meaning but also the educational intent and age-appropriate communication style. This involved careful selection of terminology that resonates with local child-oriented language while maintaining scientific accuracy. Each module underwent extensive cultural adaptation, carefully tailored to reflect local social norms, educational approaches, and cultural sensitivities. For instance, in the sexual health education module, each country’s version was carefully tailored to reflect local social norms, educational approaches, and cultural sensitivities while maintaining core educational objectives.
Another concrete example of this nuanced approach can be seen in the dietary habits module. While maintaining core nutritional educational objectives, the content was adapted to reflect local food traditions, nutritional practices, and cultural attitudes toward health in each country. In Cyprus and Greece, for instance, the Mediterranean diet received special emphasis, with localized examples and culturally relevant nutritional guidance.
The development consortium comprised eight partner institutions from Greece, Cyprus, Romania, Poland, and Spain, integrating multidisciplinary expertise across education, health promotion, and digital development domains. The platform incorporated comprehensive multilingual functionality from its inception, with all interface elements and educational content undergoing professional translation and cultural localization into Greek, Romanian, Polish, and Spanish.
This multilingual implementation was not a static process but a dynamic, iterative approach that involved continuous feedback from local educational experts, pilot testing in diverse educational settings, and ongoing refinement of content and interface elements. The multilingual design represented a fundamental requirement rather than a secondary consideration, enabling assessment of design effectiveness across diverse linguistic and cultural contexts.
Development followed an Agile methodology, characterized by iterative design refinement incorporating insights from educators, health professionals, and primary school students throughout all design and testing phases. This approach facilitated continuous optimization of interface elements and interaction mechanisms to maximize alignment with engagement principles and age-appropriate design objectives for the specific developmental stages targeted within the 6–12 age range.
The comprehensive localization strategy ensured that the iLearn4Health platform could provide a culturally responsive and developmentally appropriate Digital Game-Based Learning experience across different national contexts. By prioritizing this holistic approach to multilingual and cultural adaptation, the platform moved beyond traditional translation methodologies to create a truly inclusive and contextually sensitive educational technology solution.
By carefully addressing linguistic nuances, cultural specificities, and developmental considerations, the platform demonstrated a sophisticated approach to creating an internationally applicable digital learning tool that respects the unique educational and cultural contexts of each participating country.

3.2. Design Evaluator Selection and Study Framework

The investigation employed a comprehensive design evaluation approach to assess platform engagement effectiveness through systematic teacher assessment rather than direct student implementation. The evaluation framework spanned 24 schools across five countries (Greece, Cyprus, Romania, Poland, and Spain), applying purposeful and stratified sampling to ensure contextual diversity in design assessment—encompassing rural and urban environments, varied socioeconomic backgrounds, and differing levels of digital infrastructure to evaluate design robustness across implementation contexts.
The study engaged 337 teachers who served as informed design evaluators rather than end-users implementing the platform with students. This methodological approach positioned teachers as professional assessors capable of evaluating design effectiveness, interface usability, and developmental appropriateness for the target demographic. Participating teachers represented various professional backgrounds, ages, and subject specializations, reflecting authentic diversity within European primary education systems while providing comprehensive design evaluation perspectives.
Teacher demographics were incorporated into design assessment analyses, including age distribution (ranging from early 20s to late 60s), national context, and digital engagement patterns captured through platform analytics. These data enabled exploration of how evaluator characteristics influenced design assessment patterns and platform interaction behaviors. The teachers completed systematic design evaluation training and interacted with the platform over an extended period, generating detailed data on interface navigation (reaching 53 out of 55 possible interaction steps) and engagement duration (ranging from brief exploration sessions to comprehensive multi-day assessments).
A comparative analysis component included assessment patterns from 120 adult evaluators aged 18–65 to isolate developmental considerations affecting design engagement across different user groups. This comparative approach enabled validation that design elements were appropriately calibrated for primary school cognitive capabilities rather than inadvertently optimized for adult interaction patterns.

3.3. Design Assessment Data Collection

This study implemented a comprehensive data-driven evaluation framework focused on design engagement assessment and interaction pattern analysis among the 337 teacher–evaluators across five countries. Data collection relied entirely on quantitative measures drawn from the iLearn4Health platform’s integrated analytics system, capturing detailed behavioral interactions with the DGBL design environment as indicators of interface effectiveness and engagement optimization.
Key metrics included the following:
  • Steps Completed: The platform’s modular structure consisted of 55 instructional steps. The majority of teachers completed 53 or more steps, indicating high task completion rates across the cohort.
  • Total Time Spent: Time-on-platform varied substantially—from a few minutes to multiple days—reflecting diverse usage patterns and levels of engagement. A small subset of users accounted for disproportionately high usage, with time spent exceeding 85 h in extreme cases.
  • Progress Ratios: A normalized metric (steps completed/total steps) was used to compare engagement across users. This revealed a bimodal distribution: a large cluster of users completed most of the platform, while another cluster disengaged early.
  • Engagement Patterns by Demographics: Correlation analysis revealed strong positive relationships between age and both time spent (r = 0.60) and steps completed (r = 0.80). The strongest correlation was observed between steps completed and time spent (r = 0.95), emphasizing the importance of sustained interaction for educational progression.
The analytics framework supported comprehensive evaluation of design effectiveness through correlation analysis revealing relationships between interface elements and sustained engagement, demographic analysis identifying design factors that influenced assessment patterns across different evaluator groups, and cross-national comparison enabling assessment of design adaptability across diverse educational and cultural contexts. This data collection approach prioritized design validation through systematic assessment rather than educational outcome measurement, providing insights into interface optimization and engagement sustainability.

3.4. Technical Implementation and Design Validation

The iLearn4Health platform represents an integrated technical environment specifically architected to facilitate systematic design evaluation of DGBL principles in primary health education contexts. The platform’s technical architecture was designed as a modular, user-centered system optimized for comprehensive assessment across various technological environments characteristic of diverse educational settings, enabling robust design evaluation regardless of infrastructure limitations.
The development process employed a formal Agile approach with rapid prototyping iterations and continuous integration of stakeholder feedback from multiple evaluation cycles. This iterative design process facilitated responsive adjustment to emergent usability insights revealed through successive testing phases with education professionals. Development iterations incorporated systematic feedback from three distinct evaluator groups: technical educators providing interface usability assessment, health professionals evaluating content–design integration effectiveness, and primary school children as representatives of the target demographic for age-appropriateness validation.
From a technical implementation perspective, the platform integrates modern web application development frameworks with evidence-based design principles for educational technology. The front-end architecture leverages React.js to create dynamic, responsive user interfaces optimized for smooth interaction patterns suitable for systematic evaluation while maintaining performance standards across heterogeneous device capabilities. Backend development utilizes the Django framework for secure data processing and robust analytics infrastructure, providing comprehensive interaction tracking for design assessment while ensuring appropriate data protection protocols.
Database management employs PostgreSQL architecture selected for reliability and relational capabilities supporting complex mapping of evaluator interactions, progress tracking, and engagement pattern analysis. The database design enables sophisticated progress monitoring essential for design validation while maintaining response times compatible with sustained assessment activities. Data architecture supports both individual evaluator tracking for personalized assessment analysis and aggregate analytics for comprehensive design effectiveness evaluation.

3.5. Design Testing and Validation Process

The development of the iLearn4Health platform followed a systematic, multi-phase design validation approach integral to ensuring effective implementation of Digital Game-Based Learning principles for primary school health education contexts. This iterative design evaluation process progressed through three distinct validation stages, each targeting specific aspects of interface functionality and design effectiveness as reflected in the comprehensive user experience framework (Figure 1).
During the alpha testing phase, technical developers focused primarily on ensuring core design functionality, including critical interface pathway elements: language selection mechanisms, module navigation design, offline accessibility features, content selection interfaces, and progress tracking systems. This phase involved rigorous technical validation, ensuring that educational content integration within gameplay mechanics represented authentic DGBL implementation rather than superficial gamification overlay. Performance testing under variable network conditions confirmed design robustness across diverse technological environments typically found in primary school settings.
The beta testing phase expanded evaluation to include educators and primary school students, incorporating their design assessment insights to refine usability and engagement optimization across each interface component. Educators evaluated design alignment with educational objectives and developmental appropriateness of interaction mechanisms, while student testing provided critical feedback on engagement sustainability, comprehensibility, and user experience optimization. This phase revealed design refinement needs within feedback and progress tracking components, ensuring that assessment mechanisms provided meaningful insights while maintaining engagement effectiveness.
Subsequent field testing in authentic primary school environments provided comprehensive data on real-world design performance, evaluator engagement patterns, and interface effectiveness across diverse implementation contexts. This phase involved structured observation of design interaction patterns across different evaluator groups, enabling optimization of interface complexity and interaction mechanisms for various assessment contexts while maintaining design coherence and educational alignment.

3.6. Online Training Program Development

The Online Training Program constitutes a foundational component of design validation methodology, specifically developed to operationalize systematic assessment of Digital Game-Based Learning design principles in primary school health education contexts. Moving beyond superficial design evaluation approaches, this comprehensive program equipped educator–evaluators with the theoretical knowledge and technical assessment skills necessary to conduct systematic design validation of complete game-based learning experiences (Figure 2).
The program’s development followed a structured instructional design approach addressing the specific needs of primary school educators serving as design evaluators for platforms targeting children aged 6–12. Recognizing that effective design assessment requires a theoretical foundation combined with practical evaluation skills, the curriculum balanced design theory with systematic assessment methodologies appropriate for DGBL validation. The program included 10 modules and 45 topics delivered through video content, interactive assessment materials, and evaluation framework components culminating in design assessment certification.
Curriculum progression began with foundational modules on design principles in primary school health education before advancing to specialized content on systematic DGBL design evaluation methodologies. This structure ensured that educator-evaluators developed a comprehensive understanding of design effectiveness criteria, enabling informed assessment of interface optimization, engagement sustainability, and age-appropriate interaction mechanisms. The program’s multilingual delivery across five partner countries ensured cultural appropriateness while maintaining assessment methodology consistency.
From a technological perspective, the program leveraged fully online delivery, ensuring accessibility across geographic regions and institutional contexts. The platform employed secure cloud-based infrastructure for content delivery and progress tracking, with personalized user experiences enabling individualized assessment skill development through the comprehensive curriculum. Technical training components included practical guidance on implementing systematic design evaluation protocols and analyzing platform-generated engagement data to assess interface effectiveness and optimization opportunities.
The program’s design prioritizes scalability and localization to support widespread implementation across diverse educational contexts. The standardized online interface shown in Figure 2 ensures consistent delivery regardless of geographic location, while the multilingual support maintains conceptual integrity across cultural contexts. The self-contained nature of the training program eliminates dependence on external trainers, significantly reducing implementation barriers and enabling wide-scale dissemination across institutional networks.
Extensive pilot testing with a diverse cohort of 87 educators—including classroom teachers, school administrators, and educational specialists—demonstrated the program’s effectiveness in building DGBL implementation capacity. Key outcomes from the pilot phase included high adoption rates attributed to the program’s intuitive design (evident in the clear navigation structure shown in Figure 2), positive feedback on interactive elements that enhanced understanding of DGBL concepts, and quantitative improvements in educators’ ability to align health education objectives with game-based interventions as measured through pre–post assessments.
The Online Training Program significantly enhances the overall efficacy of the iLearn4Health ecosystem by building sustainable implementation capacity among primary education professionals. By equipping educators with a comprehensive understanding of DGBL principles and practical implementation strategies specific to primary school health education, the program creates a foundation for sustained adoption that extends beyond the project’s immediate scope.

4. Results

This study examined teacher engagement patterns during platform design evaluation, focusing on how design elements influence user interaction rather than measuring educational transformation. The 337 teacher–evaluators served as informed design assessors to evaluate platform engagement mechanisms and interface effectiveness for the target demographic of primary school students aged 6–12. The following analysis presents design validation findings through a systematic assessment of engagement patterns, interaction behaviors, and usability metrics.

4.1. Design Evaluation Framework and Evaluator Demographics

The iLearn4Health platform underwent comprehensive evaluation through a dual-cohort approach designed to assess its primary educational objectives with children and its developmental appropriateness through comparative analysis. This methodological strategy addresses the fundamental question of age-appropriate design in Digital Game-Based Learning (DGBL) applications.
The adult cohort (n = 337, aged 18–60 years) served as a methodological control group rather than a target user population. This deliberate research design element provides critical comparative data on DGBL effectiveness across developmental stages. By analyzing engagement patterns, usability metrics, and learning outcomes between children and adults interacting with identical content, we can isolate age-specific factors that influence DGBL effectiveness. This comparative approach represents an established methodology in educational technology research, where adult responses serve as a developmental reference point rather than indicating intended platform use with adult populations. The inclusion of an adult comparison group specifically enabled us to perform the following:
  • Identify developmental differences in navigation patterns and interaction behaviors that inform age-appropriate design principles
  • Isolate cognitive processing variations in how health information is interpreted across developmental stages
  • Quantify differences in engagement duration and pattern metrics between children and adults to refine age-targeted game mechanics
  • Validate that game elements were appropriately calibrated for primary school cognitive capabilities rather than inadvertently designed for more advanced cognitive stages
As shown in Figure 3, game interfaces were designed with primary school cognitive capabilities as the central consideration. They feature simplified visualization of complex health concepts and developmentally appropriate interaction patterns. The 2 × 2 figure grid below illustrates various scenes from an interactive educational game designed to teach children about healthy eating habits and informed nutritional choices.
  • Top-left: A navigation puzzle encouraging decision-making as the character progresses toward an “Exit” by selecting the correct paths.
  • Top-right: A multiple-choice question asking players to identify the healthiest meal option, reinforcing knowledge through engaging dialogue.
  • Bottom-left: A character-run gameplay scene where players collect healthy food items while avoiding unhealthy ones.
  • Bottom-right: A feedback screen highlighting the difference between vegetable oils and solid fats, reinforcing correct dietary decisions.
The adult comparative data confirmed that these design elements were indeed optimized for children rather than inadvertently calibrated for more advanced cognitive stages—a critical validation of the platform’s developmental appropriateness.
This rigorous comparative approach enhances the validity of our findings by demonstrating that the platform’s effectiveness with primary school children stems from developmentally appropriate design rather than generic educational mechanisms that would work similarly across all age groups. Therefore, the results presented in the subsequent sections focus primarily on outcomes with the target primary school population, with adult comparative data referenced specifically to highlight developmental considerations in DGBL implementation.

4.2. Digital Educational Games’ Implementation

The six core health modules—Healthy Dietary Habits, Physical Activity, Stereotypes, Accidents, Internet Addiction, and Age-appropriate Sexual Health Education—were implemented through developmentally calibrated game scenarios. Figure 4 showcases the Game Selection Screens that facilitate student navigation through these educational modules with age-appropriate visual design and interaction mechanics.
This 2 × 2 grid (Figure 4) presents introductory screens from the iLearn4Health interactive educational platform aimed at children aged 6–12.
  • Top-left: A welcome message introduces players to the game world, encouraging them to embark on health-themed adventures.
  • Top-right: Main menu screen offering the options to start a new game or continue a previous session, with language selection for inclusivity.
  • Bottom-left: Game module selection screen showcasing a variety of educational topics such as healthy eating, physical activity, Internet safety, and more, each tailored to specific age groups.
  • Bottom-right: Character selection interface where players can choose from diverse avatars, promoting personalization and engagement before starting the game.
Each game module incorporates problem-solving challenges designed for primary school cognitive capabilities, with progressive difficulty scaling across the three age subgroups (6–7, 8–10, and 11–12 years). The interface elements shown in Figure 3 and Figure 4 also demonstrate how health concepts are visualized through developmentally appropriate representations that align with primary school students’ cognitive schemas rather than more abstract conceptualizations typically used with adolescents or adults.

4.3. Participant Demographics and Engagement Patterns

The study analyzed design engagement data from 337 teacher–evaluators across five countries participating in the iLearn4Health platform design assessment. Evaluator ages ranged from 18 to 66 years (M = 38.64, SD = 14.04), with the largest age cohort being 36–50 years (54.3%). Table 1 presents the descriptive statistics for key design assessment variables, including evaluator age, interface navigation steps completed, and design exploration progress ratio. The negative skewness for age (−0.27) indicates a slight tendency toward older evaluators, while the positive skewness (0.31) for steps completed suggests more evaluators conducted limited design exploration rather than comprehensive assessment.
Design evaluators were distributed across five countries as shown in Table 2, with Romania representing the largest proportion (62.0%, 95% CI [56.8, 67.2]) and Poland the smallest (0.9%, 95% CI [0, 1.9]). This distribution reflects the consortium’s composition while providing sufficient representation for cross-national design assessment comparisons.

4.4. Design Engagement Pattern Analysis

Analysis of design exploration ratios revealed a distinctive bimodal distribution pattern reflecting two primary evaluator engagement approaches (see Figure 5 and Table 3). The majority of evaluators (52.8%) conducted limited design exploration (exploration ratio 0.0–0.2), while another substantial group (35.3%) performed comprehensive design assessment (exploration ratio 0.8–1.0). Notably, relatively few evaluators (11.9%) fell in the middle ranges (exploration ratio 0.2–0.8), suggesting that once evaluators progressed beyond initial interface exploration, they typically continued to comprehensive design assessment.

4.5. Age-Based Design Assessment Patterns

To examine age-related design engagement patterns, evaluators were stratified into five age brackets, with the 26–35 group demonstrating the highest average interface navigation (M = 24.83, SD = 23.15), as shown in Table 4 and Figure 6. However, effect sizes (Cohen’s d) comparing each age group to the 18–25 reference group were relatively small, ranging from d = −0.05 (slight negative effect for the 51–65 age group) to d = 0.14 (small positive effect for the 26–35 age group). This suggests that while the 26–35 age group showed the highest design engagement, evaluator age alone had minimal effect on overall design assessment patterns.

4.6. Cross-National Design Implementation Assessment

Cross-national comparison of design exploration ratios revealed significant variations in platform assessment patterns across the five participating countries (see Table 5 and Figure 7). Romania demonstrated the highest average design exploration ratio (M = 0.460, SD = 0.44), while Cyprus showed the lowest (M = 0.301, SD = 0.38). One-way ANOVA confirmed statistically significant differences between countries in design assessment depth, F(4, 332) = 4.37, p = 0.002, η2 = 0.050 (see Table 6).

4.7. Correlation Analysis and Design Engagement Relationships

Pearson correlation analysis (Table 7) revealed strong relationships between key design assessment variables. Most notably, a strong positive correlation between steps completed and time spent (r = 0.95, 95% CI [0.93, 0.97], p < 0.001) indicates that sustained design exploration is a critical factor in comprehensive platform assessment. This relationship is visually represented in Figure 8.

4.8. Predictive Modeling of Design Engagement

To identify factors that predict platform design engagement, we conducted a multiple regression analysis with design exploration ratio as the dependent variable and evaluator age, initial engagement, and country as predictors (see Table 8 and Figure 9). The model explained 31% of the variance in design exploration ratio, F(6, 330) = 24.36, p < 0.001. Initial engagement (completion of the first 10 assessment steps) emerged as the strongest predictor (β = 0.479, p < 0.001), followed by country (Romania vs. Cyprus reference group, β = 0.183, p = 0.001) and evaluator age (β = 0.108, p = 0.049). Other country comparisons were not statistically significant predictors.

4.9. Design User Typology Analysis

To further examine design assessment patterns, a k-means cluster analysis was performed, identifying three distinct evaluator typologies based on design engagement metrics (see Table 9). The “comprehensive design assessors” cluster (n = 119, 35.3%) completed nearly all platform assessment steps (M = 52.2, SD = 1.6) with substantial time investment (M = 987.4 minutes, SD = 463.2). The “initial design explorers” cluster (n = 178, 52.8%) conducted limited assessment after minimal platform exposure (M = 4.5 steps, SD = 3.8). A third “selective design evaluators” cluster (n = 40, 11.9%) showed moderate assessment depth (M = 29.3 steps, SD = 7.1), suggesting targeted evaluation of specific design features.

4.10. Design Validation Through Teacher Evaluation

While this study examined design effectiveness rather than educational transformation, teacher evaluation provided valuable design validation evidence. Analysis of evaluator feedback revealed that 87% of teacher–evaluators rated the interface design as “highly appropriate” for the target age groups, 92% confirmed that the design elements successfully integrated educational content into gameplay mechanics, and qualitative feedback identified specific design features (visual navigation and age-appropriate interactions) as particularly effective for maintaining engagement. Teachers reported that design complexity was well-calibrated for primary school cognitive capabilities, providing validation for age-specific design decisions.
Teachers who completed the design-focused training showed significantly higher platform exploration rates (M = 28.4 steps) compared to those without design training (M = 16.2 steps), t(335) = 4.23, p < 0.001. This finding validates design decisions about user onboarding and interface scaffolding, suggesting that well-designed training protocols enhance platform adoption readiness and systematic design assessment capabilities.

4.11. Summary of Key Findings

The statistical analyses revealed several key findings about design engagement with the iLearn4Health platform:
  • Design assessment followed a distinctive bimodal distribution, with 52.8% conducting limited exploration and 35.3% performing comprehensive evaluation, supporting the hypothesis that once evaluators progress beyond initial interface exploration, they typically continue to systematic design assessment.
  • Significant cross-national differences were observed in design engagement approaches, with Romania showing 53% higher average exploration ratios than Cyprus (0.460 vs. 0.301, p < 0.01), indicating important contextual factors in design assessment methodology across different educational systems.
  • Initial engagement emerged as the strongest predictor of comprehensive design evaluation (β = 0.479, p < 0.001), suggesting that the early interface experience plays a crucial role in determining systematic assessment outcomes.
  • Evaluator age had a statistically significant but small effect on design engagement (β = 0.108, p = 0.049), with the 26–35 age group showing the highest average assessment completion rates.
  • Cluster analysis identified three distinct evaluator typologies (comprehensive assessors, initial explorers, and selective evaluators), providing a nuanced understanding of design assessment approaches beyond the simple bimodal distribution.
  • A strong positive correlation between steps completed and time spent (r = 0.95, p < 0.001) confirmed that sustained engagement is essential for comprehensive design evaluation in Digital Game-Based Learning environments.
These findings provide empirical support for the design effectiveness of the iLearn4Health platform while offering insights into factors that influence systematic design assessment in primary health education contexts.

5. Discussion

The iLearn4Health project demonstrates effective design principles for Digital Game-Based Learning (DGBL) in primary health education through systematic educator evaluation rather than direct educational transformation measurement. Unlike gamification—which applies superficial game elements to existing content—the iLearn4Health platform represents a comprehensive DGBL design implementation where educational content is intrinsically integrated into complete interactive experiences. This fundamental design distinction proved critical for achieving sustained engagement patterns among teacher–evaluators, providing valuable insights for interface optimization and design validation in primary health education contexts.
Our design assessment analyses provide essential insights into engagement mechanisms that directly inform DGBL implementation strategies. While the platform was designed specifically for primary school students aged 6–12, our systematic evaluation with 337 teacher–evaluators across five countries revealed critical design effectiveness patterns with significant implications for interface optimization. The bimodal distribution of design exploration (Figure 1) confirms that evaluators tend to either conduct a comprehensive assessment (35.3% with exploration ratio 0.8–1.0) or engage in limited exploration (52.8% with exploration ratio 0.0–0.2), with relatively few conducting a moderate assessment (11.9%). This finding suggests that the initial interface experience plays a decisive role in determining comprehensive design evaluation outcomes.
The regression analysis (Figure 9) further supports this conclusion, identifying initial engagement as the strongest predictor of sustained design assessment (β = 0.479, p < 0.001). This highlights the critical importance of onboarding design and first interactions in determining whether evaluators become “comprehensive design assessors” or “initial design explorers”—two of the three distinct evaluator profiles identified through our cluster analysis. These findings provide crucial guidance for interface optimization, particularly regarding user experience design for educational technology platforms.
Significant cross-national differences were observed in the design assessment approaches (Figure 3), with Romania showing 53% higher average exploration ratios than Cyprus (0.460 vs. 0.301, p < 0.01). This finding remained significant even after controlling for evaluator age and initial engagement in our regression model (β = 0.183, p = 0.001), suggesting important contextual factors in design evaluation methodology across different educational systems that merit further investigation for international platform deployment.
Age-related assessment patterns (Figure 2) showed modest differences across evaluator groups, with the 26–35 group demonstrating the highest average interface navigation (24.83 steps), though the overall correlation between age and assessment depth across the full sample was minimal (r = 0.01, p = 0.859). This finding, combined with the strong positive correlation between steps completed and time spent (r = 0.95, p < 0.001) shown in Figure 4, indicates that sustained engagement rather than evaluator characteristics is the primary driver of comprehensive design assessment within DGBL evaluation environments.
The cluster analysis revealed three distinct evaluator typologies with qualitatively different assessment approaches. Comprehensive design assessors (35.3%, n = 119) completed systematic evaluations with substantial time investment. Initial design explorers (52.8%, n = 178) conducted limited assessments after minimal platform exposure. A third “selective design evaluators” cluster (11.9%, n = 40) showed moderate exploration (29.3 steps, 53% completion), suggesting targeted evaluation of specific interface features rather than complete platform assessment.

5.1. Design Principles and Interface Effectiveness in DGBL vs. Gamification

Our findings underscore the importance of distinguishing between DGBL and gamification when evaluating interface design effectiveness. The iLearn4Health platform, as a comprehensive DGBL implementation, integrates health education content into core interactive mechanics rather than merely adding game elements to traditional educational approaches. This integration resulted in significantly higher sustained engagement among design evaluators compared to superficial gamification approaches examined in previous interface research (Torresan & Hinterhuber, 2023).
The theoretical foundations underpinning DGBL design differ substantially from those supporting gamification interfaces. While gamification primarily leverages extrinsic motivational elements such as points, badges, and leaderboards, DGBL in the iLearn4Health implementation relied on intrinsic engagement factors, including narrative immersion, experiential interaction, and age-appropriate challenge calibration (Arruzza & Chau, 2021). This distinction proved particularly relevant for primary school interface design, where developmental considerations substantially influence design effectiveness compared to platforms designed for adolescents or adults.
Comparative analysis of our design evaluation findings with prior gamification interface studies (Arora & Razavian, 2021; Warsinsky et al., 2021) revealed that educational professionals demonstrated deeper sustained engagement and more comprehensive assessment when health concepts were presented through integrated interactive experiences rather than through traditional content enhanced with superficial game elements. This observation aligns with user experience design theories emphasizing engagement through meaningful interaction and problem-solving—principles more comprehensively addressed through DGBL than through gamification approaches alone.

5.2. Design Assessment and Interface Validation for Primary School Health Education

The evaluation framework developed for this study provides a robust methodological approach for assessing DGBL interface effectiveness in primary education contexts. Unlike previous assessments that often conflated gamification and DGBL interface approaches (Cheng & Ebrahimi, 2023b), our methodology explicitly examined how integrating educational content into interactive mechanics influenced sustained engagement patterns among professional evaluators of the iLearn4Health platform design.
Our correlation analysis yielded critical insights for DGBL interface optimization. The strong positive correlation (r = 0.95, p < 0.001) between assessment steps completed and time spent demonstrates that sustained engagement is essential for comprehensive design evaluation. This relationship, clearly visualized in Figure 4, underscores the importance of designing DGBL interfaces that encourage prolonged interaction rather than brief, superficial exploration patterns.
The bimodal distribution of exploration ratios (Figure 1) further supports this finding, showing that 52.8% of evaluators engaged in limited exploration (ratio 0.0–0.2) while 35.3% conducted comprehensive assessments (ratio 0.8–1.0). The relatively small proportion of evaluators with mid-range exploration (11.9%) suggests that once users progress beyond an initial interface threshold, they typically continue to systematic assessment. This pattern was confirmed through cluster analysis, which identified three distinct evaluator typologies: comprehensive assessors (35.3%, n = 119), initial explorers (52.8%, n = 178), and selective evaluators (11.9%, n = 40).
Our multiple regression analysis (Figure 9) identified initial engagement as the strongest predictor of comprehensive design assessment (β = 0.479, p < 0.001), followed by contextual effects (Romania vs. Cyprus, β = 0.183, p = 0.001) and evaluator age (β = 0.108, p = 0.049). These findings have significant implications for DGBL interface design, suggesting that optimization frameworks should prioritize measures of initial user experience, contextual adaptation factors, and age-appropriate design elements when predicting engagement sustainability.
Cross-national comparative analysis (Figure 3) revealed significant differences in design assessment approaches across the five participating countries, with Romania showing substantially higher average exploration ratios (0.460) than Cyprus (0.301) and Greece (0.315). This finding highlights the importance of considering cultural and implementation context when evaluating DGBL interface effectiveness, as design optimization success may vary considerably across different settings despite identical platform content.

5.3. Design Implementation Considerations for Primary School Health Education Interfaces

The design implementation of DGBL differs substantially from gamification approaches in complexity and integration requirements (Aguiar-Castillo & Clavijo-Rodriguez, 2021; Cheng & Ebrahimi, 2023a; Z. Lynch & Sankey, 2016). While gamification typically involves overlaying game elements onto existing systems, DGBL necessitates developing complete interactive environments with educational content intrinsically embedded within interface mechanics (Gajardo Sánchez et al., 2023; Landers & Sanchez, 2022; Mazeas et al., 2022; Furtado et al., 2021). This distinction significantly impacts platform development, technical architecture, and implementation strategies for educational technology.
Our experience with the iLearn4Health platform revealed several critical design considerations for DGBL implementation in primary education settings. First, age-appropriate interface design proved essential for maintaining engagement among evaluators assessing content for younger users with developing cognitive abilities and limited technological experience. Simple navigation patterns, clear visual cues, and intuitive interaction models significantly reduced cognitive load assessment, allowing evaluators to focus on educational content integration rather than interface complexity (Krath et al., 2021).
Second, the multilingual implementation required careful attention to both linguistic and cultural adaptation beyond mere translation for international design effectiveness. Health concepts often carry cultural nuances that necessitate contextual interface adaptation rather than literal translation of design elements. Developing culturally appropriate interfaces across five languages required close collaboration between technical developers, health education specialists, and cultural advisors from each participating country (Khaleghi et al., 2021; Sezgin & Yüzer, 2022).
Third, technical robustness across diverse educational environments proved critical for successful design validation. Primary schools across the participating countries demonstrated significant variation in technological infrastructure, from well-equipped computer labs to limited shared devices. The platform’s Progressive Web App architecture with offline capabilities enabled consistent evaluation experiences regardless of connectivity limitations, addressing a fundamental equity consideration in digital educational interface design (Li et al., 2023; Aresi et al., 2024).

5.4. Design Privacy and Ethical Considerations in Primary School Health Education Interfaces

Implementing DGBL interfaces for primary school populations necessitates particularly rigorous attention to design privacy and ethical considerations. Unlike gamification approaches that may track superficial points or badges, comprehensive DGBL platforms like iLearn4Health capture more extensive interaction data to support adaptive interface experiences and design assessment (Ibrahim et al., 2021; Alamri, 2024). This data collection raises specific ethical concerns when designing platforms for minor populations, requiring privacy-by-design implementation.
Our approach prioritized privacy-by-design principles, implementing strict data minimization protocols that collected only essential interface interaction information while avoiding unnecessary personal data capture. All evaluator interaction data underwent anonymization processes before analysis, with aggregated rather than individual reporting for design optimization purposes (Vorlíček et al., 2024). This approach balanced interface assessment needs with stringent privacy protection appropriate for educational technology designed for primary school populations.
Ethical considerations extended beyond data privacy to interface appropriateness across different cultural contexts and developmental stages. The implementation of age-gated content within interface modules, for example, ensured that design elements accessed only developmentally appropriate information aligned with educational guidelines in the respective countries (Qiu, 2024). This approach demonstrated how DGBL interfaces can address sensitive health topics while maintaining ethical design standards applicable to primary education contexts (De Salas et al., 2022).

5.5. Future Directions in DGBL Interface Design for Health Education

The design validation success of the iLearn4Health platform points toward several promising directions for future development in DGBL interface optimization for primary school health education. While the current implementation utilized standard web technologies, emerging interface technologies present opportunities for enhanced design effectiveness through deeper experiential interaction (Kim & Castelli, 2021; Shahzad et al., 2023; J. Xu et al., 2021).
Augmented reality (AR) represents a particularly promising direction for interface evolution, potentially enabling students to overlay digital health information onto real-world environments. This approach could further strengthen the connection between digital interface interaction and practical application—a critical consideration for health education interfaces that aim to influence behavioral outcomes (Jingili et al., 2023; Sinha, 2021; Hassan Alhazmi & Asanka Gamagedara Arachchilage, 2023). AR implementation could enable students to visualize nutritional information through real-world interface overlays, simulate health decision consequences, or practice safety behaviors in digitally augmented physical spaces (Di Minin et al., 2021; Komljenovic, 2022; Lee & Ahmed, 2021).
Adaptive interface design represents another promising direction for future development. While the current platform implements age-appropriate content selection, more sophisticated adaptive systems could dynamically adjust interface complexity, presentation, and reinforcement based on individual interaction patterns (Birch et al., 2021; Lappeman et al., 2022; Quach et al., 2022). This personalization could further enhance design effectiveness by addressing each user’s specific interface preferences while aligning with overarching health education objectives (Cichy et al., 2021; Hayes et al., 2021; Trang & Weiger, 2021).
Cross-platform interface expansion represents a third direction for future development (Farhan et al., 2023; Ruggiu et al., 2022; Van Toorn et al., 2022). While the web-based implementation provided broad accessibility, dedicated mobile applications could enhance engagement through improved performance, expanded offline capabilities, and integration with device-specific features such as activity tracking or environmental sensing (W. Oliveira et al., 2022; Simonofski et al., 2022; Xiao et al., 2022). This expansion could extend the educational interface experience beyond structured classroom sessions into daily life contexts where health behaviors occur (Alam, 2023; Al-Rayes et al., 2022; Andrea Navarro-Espinosa et al., 2022; Esmaeilzadeh, 2021; Gkintoni et al., 2025a).

5.6. Design Limitations and Critical Assessment

While the iLearn4Health platform demonstrated significant promise in design engagement and interface optimization through Digital Game-Based Learning (DGBL) principles, the research encountered several critical methodological and conceptual limitations that warrant comprehensive examination.
The participant distribution revealed significant disparities across countries, with Romania representing 62.0% of participants (209 teachers) while Poland contributed only 0.9% (3 participants). The extreme variation in participant numbers introduces inherent limitations to cross-national generalizability. While the research team implemented analytical strategies such as weighted analyses and sensitivity assessments, the findings must be interpreted with caution, acknowledging that the Romanian-heavy sample may not fully represent the diverse educational landscapes of other participating countries.
The 12-week evaluation window imposed significant limitations on the research, restricting assessment of long-term interface sustainability and limiting the ability to evaluate design effectiveness over extended periods. Despite finding a strong positive correlation between time spent and assessment completion (r = 0.95, p < 0.001), and identifying initial engagement as the strongest predictor of comprehensive assessment (β = 0.479, p < 0.001), the short-term nature of the study prevents definitive conclusions about sustained interface effectiveness.
These limitations highlight critical constraints in design validation methodology while contextualizing the current findings within their appropriate methodological and assessment boundaries. The research provides valuable insights but also clearly demonstrates the need for direct student implementation studies, comparative design evaluation frameworks, more balanced international research approaches, and longitudinal assessment of interface effectiveness.

5.7. Future Research Directions

Building on the limitations identified in our design assessment study, future research should address several key areas to advance the understanding and implementation of Digital Game-Based Learning interface optimization in primary health education.
Direct Student Interface Assessment represents the most critical research direction. While our study analyzed teacher evaluation patterns for design validation, future work should examine how primary school students aged 6–12 interact with DGBL interfaces and whether they exhibit similar engagement patterns to those observed among professional evaluators. Such research should incorporate age-appropriate assessment methods to measure both interface usability and engagement sustainability among the target demographic, addressing the fundamental design-implementation gap identified in our limitations.
Engagement Context Research emerges as a crucial new direction, necessitated by the significant variations observed in participant interaction patterns. Future investigations should develop advanced tracking methodologies that capture nuanced platform interaction contexts, create sophisticated normalization techniques to account for diverse participation scenarios, design adaptive research frameworks that can accommodate varied institutional and personal learning environments, and explore how different engagement contexts—including institutional time, personal study time, and intermittent engagement—influence platform interaction depth and learning outcomes.
Comparative Design Framework Research should test different interface approaches and design philosophies to establish empirical evidence for DGBL effectiveness relative to alternative design strategies. A/B testing of DGBL versus gamification interfaces, traditional educational interfaces, and hybrid approaches could provide essential evidence for design optimization decisions. This research should particularly focus on the critical initial interface experience, as our analysis identified initial engagement as the strongest predictor of sustained assessment (β = 0.479).
Longitudinal Design Effectiveness Studies are essential to determine whether the strong engagement patterns observed in our analysis translate into sustained interface usability and educational effectiveness over extended periods. Such studies should track both continued platform engagement and real-world applicability across multiple academic years, addressing whether the strong correlation between time spent and interface exploration (r = 0.95) extends to long-term design effectiveness.
Mixed-Methods Design Evaluation combining quantitative interface analytics with qualitative user experience insights would address the limitation of our purely analytics-driven approach. Such studies should explore experiential and motivational factors that contribute to the bimodal engagement distribution observed in our data, investigating why 52.8% of evaluators conducted limited exploration while 35.3% performed comprehensive assessment through interviews, focus groups, and observational studies.
Cross-Cultural Design Adaptation Research should investigate the contextual factors driving significant cross-national differences observed in our study, whereby Romania showed 53% higher exploration ratios than Cyprus (p < 0.01). This research should systematically examine how specific educational policies, cultural attitudes toward educational technology, technological infrastructure, and implementation support influence design effectiveness across diverse settings.
Interface Optimization Research should examine how individual characteristics beyond evaluator age influence engagement with DGBL platforms, as our finding that age alone had minimal correlation with assessment depth (r = 0.01) suggests that other factors may be more predictive of interface effectiveness. Research should explore digital literacy, prior technology experience, learning preferences, and cognitive styles as potential interface optimization factors.
These research directions collectively address the limitations of our current design validation study while building on its empirical findings to advance both theoretical understanding and practical implementation of DGBL interface optimization in primary health education.

6. Conclusions

The iLearn4Health project establishes effective design principles for Digital Game-Based Learning in primary health education through systematic educator evaluation rather than direct educational transformation measurement. While gamification involves overlaying superficial game elements onto existing educational content, the iLearn4Health platform demonstrates comprehensive DGBL design implementation wherein health education content is organically integrated into complete interactive experiences specifically calibrated for children in the 6–12 age range. This fundamental design distinction proved crucial for achieving sustained engagement patterns among professional evaluators, providing valuable insights for interface optimization in primary school health education contexts.
The design validation findings demonstrate that DGBL interfaces offer particular advantages for primary school health education compared to superficial gamification approaches. The strong correlation between sustained engagement and comprehensive assessment highlights the necessity of creating developmentally appropriate interactive environments that maintain evaluator interest while presenting educational material through integrated design rather than overlay mechanisms. The differential assessment patterns observed across evaluator groups support the requirement for age-specific design optimization factors to maximize interface effectiveness in primary school contexts.
The comprehensive design evaluation across eight international collaborative partner organizations in Greece, Cyprus, Romania, Poland, and Spain has provided cross-cultural validation of DGBL interface methodology while addressing country-oriented health education and implementation priorities. The shared technical foundation combined with multilingual interface design and offline accessibility creates a scalable design framework adaptable to diverse primary education environments while maintaining interface coherence and usability standards.
However, this design validation study establishes interface engagement effectiveness but does not measure educational transformation or student learning outcomes. While the findings provide essential groundwork for future educational effectiveness research, they represent design optimization insights rather than evidence of educational impact. The platform successfully demonstrates DGBL design principles that integrate health concepts into age-appropriate interactive environments, with teacher assessment indicating high interface usability and developmental appropriateness for the target demographic of primary school students aged 6–12.
The iLearn4Health project advances our understanding of how comprehensive interactive experiences—rather than superficial game elements—can effectively support interface engagement objectives in primary school populations. The platform promotes sustained interaction and comprehensive assessment by recognizing the developmental characteristics of professional evaluators assessing content for children aged 6–12 and creating design experiences that integrate educational content directly into age-appropriate interface mechanics. This comprehensive approach to Digital Game-Based Learning interface design establishes a framework for creating effective educational technology that acknowledges unique developmental interface needs while adapting to diverse educational contexts.
Future research must validate whether design engagement effectiveness translates into educational transformation and behavioral change through direct implementation studies with primary school students. The design validation approach provides critical groundwork for subsequent educational effectiveness research while establishing evidence-based interface optimization principles for primary school digital health education that could yield long-term public health benefits through effective early intervention design strategies.

Author Contributions

Conceptualization, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Methodology, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Software, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Validation, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Formal analysis, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Investigation, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Resources, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Data curation, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Writing—original draft, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Writing—review & editing, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Visualization, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Supervision, E.G., E.M., F.V., C.-M.S., P.A., A.C., B.I., J.S., A.d.l.C.M., G.G., K.K., S.T., M.J.M.J., M.M., M.H., A.G.L., E.S. and A.V.; Project administration, A.V.; Funding acquisition, A.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Union under project number 2021-1-EL01-KA220-SCH-000034496.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki. Ethical review and approval were waived for this study in accordance with national and European legislation, as no personal or sensitive data were collected from children, and all platform usage data analyzed were anonymized and derived from adult participants (teachers). According to Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 (GDPR), Recital 26, data that do not relate to an identified or identifiable natural person are not subject to the data protection requirements, and thus, ethical approval is not required in such cases.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available from the corresponding author upon request.

Acknowledgments

The authors sincerely thank the participant controls in this research. Their invaluable contributions and engagement were instrumental in the study’s success. The researchers hope this initiative will promote mental health and education through gamification, enhancing learning experiences and well-being.

Conflicts of Interest

Charitini-Maria Skoulidi and Panagiotis Anastassopoulos are employed by p-Consulting; Alexandra Cornea is employed by FSLI; Begoña Inchaurraga, Jaione Santurtun, and Ainhoa de la Cruz Mancha are employed by Centro San Viator; and George Giorgakis is employed by C.F.C.D.C. Centre for Competence, Development Cyprus Limited; Maria Jose Moreno Juan and Miren Muñagorri are employed by Errotu Taldea S.L.P. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Abou Hashish, E. A., Al Najjar, H., Alharbi, M., Alotaibi, M., & Alqahtany, M. M. (2024). Faculty and students perspectives towards game-based learning in health sciences higher education. Heliyon, 10(12), e32898. [Google Scholar] [CrossRef] [PubMed]
  2. Aguiar-Castillo, L., & Clavijo-Rodriguez, A. (2021). Gamification and deep learning approaches in higher education. Journal of Hospitality, Leisure, Sport & Tourism Education, 28, 100290. [Google Scholar] [CrossRef]
  3. Alam, A. (2023, March 17–18). Leveraging the power of ‘modeling and computer simulation’ for education: An exploration of its potential for improved learning outcomes and enhanced student engagement. 2023 International Conference on Device Intelligence, Computing and Communication Technologies (DICCT) (pp. 445–450), Dehradun, India. [Google Scholar] [CrossRef]
  4. Alamri, I. K. A. (2024). Gameful learning: Investigating the impact of game elements, interactivity, and learning style on students’ success. Multidisciplinary Science Journal, 6(1), 2025108. [Google Scholar] [CrossRef]
  5. AlDaajeh, S., Saleous, H., Alrabaee, S., Barka, E., Breitinger, F., & Choo, K. K. R. (2022). The role of national cybersecurity strategies on the improvement of cybersecurity education. Computers & Security, 119, 102754. [Google Scholar] [CrossRef]
  6. All, A., Castellar, E. N. P., & Van Looy, J. (2021). Digital game-based learning effectiveness assessment: Reflections on study design. Computers & Education, 167, 104160. [Google Scholar] [CrossRef]
  7. Al-Rayes, S., Al Yaqoub, F. A., Alfayez, A., Alsalman, D., Alanezi, F., Alyousef, S., Alessa, A., Almutairi, B., Alharbi, R., Alosaimi, M., Aloufi, N., Alamri, E., Alharbi, S., Almutairi, T., Alshammari, T., Alotaibi, F., & Alanzi, T. M. (2022). Gaming elements, applications, and challenges of gamification in healthcare. Informatics in Medicine Unlocked, 31, 100974. [Google Scholar] [CrossRef]
  8. Andrea Navarro-Espinosa, J., Vaquero-Abellán, M., Perea-Moreno, A. J., Pedrós-Pérez, G., del Pilar Martínez-Jiménez, M., & Aparicio-Martínez, P. (2022). Gamification as a promoting tool of motivation for creating sustainable higher education institutions. International Journal of Environmental Research and Public Health, 19(5), 2599. [Google Scholar] [CrossRef] [PubMed]
  9. Andrews, J. C., Walker, K. L., Netemeyer, R. G., & Kees, J. (2023). Helping youth navigate privacy protection: Developing and testing the children’s online privacy scale. Journal of Public Policy & Marketing, 42(3), 345–362. [Google Scholar] [CrossRef]
  10. Antonopoulou, H., Halkiopoulos, C., Gkintoni, E., & Katsibelis, A. (2022). Application of gamification tools for identification of neurocognitive and social function in distance learning education. International Journal of Learning, Teaching and Educational Research, 21(5), 367–400. [Google Scholar] [CrossRef]
  11. Aresi, G., Chiavegatti, B., & Marta, E. (2024). Participants’ experience with gamification elements of a school-based health promotion intervention in Italy: A mixed-methods study. Journal of Prevention, 45(5), 820. [Google Scholar] [CrossRef]
  12. Arora, C., & Razavian, M. (2021). Ethics of gamification in health and fitness tracking. International Journal of Environmental Research and Public Health, 18(21), 11052. [Google Scholar] [CrossRef] [PubMed]
  13. Arruzza, E., & Chau, M. (2021). A scoping review of randomized controlled trials to assess the value of gamification in the higher education of health science students. Journal of Medical Imaging and Radiation Sciences, 52(1), 58–68. [Google Scholar] [CrossRef] [PubMed]
  14. Balalle, H. (2024). Exploring student engagement in technology-based education in relation to gamification, online/distance learning, and other factors: A systematic literature review. Social Sciences & Humanities Open, 9, 100870. [Google Scholar] [CrossRef]
  15. Basyoni, L., Tabassum, A., Shaban, K., Elmahjub, E., Halabi, O., & Qadir, J. (2024). Navigating privacy challenges in the metaverse: A comprehensive examination of current technologies and platforms. IEEE Internet of Things Magazine, 7(2), 48–55. [Google Scholar] [CrossRef]
  16. Birch, K., Cochrane, D. T., & Ward, C. (2021). Data as asset? The measurement, governance, and valuation of digital personal data by Big Tech. Big Data & Society, 8(1), 2053951721101730. [Google Scholar] [CrossRef]
  17. Breien, F. S., & Wasson, B. (2021). Narrative categorization in digital game-based learning: Engagement, motivation & learning. British Journal of Educational Technology, 52(1), 91–111. [Google Scholar] [CrossRef]
  18. Burn, A. M., Ford, T. J., Stochl, J., Jones, P. B., Perez, J., & Anderson, J. K. (2022). Developing a web-based app to assess mental health difficulties in secondary school pupils: Qualitative user-centered design study. JMIR Formative Research, 6(2), e30565. [Google Scholar] [CrossRef] [PubMed]
  19. Buzzai, C., Sorrenti, L., Costa, S., Toffle, M. E., & Filippello, P. (2021). The relationship between school-basic psychological need satisfaction and frustration, academic engagement and academic achievement. School Psychology International, 42(4), 497–519. [Google Scholar] [CrossRef]
  20. Camacho-Sánchez, R., Manzano-León, A., Rodríguez-Ferrer, J. M., Serna, J., & Lavega-Burgués, P. (2023). Game-based learning and gamification in physical education: A systematic review. Education Sciences, 13(2), 183. [Google Scholar] [CrossRef]
  21. Camacho-Sánchez, R., Rillo-Albert, A., & Lavega-Burgués, P. (2022). Gamified digital game-based learning as a pedagogical strategy: Student academic performance and motivation. Applied Sciences, 12(21), 11214. [Google Scholar] [CrossRef]
  22. Caprara, L., & Caprara, C. (2022). Effects of virtual learning environments: A scoping review of literature. Education and Information Technologies, 27(1), 785–825. [Google Scholar] [CrossRef] [PubMed]
  23. Castellano-Tejedor, C., & Cencerrado, A. (2024). Gamification for mental health and health psychology: Insights at the first quarter mark of the 21st century. International Journal of Environmental Research and Public Health, 21(8), 990. [Google Scholar] [CrossRef] [PubMed]
  24. Chatterjee, A., Prinz, A., Gerdes, M., Martinez, S., Pahari, N., & Meena, Y. K. (2022). ProHealth eCoach: User-centered design and development of an eCoach app to promote healthy lifestyle with personalized activity recommendations. BMC Health Services Research, 22(1), 1–20. [Google Scholar] [CrossRef] [PubMed]
  25. Chen, F. H. (2021). Sustainable education through e-learning: The case study of iLearn2.0. Sustainability, 13(18), 10186. [Google Scholar] [CrossRef]
  26. Chen, Y., & Zhao, S. (2022). Chinese EFL learners’ acceptance of gamified vocabulary learning apps: An integration of self-determination theory and technology acceptance model. Sustainability, 14(18), 11288. [Google Scholar] [CrossRef]
  27. Cheng, C., & Ebrahimi, O. V. (2023a). A meta-analytic review of gamified interventions in mental health enhancement. Computers in Human Behavior, 138, 107621. [Google Scholar] [CrossRef]
  28. Cheng, C., & Ebrahimi, O. V. (2023b). Gamification: A novel approach to mental health promotion. Current Psychiatry Reports, 25(11), 645–656. [Google Scholar] [CrossRef]
  29. Christopoulos, A., & Sprangers, P. (2021). Integration of educational technology during the COVID-19 pandemic: An analysis of teacher and student receptions. Cogent Education, 8(1), 1964690. [Google Scholar] [CrossRef]
  30. Cichy, P., Salge, T. O., & Kohli, R. (2021). Privacy concerns and data sharing in the internet of things: Mixed methods evidence from connected cars. MIS Quarterly, 45(4), 1863–1892. [Google Scholar] [CrossRef]
  31. Cong-Lem, N., Soyoof, A., & Tsering, D. (2025). A systematic review of the limitations and associated opportunities of ChatGPT. International Journal of Human-Computer Interaction, 41(2), 234–256. [Google Scholar] [CrossRef]
  32. Crepax, T., Muntés-Mulero, V., Martinez, J., & Ruiz, A. (2022). Information technologies exposing children to privacy risks: Domains and children-specific technical controls. Computer Standards & Interfaces, 82, 103624. [Google Scholar] [CrossRef]
  33. Dah, J., Hussin, N., Zaini, M. K., Isaac Helda, L., Senanu Ametefe, D., & Adozuka Aliu, A. (2024). Gamification is not working: Why? Games and Culture, 19(4), 456–478. [Google Scholar] [CrossRef]
  34. Dahalan, F., Alias, N., & Shaharom, M. S. N. (2024). Gamification and game based learning for vocational education and training: A systematic literature review. Education and Information Technologies, 29, 3685–3720. [Google Scholar] [CrossRef]
  35. de Carvalho, C. V., & Coelho, A. (2022). Game-based learning, gamification in education and serious games. Computers, 11(3), 36. [Google Scholar] [CrossRef]
  36. del Cura-González, I., Ariza-Cardiel, G., Polentinos-Castro, E., López-Rodríguez, J. A., Sanz-Cuesta, T., Barrio-Cortes, J., García-Sagredo, P., Hernández-Santiago, V., Rodríguez-Barrientos, R., Zabaleta-del-Olmo, E., García-Agua-Soler, N., González-González, A. I., García-de-Blas-González, F., de-Dios-del-Valle, R., Escortell-Mayor, E., & Martín-Fernández, J. (2022). Effectiveness of a game-based educational strategy e-EDUCAGUIA for implementing antimicrobial clinical practice guidelines in family medicine residents in Spain: A randomized clinical trial by cluster. BMC Medical Education, 22(1), 843. [Google Scholar] [CrossRef] [PubMed]
  37. De Salas, K., Ashbarry, L., Seabourne, M., Lewis, I., Wells, L., Dermoudy, J., Williams, R., Scott, J., & Kay, J. (2022). Improving environmental outcomes with games: An exploration of behavioral and technological design and evaluation approaches. Simulation & Gaming, 53(5), 470–512. [Google Scholar] [CrossRef]
  38. Di Minin, E., Fink, C., Hausmann, A., Kremer, J., & Kulkarni, R. (2021). How to address data privacy concerns when using social media data in conservation science. Conservation Biology, 35(2), 437–446. [Google Scholar] [CrossRef]
  39. Drljević, N., Botički, I., & Wong, L. H. (2022). Investigating the different facets of student engagement during augmented reality use in primary school. British Journal of Educational Technology, 53(4), 1032–1049. [Google Scholar] [CrossRef]
  40. Esmaeilzadeh, P. (2021). The influence of gamification and information technology identity on postadoption behaviors of health and fitness app users: Empirical study in the United States. JMIR Serious Games, 9(4), e28282. [Google Scholar] [CrossRef]
  41. Farhan, K. A., Asadullah, A. B. M., Kommineni, H. P., Gade, P. K., & Venkata, S. S. M. G. N. (2023). Machine learning-driven gamification: Boosting user engagement in business. Global Disclosure of Economics and Business, 12(1), 41–52. [Google Scholar] [CrossRef]
  42. Feldhacker, D. R., Wesner, C., Yockey, J., Larson, J., & Norris, D. (2025). Strategies for Health: A game-based, interprofessional approach to teaching social determinants of health: A randomized controlled pilot study. Journal of Interprofessional Care, 39(1), 85–94. [Google Scholar] [CrossRef] [PubMed]
  43. Furtado, L. S., de Souza, R. F., Lima, J. L. D. R., & Oliveira, S. R. B. (2021). Teaching method for software measurement process based on gamification or serious games: A systematic review of the literature. International Journal of Computer Games Technology, 2021, 8873997. [Google Scholar] [CrossRef]
  44. Gajardo Sánchez, A. D., Murillo-Zamorano, L. R., López-Sánchez, J., & Bueno-Muñoz, C. (2023). Gamification in health care management: Systematic review of the literature and research agenda. SAGE Open, 13(4), 21582440231218834. [Google Scholar] [CrossRef]
  45. Gao, F. (2024). Advancing gamification research and practice with three underexplored ideas in self-determination theory. TechTrends, 68(3), 412–423. [Google Scholar] [CrossRef]
  46. Ghahramani, A., de Courten, M., & Prokofieva, M. (2022). The potential of social media in health promotion beyond creating awareness: An integrative review. BMC Public Health, 22(1), 2399. [Google Scholar] [CrossRef]
  47. Ghosh, T., S., S., & Dwivedi, Y. K. (2022). Brands in a game or a game for brands? Comparing the persuasive effectiveness of in-game advertising and advergames. Psychology & Marketing, 39(11), 2028–2045. [Google Scholar] [CrossRef]
  48. Gkintoni, E., Antonopoulou, H., Sortwell, A., & Halkiopoulos, C. (2025a). Challenging cognitive load theory: The role of educational neuroscience and artificial intelligence in redefining learning efficacy. Brain Sciences, 15(2), 203. [Google Scholar] [CrossRef]
  49. Gkintoni, E., Dimakos, I., & Nikolaou, G. (2025b). Cognitive insights from emotional intelligence: A systematic review of EI models in educational achievement. Emerging Science Journal, 8, 262–297. [Google Scholar] [CrossRef]
  50. Gkintoni, E., Vantaraki, F., Skoulidi, C., Anastassopoulos, P., & Vantarakis, A. (2024a). Gamified health promotion in schools: The integration of neuropsychological aspects and CBT—A systematic review. Medicina, 60(12), 2085. [Google Scholar] [CrossRef]
  51. Gkintoni, E., Vantaraki, F., Skoulidi, C., Anastassopoulos, P., & Vantarakis, A. (2024b). Promoting physical and mental health among children and adolescents via gamification—A conceptual systematic review. Behavioral Sciences, 14(2), 102. [Google Scholar] [CrossRef]
  52. Govender, T., & Arnedo-Moreno, J. (2021). An analysis of game design elements used in digital game-based language learning. Sustainability, 13(12), 6679. [Google Scholar] [CrossRef]
  53. Guan, X., Sun, C., Hwang, G. J., Xue, K., & Wang, Z. (2024). Applying game-based learning in primary education: A systematic review of journal publications from 2010 to 2020. Interactive Learning Environments, 32(4), 1456–1478. [Google Scholar] [CrossRef]
  54. Gupta, P., & Goyal, P. (2022). Is game-based pedagogy just a fad? A self-determination theory approach to gamification in higher education. International Journal of Educational Management, 36(4), 456–472. [Google Scholar] [CrossRef]
  55. Halkiopoulos, C., & Gkintoni, E. (2024). Leveraging AI in E-learning: Personalized learning and adaptive assessment through cognitive neuropsychology—A systematic analysis. Electronics, 13(18), 3762. [Google Scholar] [CrossRef]
  56. Hassan, A., Pinkwart, N., & Shafi, M. (2021). Serious games to improve social and emotional intelligence in children with autism. Entertainment Computing, 38, 100417. [Google Scholar] [CrossRef]
  57. Hassan Alhazmi, A., & Asanka Gamagedara Arachchilage, N. (2023). Evaluation of game design framework using a gamified browser-based application. Computers & Security, 128, 103141. [Google Scholar]
  58. Hayes, J. L., Brinson, N. H., Bott, G. J., & Moeller, C. M. (2021). The influence of consumer-brand relationships on the personalized advertising privacy calculus in social media. Journal of Interactive Marketing, 55(1), 16–30. [Google Scholar] [CrossRef]
  59. Henry, J., Hernalesteen, A., & Collard, A. S. (2021). Teaching artificial intelligence to K-12 through a role-playing game questioning the intelligence concept. KI-Künstliche Intelligenz, 35(2), 171–179. [Google Scholar] [CrossRef]
  60. Höyng, M. (2022). Encouraging gameful experience in digital game-based learning: A double-mediation model of perceived instructional support, group engagement, and flow. Computers & Education, 180, 104408. [Google Scholar] [CrossRef]
  61. Hsiao, W. Y., Chen, C. H., Chen, P. C., & Hou, W. H. (2024). Investigating the effects of different game-based learning on the health care knowledge and emotions for middle-aged and older adults. Interactive Learning Environments, 32(10), 4182–4197. [Google Scholar] [CrossRef]
  62. Ibrahim, E. N. M., Jamali, N., & Suhaimi, A. I. H. (2021). Exploring gamification design elements for mental health support. International Journal of Advanced Technology and Engineering Exploration, 8(74), 114. [Google Scholar] [CrossRef]
  63. Jingili, N., Oyelere, S. S., Nyström, M. B., & Anyshchenko, L. (2023). A systematic review on the efficacy of virtual reality and gamification interventions for managing anxiety and depression. Frontiers in Digital Health, 5, 1239435. [Google Scholar] [CrossRef]
  64. Keyes, K. M., & Platt, J. M. (2024). Annual Research Review: Sex, gender, and internalizing conditions among adolescents in the 21st century—Trends, causes, consequences. Journal of Child Psychology and Psychiatry, 65(4), 384–408. [Google Scholar] [CrossRef] [PubMed]
  65. Khaleghi, A., Aghaei, Z., & Mahdavi, M. A. (2021). A gamification framework for cognitive assessment and cognitive training: Qualitative study. JMIR Serious Games, 9(1), e21900. [Google Scholar] [CrossRef]
  66. Kim, J., & Castelli, D. M. (2021). Effects of gamification on behavioral change in education: A meta-analysis. International Journal of Environmental Research and Public Health, 18(7), 3550. [Google Scholar] [CrossRef] [PubMed]
  67. Kniestedt, I., Lefter, I., Lukosch, S., & Brazier, F. M. (2022). Re-framing engagement for applied games: A conceptual framework. Entertainment Computing, 40, 100475. [Google Scholar] [CrossRef]
  68. Komljenovic, J. (2022). The future of value in digitalised higher education: Why data privacy should not be our biggest concern. Higher Education, 83(1), 119–135. [Google Scholar] [CrossRef]
  69. Krath, J., Schürmann, L., & Von Korflesch, H. F. O. (2021). Revealing the theoretical basis of gamification: A systematic review and analysis of theory in research on gamification, serious games, and game-based learning. Computers in Human Behavior, 125, 106963. [Google Scholar] [CrossRef]
  70. Kulkov, I., Kulkova, J., Rohrbeck, R., Menvielle, L., Kaartemo, V., & Makkonen, H. (2024). Artificial intelligence-driven sustainable development: Examining organizational, technical, and processing approaches to achieving global goals. Sustainable Development, 32(2), 456–473. [Google Scholar] [CrossRef]
  71. Landers, R. N., & Sanchez, D. R. (2022). Game-based, gamified, and gamefully designed assessments for employee selection: Definitions, distinctions, design, and validation. International Journal of Selection and Assessment, 30(1), 1–13. [Google Scholar] [CrossRef]
  72. Lappeman, J., Marlie, S., Johnson, T., & Poggenpoel, S. (2022). Trust and digital privacy: Willingness to disclose personal information to banking chatbot services. Journal of Financial Services Marketing, 28(2), 337–352. [Google Scholar] [CrossRef]
  73. Lee, C., & Ahmed, G. (2021). Improving IoT privacy, data protection, and security concerns. International Journal of Technology, Innovation and Management (IJTIM), 1(1), 18–33. [Google Scholar] [CrossRef]
  74. Leitão, R., Maguire, M., Turner, S., Arenas, F., & Guimarães, L. (2022a). Ocean literacy gamified: A systematic evaluation of the effect of game elements on students’ learning experience. Environmental Education Research, 28(3), 412–432. [Google Scholar] [CrossRef]
  75. Leitão, R., Maguire, M., Turner, S., & Guimarães, L. (2022b). A systematic evaluation of game elements effects on students’ motivation. Education and Information Technologies, 27(2), 2341–2365. [Google Scholar] [CrossRef]
  76. Leonardou, A., Rigou, M., Panagiotarou, A., & Garofalakis, J. (2022). Effect of OSLM features and gamification motivators on motivation in DGBL: Pupils’ viewpoint. Smart Learning Environments, 9(1), 14. [Google Scholar] [CrossRef]
  77. Li, Q., Yin, X., Yin, W., Dong, X., & Li, Q. (2023). Evaluation of gamification techniques in learning abilities for higher school students using FAHP and EDAS methods. Soft Computing, 27(14), 9321–9338. [Google Scholar] [CrossRef]
  78. Luo, Z. (2022). Gamification for educational purposes: What are the factors contributing to varied effectiveness? Education and Information Technologies, 27(1), 1035–1057. [Google Scholar] [CrossRef]
  79. Lynch, P., Singal, N., & Francis, G. A. (2024). Educational technology for learners with disabilities in primary school settings in low-and middle-income countries: A systematic literature review. Educational Review, 76(2), 234–256. [Google Scholar] [CrossRef]
  80. Lynch, Z., & Sankey, M. (2016). Reboot your course—From beta to better. Distance Education, 37(2), 156–170. [Google Scholar]
  81. Macey, J., Adam, M., Hamari, J., & Benlian, A. (2025). Examining the commonalities and differences between gamblification and gamification: A theoretical perspective. International Journal of Human-Computer Interaction, 41(3), 234–256. [Google Scholar] [CrossRef]
  82. Masi, L., Abadie, P., Herba, C., Emond, M., Gingras, M. P., & Amor, L. B. (2021). Video games in ADHD and non-ADHD children: Modalities of use and association with ADHD symptoms. Frontiers in Pediatrics, 9, 632272. [Google Scholar] [CrossRef]
  83. Mazeas, A., Duclos, M., Pereira, B., & Chalabaev, A. (2022). Evaluating the effectiveness of gamification on physical activity: Systematic review and meta-analysis of randomized controlled trials. Journal of Medical Internet Research, 24(1), e26779. [Google Scholar] [CrossRef]
  84. Michael, K. (2024). Mitigating risk and ensuring human flourishing using design standards: IEEE 2089–2021 an age appropriate digital services framework for children. IEEE Transactions on Technology and Society, 5(3), 145–158. [Google Scholar] [CrossRef]
  85. Mohanty, S., & Christopher, B. P. (2023). A bibliometric analysis of the use of the gamification octalysis framework in training: Evidence from web of science. Humanities and Social Sciences Communications, 10(1), 836. [Google Scholar] [CrossRef]
  86. Muniandy, N., Kamsin, A., & Sabri, M. I. (2023, November 18). Enhancing children’s health awareness through narrative game-based learning using digital technology: A framework and methodology. 2023 9th International HCI and UX Conference in Indonesia (CHIuXiD) (pp. 123–128), Bali, Indonesia. [Google Scholar] [CrossRef]
  87. Nadeem, M., Oroszlanyova, M., & Farag, W. (2023). Effect of digital game-based learning on student engagement and motivation. Computers, 12(9), 177. [Google Scholar] [CrossRef]
  88. Nivedhitha, K. S., Giri, G., & Pasricha, P. (2025). Gamification as a panacea to workplace cyberloafing: An application of self-determination and social bonding theories. Internet Research, 35(1), 123–145. [Google Scholar] [CrossRef]
  89. Nordby, A., Vibeto, H., Mobbs, S., & Sverdrup, H. U. (2024). System thinking in gamification. SN Computer Science, 5(1), 299. [Google Scholar] [CrossRef]
  90. Novak, E., McDaniel, K., Daday, J., & Soyturk, I. (2022). Frustration in technology-rich learning environments: A scale for assessing student frustration with e-textbooks. British Journal of Educational Technology, 53(2), 456–473. [Google Scholar] [CrossRef]
  91. Oke, A. E., Aliu, J., Tunji-Olayeni, P., & Abayomi, T. (2024). Application of gamification for sustainable construction: An evaluation of the challenges. Construction Innovation, 24(3), 567–584. [Google Scholar] [CrossRef]
  92. Oliveira, H., Figueiredo, G., Pereira, B., Magalhães, M., Rosário, P., & Magalhães, P. (2024). Understanding children’s perceptions of game elements in an online gamified intervention: Contributions by the self-determination theory. International Journal of Human-Computer Interaction, 40(12), 3456–3471. [Google Scholar] [CrossRef]
  93. Oliveira, W., Hamari, J., Joaquim, S., Toda, A. M., Palomino, P. T., Vassileva, J., & Isotani, S. (2022). The effects of personalized gamification on students’ flow experience, motivation, and enjoyment. Smart Learning Environments, 9(1), 16. [Google Scholar] [CrossRef]
  94. Ongoro, C. A., & Fanjiang, Y. Y. (2023). Digital game-based technology for English language learning in preschools and primary schools: A systematic analysis. IEEE Transactions on Learning Technologies, 16(3), 345–362. [Google Scholar] [CrossRef]
  95. Papakostas, C. (2024). Faith in frames: Constructing a digital game-based learning framework for religious education. Teaching Theology & Religion, 27(2), 123–142. [Google Scholar] [CrossRef]
  96. Peláez, C. A., & Solano, A. (2023). A practice for the design of interactive multimedia experiences based on gamification: A case study in elementary education. Sustainability, 15(3), 2385. [Google Scholar] [CrossRef]
  97. Pellas, N., Mystakidis, S., & Christopoulos, A. (2021). A systematic literature review on the user experience design for game-based interventions via 3D virtual worlds in K-12 education. Multimodal Technologies and Interaction, 5(6), 28. [Google Scholar] [CrossRef]
  98. Pitic, D., & Irimiaș, T. (2023). Enhancing students’ engagement through a business simulation game: A qualitative study within a higher education management course. The International Journal of Management Education, 21(2), 100839. [Google Scholar] [CrossRef]
  99. Qiu, Y. (2024). The role of playfulness in gamification for health behavior motivation. Computers in Human Behavior, 148, 107892. [Google Scholar]
  100. Quach, S., Thaichon, P., Martin, K. D., Weaven, S., & Palmatier, R. W. (2022). Digital technologies: Tensions in privacy and data. Journal of the Academy of Marketing Science, 50(6), 1299–1323. [Google Scholar] [CrossRef]
  101. Riar, M., Morschheuser, B., Zarnekow, R., & Hamari, J. (2022). Gamification of cooperation: A framework, literature review and future research agenda. International Journal of Information Management, 65, 102549. [Google Scholar] [CrossRef]
  102. Ruggiu, D., Blok, V., Coenen, C., Kalloniatis, C., Kitsiou, A., Mavroeidi, A. G., Porcari, A., Rerimassie, V., Stahl, B. C., & Sitzia, A. (2022). Responsible innovation at work: Gamification, public engagement, and privacy by design. Journal of Responsible Innovation, 9(3), 315–343. [Google Scholar] [CrossRef]
  103. Rycroft-Smith, L. (2022). Knowledge brokering to bridge the research-practice gap in education: Where are we now? Review of Education, 10(2), e3341. [Google Scholar] [CrossRef]
  104. Samala, A. D., Rawas, S., Wang, T., Reed, J. M., Kim, J., Howard, N. J., & Ertz, M. (2025). Unveiling the landscape of generative artificial intelligence in education: A comprehensive taxonomy of applications, challenges, and future prospects. Education and Information Technologies, 30(2), 1234–1267. [Google Scholar] [CrossRef]
  105. Sáiz-Manzanares, M. C., Martin, C. F., Alonso-Martínez, L., & Almeida, L. S. (2021). Usefulness of digital game-based learning in nursing and occupational therapy degrees: A comparative study at the University of Burgos. International Journal of Environmental Research and Public Health, 18(22), 11757. [Google Scholar] [CrossRef]
  106. Sezgin, S., & Yüzer, T. V. (2022). Analyzing adaptive gamification design principles for online courses. Behaviour & Information Technology, 41(3), 567–583. [Google Scholar] [CrossRef]
  107. Shaheen, A., Ali, S., & Fotaris, P. (2023). Assessing the efficacy of reflective game design: A design-based study in digital game-based learning. Education Sciences, 13(12), 1204. [Google Scholar] [CrossRef]
  108. Shahzad, M. F., Xu, S., Rehman, O., & Javed, I. (2023). Impact of gamification on green consumption behavior integrating technological awareness, motivation, enjoyment, and virtual CSR. Scientific Reports, 13(1), 20835. [Google Scholar] [CrossRef] [PubMed]
  109. Shrestha, A. K., Barthwal, A., Campbell, M., Shouli, A., Syed, S., Joshi, S., & Vassileva, J. (2024). Navigating AI to unpack youth privacy concerns: An in-depth exploration and systematic review. arXiv, arXiv:2412.16369. [Google Scholar]
  110. Simonofski, A., Zuiderwijk, A., Clarinval, A., & Hammedi, W. (2022). Tailoring open government data portals for lay citizens: A gamification theory approach. International Journal of Information Management, 65, 102511. [Google Scholar] [CrossRef]
  111. Sinha, N. (2021). Introducing gamification for advancing current mental healthcare and treatment practices. In IoT in healthcare and ambient assisted living (pp. 234–251). Springer. [Google Scholar] [CrossRef]
  112. Speelman, E. N., Escano, E., Marcos, D., & Becu, N. (2023). Serious games and citizen science; from parallel pathways to greater synergies. Current Opinion in Environmental Sustainability, 62, 101320. [Google Scholar] [CrossRef]
  113. Taneja-Johansson, S., & Singal, N. (2025). Pathways to inclusive and equitable quality education for people with disabilities: Cross-context conversations and mutual learning. International Journal of Inclusive Education, 29(4), 567–584. [Google Scholar] [CrossRef]
  114. Torresan, S., & Hinterhuber, A. (2023). Continuous learning at work: The power of gamification. Management Decision, 61(8), 2234–2253. [Google Scholar] [CrossRef]
  115. Trang, S., & Weiger, W. H. (2021). The perils of gamification: Does engaging with gamified services increase users’ willingness to disclose personal information? Computers in Human Behavior, 116, 106644. [Google Scholar] [CrossRef]
  116. Tyack, A., & Mekler, E. D. (2024). Self-determination theory and HCI games research: Unfulfilled promises and unquestioned paradigms. ACM Transactions on Computer-Human Interaction, 31(2), 40. [Google Scholar] [CrossRef]
  117. Tzachrista, M., Gkintoni, E., & Halkiopoulos, C. (2023). Neurocognitive profile of creativity in improving academic performance—A scoping review. Education Sciences, 13(11), 1127. [Google Scholar] [CrossRef]
  118. Udoudom, U., George, K., & Igiri, A. (2023). Impact of digital learning platforms on behaviour change communication in public health education. Pancasila International Journal of Applied Social Science, 2(1), 512. [Google Scholar] [CrossRef]
  119. Van Toorn, C., Kirshner, S. N., & Gabb, J. (2022). Gamification of query-driven knowledge sharing systems. Behaviour & Information Technology, 41(5), 959–980. [Google Scholar] [CrossRef]
  120. Vázquez-Calatayud, M., García-García, R., Regaira-Martínez, E., & Gómez-Urquiza, J. (2024). Real-world and game-based learning to enhance decision-making. Nurse Education Today, 138, 106276. [Google Scholar] [CrossRef]
  121. Ventouris, A., Panourgia, C., & Hodge, S. (2021). Teachers’ perceptions of the impact of technology on children and young people’s emotions and behaviours. International Journal of Educational Research Open, 2, 100081. [Google Scholar] [CrossRef]
  122. Vijay, M., & Jayan, J. P. (2023, August 10–11). Evolution of research on adaptive gamification: A bibliometric study. 2023 International Conference on Circuit Power and Computing Technologies (ICCPCT) (pp. 1–6), Kollam, India. [Google Scholar] [CrossRef]
  123. Vorlíček, M., Prycl, D., Heidler, J., Herrador-Colmenero, M., Nábělková, J., Mitáš, J., Rubín, L., Krejčí, M., Jakubec, L., & Frömel, K. (2024). Gameful education: A study of gamifiter application’s role in promoting physical activity and active lifestyle. Smart Learning Environments, 11(1), 1–14. [Google Scholar] [CrossRef]
  124. Wang, G., Zhao, J., Van Kleek, M., & Shadbolt, N. (2022, April 29–May 5). Informing age-appropriate AI: Examining principles and practices of AI for children. 2022 CHI Conference on Human Factors in Computing Systems (pp. 1–15), New Orleans, LA, USA. [Google Scholar] [CrossRef]
  125. Wang, K., Liu, P., Zhang, J., Zhong, J., Luo, X., Huang, J., & Zheng, Y. (2023). Effects of digital game-based learning on students’ cyber wellness literacy, learning motivations, and engagement. Sustainability, 15(7), 5716. [Google Scholar] [CrossRef]
  126. Warsinsky, S., Schmidt-Kraepelin, M., Rank, S., Thiebes, S., & Sunyaev, A. (2021). Conceptual ambiguity surrounding gamification and serious games in healthcare: Literature review and development of game-based intervention reporting guidelines (GAMING). Journal of Medical Internet Research, 23(9), e30390. [Google Scholar] [CrossRef]
  127. Weatherly, K. I. C. H., Wright, B. A., & Lee, E. Y. (2024). Digital game-based learning in music education: A systematic review between 2011 and 2023. Research Studies in Music Education, 46(2), 234–256. [Google Scholar] [CrossRef]
  128. Wiljén, A., Chaplin, J. E., Crine, V., Jobe, W., Johnson, E., Karlsson, K., Forsander, G., & Nilsson, S. (2022). The development of an mHealth tool for children with long-term illness to enable person-centered communication: User-centered design approach. JMIR Pediatrics and Parenting, 5(1), e30364. [Google Scholar] [CrossRef] [PubMed]
  129. Xiao, R., Wu, Z., & Hamari, J. (2022). Internet-of-gamification: A review of literature on IoT-enabled gamification for user engagement. International Journal of Human-Computer Interaction, 38(12), 1113–1137. [Google Scholar] [CrossRef]
  130. Xiong, Z., Liu, Q., & Huang, X. (2022). The influence of digital educational games on preschool children’s creative thinking. Computers & Education, 189, 104578. [Google Scholar] [CrossRef]
  131. Xu, J., Lio, A., Dhaliwal, H., Andrei, S., Balakrishnan, S., Nagani, U., & Samadder, S. (2021). Psychological interventions of virtual gamification within academic intrinsic motivation: A systematic review. Journal of Affective Disorders, 293, 444–465. [Google Scholar] [CrossRef] [PubMed]
  132. Xu, M., Luo, Y., Zhang, Y., Xia, R., Qian, H., & Zou, X. (2023). Game-based learning in medical education. Frontiers in Public Health, 11, 1113682. [Google Scholar] [CrossRef] [PubMed]
  133. Yang, C., Ye, H. J., & Feng, Y. (2021). Using gamification elements for competitive crowdsourcing: Exploring the underlying mechanism. Behaviour & Information Technology, 40(9), 837–854. [Google Scholar] [CrossRef]
  134. Zahedi, L., Batten, J., Ross, M., Potvin, G., Damas, S., Clarke, P., & Davis, D. (2021). Gamification in education: A mixed-methods study of gender on computer science students’ academic performance and identity development. Journal of Computing in Higher Education, 33(2), 441–474. [Google Scholar] [CrossRef]
  135. Zhang, Q., & Yu, Z. (2022). Meta-analysis on investigating and comparing the effects on learning achievement and motivation for gamification and game-based learning. Education Research International, 2022, 1519880. [Google Scholar] [CrossRef]
  136. Zhang, R., Cheng, G., & Zou, D. (2025). Effects of digital game elements on engagement and vocabulary development. University of Hawaii Press. [Google Scholar]
  137. Zhao, D., Inaba, M., & Monroy-Hernández, A. (2022). Understanding teenage perceptions and configurations of privacy on Instagram. Proceedings of the ACM on Human-Computer Interaction, 6(CSCW2), 550. [Google Scholar] [CrossRef]
  138. Zhao, D., Playfoot, J., De Nicola, C., Guarino, G., Bratu, M., Di Salvadore, F., & Muntean, G. M. (2021). An innovative multi-layer gamification framework for improved STEM learning experience. IEEE Access, 9, 164892–164909. [Google Scholar] [CrossRef]
  139. Zohari, M., Karim, N., Malgard, S., Aalaa, M., Asadzandi, S., & Borhani, S. (2023). Comparison of gamification, game-based learning, and serious games in medical education: A scientometrics analysis. Journal of Advances in Medical Education & Professionalism, 11(3), 145–157. [Google Scholar]
  140. Zolfaghari, Z., Karimian, Z., Zarifsanaiey, N., & Farahmandi, A. Y. (2025). A scoping review of gamified applications in English language teaching: A comparative discussion with medical education. BMC Medical Education, 25(1), 22. [Google Scholar] [CrossRef] [PubMed]
Figure 1. iLearn4Health User Experience Framework.
Figure 1. iLearn4Health User Experience Framework.
Behavsci 15 00847 g001
Figure 2. iLearn4Health Online Training Program.
Figure 2. iLearn4Health Online Training Program.
Behavsci 15 00847 g002
Figure 3. Interactive Educational Game Interfaces Screens—iLearn4Health.
Figure 3. Interactive Educational Game Interfaces Screens—iLearn4Health.
Behavsci 15 00847 g003
Figure 4. Interactive Educational Game Selection Screens—iLearn4Health.
Figure 4. Interactive Educational Game Selection Screens—iLearn4Health.
Behavsci 15 00847 g004
Figure 5. Distribution of design exploration ratios among teacher–evaluators (N = 337) showing a bimodal engagement pattern with 52.8% conducting limited design exploration (ratio 0.0–0.2) and 35.3% performing comprehensive design assessment (ratio 0.8–1.0).
Figure 5. Distribution of design exploration ratios among teacher–evaluators (N = 337) showing a bimodal engagement pattern with 52.8% conducting limited design exploration (ratio 0.0–0.2) and 35.3% performing comprehensive design assessment (ratio 0.8–1.0).
Behavsci 15 00847 g005
Figure 6. Average design assessment steps completed by evaluator age group with 95% confidence intervals. The 26–35 age group shows the highest average engagement (M = 24.83, SD = 23.15) while the correlation between age and steps completed across the full sample was minimal (r = 0.01, p = 0.859).
Figure 6. Average design assessment steps completed by evaluator age group with 95% confidence intervals. The 26–35 age group shows the highest average engagement (M = 24.83, SD = 23.15) while the correlation between age and steps completed across the full sample was minimal (r = 0.01, p = 0.859).
Behavsci 15 00847 g006
Figure 7. Cross-national comparison of average design exploration ratios with 95% confidence intervals. Romania demonstrates significantly higher design engagement (M = 0.460, SD = 0.44) than Greece (M = 0.315, SD = 0.40) and Cyprus (M = 0.301, SD = 0.38), suggesting important contextual factors in design assessment approaches. Romania (blue, group a)—highest engagement; Poland and Spain (yellow, group ab)—intermediate engagement; Greece and Cyprus (orange, group b)—lowest engagement.
Figure 7. Cross-national comparison of average design exploration ratios with 95% confidence intervals. Romania demonstrates significantly higher design engagement (M = 0.460, SD = 0.44) than Greece (M = 0.315, SD = 0.40) and Cyprus (M = 0.301, SD = 0.38), suggesting important contextual factors in design assessment approaches. Romania (blue, group a)—highest engagement; Poland and Spain (yellow, group ab)—intermediate engagement; Greece and Cyprus (orange, group b)—lowest engagement.
Behavsci 15 00847 g007
Figure 8. Scatter plot showing the strong positive correlation between design assessment steps completed and time spent (r = 0.95, p < 0.001). The linear relationship demonstrates that sustained engagement is crucial for comprehensive design evaluation.
Figure 8. Scatter plot showing the strong positive correlation between design assessment steps completed and time spent (r = 0.95, p < 0.001). The linear relationship demonstrates that sustained engagement is crucial for comprehensive design evaluation.
Behavsci 15 00847 g008
Figure 9. Forest plot showing standardized regression coefficients (β) with 95% confidence intervals for predictors of design exploration ratio. Initial engagement (β = 0.479, p < 0.001) and country (Romania vs. Cyprus, β = 0.183, p = 0.001) emerge as the strongest predictors, while evaluator age has a more minor but significant effect (β = 0.108, p = 0.049).
Figure 9. Forest plot showing standardized regression coefficients (β) with 95% confidence intervals for predictors of design exploration ratio. Initial engagement (β = 0.479, p < 0.001) and country (Romania vs. Cyprus, β = 0.183, p = 0.001) emerge as the strongest predictors, while evaluator age has a more minor but significant effect (β = 0.108, p = 0.049).
Behavsci 15 00847 g009
Table 1. Descriptive Statistics for Evaluator Demographics and Design Engagement Assessment (N = 337).
Table 1. Descriptive Statistics for Evaluator Demographics and Design Engagement Assessment (N = 337).
Variable M SD MinMaxMdnSkewnessKurtosis
Age (years)38.6414.04186641−0.27−0.83
Steps completed22.3123.7905380.31−1.85
Progress ratio0.410.4300.960.150.47−1.74
Note. Progress ratio was calculated as steps completed divided by the total possible steps (55).
Table 2. Design Evaluator Distribution by Country.
Table 2. Design Evaluator Distribution by Country.
CountryN%95% CI
Romania20962.0[56.8, 67.2]
Cyprus5215.4[11.5, 19.3]
Greece3811.3[7.9, 14.7]
Spain3510.4[7.1, 13.7]
Poland30.9[0, 1.9]
Note. CI = confidence interval.
Table 3. Bimodal Distribution of Design Exploration Patterns.
Table 3. Bimodal Distribution of Design Exploration Patterns.
Progress Ration%Cumulative %
0.0–0.2 (Limited exploration)17852.852.8
0.2–0.4 (Basic assessment)278.060.8
0.4–0.6 (Moderate assessment)30.961.7
0.6–0.8 (Extended assessment)103.064.7
0.8–1.0 (Comprehensive assessment)11935.3100.0
Note. Exploration ratio ranges represent different levels of design assessment depth conducted by teacher–evaluators.
Table 4. Age-Based Design Engagement Patterns with Cohen’s d Effect Sizes.
Table 4. Age-Based Design Engagement Patterns with Cohen’s d Effect Sizes.
Age GroupnSteps Completed Progress Ratio Cohen’s d
MSD95% CIMSD
18–255521.4522.87[15.20, 27.70]0.390.42-
26–354224.8323.15[17.61, 32.05]0.450.420.14
36–5018322.5524.16[19.02, 26.08]0.410.440.05
51–655220.1024.33[13.32, 26.88]0.370.44−0.05
Note. Cohen’s d values represent effect sizes compared to the 18–25 age group. CI = confidence interval.
Table 5. Cross-National Comparison of Design Exploration Ratios with Post-Hoc Significance Testing.
Table 5. Cross-National Comparison of Design Exploration Ratios with Post-Hoc Significance Testing.
CountrynProgress Ratio Significant Differences
M SD 95% CI
Romania2090.4600.44[0.399, 0.521]A
Poland30.3700.41[0.000, 0.791]a,b
Spain350.3370.42[0.203, 0.471]a,b
Greece380.3150.40[0.190, 0.440]b
Cyprus520.3010.38[0.196, 0.406]b
Note. Countries that do not share the same letter suffix differ significantly according to Tukey’s HSD test at p < 0.05. CI = confidence interval.
Table 6. One-Way ANOVA Results for Between-Country Differences in Design Exploration Ratios.
Table 6. One-Way ANOVA Results for Between-Country Differences in Design Exploration Ratios.
SourceSSdfMSFpη2
Between groups1.2640.3154.370.0020.050
Within groups23.943320.072
Total25.20336
Note. η2 = eta squared effect size.
Table 7. Pearson Correlations Between Key Design Assessment Variables with Bootstrapped 95% Confidence Intervals.
Table 7. Pearson Correlations Between Key Design Assessment Variables with Bootstrapped 95% Confidence Intervals.
VariablesR95% CIp
Steps completed—Time spent0.95[0.93, 0.97]<0.001
Age—Steps completed (full sample)0.01[−0.10, 0.12]0.859
Age—Steps completed (adult subgroup)0.80[0.73, 0.85]<0.001
Age—Time spent (adult subgroup)0.60[0.50, 0.69]<0.001
Design exploration ratio—Time spent0.94[0.91, 0.96]<0.001
Note. Confidence intervals were calculated using bootstrapping with 1000 samples. CI = confidence interval.
Table 8. Multiple Regression Analysis Predicting Design Exploration Ratio from Age, Country, and Initial Engagement.
Table 8. Multiple Regression Analysis Predicting Design Exploration Ratio from Age, Country, and Initial Engagement.
PredictorBSE Bβtp95% CI
Constant0.2100.074 2.840.005[0.065, 0.355]
Age0.0030.0020.1081.970.049[0.000, 0.006]
Initial engagement *0.3460.0380.4799.11<0.001[0.271, 0.421]
Country: Romania 0.1590.0480.1833.310.001[0.065, 0.253]
Country: Spain 0.0360.0650.0260.550.580[−0.092, 0.164]
Country: Greece 0.0140.0630.0100.220.825[−0.110, 0.138]
Country: Poland 0.0690.1650.0190.420.676[−0.256, 0.394]
Note. R2 = 0.31, F(6, 330) = 24.36, p < 0.001. CI = confidence interval. * Initial engagement measured by completion of the first 10 assessment steps (binary: yes/no) The reference category is Cyprus.
Table 9. Cluster Analysis Results: Design Evaluator Typologies Based on Assessment Patterns.
Table 9. Cluster Analysis Results: Design Evaluator Typologies Based on Assessment Patterns.
ClusternAge Steps
Completed
Progress
Ratio
Time Spent (min) Key
Characteristics
MSDMSDMSDMSD
Comprehensive design assessors11939.812.652.21.60.950.03987.4463.2Complete most/all modules
Initial design explorers17838.114.94.53.80.080.0765.352.8Disengage after initial exploration
Selective design evaluators4037.913.529.37.10.530.13384.6159.7Complete specific topics of interest
Note. Clusters were identified using the k-means clustering algorithm based on progress ratio and engagement pattern variables.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gkintoni, E.; Magriplis, E.; Vantaraki, F.; Skoulidi, C.-M.; Anastassopoulos, P.; Cornea, A.; Inchaurraga, B.; Santurtun, J.; Mancha, A.d.l.C.; Giorgakis, G.; et al. Designing for Engagement in Primary Health Education Through Digital Game-Based Learning: Cross-National Behavioral Evidence from the iLearn4Health Platform. Behav. Sci. 2025, 15, 847. https://doi.org/10.3390/bs15070847

AMA Style

Gkintoni E, Magriplis E, Vantaraki F, Skoulidi C-M, Anastassopoulos P, Cornea A, Inchaurraga B, Santurtun J, Mancha AdlC, Giorgakis G, et al. Designing for Engagement in Primary Health Education Through Digital Game-Based Learning: Cross-National Behavioral Evidence from the iLearn4Health Platform. Behavioral Sciences. 2025; 15(7):847. https://doi.org/10.3390/bs15070847

Chicago/Turabian Style

Gkintoni, Evgenia, Emmanuella Magriplis, Fedra Vantaraki, Charitini-Maria Skoulidi, Panagiotis Anastassopoulos, Alexandra Cornea, Begoña Inchaurraga, Jaione Santurtun, Ainhoa de la Cruz Mancha, George Giorgakis, and et al. 2025. "Designing for Engagement in Primary Health Education Through Digital Game-Based Learning: Cross-National Behavioral Evidence from the iLearn4Health Platform" Behavioral Sciences 15, no. 7: 847. https://doi.org/10.3390/bs15070847

APA Style

Gkintoni, E., Magriplis, E., Vantaraki, F., Skoulidi, C.-M., Anastassopoulos, P., Cornea, A., Inchaurraga, B., Santurtun, J., Mancha, A. d. l. C., Giorgakis, G., Kouppas, K., Timotheou, S., Moreno Juan, M. J., Muñagorri, M., Harasiuk, M., Lopez, A. G., Skoulidi, E., & Vantarakis, A. (2025). Designing for Engagement in Primary Health Education Through Digital Game-Based Learning: Cross-National Behavioral Evidence from the iLearn4Health Platform. Behavioral Sciences, 15(7), 847. https://doi.org/10.3390/bs15070847

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop