Next Article in Journal
Digital Governance, Democracy and Public Funding Efficiency in the EU-27: Comparative Insights with Emphasis on Greece
Previous Article in Journal
Balancing Business, IT, and Human Capital: RPA Integration and Governance Dynamics
Previous Article in Special Issue
Social Robots in Education: Current Trends and Future Perspectives
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

EDTF: A User-Centered Approach to Digital Educational Games Design and Development

by
Raluca Ionela Maxim
*,† and
Joan Arnedo-Moreno
*,†
Department of Multimedia and Game Design, Faculty of IT, Multimedia and Telecommunications (IMT), Universitat Oberta de Catalunya, 08018 Barcelona, Spain
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Information 2025, 16(9), 794; https://doi.org/10.3390/info16090794
Submission received: 30 May 2025 / Revised: 27 August 2025 / Accepted: 8 September 2025 / Published: 12 September 2025
(This article belongs to the Special Issue Recent Advances and Perspectives in Human-Computer Interaction)

Abstract

The creation of digital educational games often lacks strong user-centered design despite available frameworks, which tend to focus on technical and instructional aspects. This paper presents the Empathic Design Thinking Framework (EDTF), a structured methodology tailored to digital educational game creation. Rooted in human–computer interaction (HCI) principles, the EDTF integrates continuous co-design and iterative user research from ideation to deployment, involving both learners and instructors throughout all phases; it positions empathic design (ED) principles as an important component of HCI, focusing not only on identifying user needs but also on understanding users’ lived experiences, motivations, and frustrations. Developed through design science research, the EDTF offers step-by-step guidance, comprised of 10 steps, that reduces uncertainty for novice and experienced designers, developers, and HCI experts alike. The framework was validated in two robust phases. First, it was evaluated by 60 instructional game experts, including designers, developers, and HCI professionals, using an adapted questionnaire covering dimensions like clarity, problem-solving, consistency, and innovation, as well as standardized scales such as UMUX-Lite for perceived ease of use and usefulness and SUS for perceived usability. This was followed by in-depth interviews with 18 experts to understand the feasibility and conceptualization of EDTF applicability. The strong validation results highlight the framework’s potential to guide the design and development of educational games that take into account HCI principles and are usable, efficient, and impactful.

Graphical Abstract

1. Introduction

Digital serious games (DSGs) have firmly established themselves as valuable educational resources, combining interactive digital environments with pedagogically grounded learning objectives [1]. Their application spans various domains—from formal K-12 and higher education settings to corporate and military training—highlighting their versatility and relevance. Examples include medical simulations for nursing students, language acquisition tools, and cybersecurity training environments, each leveraging game-based mechanics to increase learner motivation, cognitive engagement, and knowledge retention [2,3,4]. Particularly because of their digital format, they have shown potential for scalable, self-paced, and adaptive learning experiences, often outperforming traditional passive approaches in terms of learner participation and outcomes. Therefore, now the focus has shifted to optimizing their design and implementation for meaningful, user-centered learning [5,6].
Despite the growing body of evidence supporting the pedagogical potential of digital serious games (DSGs), their effectiveness in educational settings continues to be constrained by a range of systemic and infrastructural challenges. One major limitation lies in the insufficient integration of DSGs into formal curricula. Even when games are well-aligned with learning objectives, educators often encounter difficulties incorporating them into rigid curricular structures and standardized assessment frameworks, because of lack of clear strategies for DSG adoption in academia. This means that without clear curricular mappings and instructional scaffolding, DSGs are frequently perceived as supplementary rather than integral to the learning process [7,8].
Another critical barrier is the lack of comprehensive teacher training. While digital fluency among educators is increasing, many still report feeling ill-equipped to select, adapt, and meaningfully integrate DSGs into classroom practice. Some authors found that pre-service teachers, even those with high levels of digital competence, expressed uncertainty about the pedagogical application of DSGs in the absence of explicit training and methodological guidance [9]. Additionally, resource constraints significantly hinder the implementation of DSGs in many educational contexts. These include limited access to updated digital infrastructure and insufficient technical support. Several studies identified the technological barriers as a predominant challenge, often outweighing educators’ willingness or the perceived relevance of DSGs to the curriculum [10,11].
While these structural and institutional barriers merit substantial attention, the present work focuses on a more nuanced but equally critical challenge: the design limitations on human–computer interaction (HCI) integration. Specifically, many DSGs underemphasize the emotional, experiential, and social dimensions of gameplay—factors that are increasingly recognized as essential to deep learning and sustained engagement. Integrating affective computing elements, such as emotion-aware interactions and empathetic feedback systems, can significantly enhance user engagement and learning outcomes. However, such considerations remain underexplored in current game design practices, which often prioritize cognitive content delivery at the expense of user experience design. According to the literature, issues such as unclear interaction flows, poorly designed feedback systems, and emotionally flat experiences are common across DSGs [12,13], resulting in products that may functionally deliver content but lack the experiential richness required to fully engage learners. This gap in human-centered approaches represents a critical barrier to unlocking DSGs’ full educational potential and calls for more comprehensive design frameworks.
Recognizing the diversity of educational game contexts, it is important to highlight the specific HCI needs within the domain of educational games. The demands placed extend beyond gameplay mechanics—they are shaped by the learner’s environment, cognitive load, and interaction expectations. In that regard, addressing the shortcomings of DSGs requires bridging the pedagogical with the experiential—an effort where the general field of HCI converges with more specific ones such as empathic design (ED) and design thinking (DT), providing a multidisciplinary foundation that emphasizes usability, accessibility, and interaction design tailored to users’ cognitive, emotional, and behavioral patterns. Within HCI, ED emerges as a specialized approach that focuses on understanding users’ emotions, motivations, and contexts through immersive and observational research [14,15]. Complementing ED, DT offers an iterative, collaborative process for problem-solving that includes ideation, prototyping, and user testing—stages deeply rooted in empathy and co-creation [16]. Together, ED and DT function as methodological extensions of HCI, reinforcing its user-centered foundation while adding emotional depth and iterative innovation. Applied to DSGs, these approaches promise to enhance not only what learners know, but also how they feel and interact during the learning process. The intersection of HCI, ED, and DT thus presents a compelling response to the limitations identified in current DSG design practice.
Taking all these aspects into account, this paper proposes the Empathic Design Thinking Framework (EDTF)—a structured approach that merges empathic design and design thinking within the broader principles of HCI to support the development of digital serious games that are pedagogically sound, emotionally engaging, and experientially rich. The EDTF emphasizes iterative co-design with learners and educators, prioritizing usability, inclusivity, and affective engagement across 10 well-defined methodological steps. These steps guide practitioners through phases such as contextual inquiry, empathy mapping, participatory design, and iterative testing, ensuring the final product reflects both pedagogical goals and user experiences. By operationalizing abstract concepts like empathy and human-centeredness into actionable design tasks, the EDTF fills a gap in current educational game design methodologies. It is accessible to both novice and expert designers, making it suitable for use in diverse educational and training contexts. This framework is especially relevant for teams working across disciplines, providing a common language and process for aligning pedagogical, technical, and emotional dimensions in DSG design.
This paper is structured as follows: Section 2 presents a brief review of the literature on the current state of digital serious games and identifies key design challenges, that help frame the current proposal. Section 3 introduces the EDTF framework’s methodological components, including relevant HCI, ED, and DT principles. Section 4 outlines the evaluation method, while Section 5 discusses the results of the framework’s evaluation with experts. Finally, Section 6 and Section 7 conclude this paper with reflections on the evaluation results, implications, limitations, and future directions for the use of the EDTF in serious game design.

2. The Design Challenges of Digital Serious Games

Digital serious games (DSGs) are interactive, digitally mediated learning experiences that leverage game-based structures to deliver instructional content across a wide array of fields, including science, healthcare, business, and the arts [17,18]. By incorporating audiovisual media, interactivity, and narrative structures, DSGs aim to create immersive environments that engage learners in challenges mirroring real-world complexity. Though structurally similar to entertainment games, DSGs diverge in purpose: Their primary objective is instructional, not recreational. This dual imperative—to both engage and educate—introduces distinct design complexities that position DSGs as a unique area of applied research and development [19].
To navigate these complexities, digital serious games design frameworks for (DSGDFWs) have emerged as guiding instruments that translate pedagogical objectives into interactive, game-based experiences. For the purposes of this analysis, a framework is understood as a structured and repeatable design approach that offers defined stages, guiding principles, and practical tools to support the development of serious games—distinct from more general models, taxonomies, or theoretical discussions that do not provide actionable design guidance [20,21]. These frameworks aim to harmonize instructional goals with gameplay mechanics, incorporating motivational design techniques such as challenge loops, progression structures, feedback systems, and social elements. In practice, DSG frameworks are often interdisciplinary in scope, integrating learning theories, development workflows, and game design principles. However, no single universal framework exists. Most are tailored to specific disciplinary domains or emphasize certain design dimensions—such as educational content, production methodology, or technical implementation—over others [22,23].

2.1. State of the Art on Design Frameworks

The design of digital serious games (DSGs) for education relies on frameworks that bridge pedagogy, technology, and user experience. This section provides an overview of existing frameworks for the design of DSGs exclusively, a theoretical framework for ED-DT, and insights on how the latter can be used to improve the former.

2.1.1. Overview of Existing Frameworks

A formal review under rigorous academic standards allows the identification of different educational game design frameworks in the literature. The selected models and frameworks were required to (i) meet empirical validation through peer-reviewed studies of serious games frameworks or models; (ii) have an explicit alignment with formal educational objectives and assessment strategies; and (iii) demonstrate alignment—or lack thereof—with empathic design (ED), design thinking (DT), and human–computer interaction (HCI) principles, such as empathy mapping, affective feedback, or co-design with both educators and learners in alignment with designers and developers through collaborative and efficient communication. This boundary ensured the synthesis focused on a central research question: how rigorously validated, education-ready, HCI-informed serious game frameworks can address the empathic, affective, and experiential gaps in digital serious games—without diluting the evidence base with conceptually rich but empirically untested guidance.
The results include the Triadic Framework [24], which emphasizes balancing reality, meaning, and play; the Balanced Design Framework [25], which aligns curriculum standards, pedagogy, and game mechanics; the Game Rules Scenario Model [26], which engages players through structured, rule-based scenarios; and the Game-Based Learning Framework [27], which integrates learning theory, pedagogical models, and game mechanics. The review also included iteration-focused models such as the Transdisciplinary Model [28], the iMPOS2inG Model for interdisciplinary serious games in healthcare and social care [29], and the Tandem Transformational Game Design Framework [30] that introduces two lead designers—one for gameplay and one for content—working in parallel to ensure alignment between fun and learning. While it considers player needs, involvement occurs indirectly through designers’ interpretation rather than direct user participation and co-design during early stages.
Outside the strictly peer-reviewed research papers, the game-design literature offers several highly influential practitioner handbooks and conceptual models for entertainment game design—notably, Schell’s Lenses [31], Koster’s Theory of Fun [32] and Fullerton’s Game Design Workshop [33], among others. These works foreground audience enjoyment, iterative play-testing, and stakeholder balance, and have shaped commercial practice, but omit actionable ED/DT tools for educational contexts. And, fundamentally, they are not focused on designing educational games or they do not strictly meet the aforementioned requirements.
A synthesis of the results shows that, despite differences in emphasis, existing frameworks tend to share a number of foundational assumptions. First, they prioritize instructional alignment, aiming to ensure that game mechanics directly serve educational goals. This often involves embedding assessments within gameplay and structuring progression to scaffold knowledge acquisition. Second, motivational design is a common thread across frameworks, frequently drawing on extrinsic motivators such as badges, points, or levels, alongside intrinsic motivators like narrative immersion or autonomy. Third, most frameworks follow an iterative production logic, advocating design cycles that include ideation, prototyping, testing, and refinement. Finally, they typically define role distributions across teams, with clear distinctions between educators, developers, designers, and other contributors—although these roles rarely intersect in deeply collaborative or co-creative ways [21].

2.1.2. Properties of DSG Design Frameworks

While these frameworks provide essential scaffolding for the design and development of DSGs, their study exhibits some veunes for improvement, according to the current literature. One of the most prominent aspects is that they tend to emphasize technical and instructional elements, often at the expense of the experiential quality of interaction. In most cases, frameworks describe what the learner should learn and how to embed that into gameplay, but they offer little insight into what it feels like for learners to engage with the system. The affective, sensory, and emotional dimensions of gameplay, the lived experience of the user, are underexplored or entirely absent [34]. Another significant issue lies in the underdeveloped integration of empathic and co-design practices. While many frameworks advocate involving users in usability testing or pilot phases, few offer concrete strategies for engaging learners, educators, or marginalized users throughout the design process. Co-creative approaches and empathic design—approaches that actively seek to understand and respond to users’ emotional, contextual, and cognitive needs—remain peripheral, rather than core, to most DSGDFWs [35,36]. As a result, usability, accessibility, and learner agency are often only considered after key design decisions have been made.
Nevertheless, existing frameworks for digital serious game (DSG) design offer valuable conceptual foundations that align learning goals with game mechanics and instructional strategies. Many of these approaches present structured design elements and taxonomies that help educators and developers ensure pedagogical consistency and domain relevance [7,37]. However, the practical adoption of such frameworks in real educational contexts reveals several areas where further development may enhance usability, inclusivity, and sustainability. The importance of empathic interface design remains an area deserving greater attention, since some frameworks implicitly assume a baseline of technical literacy or defer interface design decisions to developers, which may inadvertently exclude lecturers and learners from early-stage design conversations. This limited involvement may reduce the relevance or accessibility of the resulting DSGs, particularly in diverse educational settings where digital proficiency and device availability vary [38,39].
Recent research has emphasized the growing importance of interface-related factors such as visibility, reliability, and memorability in shaping user experience. These parameters are especially critical in environments where learners are expected to interact with complex visual or navigational systems. However, these components are not always prominently featured in existing frameworks, suggesting room for better integration between theory and practice in terms of usability [40,41]. For instance, empirical studies suggest that a significant proportion of educators discontinue the use of DSGs due to unintuitive interfaces—a challenge often intensified by limited usability testing with target users such as teachers and students, as well as constraints on iterative refinement cycles. This issue tends to be more pronounced in frameworks that prioritize instructional alignment over front-end usability, potentially creating a misalignment between theoretical design and classroom realities [7,37].
Another venue for improvement lies in the early-stage collaboration with educational institutions. While interdisciplinary teamwork is often encouraged, co-design processes with schools and teachers are not always formalized, which may result in interfaces that are misaligned with real-world pedagogical workflows. Moreover, practical development challenges—such as limited funding—can push teams to focus on back-end functionalities like data storage or system performance, sometimes at the expense of user-facing interface quality [42]. There is also an opportunity to more effectively support non-technical stakeholders in the design process. Currently, few accessible and educator-friendly prototyping tools exist, making it difficult for teachers or administrators to engage with or shape early design iterations. As a result, valuable experiential insights may be underutilized, and a cycle of exclusion may be unintentionally reinforced [43,44].
Cultural responsiveness represents another area with potential for further development. While many frameworks acknowledge the importance of motivational strategies and engaging narratives, these are often grounded in Western-centric assumptions. Frameworks could benefit from more explicit mechanisms to adapt storylines, characters, or visual design elements to reflect diverse cultural, linguistic, and socio-economic learner backgrounds—an essential step toward fostering greater inclusivity and global applicability [36]. The ethical dimensions of DSG design, particularly in the context of emerging technologies, also warrant closer integration. Despite increasing concerns about privacy, psychological safety, algorithmic bias, and consent in digital learning environments, such issues are rarely embedded within the core structure of most frameworks. As DSGs begin to incorporate AI and learner analytics, the need for embedded ethical guidance becomes more pressing [19,45].
Finally, while multidisciplinary collaboration is often cited as a best practice, frameworks rarely provide explicit strategies or tools to support deep, transdisciplinary integration. Instead, collaboration is often limited to input from distinct roles working in parallel rather than fostering a shared and iterative design practice. Additionally, many frameworks tend to operate as fixed taxonomies or rigid checklists. While this offers clarity and structure, it may limit adaptability in innovative or experimental educational environments that require greater contextual sensitivity and iterative freedom [23].
In summary, existing frameworks have made important strides in aligning instructional objectives with game mechanics, offering valuable structure for the development of pedagogically grounded serious games. However, certain experiential dimensions—such as emotional engagement, ethical awareness, and cultural responsiveness—receive comparatively less emphasis. Many frameworks tend to foreground functional elements, procedural design, and standardization, which, while essential, may inadvertently limit adaptability, inclusivity, and learner-centered innovation. As the field continues to evolve, there is growing recognition of the potential to more fully integrate the emotional, motivational, and experiential perspectives of both educators and learners to create richer, more immersive, and co-creative game-based learning environments [22,46].

2.2. Empathic and Thinking Design: Theoretical Lens

Empathic design, as grounded in human-centered methodologies [47], transcends traditional usability-focused design by emphasizing emotional immersion in learners’ lived experiences. Its key pillars include affective mapping (the observation of emotional responses during gameplay), latent need discovery (surfacing unspoken struggles or desires), and co-creation (involving learners and educators in prototyping and ideation). Design thinking complements ED by providing iterative cycles of ideation, testing, and refinement [48]; however, existing game design frameworks—such as Fullerton’s Game Design Workshop, often lack mechanisms for integrating ED in a structured, research-grounded manner [33].
ED encourages designers to understand users by stepping into their perspective—observing, listening, and analyzing experiences in context [47,49]. It surpasses basic needs analysis by uncovering latent needs, emotional barriers, and sociocultural contexts that influence user behavior. In parallel, design thinking (DT) has emerged as a structured but flexible methodology for addressing complex design challenges. It integrates user research, creativity, and experimentation in iterative cycles that foster innovation and responsiveness [48]. When applied to educational technology, DT emphasizes rapid prototyping and feedback loops with users, allowing educational game designers to continuously refine both learning mechanics and emotional resonance. Together, ED and DT offer a robust toolkit for tackling the core challenges outlined earlier—namely, the neglect of experiential design, emotional connection, and real-world relevance in DSGs. Their integration, when embedded within HCI, supports the foundation for new methodological frameworks.
To address these HCI-related considerations within digital serious game (DSG) design, it is useful to draw on a set of core principles from empathic design (ED) and design thinking (DT), which offer structured guidance for designing interactive systems that are both usable and meaningful in educational contexts. First, user-centered design emphasizes the importance of identifying authentic user needs through the observation of real behaviors, including latent or unspoken expectations, in order to support intuitive and accessible user experiences [50,51]. Second, the principle of user experience foregrounds emotional connection and perceived usefulness, advocating for inclusive co-creation practices grounded in shared understanding between designers and users [52]. Third, the learning journey supports the design of emotionally responsive mechanisms that sustain attention, curiosity, and motivation over time—elements shown to be important for promoting meaningful engagement and learning outcomes [53]. Finally, the visual aspect addresses the role of aesthetics, immersive design, and visual engagement in shaping emotional resonance and overall user satisfaction during gameplay [54]. Together, these principles offer a foundation for approaching DSG design in a way that balances functional usability with pedagogical alignment and emotional relevance.
When applied to classroom-integrated scenarios, three human–computer interaction (HCI) needs emerge as particularly relevant: (1) usability that accommodates diverse cognitive demands, (2) coherent integration of ED and DT principles with instructional content, and (3) design flexibility that supports both educator- and learner-driven use. Focusing on these dimensions enables the development of tools that are grounded in formal learning environments while remaining adaptable to adjacent educational contexts.

2.3. Application of ED-DT Principles in Educational Game Design

The synthesis of existing DSG design frameworks and the principles of empathic design and design thinking (ED-DT) reveals several promising opportunities for enhancement. While current frameworks provide strong structural alignment between pedagogical goals and game mechanics, certain experiential, emotional, and ethical dimensions remain underdeveloped or conceptually framed rather than methodologically embedded. For instance, many frameworks encourage interdisciplinary collaboration and recognize narrative as a key component of engagement, yet these elements are often presented at a high level of abstraction, without offering concrete methods to implement them in practice [24,26,27,30]. As discussed in Section 2.1.2, this can lead to challenges in addressing learner diversity, cultural responsiveness, and emotional resonance—particularly when educators and designers face time, resource, or logistical constraints
ED-DT may offer valuable methodological scaffolding to bridge these challenges. For example, the Empathize and Co-Design stages of ED-DT provide actionable strategies—such as contextual observation, emotional immersion, and participatory ideation—for involving users meaningfully throughout the design process. These practices align with the need for earlier and deeper integration of user perspectives, particularly in settings where access to end users is limited or where traditional prototyping tools are inaccessible to non-technical stakeholders [51,52].
Similarly, they define and ideate stages that can help translate abstract design principles into tangible outputs that address usability, adaptability, and motivational appeal—areas highlighted in Section 2.1.2 as frequently overlooked in existing frameworks [43,44]. In this way, ED-DT does not seek to replace current frameworks but to enhance them by embedding emotional and experiential considerations alongside pedagogical ones [42,43].
By emphasizing iterative, human-centered approaches, ED-DT may also support better alignment between instructional objectives and user engagement. Rather than treating affective and cognitive dimensions as separate, ED-DT encourages their integration, potentially leading to designs that are both educationally effective and experientially rich [21]. These principles could help address venues for improvement identified in the literature, particularly around ethical design, inclusive storytelling, and co-creation.

3. Empathic Design Thinking Framework (EDTF)

The Empathic Design Thinking Framework (EDTF) emerged from a comprehensive synthesis of existing literature across two major design domains: digital serious games (DSGs) and digital entertainment games (DEGs) [21], analyzing their main design principles, commonalities, and gaps (see also Section 2). These insights laid the conceptual groundwork for the EDTF, which was developed as a response to persistent limitations in traditional serious game development frameworks. The proposed framework seeks to establish a holistic, user-centric methodology that embeds empathy, interdisciplinary collaboration, and iterative evaluation across the full lifecycle of serious game design. Figure 1 provides an overview of the phases involved, which include conceptualization, design, development, testing, and deployment. Each phase plays a critical role in ensuring that the game not only delivers effective instruction but also fosters meaningful, immersive learner experiences.
In accordance with the challenges discussed in Section 2, empathic design thinking was adopted as the methodological lens for uncovering and incorporating these human-centered insights into the design process. By identifying common design phases in existing frameworks (exploration, design, development, and assessment), and weaving empathic design principles into them, the EDTF repositions users not just as end recipients, but as active participants in shaping game experiences. Additionally, it is fundamentally grounded in the established body of tools and methods from the fields of user experience (UX) and human–computer interaction (HCI). Each phase of the EDTF—from Empathize and Co-Design to UX Evaluation and Assessment—draws upon proven instruments for human-centered, iterative, and context-aware design. These roots ensure that the EDTF offers a scientifically informed yet practically applicable structure for serious game development and educational technology design [55,56].
The EDTF is proposed as a structured, human-centered approach to DSGs that explicitly integrates methods from empathic design (ED) and design thinking (DT) into all phases of development. While many educational or game design frameworks focus on aligning learning outcomes with mechanics or balancing content and play with indirect consideration of users needs, the EDTF distinguishes itself in three ways:
(a) 
Emotion and User Experience as Core Drivers: The EDTF positions emotional resonance, educator and learner (player) perception, and contextual user needs as design anchors, not afterthoughts. While traditional models or frameworks often include players in playtesting or final usability stages, the EDTF embeds their affective needs from the first step via empathic research techniques such as shadowing, empathy mapping, journey maps, co-creation workshops, and value-centered interviews.
(b) 
Systematic Integration of Empathic Design and Design Thinking Methods: Unlike models that rely on generalized “user-centered” claims, the EDTF outlines specific tools and methods from empathic design [47,57,58] and design thinking [48] in each phase. It supports structured problem framing, user segmentation, emotional mapping, prototyping, and affective feedback, with design checkpoints based on empathic insight.
(c) 
Designed for Educational Constraints: The EDTF is tailored for resource-constrained educational contexts, acknowledging that small teams, limited funding, and institutional silos often inhibit full-cycle design. Therefore, the framework prioritizes lightweight, high-impact tools and participatory shortcuts (e.g., lean empathy mapping, emotional card sorting, stakeholder personas) that make empathic co-design feasible in practice.
The EDTF integrates principles from HCI, ED, and DT to support user experience and engagement; however, it is foremost an educational framework. Its structure explicitly incorporates instructional design elements aimed at supporting the development of clearly defined learning outcomes, competencies, and skills. As such, ED-DT seeks to balance engagement with instructional effectiveness, addressing both learner needs and educational objectives throughout the design process.
The EDTF is not limited to pre-design analysis or confined to rapid prototyping. Rather, it is a full-cycle design and development framework that spans from early discovery through implementation and redesign. To operationalize empathic design in the EDTF, the framework provides a set of methodological guiding questions and tools for each stage to serve as a “readiness checklist” in the Section 3.1.
EDTF’s initial phases, Empathize and Co-Design Instructional Content, are directly informed by empathy mapping, personas, and user journey mapping—core UX tools that help teams understand users’ mental models, motivations, and pain points. These methods support designers in stepping into and out of users’ experiences, a hallmark of empathic design. Some researchers and authors [14] underscore the value of “stepping into the user’s life” to foster creative design that genuinely resonates with user needs, while others outline journey maps as critical for capturing user behaviors and emotional states over time [14,59].
In the Co-Design Scenario and Instructional Narrative phases, the EDTF applies storyboarding, co-creation workshops, and sketch-based tools to enable educators, learners, and designers to collaborate on ideating experiences. These visual thinking techniques help teams experiment with structure, flow, and user tasks before committing to full prototypes. Participatory design methods, as advocated by the d.school [60], empower users and stakeholders to co-own the design process, anchoring the EDTF in democratized innovation. The UX Evaluation (Storyboard) and Low-Fidelity Prototype phases leverage well-established usability inspection techniques, including Nielsen’s Heuristics [59], Playability Heuristics [61], and think-aloud protocols [62]. These tools enable EDTF practitioners to diagnose early-stage usability breakdowns, test player engagement, and iterate design concepts before production. Think-aloud protocols and cognitive walkthroughs—especially when paired with instructional design experts—offer insight into users’ learning pathways and interaction frustrations.
Later EDTF phases—High-Fidelity Prototype and Evaluation and Assessment—build on models of engagement, enjoyment, and flow in games [63], as well as UX behavioral analytics and learning outcome metrics. The use of game flow aligns with the EDTF’s goal to balance cognitive load, immersion, and educational motivation, while tools such as usability metrics help operationalize experience data and refine game iterations through evidence [64].
Overall, the EDTF synthesizes key human-centered and empathic practices that are deeply embedded in the field of design thinking and interaction design. Some researchers highlight that good design requires respecting the user’s needs, capabilities, and emotions; openness; an emphasis on the importance of iteration and ideation [65]; and radical collaboration—all principles that the EDTF embodies in its application to serious games and instructional experiences.

3.1. EDTF 10 Phases

As a digital educational game design framework, the EDTF integrates principles from empathic design thinking and instructional design. It supports not only engagement and usability, but also the articulation of clear learning outcomes, alignment with pedagogical goals, and the evaluation of skill, knowledge, or attitudinal change.
The EDTF framework introduces a focus on the application of ED and DT by embedding learners’ emotional and motivational dimensions directly into the structural scaffolding of serious game development that includes educational content with fun aspects for the end-to-end experience of learning. Unlike traditional DSG design models that emphasize either instructional alignment or entertainment engagement, the EDTF uniquely positions empathic insights—not as peripheral inspiration, but as core design constraints [21]. It operationalizes ED principles through structured learner immersion, persona co-creation, emotion-driven journey mapping, and reflective iteration loops [14,66,67]. From DT, the EDTF adapts iterative prototyping, problem reframing from the learner’s perspective, and collaborative sensemaking to educational contexts. What makes this framework distinctively ED-/DT-based is its emphasis on designing for emotional resonance and experiential alignment before solution ideation, ensuring that learning goals are not just delivered but felt and owned by learners. This deep integration of ED/DT grounds the framework in inclusivity, learner identity, and intrinsic motivation—elements often overlooked in instructional game development.
The EDTF framework acknowledges the challenge of learners’ expectations shaped by commercial entertainment games and addresses it by setting transparent design intents early in the process. ED principles guide the framework to focus not on replicating commercial polish, but on creating experiences that are emotionally meaningful and cognitively engaging. To avoid the pitfall of over-prioritizing entertainment at the expense of learning outcomes, the EDTF defines “resonance” not in terms of graphics or complexity, but in terms of felt relevance, narrative ownership, and sustained motivation. By integrating co-design sessions, user storyboards, and expectation management as part of the empathic inquiry phase, the framework ensures that learners are part of setting realistic, co-owned expectations [68]. Furthermore, it makes explicit that empathic engagement and experiential memorability are not distractions but learning enhancers. The framework separates hedonic appeal from extrinsic gratification, focusing instead on serious games that are immersive because they are contextually relevant—not merely visually impressive.
To address conflicting user input (e.g., varying thematic preferences), the EDTF framework applies a structured empathic synthesis approach. First, learner preferences and needs are observed, collected, and analyzed using thematic clustering to identify major narrative or emotional anchors across groups [69]. The game design team then employs a prioritization matrix that weighs multiple factors: frequency of preference, alignment with learning goals, potential for modular implementation, and technical feasibility. In cases where thematic preferences are diverse but balanced, EDTF supports modular or personalized entry points (e.g., letting users select a theme at the beginning), maintaining agency without diluting focus. This avoids design chaos by embedding user data into systematic design decision-making [70,71]. High-impact, emotionally resonant options are prioritized, but within scope-aware boundaries. This strategy allows the framework to stay empathically grounded while delivering a coherent, pedagogically aligned experience.
Although the framework emphasizes emotional engagement and empathic design, it does so to support, not replace, learning outcomes. The intent is not to mimic commercial games, but to use emotional design as a mechanism for making content more memorable and meaningful, in line with cognitive and motivational learning theories [21].
The EDTF unfolds across 10 interconnected phases, which collectively establish a responsive and replicable blueprint for serious game development. It begins with (1) Empathize, which resolves the often-overlooked need to deeply understand users’ motivations, emotions, and contextual constraints through qualitative research.
The (2) Co-Design Scenario phase ensures meaningful collaboration with lecturers and learners, countering the common lack of stakeholder involvement and promoting shared design ownership. During this phase, designers collaboratively define core learning objectives and expected competencies the serious game aims to foster [72]. (3) Co-Design Instructional Content bridges the gap between gameplay and pedagogy by aligning content with clear instructional goals and learner expectations. Design teams brainstorm game mechanics and narratives that can support the development of target skills or knowledge areas, ensuring that engagement strategies align with intended learning outcomes [73].
Moving into the validation stages, (4) UX Evaluation using storyboards addresses the absence of early narrative and flow testing, allowing adjustments before costly development. The (5) Low-Fidelity Prototype phase enables early feedback and rapid iteration, solving the problem of late-stage usability discoveries. Building on this, (6) Iterative UX Evaluation introduces continuous user testing to refine mechanics and interface elements—filling the gap in ongoing UX refinement.
The (7) High-Fidelity Prototype phase delivers a functional version for realistic testing, responding to the lack of integrated environments for assessing learning efficacy and engagement. In (8) Development, the framework fosters cross-functional collaboration between design, technical, and pedagogical teams, mitigating issues from siloed development practices. (9) Evaluation and Assessment focuses on learning analytics, satisfaction, and behavioral data, addressing the typical weakness in measuring educational impact. This phase evaluates not only user engagement and emotional resonance, but also the game’s effectiveness in meeting its learning goals. The emphasis is on evidencing educational transformation and knowledge transfer, consistent with established instructional evaluation models [74]. Finally, (10) Deployment and Change Management supports strategic rollout, long-term relevance, and adaptability, countering the common neglect of implementation and sustainability planning.
Further, this section provides a closer look at each of the 10 phases in the framework. It outlines the purpose and key activities of each phase and shows how they contribute to a more user-centered and pedagogically grounded approach to serious game design.

3.1.1. Phase 1: Empathize

The process begins with a deep investigation of user needs, motivations, and contextual constraints through qualitative insight gathering. At this foundational stage, the focus is on understanding user behaviors, emotional responses, and pain points in the context of game conceptualization. By exploring the attitudes and experiences of learners, instructors, and stakeholders, the team uncovers opportunities to build empathy and identify real-world challenges that the game could address. Empathy maps are created to visualize users’ thoughts, feelings, and behaviors, which serve as a foundation for all future design decisions [75].
This phase focuses on identifying and segmenting diverse user types—including learners, educators, and stakeholders—to ground the design in emotional and contextual understanding as per ED and DT principles.
Key Goal: The goal is to gain a deep understanding of users’ needs by exploring their experiences, emotional states, and behavioral patterns. By segmenting the user base, the design process can more effectively address distinct motivations and challenges.
ED and DT Design Questions: Key questions guiding this phase include the following: Who are the users? What are their fears, motivations, and pain points? How do they experience the current system?
ED and DT Tools: The objective is to cultivate deep user understanding using tools such as Empathy Maps for gamestorming [76], Contextual Interviews [77], and Emotional Journey Maps [14]. Additionally, the Divergence–Convergence Matrix from the Game Design Matrix [78] supports creative synthesis in this phase.
Context: Game conceptualization and foundational mechanics.
UX Focus: Evaluating user behaviors, values, pain points, emotional responses, and motivations.
Opportunity Potential: Uncover visionary potential to address real user challenges.
Gaps/Limitations Addressed: Overcomes the lack of attention to users’ implicit motivations, emotional context, and experiential needs identified in prior approaches.
Key Activities:
  • Identify and segment target users (learners, instructors, and stakeholders)
  • Conduct qualitative research to understand user attitudes and behaviors
  • Cultivate deep empathy to inform user-centered design
Artifact: Empathy Maps—visualizing user thoughts, feelings, and behaviors.

3.1.2. Phase 2: Co-Design Scenario

This phase shifts toward collaboratively mapping the learning journey and personas. It involves co-creation sessions with lecturers and learners to define how the educational narrative and user experience will unfold. Through collaborative mapping of user journeys and decision points, the team ensures alignment between instructional goals and user expectations. The purpose here is to foster a sense of shared ownership while encouraging new perspectives on how learning can be approached through storytelling. The output is a journey map that captures the key milestones and interactions users will encounter during gameplay [67,79].
This phase focuses on translating user context into meaningful gameplay situations through narrative scaffolding according to ED and DT principles.
Key Goal: The aim is to build emotionally relevant, engaging scenarios anchored in real-life situations that resonate with users’ experiences and emotions.
ED & DT Design Questions: What real-life situations can anchor gameplay? How can stories evoke empathy and motivation?
ED & DT Tools: Build emotionally relevant, engaging scenarios using Scenario Mapping [59], Narrative Prototyping and Storyboarding [66], and Emotional Arc Diagrams [80,81].
Context: Structuring the learning journey and narrative.
UX Focus: Aligning story progression with educational goals through collaborative mapping.
Opportunity Potential: Encourage new perspectives in learning via interactive storytelling (paradigm shift).
Gaps/Limitations Addressed: The limited integration of learner perspectives and educational alignment in narrative design found in existing frameworks.
Key Activities:
  • Map decision points within the user journey;
  • Evolve personas based on real-world data;
  • Facilitate co-design sessions with users and educators.
Artifact: Journey Map—illustrating key interactions and learning milestones.

3.1.3. Phase 3: Co-Design Instructional Content

The game’s educational core is structured to meet pedagogical standards. This phase ensures that learning content is carefully synchronized with gameplay mechanics and narrative flow. By co-developing instructional goals with subject-matter experts and aligning them with user expectations uncovered in earlier phases, the design team ensures that content is both engaging and educationally sound. The instructional design document produced here outlines learning objectives, content flow, and embedded assessments [82].
This phase involves co-developing instructional goals and values with users to ensure aligned and relevant learning experiences as per ED and DT principles.
Key Goal: The objective is to align learning content with users’ real-world needs and diverse perspectives.
ED and DT Design Questions: What content matters to users? How can learning goals reflect real-world needs and diverse perspectives?
ED and DT Tools: Align learning with emotional and social context using Persona Building [83], Journey Mapping [67], Participatory Content Workshops [84], and the Value Proposition Canvas [85].
Context: Syncing gameplay mechanics with learning goals.
UX Focus: Ensuring instructional strategies meet learner needs.
Opportunity Potential: Innovate through learning technology integration (technically challenging).
Gaps/Limitations Addressed: Overcomes insufficient alignment between game mechanics and pedagogical objectives common in prior frameworks.
Key Activities:
  • Define measurable learning objectives;
  • Design structured content for engagement and pedagogy;
  • Validate with subject-matter experts.
Artifact: Instructional design document—outlining outcomes and content flow.

3.1.4. Phase 4: UX Evaluation (Storyboard)

This phase involves validating the game’s narrative and visual flow through early testing. Storyboards are reviewed with users and stakeholders to ensure alignment between instructional intent and user engagement. By visualizing how scenes and interactions will unfold, the team can identify mismatches between what users need and what the design currently offers. This step is critical for shaping a coherent and emotionally engaging experience, and the storyboard serves as both a design reference and a testing artifact [86,87].
This phase identifies pain points in logic, structure, and emotional flow through walkthroughs of key story flows according to ED and DT principles.
Key Goal: Map the user journey and evaluate emotional engagement and logical consistency across interactions.
ED and DT Design Questions: Where might confusion, frustration, or disengagement occur? What steps break immersion or flow?
ED and DT Tools: Evaluate storyboard logic and emotional journey using Storyboard Evaluation [66], Scenario Role-play [88], and Emotional Journey Overlay [14].
Context: Ensuring visual and narrative coherence.
UX Focus: Assessing storytelling clarity and user engagement.
Opportunity Potential: Promote cross-functional insight (multidisciplinary collaboration).
Gaps/Limitations Addressed: Solves the lack of early narrative validation and coordination between story and instructional design found in existing approaches.
Key Activities:
  • Review narrative flow with stakeholders;
  • Identify and resolve conflicts between narrative and instructional goals;
  • Iterate storyboard based on usability feedback.
Artifact: Storyboard—visualizing scenes, transitions, and interactions.

3.1.5. Phase 5: Low-Fidelity Prototyping

Moving into this phase, the team creates simple mockups or paper prototypes to quickly test basic interaction flows. These low-cost prototypes enable early testing without committing to resource-intensive development. Users are invited to interact with rough wireframes to evaluate the clarity, logic, and usability of the intended design. This early feedback loop helps surface usability issues, misconceptions, or mismatches in expectations, allowing the team to iterate quickly and efficiently [87,89].
This phase explores early design interactions with a focus on emotional and functional value in relation to ED and DT principles.
Key Goal: Quickly explore design directions and test user intuitions and reactions at low cost and risk.
ED and DT Design Questions: What aspects of interaction are intuitive or confusing? Does it feel right to play?
ED and DT Tools: Explore design alternatives using low fidelity Paper Prototypes [90], Think-Aloud Protocols [91], and Wizard-of-Oz Testing [92].
Context: Early-stage concept validation.
UX Focus: Cost-effective usability testing.
Opportunity Potential: Enable fast iteration and feedback (actionable).
Gaps/Limitations Addressed: Resolves the issue of late usability testing by enabling early detection of design flaws and faster iteration cycles.
Key Activities:
  • Develop basic wireframes or paper prototypes;
  • Conduct usability sessions with target users;
  • Pinpoint friction points and usability gaps.
Artifact: Low-fidelity prototype—sketches, paper models, or digital mockups.

3.1.6. Phase 6: Iterative UX Evaluation

This phase builds on the low-fidelity feedback by refining the user experience through repeated usability testing. It focuses on validating interaction mechanics, feedback systems, and UI components in progressively higher fidelity. Both qualitative feedback and behavioral data are analyzed to pinpoint where users struggle, where they succeed, and how the overall experience can be enhanced. This process helps shape a product that feels intuitive and responsive to user needs, well before the final development stage with usability test reports [93].
This phase validates the logic and emotional engagement of early narrative flows according to ED and DT principles.
Key Goal: Test early structures to ensure they support coherent and emotionally engaging designs.
ED and DT Design Questions: Are the scenarios coherent? Do they emotionally engage or confuse?
ED and DT Tools: Use Cognitive Walkthroughs [94], Heuristic Evaluation [95], and Playability Heuristics [61] to improve flow and engagement.
Context: Ongoing interaction design refinement.
UX Focus: Optimizing user experience through repeated validation.
Opportunity Potential: Ensure adaptability through feedback (continuous improvement).
Gaps/Limitations Addressed: Insufficient iterative testing and refinement common in existing design processes.
Key Activities:
  • Conduct structured usability tests;
  • Analyze qualitative and quantitative feedback;
  • Improve UI/UX before advancing to high-fidelity build.
Artifact: Usability test reports—synthesizing findings and change recommendations.

3.1.7. Phase 7: High-Fidelity Prototyping

In this phase, the game is developed into a fully interactive, near-final version. This build includes refined visuals, interactions, and audio elements. At this point, the team rigorously tests the game to assess engagement levels, learning effectiveness, and user satisfaction. Technical feasibility and gameplay flow are validated, and insights from this phase are used to make final design adjustments before launch. The high-fidelity prototype serves as the closest simulation of the final game and is often used for summative testing [96].
This phase playtests emotional and cognitive engagement using complete interaction models as per ED and DT principles.
Key Goal: Evaluate the game’s ability to maintain flow and emotional resonance with players.
ED and DT Design Questions: Where does flow break down? Are players frustrated, bored, or lost?
ED and DT Tools: Test engagement using the GameFlow Framework [63], Game Experience Questionnaire [97], Eye Tracking, and Emotion Recognition [98].
Context: Engagement testing and interaction validation.
UX Focus: Fine-tune interactivity, aesthetic appeal, and educational impact.
Opportunity Potential: Resolve technical hurdles pre-launch (technically challenging).
Gaps/Limitations Addressed: The lack of realistic, integrated testing environments for engagement and learning effectiveness before full development.
Key Activities:
  • Develop near-final interactive prototypes;
  • Assess engagement, retention, and technical performance;
  • Refine gameplay before full-scale development.
Artifact: High-fidelity prototype—functional, polished demo.

3.1.8. Phase 8: Development

This phase represents the full-scale production phase where all aspects of the game, mechanics, content, and interface, are implemented. Cross-functional collaboration ensures that the instructional goals, user experience insights, and technical requirements are integrated into a polished product. Quality assurance testing is conducted to ensure performance, reliability, and fidelity to the original design intent. The final outcome is a fully functional game ready for deployment.
This phase integrates emotional, learning, and usability feedback into the game system according to ED and DT principles.
Key Goal: Refine mechanics and interaction to align purposefully with learning and emotional goals.
ED and DT Design Questions: Are mechanics aligned with learning and emotion? Does it feel purposeful?
ED and DT Tools: Finalize the interaction system using the System Usability Scale (SUS) [99], Task-Based Performance Testing [100], and Affective Feedback Loops.
Context: Transition from prototype to production.
UX Focus: Ensure fidelity to tested user needs and educational intent.
Opportunity Potential: Enable execution at scale (multidisciplinary execution).
Gaps/Limitations Addressed: Challenges in maintaining design integrity and quality control during full-scale production and handoff.
Key Activities:
  • Implement full game content, mechanics, and visuals;
  • Conduct comprehensive QA testing;
  • Prepare assets for deployment.
Artifact: Final game solution—complete, ready-to-deploy product.

3.1.9. Phase 9: Evaluation and Assessment

This phase focuses on measuring the impact of the game in real-world learning environments. Through a mix of learning analytics, user feedback, and performance metrics, the team evaluates whether the game meets its educational and engagement goals. Post-launch assessments capture user satisfaction, knowledge retention, and behavioral change. Evaluation reports are produced to guide further refinements and determine the overall success of the intervention.
This phase validates the impact of the game on knowledge, emotion, and behavior as per ED and DT principles.
Key Goal: Demonstrate the game’s effectiveness and transformative impact on users.
ED and DT Design Questions: Did the game meet learning and emotional goals? What was transformed?
ED and DT Tools: Evaluate impact using Pre/Post Tests, Learning Analytics [101], and Behavioral Observation [102].
Context: Post-deployment learning outcome evaluation.
UX Focus: Assess real-world educational and experiential impact.
Opportunity Potential: Drive long-term effectiveness (Far-Reaching).
Gaps/Limitations Addressed: Overcomes the lack of comprehensive impact evaluation and feedback loops after deployment in existing frameworks.
Key Activities:
  • Run post-launch assessments;
  • Track retention, engagement, and skill acquisition;
  • Identify areas for revision or enhancement.
Artifact: Evaluation reports—documenting impact with qualitative and quantitative data.

3.1.10. Phase 10: Deployment and Change Management

This phase ensures that the game is implemented smoothly and continues to evolve over time. This includes developing rollout strategies, supporting large-scale adoption, and monitoring long-term use. Feedback loops are established to gather user input for future updates, and maintenance plans are developed to ensure sustainability. System updates and release notes track improvements and support continued relevance, ensuring that the game remains a valuable learning tool well into the future.
This final phase captures insights to inform ongoing design and future scaling related to ED and DT principles.
Key Goal: Reflect on design outcomes and prepare for broader implementation or reuse.
ED and DT Design Questions: What worked, what didn’t, and why? How can future games embed empathy better?
ED and DT Tools: Improve future designs through Retrospective Interviews, Affinity Mapping, Co-reflection Workshops, and Thematic Analysis [103].
Context: Real-world scaling and lifecycle support.
UX Focus: Maintain and adapt the product based on user feedback.
Opportunity Potential: Secure sustainability and adoption (actionable).
Gaps/Limitations Addressed: Resolves the insufficient focus on ongoing maintenance, scalability, and user-driven evolution in current design approaches.
Key Activities:
  • Deploy to large-scale environments;
  • Monitor performance and user feedback;
  • Release updates and continuous improvements.
Artifact: System updates and release notes—documenting iterations and user-driven enhancements.

4. Evaluation Method

As part of the proposal, the Empathic Design Thinking Framework (EDTF) was evaluated by addressing the following research questions:
  • RQ1: How robust, usable, and applicable is the EDTF framework?
  • RQ2: What is the perceived value attributed to each of the EDTF phases?
To address these questions, a two-stage mixed-methods evaluation approach was employed. Stage 1 consisted of a structured quantitative evaluation using standardized and custom-built instruments, while Stage 2 focused on qualitative insights gathered through in-depth interviews. This section details the instruments used, participant recruitment, and data collection procedures for both phases.
The evaluation of the EDTF in this study is limited to expert validation, rather than field testing. It employs a mixed-methods approach to assess perceived utility, usability, and relevance for design practice. While future field testing within live educational projects is necessary for comprehensive and longitudinal validation, this initial evaluation follows a well-established tradition in foundational HCI research, where early-stage frameworks are commonly assessed through expert review and theoretical applicability before implementation studies are conducted [104].
Following the methodological progression seen in HCI research, we aimed to verify the framework’s clarity and usability before broader implementation with an evaluation conducted in two stages. The first stage consisted of a quantitative assessment with a sample of 60 domain experts in instructional game design, development, and human–computer interaction (HCI). Of these participants, 40% were instructional game designers, 44% instructional game developers, and 16% HCI experts. They were asked whether they could be contacted later for participation in an interview or workshop. The second stage involved a qualitative evaluation with a subsample of 18 participants drawn from the same pool of 60 experts. This group mirrored the disciplinary and career-stage composition of the larger sample, consisting of 44% instructional game developers, 33% instructional game designers, and 22% HCI experts. Further details on participant profiles are provided in the Section 5.

4.1. Stage 1: Quantitative Evaluation Through Questionnaires

The first stage of the study aimed to gather broad, systematic feedback on the EDTF using a set of structured, quantitative instruments. To achieve a multi-faceted understanding of the framework’s usability and perceived value, three key tools were employed: the Usability Metric for User Experience (UMUX-Lite), the System Usability Scale (SUS), and a custom Perception Evaluation Questionnaire (PEQ).
The UMUX-Lite scale [105] is a concise, two-item measure derived from the Technology Acceptance Model [106]. This tool captures perceived ease of use (PEOU) andpPerceived usefulness (PU), two core predictors of user adoption, using a 7-point Likert scale. Its high correlation with SUS and minimal cognitive burden made it particularly suitable for expert respondents. To complement this, the study also incorporated the SUS [99], a 10-item Likert-scale questionnaire, to assess the general usability of the EDTF. Its scores range from 0 to 100, with values above 68 considered above average [107]. The SUS was selected for its well-established reliability and simplicity in capturing a broad view of system usability.
In addition to these standardized tools, a customized Perception Evaluation Questionnaire (PEQ) was developed to assess nine framework-specific dimensions informed by prior evaluation models in game and educational design [108]. These dimensions included Clarity, Flexibility, Comprehensiveness, Innovation, Ease of Learning, Internal Consistency, Support for Problem-Solving, Real-World Applicability, and Support for Testing and Iteration. Each dimension was rated on a 5-point Likert scale. To further understand the perceived utility of the EDTF’s structure, participants were also asked to rate the value of each of the framework’s 10 phases individually, using the same scale. A full breakdown of PEQ items and the phase one evaluation scale is provided in Appendix A.
Participants for this phase were recruited from a wide-ranging pool of professionals in game development, design, learning science, and human–computer interaction (HCI). Recruitment strategies prioritized diversity in disciplinary expertise and were carried out via email invitations and professional online communities. All participants enrolled voluntarily and provided informed consent before beginning the study. The evaluation was administered through an online survey platform. Prior to filling out the questionnaire, participants were asked to independently review the EDTF framework, providing a brief instructional guide to ensure participants could navigate the EDTF effectively. The survey was then completed in a single session, with no time constraints imposed. The vast majority of responses were submitted within the first week of the dissemination period. All data were collected anonymously and in digital format. This approach supported accessibility across geographic locations and minimized barriers to participation. Upon collection, the data were securely exported for statistical analysis in a controlled research environment in Python 3.11.13.

4.2. Stage 2: Qualitative Evaluation Through In-Depth Interviews

To complement the breadth of the quantitative analysis, Stage 2 of the study adopted a two-part qualitative approach through in-depth, semi-structured interviews and group collaborative discussion. This phase aimed to unpack participants’ deeper perceptions of the EDTF as end to end framework.
In the first part of the qualitative assessment, each interview followed a clear protocol based on reflective inquiry [109], in which participants independently explored the EDTF framework and responded to open-ended prompts regarding its clarity, usability, feasibility, usefulness, and ease of use. These constructs were grounded in core principles of usability engineering and design thinking, enabling consistent comparison across participants. This method was chosen to elicit both abstract reflections on the EDTF’s structure and practical insights into its potential use. The interview design encouraged participants to critically assess the framework’s strengths and limitations, revealing barriers and affordances not captured through the quantitative questionnaires.
The interviewees were selected from the original Stage 1 respondent pool based on their domain expertise and willingness to participate further. A total of 18 experts were invited, ensuring a balanced mix of professionals across design, development, and HCI domains. Each participant was contacted individually and scheduled for a one and half-hour session conducted via Zoom. Prior to the session, participants were informed about the research purpose and data protection measures and gave consent to be recorded. Interviewers facilitated the session by prompting participants to describe their thought processes and design logic as they walked through each phase of the EDTF. Participants were encouraged to reference relevant design contexts or previously used project materials, promoting continuity between the two phases of the study. All sessions were recorded, transcribed and anonymized. Data analysis was carried out manually in a spreadsheet environment and refined into the core dimensions of evaluation, (a) comprehensibility, (b) usefulness, (c) usability, (d) feasibility, and (e) ease of use, across all 10 phases.
To further deepen the evaluation, the qualitative phase included a second part consisting of a collaborative workshop, where participants collectively applied the EDTF to hypothetical real life scenarios based on the experts’ previous experiences and projects with challenges they had in the past [110]. This two-phase structure—individual interviews followed by a group workshop—enabled the triangulation of data, combining personal perspectives with dynamic group interactions. During the workshop, experts reflected on each phase of the EDTF and shared common challenges they have faced when creating serious digital games (SDGs). Using a lightweight discussion format, participants explored how these challenges typically manifest throughout the design and development process and discussed how the EDTF could help teams address them more effectively.
Together, these qualitative two-parts provided a comprehensive qualitative evaluation of the the EDTF, balancing individual expert critique with collaborative reflection. The integration of both individual and group-based data enhanced the robustness of findings and informed a nuanced understanding of how the EDTF can guide educational game design in practice.
Overall, through this two-phase mixed-methods approach, quantitative and qualitative, the study provided both a high-level validation of the EDTF’s usability and a grounded understanding of how experts envision applying it in real-world educational game development.

5. Results

This section presents the mixed-approach evaluation results with both quantitative and qualitative data and insights from the EDTF expert validation and evaluation process.
The EDTF has been rigorously evaluated through both qualitative and quantitative methods, including expert validation, usability testing, and user experience assessment. The evaluation involved educational technologists, instructional designers, and students across multiple prototypes of DSGs. Findings indicate that the EDTF enhances clarity in the design process, improves the emotional resonance and usability of DSGs, and supports meaningful collaboration among stakeholders. Usability scores demonstrated measurable improvements in game navigation, feedback clarity, and learner satisfaction. Expert feedback confirmed that the framework fills a critical gap in current practices by bridging the pedagogical with the experiential. This contribution provides actionable guidance for educational practitioners and designers seeking to create high-impact DSGs, while advancing the theoretical discourse on integrating human-centered methodologies into educational technology design. Ultimately, the EDTF fosters more meaningful human–technology interactions, supporting learner motivation, engagement, and knowledge retention.

5.1. Quantitative Evaluation Results

The quantitative evaluation of the EDTF framework involved 60 domain experts with professional backgrounds in instructional game design, development, and human–computer interaction (HCI).
Of the participants, 40% identified as instructional game designers, 44% as instructional game developers, and 16% as HCI experts. Most were early- to mid-career professionals, with 50% having 1–3 years of experience, 28% having 3–5 years, and 22% having more than 5 years in the field. The gender distribution skewed male (68%), with female participants representing 32% of the sample. Geographically, respondents represented a moderately diverse set of regions: 34% from North America, 30% from Europe, 8% from Africa, 6% from Latin America, 4% from the Middle East, and 4% from Asia-Pacific.

5.1.1. UMUX-Lite Results

The UMUX-Lite assessment of the EDTF offers valuable insights into how 60 experts perceived the EDTF tool in terms of its perceived usefulness and ease of use. With a strong composite score of 79.5 out of 100, the framework demonstrates considerable strengths as per Figure 2.
The usefulness dimension received a particularly high rating of an average of 4.43 out of 5. This suggests that most experts strongly agree that the EDTF would effectively support their needs in digital serious game design. Users found that the framework’s 10 phases and components align well with real-world design challenges, and they recognized genuine value in what the framework would offer for their professional work. This positive score reflects the framework’s success in addressing core functional requirements, providing comprehensive support for various aspects of digital educational game creation.
The ease of use score of an average of 3.93 out of 5 for the EDTF tool reflects a generally positive perception among experts, especially considering the framework’s depth and complexity. This score is a strong indicator that participants find it accessible and navigable, even while engaging with a comprehensive and structured process that spans the entire serious game design and development lifecycle. Given that the EDTF is built around a robust 10-phase methodology, it naturally introduces a level of sophistication that supports professional-grade work. Each phase, ranging from initial needs analysis to post-deployment evaluation, offers a detailed and methodical approach that helps designers and developers stay focused, organized, and aligned with pedagogical and gameplay goals. The richness of this framework contributes to its overall effectiveness, but it also means that experts must engage with a multistep process that requires time, effort, and familiarity.
Overall, the UMUX-Lite score of 79.5 places the EDTF well above the average benchmark of 68, marking it as a strong performer [105]. It is nearing the “excellent” threshold of 80, and it stands competitively alongside other professional tools, solutions, and methodologies used in the game development space. This result reinforces the framework’s value as a robust and effective solution for digital serious game design, particularly in terms of functionality.

5.1.2. System Usability Scale (SUS) Results

The usability of the EDTF for digital serious game design and development was evaluated using the System Usability Scale (SUS). The analysis revealed a mean SUS score of 78.50, indicating a high level of perceived usability among participants, as per Figure 2. This score falls within the “Good to Excellent” range according to industry benchmarks and places the framework well above the commonly accepted average SUS benchmark of 68 [99,107].
The score range observed across all respondents was between 70.00 and 87.50, suggesting moderate variability in individual experiences but consistently positive perceptions overall. A score of 78.50 translates to a grade of B+ or even A- in SUS grading scales, which is generally interpreted as users finding the system usable, intuitive, and well-structured. Participants were able to navigate the framework efficiently and comprehend its phases, and it can be applied meaningfully in their design and development processes. Importantly, this score suggests that the EDTF framework supports core usability principles such as learnability, efficiency, and satisfaction.
To contextualize this result, a benchmarking approach was adopted with the primary goal of establishing a baseline threshold for usability—rather than drawing direct comparisons with other frameworks, which often rely on varied and incompatible validation standards. The quantitative evaluation served as a foundational checkpoint: If the framework had failed to meet essential usability and usefulness criteria (e.g., SUS, TAM), it would have raised concerns about its practical viability. In this case, however, the EDTF’s SUS score confirms its overall usability. Additionally, the score surpasses the >=75 benchmark recommended by Nielsen [111] for solutions or frameworks designed for rapid uptake and low-friction integration, particularly in constrained or time-sensitive environments. This is particularly relevant for a framework rooted in empathic and design thinking principles, where intuitiveness and ease of use are essential to support non-technical users and interdisciplinary teams.
Experts likely found it straightforward to understand the framework’s empathic and iterative structure, and they appreciated the guidance it provided in designing meaningful game-based learning experiences. However, the score also leaves room for minor improvements, perhaps in the area of flexibility of use, to push the usability into the “Excellent” bracket (>80.3 SUS score). Overall, the SUS results reinforce the EDTF framework’s viability and acceptance among serious game designers, developers, and HCI experts.

5.1.3. PEQ: General Evaluation Results

Experts provided valuable feedback on several key dimensions of the EDTF framework. Overall, the responses were highly positive, with several areas standing out as particular strengths, as per Figure 3.
  • Clarity: Respondents generally found the framework clearly explained and easy to understand (Mean = 4.34, SD = 0.84). Most participants agreed that the structure was logical and well-presented, though a few indicated they could benefit from more detailed guidance.
  • Flexibility: Flexibility emerged as one of the strongest aspects of the framework (Mean = 4.67, SD = 0.70). Participants consistently felt the EDTF could be adapted to a wide variety of learning game types, audiences, and design contexts, highlighting its broad applicability.
  • Comprehensiveness: Most respondents felt the framework effectively covered essential components of serious game development (Mean = 4.16, SD = 1.04). While the overall feedback was positive, some suggested there may be room to expand or clarify certain areas for a more complete process.
  • Innovation: Innovation received a strong positive rating (Mean = 4.36, SD = 0.99), indicating that most experts recognized its value and saw the EDTF as offering fresh, unique, and useful perspectives in digital serious game design and development.
  • Ease of Learning: Participants found the framework easy to learn and grasp (Mean = 4.52, SD = 0.82), especially after reviewing its phases. The overall consensus was strong, although a few users noted they would benefit from additional learning resources.
  • Internal Consistency: Internal consistency was the highest-rated dimension (Mean = 4.76, SD = 0.62). Nearly all participants agreed that the framework’s components worked well together and formed a coherent whole. This was seen as a major strength, reinforcing its structured and integrated nature.
  • Support for Problem-Solving: Feedback on this aspect was more varied (Mean = 3.72, SD = 1.12). While some found the framework helpful for addressing design challenges, others felt it could offer more targeted guidance.
  • Real-World Applicability: Participants largely agreed that the framework could be effectively applied in real-world scenarios (Mean = 4.36, SD = 0.99). While most viewed it as practical, a few expressed the need for more real-life examples or case studies to better understand how it performs outside of theoretical settings.
  • Support for Testing and Iteration: Opinions on testing and iteration were the most divided (Mean = 3.92, SD = 1.23). Some experts found the guidance sufficient, while others felt the framework needed more emphasis on iterative processes.

5.1.4. PEQ: Phase-by-Phase Evaluation Results

The EDTF consists of 10 phases, each rated by experts from 1 to 5 based on their importance, as per Figure 4.
Four phases received the highest rating of 5/5:
  • Phase 1: Empathize (1_Emp) was unanimously recognized as the cornerstone of the framework. It anchors the process in real user needs, ensuring alignment across all subsequent phases. While its abstract nature may challenge beginners, its human-centered focus justifies its top rating.
  • Phase 5: Low-Fidelity Prototyping (5_LFP) was applauded for turning ideas into testable forms early in the process, striking a practical balance between creativity and feasibility. Although resource-intensive for smaller teams, its value in supporting rapid, cost-effective iteration made it stand out.
  • Phase 7: High-Fidelity Prototyping (7_HFP) played a pivotal role in validating user experience and learning effectiveness, bridging the gap between design and development. Despite being resource-heavy and technically demanding, it reduces downstream risk.
  • Phase 9: Evaluation and Assessment (9_E&A) addressed a frequent shortcoming in other frameworks by emphasizing the measurement of learning outcomes. Experts appreciated its focus on educational goals and called for better tools to assess cognitive impact.
Five phases consistently scored a 4/5, reflecting strong but slightly constrained value:
  • Phase 2: Co-Design Scenario (2_CDS) helped align stakeholders early through learner journey mapping, though success heavily depends on facilitation skills. Experts recommended the use of templates to support teams with less experience.
  • Phase 3: Co-Design Instructional Content (3_CDIC) ensured synergy between instructional content and gameplay. Its effectiveness, however, relied on strong interdisciplinary collaboration, which can be difficult to sustain in resource-limited settings.
  • Phase 4: UX Evaluation – Storyboard (4_UXE) was appreciated for its ability to visualize mechanics and engage non-technical stakeholders. Practical and collaborative, it was seen more as a transitional tool than a primary driver of innovation.
  • Phase 6: Iterative UX Evaluation 317 (6_IUXE) was considered highly impactful for product refinement but laborious, often lacking clear protocols and requiring substantial user access and time.
  • Phase 10: Deployment and Change Management (10_D&CM) was acknowledged for its importance in long-term adoption and sustainability. Experts noted that it is often underfunded and viewed as a “post-launch” phase rather than an integral part of the process. Calls were made for embedded feedback systems to improve its effectiveness.
  • Phase 8: Development (8_Dev) received the lowest rating (3/5), primarily due to execution challenges. While technically indispensable, it was perceived as a bottleneck rather than an opportunity for innovation. Experts cited delays, underfunding, and collaboration breakdowns, often caused by siloed development efforts disconnected from earlier design phases. Better project management and integration were seen as necessary to improve its value and reliability within the process.
Overall, the findings highlight the importance of user-centered thinking, continuous evaluation, and cross-functional collaboration throughout the design and development journey.

5.2. Qualitative Evaluation Results

A total of 18 domain experts, drawn from the initial sample of 60 participants in the quantitative phase, participated in the qualitative evaluation. This subsample mirrored the disciplinary composition of the larger sample, consisting of 44% instructional game developers (eight participants), 33% instructional game designers (six participants), and 22% HCI experts (four participants). The subsample was selected to reflect the same balance of expertise and career-stage distribution as the quantitative participants, ensuring consistency and representativeness between the two evaluation phases. Participants similarly represented early- to mid-career levels, with 50% reporting 1–3 years of experience, 28% reporting 3–5 years, and 22% reporting over 5 years in the field. Gender distribution was 70% male and 30% female, closely aligning with the quantitative sample. Geographical representation followed the same pattern, with the majority from North America and Europe, and additional participants from Africa, Latin America, the Middle East, and Asia-Pacific. Most experts had hands-on experience with widely used game engines such as Unity, Unreal Engine, and GameMaker, grounding their feedback in practical development environments. Detailed participant demographics, including gender, role, years of experience range, and region, are provided in Table 1.
Participants reviewed the EDTF by reflecting on how they would apply it within their own design and development workflows, consistently projecting that its greatest value lies in the early empathic immersion phase. This phase encourages a deeper, emotion-focused understanding of users, helping designers uncover latent needs often overlooked in traditional design methods. This empathic inquiry was perceived as a unique contribution of the EDTF, fostering a critical shift from assumption-driven to empathy-driven design.
In the subsequent ideation and prototyping phases, participants found that the EDTF offers structured guidance to translate empathic insights into concrete design concepts. They anticipated that this clarity would reduce ambiguity and facilitate interdisciplinary collaboration, thereby enhancing overall design efficiency and coherence. While feedback was predominantly positive, participants also candidly acknowledged challenges, particularly balancing empathic depth with practical constraints such as time pressure and cognitive load. This tension reflects the realistic difficulties inherent in integrating empathic design (ED) with design thinking (DT), which the EDTF navigates by combining empathic reflection with iterative prototyping. These reflections highlight opportunities for future enhancements to increase the framework’s flexibility and adaptability.
Importantly, participants identified that the EDTF adds distinct value beyond other design frameworks by emphasizing empathic engagement as a core, actionable process rather than an abstract objective. Unlike frameworks that focus mainly on cognitive or structural design aspects, the EDTF positions itself as a more holistic methodology fostering a mindset shift—from solution-centric problem-solving to empathy-driven co-creation.
The EDTF achieves this by positioning empathic immersion, grounded in ED’s focus on emotional and contextual depth, as the foundational phase, which then flows into DT-inspired iterative cycles of ideation and prototyping. This integration ensures that empathic insights are not merely preliminary observations but are actively translated into design decisions through structured reflection and iteration. The EDTF uniquely combines empathic design (ED) and design thinking (DT) by starting with a strong empathic immersion phase that helps designers deeply understand users’ emotions and needs before moving into iterative ideation and prototyping. This emphasis on emotional and experiential understanding sets the EDTF apart from many other serious game design frameworks.
For example, frameworks like MDA [112] focus mainly on game mechanics and player experience without directly addressing user emotions early in the process. The Open Group Architecture Framework (TOGAF) ADM provides a structured methodology for enterprise architecture development, emphasizing systematic planning and stakeholder alignment, but it does not account for the emotional, empathic dimensions of user experience or co-creation [23]. Other frameworks such as the Triadic Framework [24] and the Balanced Design Framework [25] focus on balancing reality, curriculum, and gameplay, but they lack structured support for empathic engagement. Models like the Tandem Transformational Game Design Framework [30] promote collaboration between designers but rely on indirect user input rather than early-stage co-design and empathy. In contrast, the EDTF embeds empathic inquiry as a core, actionable process, guiding designers to engage directly with users’ emotional and contextual experiences. This integration of ED and DT helps shift the design mindset from purely problem-solving to empathy-driven co-creation, which many existing frameworks do not explicitly support.

5.2.1. Core Dimensions Evaluation

The expert review of the Empathic Design Thinking Framework (EDTF) through reflective interviews offered rich insights across five core dimensions, (a) comprehensibility, (b) usefulness, (c) usability, (d) feasibility, and (e) ease of use, across all 10 phases.
  • Comprehensibility: Experts found the framework to be highly comprehensible, with 89% of participants feeling the structure was coherent. Its phased layout—from early ideation to final deployment—provided a clear and logical pathway. The progression through phases was especially appreciated for its alignment with typical development workflows. A minority (11%) suggested fine-tuning some of the terminology used in specific phases to further enhance clarity and understanding.
  • Usefulness: In terms of this dimension, 83% of experts recognized its tangible value in guiding the creation of digital educational games. Phases 7 to 9, which emphasize iterative testing and refinement, were particularly praised for promoting high-quality outcomes. Nonetheless, 17% of experts highlighted potential friction between the framework’s iteration cycles and the fast pace of agile development environments, pointing to possible scheduling difficulties under tight deadlines.
  • Usability: For this dimension of the framework, around 78% of participants indicated they could easily apply the EDTF in their workflows. However, some experts (22%) expressed concern that the testing-heavy stages could become resource-intensive, especially for smaller development teams operating with limited time or staff.
  • Feasibility: The majority of experts (80%) believed the EDTF could be implemented successfully with adequate resources. However, participants also pointed out that effective use of the framework may require multidisciplinary capabilities—combining skills from design, pedagogy, technology, and user research—making it more challenging for teams lacking such diversity.
  • Ease of Use: Lastly, ease of use emerged as one of the strongest areas, with a remarkable 94% of experts appreciating the clear, step-by-step format, noting that it made the framework intuitive and accessible. Only a small fraction (6%) mentioned that the final phases, often involving more complex technical development, could be difficult for teams less familiar with advanced tools or processes.

5.2.2. Reflections on EDTF Phases

A more detailed explanation of the framework’s phases reveals how each step strategically builds toward creating effective, learner-centered educational games. This allows teams to ground their design decisions in a deep understanding of user needs, foster interdisciplinary collaboration, and iteratively refine their solutions with practical evaluation. By following this structured approach, teams can better anticipate challenges, align stakeholders, and ensure that both educational goals and user experience are thoughtfully integrated throughout the development process.
Phase 1—Empathize: All experts agreed this opening phase set a crucial human-centered foundation. They consistently emphasized the necessity of understanding users’ needs, motivations, and challenges, positioning this insight as the driving force for the entire design process. While some noted that beginners may struggle with the abstract nature of this step, its importance in shaping user-relevant solutions was unanimously recognized.
Understanding user pain points isn’t just a box to check—it’s the compass for everything that follows.
(P12)
Phase 2—Co-Design Scenario: Experts valued this phase for fostering early interdisciplinary collaboration by mapping the learner’s journey and crafting user personas. It was seen as a strategic step to bring diverse stakeholders into alignment. Several participants recommended adding structured templates to support teams unfamiliar with scenario design, especially to encourage productive dialogue between subject-matter experts and designers.
It forces subject-matter experts and designers to speak the same language early on.
(P9)
Phase 3—Co-Design Instructional Content: This phase was praised for anchoring instructional decisions to learner needs. Experts highlighted its role in preventing misalignment between content and gameplay, an issue they described as a common pitfall. Some noted challenges in maintaining collaboration between educators, learners, and developers, especially in low-resource environments. They advocated for clearer, role-specific guidelines to support interdisciplinary communication.
Without this step, you risk creating a fun game that teaches nothing—or a lesson nobody wants to play.
(P7)
Phase 4—UX Evaluation (Storyboard): The storyboard phase stood out for its practical utility. Experts applauded its clarity and effectiveness in helping teams visualize and align game mechanics with educational goals. Storyboards were particularly appreciated as a low-risk method for gathering early input and as a bridge for involving non-technical stakeholders in feedback.
Catching flaws on paper saves months of coding dead-ends.
(P17)
Phase 5—Low-Fidelity Prototyping: The transition from ideas to tangible outputs through simple prototypes was widely supported. Experts valued this phase for its role in early testing and iteration. However, some raised concerns about the time and resources required to prototype, especially for smaller or less experienced teams. They proposed incorporating easy-to-use prototyping tools to lower the barrier to entry.
Sketching things out helped us catch major misunderstandings before wasting development time.
(P2)
Phase 6—Iterative UX Evaluation: Experts identified this as one of the most impactful phases of the framework. They appreciated the structured opportunity to integrate real user feedback into ongoing design improvements. While the benefits were clear, the phase was also described as intensive—requiring dedicated time, access to users, and solid research skills. Participants requested clearer protocols and checklists to help streamline implementation.
This is where things get real—you stop designing for yourself and start designing for users.
(P10)
Phase 7—High-Fidelity Prototype: Experts noted that this phase marks a critical shift from conceptual design to realistic simulations. High-fidelity prototypes were valued for their ability to provide deep insight into both user experience and learning effectiveness. However, concerns were raised about the technical and resource demands, with smaller teams potentially struggling to produce detailed prototypes without specialized support.
You can’t test real learning without realistic interactions, and this is where that starts.
(P4)
Phase 8—Development: Feedback on this phase was somehow positive. Experts acknowledged its significance in integrating design components into a unified product but also recognized it as a stage prone to delays and team breakdowns due to resource limitations or poor planning. Strong project management and clearly defined roles were cited as essential to success.
This is where interdisciplinary teams often fracture without strong project management.
(P13)
Phase 9—Evaluation and Assessment: This phase received high praise for emphasizing the measurement of learning outcomes—an aspect often missing from similar frameworks. Experts appreciated the inclusion of both formative and summative approaches. However, they noted the need for practical tools and appropriate metrics to assess cognitive and behavioral impact, which are often difficult to quantify.
Most frameworks stop at deployment, but measuring actual learning impact is revolutionary.
(P5)
Phase 10—Deployment and Change Management: The final phase was seen as forward-thinking and vital for long-term sustainability. Experts applauded its focus on adoption, ongoing improvement, and scalability. They noted that launching an educational game is only the beginning, and this phase ensures its relevance over time. Recommendations included embedding continuous feedback mechanisms and monitoring systems to support iterative change management post-launch.
Launching is just the beginning—this phase ensures the game evolves with user needs.
(P16)
Overall, the expert feedback across all 10 phases highlights a comprehensive, human-centered design framework that strategically balances empathy, collaboration, and iteration. From grounding design in user understanding to fostering interdisciplinary co-creation, and from early visualization through storyboards to rigorous prototyping and evaluation, each phase was seen as essential in aligning educational effectiveness with user experience. While challenges around resources, collaboration, and implementation were noted, the framework’s emphasis on iterative improvement, outcome assessment, and sustainable deployment was widely praised as both visionary and practical. To further explore the projected practicality of the framework in a group workshop, we deepen the expert walkthrough aimed at identifying potential challenges across its steps and gathering feedback on how these might be addressed in real-world contexts. This exploratory approach offered valuable insights into the usability and adaptability of the framework when applied to complex, time-constrained scenarios.

5.2.3. Expert Walkthrough: Challenges

To understand how the EDTF can support real-world design work, experts reflected on each phase of the framework and shared common challenges they have previosuly encountered during serious digital game (SDG) development, based on real life experience. Through a thematic analysis, these challenges were clustered into key themes per phase, highlighting recurring barriers and illustrating how the EDTF can help teams address them effectively. This approach provides a structured evaluation of the framework’s practical value in guiding SDG development and mitigating common design pitfalls.
Table 2 presents a comprehensive overview of the diverse use cases proposed by each expert during the workshop. Each participant contributed a unique application field aligned with their professional background and interests. This variety—from chemistry puzzles and language learning to financial literacy and programming skills—demonstrates the broad relevance and flexibility of the EDTF across educational domains. Moreover, these proposed use cases reflect how experts envision applying the framework to real-world serious game projects, reinforcing the practical value highlighted throughout the thematic walkthrough.
Phase 1: Empathize with End Users
Themes: Assumption Bias, Access Limitations, and Articulation of Learning Needs
Experts frequently noted the risk of assumption bias, where designers rely on stereotypes or personal experience instead of real user data, leading to misaligned solutions. Limited access to school environments and stakeholders was another major barrier, restricting opportunities for authentic user insights. Additionally, the vague articulation of student learning difficulties—often expressed through surface complaints rather than underlying problems—complicates design targeting. The EDTF addresses these issues by embedding structured user research methods such as classroom observations, interviews, and surveys, alongside co-design with teachers to ensure educational accuracy and grounding in classroom realities. For example, P1 highlighted that students’ statements like “I just hate formulas” masked deeper conceptual misunderstandings, which were only uncovered through direct observation during gameplay.
Phase 2: Co-Design Scenario
Themes: Balancing Educational Rigor and Engagement, Stakeholder Conflicts, and Scope Management
A primary challenge identified was the tension between curriculum alignment and engaging gameplay, often leading to stakeholder conflicts—educators prioritize learning accuracy while designers emphasize immersive experiences. This dynamic sometimes results in scope creep or misaligned expectations. The EDTF mitigates these issues through collaborative workshops where educators and designers jointly define core learning goals and acceptable gameplay mechanics, ensuring a learning-first design focus. Early scenario validation during this phase was noted as critical to reducing scope creep. P14 recalled how a “fantasy alchemy” mechanic was removed after educators rejected its scientific inaccuracy, highlighting the framework’s role in aligning stakeholder visions from the outset.
Phase 3: Co-Design Instructional Content
Themes: Content Dryness, Curriculum Misalignment, and Cognitive Overload
Experts reported that educational content often risks being overly dry, misaligned with curricula, or overwhelming for learners cognitively. The EDTF counters these challenges through the involvement of subject-matter experts (SMEs) and by promoting chunking of material into manageable segments. Moreover, the framework encourages the use of interactive storytelling and simulations rather than passive text-based instruction, enhancing engagement without compromising educational goals. P10 described how, in a game project, math’s textual explanation of quadratic equations was ignored by players, necessitating a redesign into a visual puzzle format that improved interaction and learning retention.
Phase 4: UX Evaluation (Storyboard)
Themes: Cognitive Overload, Misaligned Expectations, and Early Validation
Storyboards are a critical evaluation tool but can sometimes misrepresent gameplay dynamics, causing misunderstandings among development teams and leading to costly redesigns. Experts emphasized the value of early UX testing and teacher feedback loops to verify clarity and appropriateness for the target age group before development progresses. P2 shared how an IDE-style menu led to student disengagement, which was remedied after iterative testing and simplification. The EDTF promotes the use of interactive sketches for rapid iteration and early validation to prevent such issues.
Phase 5: Low-Fidelity Prototyping
Themes: Resource Constraints, Learning Outcome Assessment, and Conflicting Feedback
Small teams often face limitations in resources and struggle to evaluate learning outcomes clearly. They also find it challenging to process diverse or conflicting feedback effectively. The EDTF recommends rapid prototyping methods (e.g., paper sketches, no-code tools) and focused testing sessions that isolate key learning objectives, prioritizing feedback aligned with educational goals. This pragmatic approach enables meaningful iteration without overwhelming teams and ensures feedback reflects authentic user engagement.
Phase 6: High-Fidelity Prototyping
Themes: Production Costs, Technological Limitations, and Premature Polishing
High-fidelity prototyping introduces challenges related to high costs and technical constraints, with a common temptation to invest heavily in visuals too early. Experts highlighted the EDTF’s guidelines for modular development and the use of temporary placeholders to maintain agility and prioritize gameplay quality over polish. P8 recounted investing significant resources in detailed art only to find core mechanics unclear, illustrating the need for EDTF’s phased focus.
Phase 7: Iterative UX Evaluation
Themes: Feedback Bias, Limited Tester Access, and Authenticity of Insights
Obtaining authentic and diverse user feedback is difficult; feedback often comes from friends, family, or unrepresentative users, leading to skewed or overly polite responses. The EDTF incorporates techniques to observe in-game behavior and promote testing with diverse learner groups, enhancing iteration speed and insight quality. P4 noted discrepancies between student verbal feedback and actual engagement data, underscoring the importance of behavior tracking and open-ended feedback interpretation.
Phase 8: Development
Themes: Technical Debt, Platform Compatibility, and Scope Management
Development often suffers from technical debt, compatibility issues, and late-stage feature creep. The EDTF advocates for agile sprint planning and strict scope gating, ensuring new features are added only after testing and approval. Experts praised the framework’s emphasis on early technical feasibility testing (“tech spikes”) to identify constraints and avoid costly rework. P1 shared challenges with real-time molecular simulations causing lag, resolved by simplifying models, illustrating the value of early technical assessments.
Phase 9: Evaluation and Assessment
Themes: Measuring Learning Outcomes and Aligning Engagement and Instructional Quality
Experts agreed that evaluating true learning outcomes remains challenging, as high engagement does not always translate to knowledge gains. The EDTF integrates pre- and post-assessments, lecturer reports, and usability testing to provide a robust evidence base. P13 shared how students memorized shortcuts in a game without comprehension, revealed only through post-game quizzes. The framework supports embedded assessments and A/B testing to verify instructional effectiveness before deployment.
Phase 10: Deployment and Change Management
Themes: Adoption Resistance, Maintenance, and Scalability
Finally, adoption hurdles, ongoing maintenance, and scaling demands pose challenges post-launch. The EDTF recommends targeted lecturer training, cloud-based infrastructure, and roadmap planning informed by user feedback to sustain products and build institutional support. Experts highlighted these as essential for long-term success in educational settings.
Across all phases, experts consistently emphasized the EDTF’s value as a structured, empathic guide that prevents common pitfalls in educational game design. They also noted that integration of AI and LLMs can accelerate many steps—such as content generation, evaluation, and sentiment analysis—without compromising quality. Most importantly, co-design with educators was seen not as optional, but as central to ensuring meaningful learning outcomes. The walkthrough session ultimately affirmed that while full empirical validation remains future work, the EDTF demonstrates clear projected practicality and alignment with real-world constraints in educational settings.

5.2.4. Deep Dive into Phase 1: Empathize with End Users

During this part of the evaluation, special emphasis was given to Phase 1—Empathize with End Users, because it starts with and lays the human-centered foundation for the Empathic Design Thinking Framework (EDTF). While all phases are very important, Phase 1 distinguishes itself by grounding the process in direct, inclusive implication and engagement with learners, educators, and users with diverse needs. Unlike conventional approaches that begin with pre-defined problems or assumptions, Phase 1 prioritizes the first-hand exploration and discovery of user realities, often revealing unarticulated barriers and unmet needs. These early empathic insights serve as anchors for subsequent design choices, such as real needs in both educational and fun-gamified aspects, emotional tone, interaction fidelity, or accessibility features, making it essential to provide in-depth guidance on its execution. The following subsection offers a structured breakdown of proposed methods, thresholds, and recruitment strategies used to ensure this phase is both inclusive and analytically robust.
Experts were asked to reflect on how they currently identify user needs in their own game design processes and to critically assess the EDTF strategies proposed in Phase 1. They widely acknowledged a systemic flaw in relying too heavily on educators to represent student needs. While teachers are curriculum experts, they are not the primary users of educational games—students are. Decisions based on teacher assumptions risk missing students’ real challenges, emotional responses, and learning strategies. They stated that Empathy is not just about listening, it is about listening to the right voices early and deeply. Experts agreed with the triangulating perspectives (learners, educators, designers, UX researchers, etc.) to uncover friction points from different angles. This requires methodological diversity, not just a single “user interview.” Different formats elicit different truths. Experts emphasized designing for “non-gamers” and underrepresented users from the start—those who are often excluded or brought in late. They confirmed the importance of accessibility and inclusive persona creation as part of early-phase discovery. Inclusion is not reactive—it must be embedded into empathy activities and reflected in the design DNA from Phase 1 onward. Further, experts pointed out that Phase 1 insights must not just sit in research reports—they should shape feature development, emotional pacing, and UI design.
Phase 1 of the Empathic Design Thinking Framework (EDTF) focuses on building a rich, inclusive understanding of user needs through a structured, practical approach. This phase begins with defining participant criteria and employing sampling strategies to ensure diverse and representative perspectives, including those of underserved or differently abled learners. Data collection involves semi-structured interviews, observations, and scenario-based prompts designed to capture authentic user experiences and emotions. To determine when sufficient data have been gathered, a saturation threshold is applied, concluding collection once fewer than 5 per cent new insights emerge across three consecutive sessions. Throughout this process, accessibility considerations are embedded to inform inclusive design decisions. By grounding the design process in genuine empathic engagement rather than assumptions, Phase 1 anchors the EDTF by immersing designers in the real-world contexts and challenges of end users, providing a solid foundation for all subsequent phases.
  • Detailed Flow
1. 
Participant Recruitment and Diversity Strategy
In educational design and game development projects, designers often work mainly with educators, who are experts in instructional alignment and curriculum integration [45]. While educators provide valuable input, they are not the ones using the tools—students are. This means important decisions are often made based on what educators think students need, not on what students actually experience. Students are usually included only at the end, during testing, when it is too late to make major changes. But students use these tools every day, and their real challenges, learning styles, and emotions can be very different from what educators expect. Relying only on formal teaching formats can make learning slower, less engaging, and more frustrating [2,113].
To capture a broad and diverse range of user perspectives, a structured sampling plan is implemented:
  • Targeted segments: A total of 8–12 participants per user group, both educators and learners. The targeted range of 8–12 participants per user group (e.g., high-school students, college instructors) is chosen to balance diversity of perspectives with practical constraints. This range aligns with established qualitative research guidelines, which suggest that 8–12 participants are typically sufficient to capture the breadth of user experiences and reach thematic saturation in heterogeneous groups [114].
  • Inclusion goals: at least two participants with cognitive or sensory disabilities and at least three from underrepresented backgrounds, as per example given in Table 3 Participant Matrix [115].
  • Incentives: provided via partnerships with NGOs, schools, or universities (e.g., course credits or gift cards).
2. 
Techniques for Empathizing
  • Contextual Interviews [77]: Participants share recent learning challenges (e.g., “Tell me about the last time you struggled to understand a concept”). Interviews are adapted with real-time captioning and sign-language interpreters as needed.
  • Empathy Mapping [76]: Collected insights are segmented using a template to identify patterns across user groups. For example, “Color-blind users rely on texture and shape” informs UI design in later phases by avoiding color-only cues.
  • Emotional Mapping [14]: Emotional “heatmaps” highlight moments of frustration (e.g., dense instructions or unclear rewards), guiding emotional pacing in subsequent phases.
3. 
Clarifying Data Saturation and Sufficiency
To ensure insight diversity and representativeness, the EDTF uses a saturation threshold: Data collection continues until fewer than 5% of new insights emerge across three consecutive interviews [116]. This approach determines when empathic data gathering can conclude without missing important perspectives.
4. 
Interpreting Diverging Preferences in Outcome Analysis
As part of the data collection and interpretation in Phase 1 of the EDTF, diverging learner preferences can appear and should not be seen as design problems but as valuable insights. For example, if a percentage of students prefer a space theme, or others prefer fantasy setting, these differences help to uncover emotional and cognitive drivers, not just to pick the most popular theme.
These variations are expected in empathic research and reflect diverse learning styles and imaginative triggers. Rather than viewing them as fragmented, the EDTF looks for patterns, like a shared desire for adventure, familiarity, or real-world connection. This informs flexible design strategies such as customizable themes or blended narrative options [68].
Conflicting preferences are interpreted alongside other data (like emotional maps with pain points) to prioritize what learners need over what they say they want. For example, while students may ask for commercial-quality graphics, deeper analysis may reveal that they really want engaging feedback, a sense of progress, or autonomy.
5. 
Designing for Inclusion from the Start
Inclusivity is embedded early through actionable design elements such as the following:
  • High-contrast visual preferences and text-to-speech support.
  • Personas that reflect sensory and cognitive diversity.
  • Adaptive mechanisms that evolve based on early user feedback.
6. 
Inclusive Design Strategies from Empathic Insights
  • “Select Your World” Preferences [117]: Interface options like high-contrast mode, text-to-speech toggles, and simplified inputs are gathered during onboarding and inform ongoing design iterations.
  • Accessibility-First Personas [118]: Personas include accessibility needs, technology literacy, and preferred learning styles (visual, auditory, or tactile). These personas remain living documents, continuously updated and used throughout co-design activities in all phases.
7. 
Downstream Influence on Later Phases
The empathic insights and data foundation from Phase 1 directly influence decisions in future steps, such as the following:
  • Accurate user personas and learner journeys that guide Phase 2 co-design scenarios.
  • Anchoring instructional content in Phase 3 to real learner challenges and preventing misalignment between gameplay and education.
  • Enhancing Phase 4 storyboards with emotional and cognitive maps to visualize potential friction points early.
  • Informing Phase 5 low-fidelity prototypes to reflect diverse user needs and support early design decisions.
This ensures the entire process remains responsive to real user constraints and needs.

6. Discussion

The evaluation of the proposed framework provides strong initial evidence of its value as a comprehensive and structured methodology for designing digital serious games. By integrating empathic design (ED), instructional theory, design thinking (DT), and human–computer interaction (HCI) principles, the Empathic Design Thinking Framework (EDTF) demonstrates how educational goals can be meaningfully aligned with engaging gameplay through a user-centered and iterative development process. This discussion reflects on the framework’s contribution across three key dimensions, validation, theoretical advancement, and practical relevance, while directly addressing the research questions (RQs) of this study.
Regarding RQ1 (“How robust, usable, and applicable is the EDTF framework?”), the results of the expert evaluations and validation strongly support the usability and practicality of the EDTF. The framework achieved high usability scores, with UMUX-Lite (79.5) and SUS (78.5) ratings that place the EDTF well above industry benchmarks. These scores confirm that the EDTF is not only easy to use, but also effective in guiding designers and developers through the complex task of serious game creation. Further, PEQ ratings showed that experts found the framework adaptable (4.67/5) and coherent (4.76/5), which is especially important given the wide variety of learning contexts and game genres. Now that experts have recognized the perceived usefulness of the EDTF framework, these insights can inform future use and refinement. The findings highlight the framework’s potential to offer structure and promote clarity and alignment within inherently multidisciplinary design teams.
With regard to RQ2 (“What is the perceived value attributed to each of the EDTF phases?”), experts feedback emphasized how each phase plays an important role in supporting more thoughtful, learner-centered development. Importantly, the phases work together to bridge the long-standing gaps in the literature, namely, the lack of structured, validated, and iterative design frameworks that fully integrate HCI and educational goals in serious game development. Traditional approaches tend to rely on implicit expertise, offer limited user involvement, or treat pedagogical alignment as secondary to game mechanics. The EDTF explicitly counters these limitations by embedding an empathic approach to users needs and motivations, continuous iteration, and co-design across all stages.
In the first phase, Empathize, developers and designers begin by understanding the real needs, motivations, and struggles of educators and learners. Experts agreed that this empathic foundation is essential and often missing in other models and frameworks. It ensures that design is grounded in users’ lived experiences rather than assumptions, aligning directly with both HCI’s emphasis on user-centeredness and DT’s core principle of empathy.
The second phase, Co-Design Scenario, involves collaboratively mapping learners’ journeys and building user personas. This phase supports shared understanding across disciplines, helping teams visualize the learning context early on. Experts appreciated how this fosters intentional collaboration between game designers, educators, and learners, an alignment that bridges gaps between educational theory and practice, and reflects DT’s focus on inclusive ideation.
In Co-Design Instructional Content, the third phase, the team collaboratively defines what learners should learn and how. Experts emphasized that this phase anchors content decisions in real-world learning needs rather than retrofitting education into game mechanics. This step addresses a major critique in the field: that serious games often lack pedagogical coherence. By deliberately integrating learning goals early, the EDTF ensures that instructional design is not an afterthought, but a co-driver of the game’s structure and engagement model.
The fourth phase, UX Evaluation with Storyboards, helps teams quickly visualize and evaluate how a user would interact with the game before investing in development. Experts found this method especially useful for testing early ideas and collecting feedback from stakeholders, involving those with less technical background. It promotes iteration at low cost and risk, aligning with HCI’s emphasis on early usability testing and DT’s prototyping mindset.
In the Low-Fidelity Prototyping phase, teams translate ideas into tangible prototypes using simple tools. Experts valued this phase for enabling early testing and quick adjustments. It makes abstract concepts real, allowing for rapid iteration. However, concerns were raised about the skills and resources required, especially for small teams. Even so, this phase embodies the HCI principle of iterative design and DT’s build-to-think approach.
The sixth phase, Iterative UX Evaluation, was seen as one of the most impactful. It offers structured opportunities to test how users interact with the game and to refine the design accordingly. Experts stressed that regular testing with real users improves the game’s clarity, engagement, and educational effectiveness. This phase brings all three traditions, HCI, ED, and DT, into alignment by creating a feedback loop that continuously improves both usability and learning outcomes.
High-Fidelity Prototyping, the seventh phase, advances the project into more polished versions that simulate the final experience. Experts appreciated this stage for enabling deeper evaluations of user experience and educational impact, though they acknowledged the increased technical demands. This phase serves as a bridge to full development, and its timing ensures only well-tested ideas are carried forward, reflecting DT’s principle of iterative refinement and HCI’s concern for experiential quality.
The eighth phase, Development, marks the formal build of the game. Experts agreed that this step often encounters delays and misalignments if not grounded in prior collaboration and testing. The EDTF’s structured lead-up ensures that development is built on well-defined, user-tested foundations, reducing the risk of rework and ensuring alignment with both user needs and educational goals.
In Evaluation and Assessment, the ninth phase, the framework emphasizes measuring learning outcomes, not just usability or engagement. Experts praised this focus, which fills a major gap in many existing models. By encouraging both formative and summative evaluations, the EDTF ensures that the game achieves its intended educational goals. This phase reflects ED’s emphasis on measurable learning and HCI’s tradition of evidence-based evaluation.
Finally, Deployment and Change Management, the 10th phase, supports the long-term success of the game by promoting sustainable adoption and continuous improvement. Experts highlighted the value of this forward-looking phase, especially in educational settings where change is slow. Including mechanisms for feedback and iteration even after deployment ensures that the game evolves with users’ needs—a core value of DT and HCI alike.
While experts highlighted significant and recurrent challenges across all SDG development phases, such as assumption bias, stakeholder misalignment, cognitive overload, and premature polishing, they emphasized the EDTF’s value in mitigating these pitfalls through its structured, participatory methods. Particularly, phases like “Empathize with End Users” and “Co-Design Scenario” were praised for shifting the design paradigm from assumption-driven or designers interpretation of what users might need to evidence-based, fostering early involvement and alignment between educators, learners, designers and developers, ensuring in this way inclusive representation from the outset.
Despite some differences in expert perspectives, such as varying tolerance for resource constraints or disagreement over the timing of UX validation, most acknowledged that the EDTF’s phased, empathic approach provides a pragmatic roadmap to manage trade-offs between educational depth and gameplay engagement. For instance, while some experts struggled with vague learning needs or dry content in past projects, they viewed the EDTF’s use of co-design workshops, educator and learner observations, and emotional heatmapping as transformative tools to surface latent user needs and adapt design fidelity across iterations.
Moreover, from the experts’ perspective, Phase 1—Empathize with End Users, is viewed as the essential foundation for successful educational game design. They emphasize the importance of engaging directly with a diverse range of learners, including those with cognitive differences, to capture authentic needs and experiences that often differ from educator and designer assumptions. Experts highlight that relying solely on educators risks overlooking critical end-user challenges. By employing varied methods such as contextual interviews and empathy mapping, this phase uncovers nuanced insights that inform accurate personas and learner journeys. Experts see Phase 1 as a pillar for fostering interdisciplinary alignment early on and ensuring that subsequent phases, from Co-Design to Development and Evaluation, remain grounded in real user constraints and motivations, ultimately driving a truly user-centered design process.
The structure of the EDTF ensures that empathic insights, co-design practices, and iterative validation are embedded throughout. It directly addresses existing challenges by unifying educational design and user experience into a coherent, actionable process. The EDTF synthesizes both pedagogy or technology into an iterative framework that centers educators and learners, fosters collaboration, and builds for impact.
Overall, this study validates the EDTF as a robust, user-centered approach that successfully integrates principles from ED, HCI, and DT. Expert validation affirm its value in aligning learning objectives with game design and underscore its potential for guiding the creation of serious games that are both usable and educationally effective. Future work should explore how this framework supports long-term learning outcomes and how it can be further adapted for different teams, contexts, and tools to promote broader adoption in the field.

7. Conclusions

This study positions the Empathic Design Thinking Framework (EDTF) as a thoroughly validated and user-centered methodology for the design of digital serious games. Grounded in empathic, iterative, and multidisciplinary practices, the EDTF offers a comprehensive approach that helps design teams navigate the complexity of creating educational games that are not only pedagogically sound but also engaging and aligned with real learner needs.
The EDTF’s main contribution lies in its synthesis of insights from instructional design, game development, and human–computer interaction (HCI), resulting in a flexible yet structured 10-phase model. It promotes an empathic foundation (Phase 1), followed by phases dedicated to co-creating game mechanics (Phase 2), defining learning outcomes (Phase 3), and embedding ongoing user feedback (Phases 4–9). This integrated perspective helps bridge persistent gaps in the field where existing frameworks tend to focus narrowly on either pedagogical structure (e.g., LM-GM, Learning Mechanics–Game Mechanics model [28]) or technical implementation (e.g., DODDEL, Digital Object Design and Development for Engaged Learning [119]).
In doing so, the EDTF responds to long-standing critiques within the literature, such as those articulated by Wouters et al. [120], by prioritizing authentic user involvement and empirical iteration throughout the design process. This highlights the potential of combining user-centered design principles with serious game development in a cohesive, evidence-informed manner.
Further work is warranted to explore the EDTF’s application beyond controlled settings. In particular, deploying the framework in real-world, longitudinal educational contexts, across diverse learner populations, institutions, and learning goals, could provide deeper insights into its scalability, adaptability, and long-term impact. Moreover, the development of lightweight, modular adaptations and practical toolkits could support wider adoption, especially among resource-constrained teams.

Author Contributions

The authors, R.I.M. and J.A.-M., contributed equally to the conceptualization, methodology, validation, formal analysis with visualizations, writing—original draft preparation, and writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Acknowledgments

During the preparation of this manuscript/study, the author(s) used Python 3.11.13, Miro 3.26.17, and Figma 125.4.9 to create the visual graphics. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AIArtificial Intelligence
DSGDigital Serious Games
DSGDFWDigital Serious Game Design Framework
DTDesign Thinking
EDEmpathic Design
EDTFEmpathic Design Thinking Framework
HCIHuman–Computer Interaction
PEQPerception Evaluation Questionnaire
SUSSystem Usability Scale
VRVirtual Realidy
UIUser Interface
UXUser Experience
UMUX-LiteUsability Metric for User Experience—Lite Version

Appendix A

This appendix presents the evaluation instruments and the specific items used for the quantitative assessment of the Empathic Design Thinking Framework (EDTF). The following standardized tools were adapted to evaluate perceived usefulness, usability, and framework perception.

Appendix A.1. Adapted UMUX-Lite

  • Perceived Usefulness: I believe the Empathic Design Thinking Framework (EDTF) would support my needs when designing serious games.
  • Perceived Ease of Use: I think the EDTF would be easy to use in a real-world game design context.

Appendix A.2. System Usability Scale (SUS)—Perceived Usability

Participants responded to the following SUS items on a 5-point Likert scale:
  • I think I would like to use the EDTF regularly in designing learning games.
  • I found the EDTF unnecessarily complex. (reverse scored)
  • I thought the EDTF was easy to understand.
  • I think I would need support from someone experienced to use the EDTF. (reverse scored)
  • I found the different components of the EDTF well integrated.
  • I thought there was too much inconsistency in the EDTF. (reverse scored)
  • I imagine most people would learn to use the EDTF quickly.
  • I found the EDTF very cumbersome to navigate. (reverse scored)
  • I feel confident that I could explain how the EDTF works to someone else.
  • I would need to learn a lot before I could effectively apply the EDTF. (reverse scored)

Appendix A.3. Perception Evaluation Questionnaire (PEQ)

This instrument was used for respondents who explored and understood the EDTF framework but had not yet applied it in practice. Responses were recorded on a 5-point Likert scale (1 = Strongly Disagree to 5 = Strongly Agree).
  • Clarity: The structure and terminology of the EDTF are clearly explained and easy to understand.
  • Flexibility: The EDTF appears adaptable to different types of learning games, audiences, and design contexts.
  • Comprehensiveness: The EDTF covers all the essential aspects needed to design and develop a serious game.
  • Innovation: The EDTF introduces new and innovative ideas to game design and development, with continuous focus on user experiences.
  • Ease of Learning: It was easy for me to learn and make sense of how the EDTF works after reviewing its components.
  • Internal Consistency: The components of the EDTF are logically connected and reinforce each other in a coherent way.
  • Support for Problem-Solving: The framework provides helpful guidance for addressing common design challenges in serious game development.
  • Real-World Applicability: I believe the EDTF could be effectively applied in real-world game design projects, not just theoretical ones.
  • Support for Testing and Iteration: The EDTF provides clear support for testing and iterating design ideas throughout all stages of game development, not just at the end.

References

  1. Mitsea, E.; Drigas, A.; Skianis, C. A Systematic Review of Serious Games in the Era of Artificial Intelligence, Immersive Technologies, the Metaverse, and Neurotechnologies: Transformation Through Meta-Skills Training. Electronics 2025, 14, 649. [Google Scholar] [CrossRef]
  2. Qian, M.; Clark, K.R. Game-based Learning and 21st century skills: A review of recent research. Comput. Hum. Behav. 2016, 63, 50–58. [Google Scholar] [CrossRef]
  3. Bond, M.; Buntins, K.; Bedenlier, S.; Zawacki-Richter, O.; Kerres, M. Mapping research in student engagement and educational technology in higher education: A systematic evidence map. Int. J. Educ. Technol. High. Educ. 2020, 17, 2. [Google Scholar] [CrossRef]
  4. Gundersen, S.W.; Lampropoulos, G. Using Serious Games and Digital Games to Improve Students’ Computational Thinking and Programming Skills in K-12 Education: A Systematic Literature Review. Technologies 2025, 13, 113. [Google Scholar] [CrossRef]
  5. Salvador-Ullauri, L.; Acosta-Vargas, P.; Luján-Mora, S. Web-Based Serious Games and Accessibility: A Systematic Literature Review. Appl. Sci. 2020, 10, 7859. [Google Scholar] [CrossRef]
  6. Nylén-Eriksen, M.; Stojiljkovic, M.; Lillekroken, D.; Lindeflaten, K.; Hessevaagbakke, E.; Flølo, T.N.; Hovland, O.J.; Solberg, A.M.S.; Hansen, S.; Bjørnnes, A.K.; et al. Game-thinking; utilizing serious games and gamification in nursing education—A systematic review and meta-analysis. BMC Med. Educ. 2025, 25, 140. [Google Scholar] [CrossRef] [PubMed]
  7. Dimitriadou, A.; Djafarova, N.; Turetken, O.; Verkuyl, M.; Ferworn, A. Challenges in Serious Game Design and Development: Educators’ Experiences. Simul. Gaming 2021, 52, 132–152. [Google Scholar] [CrossRef]
  8. Brandl, L.C.; Kordts, B.; Schrader, A. Technological Challenges of Ambient Serious Games in Higher Education. In Proceedings of the Workshop “Making A Real Connection, Pro-Social Collaborative Play in Extended Realities—Trends, Challenges and Potentials” at the 22nd International Conference on Mobile and Ubiquitous Multimedia (MUM ’23), Vienna, Austria, 3–6 December 2023; pp. 1–6. [Google Scholar] [CrossRef]
  9. Eyal, L.; Rabin, E.; Meirovitz, T. Pre-Service Teachers’ Attitudes toward Integrating Digital Games in Learning as Cognitive Tools for Developing Higher-Order Thinking and Lifelong Learning. Educ. Sci. 2023, 13, 1165. [Google Scholar] [CrossRef]
  10. Hébert, C.; Jenson, J.; Terzopoulos, T. “Access to Technology Is the Major Challenge”: Teacher Perspectives on Barriers to DGBL in K-12 Classrooms. E-Learn. Digit. Media 2021, 18, 307–324. [Google Scholar] [CrossRef]
  11. Lester, D.; Skulmoski, G.J.; Fisher, D.P.; Mehrotra, V.; Lim, I.; Lang, A.; Keogh, J.W.L. Drivers and Barriers to the Utilisation of Gamification and Game-Based Learning in Universities: A Systematic Review of Educators’ Perspectives. Br. J. Educ. Technol. 2023, 54, 1748–1770. [Google Scholar] [CrossRef]
  12. Tyack, A.; Mekler, E.D. Self-Determination Theory and HCI Games Research: Unfulfilled Promises and Unquestioned Paradigms. ACM Trans. Comput.-Hum. Interact. 2024, 31, 40. [Google Scholar] [CrossRef]
  13. Serrano-Laguna, Á.; Manero, B.; Freire, M.; Fernández-Manjón, B. A methodology for assessing the effectiveness of serious games and for inferring player learning outcomes. Multimed. Tools Appl. 2018, 77, 2849–2871. [Google Scholar] [CrossRef]
  14. Surma-Aho, A.; Hölttä-Otto, K. Conceptualization and operationalization of empathy in design research. Des. Stud. 2022, 78, 101075. [Google Scholar] [CrossRef]
  15. Jumisko-Pyykkö, S.; Viita-aho, T.; Tiilikainen, E.; Saarinen, E. Towards Systems Intelligent Approach in Empathic Design. In Proceedings of the 24th International Academic Mindtrek Conference (Academic Mindtrek ’21), Tampere, Finland, 1–3 June 2021; ACM: New York, NY, USA, 2021; pp. 197–209. [Google Scholar]
  16. Cooke, L.; Dusenberry, L.; Robinson, J. Gaming Design Thinking: Wicked Problems, Sufficient Solutions, and the Possibility Space of Games. Tech. Commun. Q. 2020, 29, 327–340. [Google Scholar] [CrossRef]
  17. Kumar, L.; Herger, M. Serious games vs entertainment games. In Gamification in Business; Herger, M., Ed.; Springer: Cham, Switzerland, 2021; pp. 125–139. [Google Scholar]
  18. Westera, W. The devil’s advocate: Identifying persistent problems in serious game design. Int. J. Serious Games 2022, 9, 115–124. [Google Scholar] [CrossRef]
  19. Barakat, N.H. A framework for integrating software design patterns with game design framework. In Proceedings of the 8th International Conference on Software and Information Engineering, Cairo, Egypt, 9–12 April 2019; pp. 47–50. [Google Scholar]
  20. Rooney, P. A theoretical framework for serious game design: Exploring pedagogy, play and fidelity and their implications for the design process. Int. J. Game-Based Learn. 2012, 2, 41–60. [Google Scholar] [CrossRef]
  21. Maxim, R.I.; Arnedo-Moreno, J. Identifying key principles and commonalities in digital serious game design frameworks: Scoping review. JMIR Serious Games 2025, 13, e54075. [Google Scholar] [CrossRef] [PubMed]
  22. Verschueren, S.; Buffel, C.; Vander Stichele, G. Developing theory-driven, evidence-based serious games for health: Framework based on research community insights. JMIR Serious Games 2019, 7, e11565. [Google Scholar] [CrossRef]
  23. Bunt, L.; Greeff, J.; Taylor, E. Enhancing serious game design: Expert-reviewed, stakeholder-centered framework. JMIR Serious Games 2024, 12, e48099. [Google Scholar] [CrossRef]
  24. Troiano, G.M.; Schouten, D.; Cassidy, M.; Tucker-Raymond, E.; Puttick, G.; Harteveld, C. All good things come in threes: Assessing student-designed games via triadic game design. In Proceedings of the 15th International Conference on the Foundations of Digital Games, Hyannis, MA, USA, 14–17 August 2020; pp. 1–4. [Google Scholar]
  25. Groff, J.; Clarke-Midura, J.; Owen, V.; Rosenheck, L. Better Learning in Games: A Balanced Design Lens for a New Generation of Learning Games; Technical Report; Learning Games Network, MIT Education Arcade: Cambridge, MA, USA, 2015. [Google Scholar]
  26. Zarraonandia, T.; Diaz, P.; Aedo, I.; Ruiz, M.R. Designing educational games through a conceptual model based on rules and scenarios. Multimed. Tools Appl. 2015, 74, 4535–4559. [Google Scholar] [CrossRef]
  27. Holmes, J.B.; Gee, E.R. A framework for understanding game-based teaching and learning. Horizon 2016, 24, 1–16. [Google Scholar] [CrossRef]
  28. Arnab, S.; Lim, T.; Carvalho, M.B.; Bellotti, F.; de Freitas, S.; Louchart, S.; De Gloria, A. Mapping learning and game mechanics for serious games analysis. Br. J. Educ. Technol. 2015, 46, 391–411. [Google Scholar] [CrossRef]
  29. Smith, K.; Shull, J.; Shen, Y.; Dean, A.; Heaney, P. A framework for designing smarter serious games. In Smart Innovation, Systems and Technologies; Springer International Publishing: Cham, Switzerland, 2018. [Google Scholar]
  30. To, A.; Fath, E.; Zhang, E.; Ali, S.; Kildunne, C.; Fan, A.; Hammer, J.; Kaufman, G. Tandem transformational game design: A game design process case study. In Proceedings of the International Academic Conference on Meaningful Play, East Lansing, MI, USA, 20–22 October 2016. [Google Scholar]
  31. Schell, J. The Art of Game Design: A Book of Lenses, 3rd ed.; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar]
  32. Koster, R. A Theory of Fun for Game Design; O’Reilly Media, Inc.: Santa Rosa, CA, USA, 2013. [Google Scholar]
  33. Fullerton, T. Game Design Workshop: A Playcentric Approach to Creating Innovative Games, 4th ed.; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  34. Kniestedt, I.; Gómez Maureira, M.A.; Lefter, I.; Lukosch, S.; Brazier, F.M. Dive deeper: Empirical analysis of game mechanics and perceived value in serious games. Proc. ACM Hum.-Comput. Interact. 2021, 5, 1–25. [Google Scholar] [CrossRef]
  35. Lukosch, H.; Kurapati, S.; Groen, D.; Verbraeck, A. Gender and cultural differences in game-based learning experiences. Electron. J. e-Learn. 2017, 15, 310–319. [Google Scholar]
  36. Tsikinas, S.; Xinogalos, S. Designing effective serious games for people with intellectual disabilities. In Proceedings of the EDUCON ’18, Santa Cruz de Tenerife, Spain, 18–20 April 2018. [Google Scholar]
  37. Xinogalos, S.; Satratzemi, M. Special Issue on New Challenges in Serious Game Design. Appl. Sci. 2023, 13, 7675. [Google Scholar] [CrossRef]
  38. Kletenik, D.; Adler, R.F. Who wins? A comparison of accessibility simulation games vs. classroom modules. In Proceedings of the 54th ACM Technical Symposium on Computer Science Education V. 1, Toronto, ON, Canada, 15–18 March 2023; pp. 214–220. [Google Scholar]
  39. Liu, C.; Zhou, K.Z.; Sy, S.; Lodvikov, E.; Shan, J.; Kletenik, D.; Adler, R.F. Opening Digital Doors: Early Lessons in Software Accessibility for K-8 Students. In Proceedings of the 56th ACM Technical Symposium on Computer Science Education V. 1, Pittsburgh, PA, USA, 26 February–1 March 2025; pp. 722–728. [Google Scholar]
  40. Angeli, C.; Giannakos, M. Computational thinking education: Issues and challenges. Comput. Hum. Behav. 2020, 105, 106185. [Google Scholar] [CrossRef]
  41. Kletenik, D.; Adler, R.F. Motivated by inclusion: Understanding students’ empathy and motivation to design accessibly across a spectrum of disabilities. In Proceedings of the 55th ACM Technical Symposium on Computer Science Education V. 1, Portland, OR, USA, 20–23 March 2024; pp. 680–686. [Google Scholar]
  42. Laurent, M.; Monnier, S.; Huguenin, A.; Monaco, P.B.; Jaccard, D. Design principles for serious games authoring tools. Int. J. Serious Games 2022, 9, 63–87. [Google Scholar] [CrossRef]
  43. Puttick, G.; Cassidy, M.; Tucker-Raymond, E.; Troiano, G.M.; Harteveld, C. “So, we kind of started from scratch, no pun intended”: What can students learn from designing games? J. Res. Sci. Teach. 2024, 61, 772–808. [Google Scholar] [CrossRef]
  44. Ruiz, A.; Giraldo, W.J.; Arciniegas, J.L. Participatory design method: Co-creating user interfaces for an educational interactive system. In Proceedings of the XIX International Conference on Human Computer Interaction, Palma, Spain, 12–14 September 2018; pp. 1–8. [Google Scholar]
  45. Dörner, R.; Göbe, S.; Effelsberg, W.; Wiemeyer, J. (Eds.) Serious Games Foundations, Concepts and Practice; Springer: Cham, Switzerland, 2016. [Google Scholar]
  46. Ávila-Pesántez, D.; Rivera, L.A.; Alban, M.S. Approaches for serious game design: A systematic literature review. Comput. Educ. J. 2017, 8. [Google Scholar]
  47. Berni, A.; Borgianni, Y. From the definition of user experience to a framework to classify its applications in design. Proc. Des. Soc. 2021, 1, 1627–1636. [Google Scholar] [CrossRef]
  48. Brown, T. Change by Design, Revised and Updated: How Design Thinking Transforms Organizations and Inspires Innovation; HarperCollins: New York, NY, USA, 2019. [Google Scholar]
  49. Afroogh, S.; Esmalian, A.; Donaldson, J.P.; Mostafavi, A. Empathic Design in Engineering Education and Practice: An Approach for Achieving Inclusive and Effective Community Resilience. Sustainability 2021, 13, 4060. [Google Scholar] [CrossRef]
  50. Maxim, R.I.; Arnedo-Moreno, J. Programming games as learning tools: Using empathic design principles for engaging experiences. In Proceedings of the GEM ’24, Turin, Italy, 5–7 June 2024; pp. 1–6. [Google Scholar]
  51. Scholten, H.; Granic, I. Use of the Principles of Design Thinking to Address Limitations of Digital Mental Health Interventions for Youth: Viewpoint. J. Med. Internet Res. 2019, 21, e11528. [Google Scholar] [CrossRef]
  52. Stephan, C. The passive dimension of empathy and its relevance for design. Des. Stud. 2023, 86, 101179. [Google Scholar] [CrossRef]
  53. Yu, Q.; Yu, K.; Lin, R. A meta-analysis of the effects of design thinking on student learning. Humanit. Soc. Sci. Commun. 2024, 11, 1–12. [Google Scholar] [CrossRef]
  54. Chang-Arana, Á.M.; Piispanen, M.; Himberg, T.; Surma-Aho, A.; Alho, J.; Sams, M.; Hölttä-Otto, K. Empathic Accuracy in Design: Exploring Design Outcomes through Empathic Performance and Physiology. Des. Sci. 2020, 6, e16. [Google Scholar] [CrossRef]
  55. Tu, J.; Wang, D.; Choong, L.; Abistado, A.; Suarez, A.; Hallifax, S.; Rogers, K.; Nacke, L. Rolling in Fun, Paying the Price: A Thematic Analysis on Purchase and Play in Tabletop Games. ACM Games Res. Pract. 2024, 3, 1–29. [Google Scholar] [CrossRef]
  56. Campoverde-Durán, R.; Galán-Montesdeoca, J.; Perez-Muñoz, Á. UX and gamification, serious game development centered on the player experience. In Proceedings of the XXIII International Conference on Human Computer Interaction, Lleida, Spain, 4–6 September 2023; pp. 1–4. [Google Scholar]
  57. Muratovski, G. Research for Designers: A Guide to Methods and Practice; Sage: Thousand Oaks, CA, USA, 2021. [Google Scholar]
  58. Fulton, S.; Roberts, P. Empathic Design in Educational Technology: A Systematic Review of Methods and Impact. TechTrends 2022, 66, 741–755. [Google Scholar]
  59. Gibbons, S. Journey Mapping 101; Nielsen Norman Group: Fremont, CA, USA, 2018. [Google Scholar]
  60. Doorley, S.; Holcomb, S.; Klebahn, P.; Segovia, K.; Utley, J. Design Thinking Bootleg; Stanford University: Stanford, CA, USA, 2018. [Google Scholar]
  61. Quiñones, D.; Rusu, C. How to develop usability heuristics: A systematic literature review. Comput. Stand. Interfaces 2017, 53, 89–122. [Google Scholar] [CrossRef]
  62. Wolcott, M.D.; Lobczowski, N.G. Using cognitive interviews and think-aloud protocols to understand thought processes. Curr. Pharm. Teach. Learn. 2021, 13, 181–188. [Google Scholar] [CrossRef]
  63. Caroux, L.; Pujol, M. Player enjoyment in video games: A systematic review and meta-analysis of the effects of game design choices. Int. J. Hum.-Interact. 2024, 40, 4227–4238. [Google Scholar] [CrossRef]
  64. Albert, B.; Tullis, T. Measuring the User Experience: Collecting, Analyzing, and Presenting UX Metrics; Morgan Kaufmann: Burlington, MA, USA, 2022. [Google Scholar]
  65. Chao, C.; Chen, Y.; Wu, H.; Wu, W.; Yi, Z.; Xu, L.; Fu, Z. An emotional design model for future smart product based on grounded theory. Systems 2023, 11, 377. [Google Scholar] [CrossRef]
  66. Asgari, M.; Hurtut, T. A design language for prototyping and storyboarding data-driven stories. Appl. Sci. 2024, 14, 1387. [Google Scholar] [CrossRef]
  67. Kalbach, J. Mapping Experiences: A Guide to Creating Value Through Journeys, Blueprints, and Diagrams; O’Reilly Media Inc.: Sebastopol, CA, USA, 2016. [Google Scholar]
  68. Koskinen, I.; Zimmerman, J.; Binder, T.; Redstrom, J.; Wensveen, S. Design research through practice: From the lab, field, and showroom. IEEE Trans. Prof. Commun. 2013, 56, 262–263. [Google Scholar] [CrossRef]
  69. Wallace, J.; McCarthy, J.; Wright, P.C.; Olivier, P. Making design probes work. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; pp. 3441–3450. [Google Scholar]
  70. Galvão, L.; García, L.S.; Felipe, T.A. A systematic mapping study on participatory game design with children. In Proceedings of the XXII Brazilian Symposium on Human Factors in Computing Systems, Maceió, Brazil, 16–20 October 2023; pp. 1–12. [Google Scholar]
  71. Kafai, Y.B.; Burke, Q. Constructionist gaming: Understanding the benefits of making games for learning. Educ. Psychol. 2015, 50, 313–334. [Google Scholar] [CrossRef]
  72. Voogt, J.M.; Pieters, J.M.; Handelzalts, A. Teacher collaboration in curriculum design teams: Effects, mechanisms, and conditions. In Teacher Learning Through Teacher Teams; Routledge: London, UK, 2018; pp. 7–26. [Google Scholar]
  73. Getenet, S. Using design-based research to bring partnership between researchers and practitioners. Educ. Res. 2019, 61, 482–494. [Google Scholar] [CrossRef]
  74. Lu, J.; Schmidt, M.; Lee, M.; Huang, R. Usability research in educational technology: A state-of-the-art systematic review. Educ. Technol. Res. Dev. 2022, 70, 1951–1992. [Google Scholar] [CrossRef]
  75. Ferreira, B.; Silva, W.; Oliveira, E.; Conte, T. Designing Personas with Empathy Map. In Proceedings of the SEKE, Pittsburgh, PA, USA, 6–8 July 2015; Volume 152. [Google Scholar]
  76. Feijoo, G.; Crujeiras, R.; Moreira, M. Gamestorming for the Conceptual Design of Products and Processes in the context of engineering education. Educ. Chem. Eng. 2018, 22, 44–52. [Google Scholar] [CrossRef]
  77. Garcia, S.E. Practical ux research methodologies: Contextual inquiry. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–2. [Google Scholar]
  78. Pendleton, A.J. Introducing the Game Design Matrix: A Step-by-Step Process for Creating Serious Games. Master’s Thesis, Air Force Institute of Technology, Wright-Patterson AFB, OH, USA, 2020. [Google Scholar]
  79. Howard, Z.; Kjeldskov, J.; Skov, M.B. Interviewing for Journeys and Experience Landscapes: A Study of Contextual Inquiry and Experience Mapping. In Proceedings of the 26th Australian Computer-Human Interaction Conference (OzCHI), Sydney, Australia, 2–5 December 2014; ACM: New York, NY, USA, 2014; pp. 448–451. [Google Scholar] [CrossRef]
  80. Reagan, A.J.; Mitchell, L.; Kiley, D.; Danforth, C.M.; Dodds, P.S. The Emotional Arcs of Stories Are Dominated by Six Basic Shapes. EPJ Data Sci. 2016, 5, 31. [Google Scholar] [CrossRef]
  81. Ramos, D.; Holtmann, B.M.; Schell, D.; Schumacher, K. Tracking Emotional Shifts During Story Reception: The Relationship Between Narrative Structure and Emotional Flow. Secur. Soc. Off. Law J. 2023, 12, 17–39. [Google Scholar]
  82. McKenney, S.; Reeves, T. Conducting Educational Design Research; Routledge: London, UK, 2018. [Google Scholar]
  83. Salminen, J.; Wenyun Guan, K.; Jung, S.G.; Jansen, B. Use cases for design personas: A systematic review and new frontiers. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022; pp. 1–21. [Google Scholar]
  84. Nofal, E. Participatory design workshops: Interdisciplinary encounters within a collaborative digital heritage project. Heritage 2023, 6, 2752–2766. [Google Scholar] [CrossRef]
  85. Osterwalder, A.; Pigneur, Y.; Bernarda, G.; Smith, A. Value Proposition Design: How to Create Products and Services Customers Want; Wiley: Hoboken, NJ, USA, 2014. [Google Scholar]
  86. Feng, L.; Wei, W. An empirical study on user experience evaluation and identification of critical UX issues. Sustainability 2019, 11, 2432. [Google Scholar] [CrossRef]
  87. Maxim, R. A Case Study of Mobile Game Based Learning Design for Gender Responsive STEM Education. World Acad. Sci. Eng. Technol. Int. J. Educ. Pedagog. Sci. 2021, 15, 189–192. [Google Scholar]
  88. Winardy, G.C.B.; Septiana, E. Role, play, and games: Comparison between role-playing games and role-play in education. Soc. Sci. Humanit. Open 2023, 8, 100527. [Google Scholar] [CrossRef]
  89. Katsumata Shah, M.; Jactat, B.; Yasui, T.; Ismailov, M. Low-Fidelity Prototyping with Design Thinking in Higher Education Management in Japan: Impact on the Utility and Usability of a Student Exchange Program Brochure. Educ. Sci. 2023, 13, 53. [Google Scholar] [CrossRef]
  90. Chen, K.; Zhang, H. Remote paper prototype testing. In Proceedings of the 33rd annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, 18–23 April 2015; pp. 77–80. [Google Scholar]
  91. Hertzum, M. Concurrent or Retrospective Thinking Aloud in Usability Tests: A Meta-Analytic Review. ACM Trans. Comput.-Hum. Interact. 2024, 31, 1–29. [Google Scholar] [CrossRef]
  92. Edwards, J.; Nguyen, A.; Lämsä, J.; Sobocinski, M.; Whitehead, R.; Dang, B.; Järvelä, S. A Wizard of Oz methodology for designing future learning spaces. High. Educ. Res. Dev. 2025, 44, 147–162. [Google Scholar] [CrossRef]
  93. Hertzum, M. Usability Testing: Too Early? Too Often? Or Just Right? Interact. Comput. 2020, 32, 199–207. [Google Scholar] [CrossRef]
  94. Sugiarti, Y.; Hanifah, S.R.; Anwas, E.O.M.; Anwar, S.; Permatasari, A.D.; Nurmiati, E. Usability Evaluation on Website Using the Cognitive Walkthrough Method. In Proceedings of the 2023 11th International Conference on Cyber and IT Service Management (CITSM), Makassar, Indonesia, 10–11 November 2023; pp. 1–8. [Google Scholar]
  95. Talero-Sarmiento, L.; Gonzalez-Capdevila, M.; Granollers, A.; Lamos-Diaz, H.; Pistili-Rodrigues, K. Towards a Refined Heuristic Evaluation: Incorporating Hierarchical Analysis for Weighted Usability Assessment. Big Data Cogn. Comput. 2024, 8, 69. [Google Scholar] [CrossRef]
  96. Li, J.; Tigwell, G.W.; Shinohara, K. Accessibility of high-fidelity prototyping tools. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Online, 8–13 May 2021; pp. 1–17. [Google Scholar]
  97. IJsselsteijn, W.; de Kort, Y.A.W.; Midden, C.; Eggen, B.; van den Hoven, E.; Wagner, M. The Game Experience Questionnaire: Development of a Self-Report Measure to Capture Player Experiences of Computer Games; Technical Report; TU Eindhoven: Eindhoven, The Netherlands, 2013. [Google Scholar]
  98. Lim, J.; Mountstephens, J.; Teo, J.T. Emotion Recognition Using Eye-Tracking: Taxonomy, Review and Current Challenges. Sensors 2020, 20, 2384. [Google Scholar] [CrossRef] [PubMed]
  99. Roncal-Belzunce, V.; Gutiérrez-Valencia, M.; Martínez-Velilla, N.; Ramírez-Vélez, R. System Usability Scale for Gamified E-Learning Courses: Cross-Cultural Adaptation and Measurement Properties of the Spanish Version. Int. J. Hum.-Comput. Interact. 2025, 1–11. [Google Scholar] [CrossRef]
  100. Seyderhelm, A.J.; Blackmore, K.L. How hard is it really? Assessing game-task difficulty through real-time measures of performance and cognitive load. Simul. Gaming 2023, 54, 294–321. [Google Scholar] [CrossRef]
  101. Ifenthaler, D.; Eseryel, D.; Ge, X. Assessment for game-based learning. In Assessment in Game-Based Learning; Springer: Berlin/Heidelberg, Germany, 2012; pp. 1–8. [Google Scholar]
  102. Patton, M.Q. Qualitative Research & Evaluation Methods, 4th ed.; Sage Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  103. Wang, Q.; Quek, C.L. Investigating Collaborative Reflection with Peers in an Online Learning Environment. In Proceedings of the Exploring the Material Conditions of Learning: The Computer Supported Collaborative Learning (CSCL) Conference 2015, Gothenburg, Sweden, 7–11 June 2015; Lindwall, O., Häkkinen, P., Koschman, T., Tchounikine, P., Ludvigsen, S., Eds.; International Society of the Learning Sciences (ISLS): Irvine, CA, USA, 2015; Volume 1. [Google Scholar]
  104. Liedtka, J. Perspective: Linking design thinking with innovation outcomes through cognitive bias reduction. J. Prod. Innov. Manag. 2015, 32, 925–938. [Google Scholar] [CrossRef]
  105. Lewis, J.R.; Utesch, B.S.; Maher, D.E. UMUX-LITE: When there’s no time for the SUS. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; pp. 2099–2102. [Google Scholar]
  106. Marikyan, D.; Papagiannidis, S.; Stewart, G. Technology acceptance research: Meta-analysis. J. Inf. Sci. 2023, 01655515231191177. [Google Scholar] [CrossRef]
  107. Vlachogianni, P.; Tselios, N. Perceived usability evaluation of educational technology using the System Usability Scale (SUS): A systematic review. J. Res. Technol. Educ. 2022, 54, 392–409. [Google Scholar] [CrossRef]
  108. Petri, G.; von Wangenheim, C.G.; Borgatto, A.F. MEEGA+: A method for the evaluation of educational games for computing education. IEEE Trans. Educ. 2018, 61, 304–311. [Google Scholar]
  109. Nardon, L.; Hari, A.; Aarma, K. Reflective Interviewing—Increasing Social Impact through Research. Int. J. Qual. Methods 2021, 20, 1–9. [Google Scholar] [CrossRef]
  110. Thoring, K.; Mueller, R.; Badke-Schaub, P. Workshops as a research method: Guidelines for designing and evaluating artifacts through workshops. In Proceedings of the 53rd Hawaii International Conference on System Sciences, Maui, HI, USA, 7–10 January 2020. [Google Scholar]
  111. Nielsen, J. Usability 101: Introduction to Usability. 2012. Available online: https://www.nngroup.com/articles/usability-101-introduction-to-usability (accessed on 10 July 2025).
  112. Junior, R.; Silva, F. Redefining the MDA framework—The pursuit of a game design ontology. Information 2021, 12, 395. [Google Scholar] [CrossRef]
  113. Fails, J.A.; Guha, M.L.; Druin, A. Methods and techniques for involving children in the design of new technology for children. Found. Trends® Hum.-Comput. Interact. 2013, 6, 85–166. [Google Scholar] [CrossRef]
  114. Hennink, M.; Kaiser, B.N. Sample sizes for saturation in qualitative research: A systematic review of empirical tests. Soc. Sci. Med. 2022, 292, 114523. [Google Scholar] [CrossRef]
  115. Creswell, J.W.; Poth, C.N. Qualitative Inquiry and Research Design: Choosing Among Five Approaches; Sage Publications: Thousand Oaks, CA, USA, 2016. [Google Scholar]
  116. Saunders, B.; Sim, J.; Kingstone, T.; Baker, S.; Waterfield, J.; Bartlam, B.; Burroughs, H.; Jinks, C. Saturation in qualitative research: Exploring its conceptualization and operationalization. Qual. Quant. 2018, 52, 1893–1907. [Google Scholar] [CrossRef]
  117. Henka, A.; Zimmermann, G. Persona Based Accessibility Testing: Towards User-Centered Accessibility Evaluation. In Proceedings of the International Conference on Human-Computer Interaction, Crete, Greece, 22–27 June 2014; Springer International Publishing: Cham, Switzerland, 2014; pp. 226–231. [Google Scholar]
  118. Bennett, C.L.; Rosner, D.K. The promise of empathy: Design, disability, and knowing the “other”. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–13. [Google Scholar]
  119. Nadolny, L.; Halabi, A. Student participation and engagement in inquiry-based game design and learning. Comput. Hum. Behav. 2016, 54, 478–488. [Google Scholar]
  120. Wouters, P.; van Nimwegen, C.; van Oostendorp, H.; van der Spek, E.D. A meta-analysis of the cognitive and motivational effects of serious games. J. Educ. Psychol. 2013, 105, 249–265. [Google Scholar] [CrossRef]
Figure 1. EDTF 10-phase flow diagram.
Figure 1. EDTF 10-phase flow diagram.
Information 16 00794 g001
Figure 2. UMUX and SUS benchmarking comparison score on EDTF.
Figure 2. UMUX and SUS benchmarking comparison score on EDTF.
Information 16 00794 g002
Figure 3. EDTF perceived value with PEQ dimensions.
Figure 3. EDTF perceived value with PEQ dimensions.
Information 16 00794 g003
Figure 4. Perceived value of the 10 EDTF phases on a 1–5 scale.
Figure 4. Perceived value of the 10 EDTF phases on a 1–5 scale.
Information 16 00794 g004
Table 1. Participant demographics and expertise.
Table 1. Participant demographics and expertise.
IDGenderRoleExperience
(Years)
RegionPrimary Expertise/Tools
P1MaleInstr. Game Developer1–3North AmericaGame Dev: Unity, Unreal Engine
P2MaleInstr. Game Designer3–5EuropeGame Design: Unity, GameMaker
P3FemaleInstr. Game Developer>5North AmericaGame Dev: Unreal Engine
P4MaleInstr. Game Designer1–3EuropeGame Design: Unity
P5MaleInstr. Game Developer1–3Asia-PacificGame Dev: Unity
P6FemaleHCI Expert3–5North AmericaHCI: Various UX Tools
P7MaleInstr. Game Developer3–5Latin AmericaGame Dev: Unity, Unreal Engine
P8MaleInstr. Game Designer1–3EuropeGame Design: GameMaker
P9MaleHCI Expert>5North AmericaHCI: UX Research Tools
P10MaleInstr. Game Developer1–3AfricaGame Dev: Unity
P11FemaleInstr. Game Designer1–3Middle EastGame Design: Unreal Engine
P12MaleInstr. Game Developer3–5EuropeGame Dev: Unity, Unreal Engine
P13FemaleHCI Expert1–3Asia-PacificHCI: UX Research Tools
P14MaleInstr. Game Designer>5North AmericaGame Design: Unity
P15MaleInstr. Game Developer1–3EuropeGame Dev: GameMaker
P16MaleInstr. Game Designer3–5North AmericaGame Design: Unity, Unreal Engine
P17MaleHCI Expert>5EuropeHCI: Various UX Tools
P18FemaleInstr. Game Developer1–3North AmericaGame Dev: Unity
Table 2. Overview of use cases shared by experts.
Table 2. Overview of use cases shared by experts.
#Use Case TitleBrief Description
1ChemSolve: Mystery LabPuzzle game to mix chemical substances and solve real-world problems.
2Climate Crisis LabStrategy game for managing resources and policy decisions to combat climate change.
3Anti-Fraud ChallengeLearn to identify online scams using AI-driven feedback in realistic scenarios.
4Language Learning WorldMini-games for reading, writing, listening, with teacher-student tracking.
5Alphabet ExplorerLearn the alphabet through engaging visuals and interactions.
6Preschool FunGames for young children to learn numbers, shapes, and colors.
7University Prep QuizGame to test knowledge for university admission tests.
8ChessFChess-based learning for strategy, logic, and pattern recognition.
9Fruit or Vegetable?Game for ages 4–6 to distinguish between fruits and vegetables.
10Math Quest: Kingdom of NumbersFantasy math puzzle game that incorporates adaptive difficulty.
11CyberShield: Threat HunterSimulation game for responding to cybersecurity threats.
12Brain Boost: Lecture DashTimed challenge turning lectures into memory-based mini-quests.
13CodeMasterTeaches programming through exercises with scalable complexity.
14STEM AdventureProblem-solving game about math, physics, and coding.
15Call Center SimulationTrains staff in software use and company procedures.
16Alphabet Memory GameA simple memory matching game to learn the alphabet.
17Python for BeginnersEngaging game to teach basic Python programming.
18Financial Literacy HeroNavigate budgeting, saving, and investing through life scenarios.
Table 3. Example of participant matrix for inclusive recruitment.
Table 3. Example of participant matrix for inclusive recruitment.
User TypeRecruitment SourceTargetInclusion Needs
High-school learnersLocal schools62 cognitive disabilities
College instructorsUniversity partners42 sensory disabilities
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Maxim, R.I.; Arnedo-Moreno, J. EDTF: A User-Centered Approach to Digital Educational Games Design and Development. Information 2025, 16, 794. https://doi.org/10.3390/info16090794

AMA Style

Maxim RI, Arnedo-Moreno J. EDTF: A User-Centered Approach to Digital Educational Games Design and Development. Information. 2025; 16(9):794. https://doi.org/10.3390/info16090794

Chicago/Turabian Style

Maxim, Raluca Ionela, and Joan Arnedo-Moreno. 2025. "EDTF: A User-Centered Approach to Digital Educational Games Design and Development" Information 16, no. 9: 794. https://doi.org/10.3390/info16090794

APA Style

Maxim, R. I., & Arnedo-Moreno, J. (2025). EDTF: A User-Centered Approach to Digital Educational Games Design and Development. Information, 16(9), 794. https://doi.org/10.3390/info16090794

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop