3.1. Research Design
This study adopts a design-based research methodology, systematically guided by the ADDIE instructional design model, to develop and evaluate a gamified intergenerational mentorship learning application, “Digital Bridge”, with the aim of enhancing digital literacy among older adults. Rather than treating the application design itself as the primary research objective, this study focuses on evaluating the effectiveness of a collaborative gamified mentorship mechanism in a real-life learning context, grounded in both cognitive and user-experience theories.
Originally developed in the field of instructional design, the ADDIE model provides a structured process for creating effective and learner-centered educational programs. The ADDIE model consists of five stages, Analysis, Design, Development, Implementation, and Evaluation, as illustrated in
Figure 1. This model is widely recognized as an effective methodological framework for developing training programs, ensuring scientific rigor, systematic organization, and practical applicability in instructional content [
14].
3.1.1. Phase One: Analysis Phase
During the analysis phase, this study combined questionnaire surveys and focus group discussions to comprehensively understand older adults’ digital skills status, needs, barriers, and preferences, providing data support and practical guidance for the design of the “Digital Reverse Mentorship” gamified application.
First, to ensure the scientific validity and content accuracy of the questionnaire, the research team consulted three gerontology professors (3), two design professors (2), two elderly university instructors (2), and one community center staff member (1). Based on their suggestions, a preference and needs assessment questionnaire was developed, covering six sections: basic information, current digital skills, digital skill needs, learning barriers, willingness for gamified learning, and open-ended suggestions.
Data collection was conducted in Taiyuan, Shanxi Province, China, via the “Wenjuanxing” online survey platform from November 15 to 5 December 2024, with 305 older adults (aged 60+) participating. All respondents provided written informed consent. To enhance comprehension, researchers explained each question during the survey process to ensure older participants could respond accurately.
To further explore user needs and ensure a multidimensional research approach, a focus group discussion was organized at a community center in Taiyuan, Shanxi Province. The discussion involved eight participants from diverse backgrounds, including older adults (2), family members (2), community center staff (1), elderly university instructors (1), experienced aging-friendly designers (1), and gerontology experts specializing in the digital divide (1). The discussion focused on the following key topics:
➀ How can gamification help older adults overcome learning barriers?
➁ How can game-based learning content and interaction be optimized to reduce learning anxiety, boost confidence, and promote intergenerational interaction?
➂ What is the specific role and impact of younger individuals in assisting older adults with digital skill learning?
➃ What is the current state of older adults’ digital literacy, and what are the strengths and limitations of existing “digital reverse mentorship” models?
➄ What is the potential of cognitive games in improving older adults’ digital literacy and learning motivation?
The discussion followed an open-ended format, encouraging participants to share real experiences and insights. Sufficient background information was provided to facilitate meaningful dialogue. By incorporating perspectives from different stakeholders, the study refined the design direction of the gamified intergenerational digital mentorship application “Digital Bridge”, ensuring that it meets the practical needs of older users while also achieving broader social and educational objectives, such as enhancing intergenerational interaction and improving the effectiveness and sustainability of digital skills learning.
3.1.2. Phase Two and Three: Design and Development Phase
Based on the findings from the analysis phase, including user needs exploration and focus group interviews, and guided by Cognitive Aging Theory and Aging-Friendly Experience Design, this study aimed to develop a “Digital Reverse Mentorship” application, “Digital Bridge”, that balances practicality and engagement. The core design philosophy of this application is to accommodate older users’ cognitive, emotional, and social needs by using gamification strategies to lower learning barriers, stimulate intrinsic motivation, and promote active intergenerational interaction. To achieve these goals, the application underwent comprehensive optimization in terms of workflow and structural design, UI (user interface) enhancements, interaction modes, and feedback mechanisms, creating a holistic digital literacy learning platform that combines education, entertainment, and social engagement to help older adults integrate into the digital world.
This study introduced “Intergenerational Digital Reverse Mentorship” as the core concept of the application, emphasizing the use of digital technology to bridge the intergenerational digital divide by enabling younger users to pass on digital skills to older adults while fostering mutual learning and interaction. Based on this concept, the application’s functional design includes two user modes: the “Digital Learner” mode (for seniors) and the “Tech Mentor” mode (for juniors). It integrates four key modules: a digital skills learning module, an intergenerational interaction module (e.g., a “Call for Help” feature), a gamified reward system (including points, virtual achievements, and emotional rewards), and an optional AI voice assistant (providing game hints or acting as a virtual junior). The workflow design follows standard operational logic such as onboarding screens for first-time users, registration pages, and senior–junior binding pages, while ensuring a clear and progressive learning path from simple to complex and from basic to advanced content, with logical continuity between topics. The interface is simple and intuitive, tailored to the cognitive characteristics of older users, for example, by reducing the number of interactive elements per page and enhancing their visibility. A “Call for Help” function allows older users to request assistance from their junior counterparts when facing difficulties, who can then provide real-time guidance via voice or text, enhancing interactivity and collaboration. Additionally, a task challenge mechanism is introduced, encouraging older users to complete various digital skill challenges to earn achievement-based rewards, thereby boosting learning engagement and motivation.
In UI (user interface) and interaction design, this study optimized the layout based on Cognitive Aging Theory to make the application more intuitive, simplified, and user-friendly. The interface uses large fonts, high-contrast color schemes, and clear navigation structures to ensure that older users can browse and operate the app effortlessly. Additionally, for critical operations (such as payments and information submissions), confirmation dialogs were introduced to reduce the risk of errors and improve safety and reliability. To enhance learning interest and user experience, gamification-based interactions were further optimized. For instance, upon completing a task, users receive instant feedback (e.g., animated prompts, voice guidance), helping them track their progress and achievements while boosting confidence. At the same time, operation processes were simplified, reducing multi-level menus and improving touch interactions to ensure that tasks can be completed with minimal steps.
In the development phase, Android Studio was used as the primary development environment. Given the operational habits of older users, the application’s interface interactions adopted a “tap-to-enter” navigation method, allowing users to easily access different functional modules without complex operations. Moreover, this study employed an iterative feedback loop model, collecting continuous user feedback throughout the development process to optimize operational fluency, interaction experience, and visual design. For example, early user testing revealed that some older adults were unfamiliar with swipe gestures, leading to an adjustment to a “single-tap confirmation” mode to reduce learning difficulty and error rates. Additionally, voice-assisted functions were incorporated into key learning tasks, further reducing cognitive load and improving interactive accessibility.
At present, the application has only completed the initial stage of basic development, with five levels created for testing purposes. Core features such as account registration, senior–junior pairing, interaction, and real-time communication have not yet been implemented. Therefore, during the testing phase of this study, each group used an Android device with the test version of the app installed. To simulate the guidance from juniors to seniors, a volunteer was assigned to observe the senior’s gameplay in person. When the senior encountered difficulties and pressed the app’s “Request Help” button, the on-site volunteer would provide prompts and guidance, simulating the app’s intended intergenerational interaction functionality.
3.1.3. Phases Four and Five: Implementation and Evaluation
This study adopted a between-group experimental design to evaluate the effectiveness of the collaborative gamified digital mentorship application in improving digital literacy among older adults. The research focused on measuring digital skill acquisition and examining the impact of collaborative gamification on user experience and learning motivation.
First, a total of 90 older adults aged 60 and above were recruited from a senior university in Taiyuan, Shanxi Province, China. Participants were recruited after we obtained permission from the university administration. Recruitment was conducted through both online announcements and offline flyers distributed on campus. Inclusion criteria included: age ≥ 60 years, basic literacy skills, voluntary participation with signed informed consent, and confirmation in the pre-test that they had not yet mastered the seven digital skills involved in the experiment. To ensure the representativeness of the sample, recruitment considered factors such as educational background, digital skill levels, and smartphone usage experience. Exclusion criteria included: (1) individuals with severe cognitive impairment, mental disorders, or physical conditions that hinder learning, as well as those unable to complete a continuous two-hour learning task; (2) older adults who had participated in similar digital skills training in the past year. No participants dropped out during the experiment. To encourage participation, each participant received a small gift valued at approximately 30 RMB as a token of appreciation after completing the sessions.
Second, this study adopted a three-arm parallel randomized controlled trial (RCT) design, with participants randomly assigned to one experimental group and two control groups (N = 30 per group), each receiving a different learning intervention (
Figure 2).
Since intergenerational interaction was involved in the experiment, the “tech mentor mode” was carried out by trained staff members. All mentors completed a two-day standardized training program, which covered the digital skills instructional script, communication techniques, and strategies for supporting older adults in learning. After the training, mentors were required to pass a teaching simulation assessment to ensure consistency in instructional content and methods, as well as the ability to adapt flexibly to learners’ progress.
The independent variable in this study was the three learning intervention modes (A/B/C models), characterized by differences in knowledge transfer medium (offline instruction/human–computer interaction/human–computer collaboration) and interaction dimension (one-way teaching/self-directed exploration/intergenerational collaboration). The dependent variables included:
① Digital Skill Acquisition: seven core digital skills (e.g., mobile payments, social media use) were selected based on the needs assessment and were evaluated using a dichotomous scoring system (1 = success, 0 = failure). Participants were required to perform skill-related tasks independently in a standardized environment, with double-blind observers recording accuracy.
② User Experience: measured using the User Experience Questionnaire—Short (UEQ-S), an eight-item scale divided into pragmatic quality (e.g., efficiency, clarity) and hedonic quality (e.g., attractiveness, innovativeness). Each item was rated on a 7-point semantic differential scale (e.g., “inefficient—efficient”), with scores ranging from 1 (negative experience) to 7 (positive experience). Mean scores for both dimensions were calculated separately to reflect user experience levels.
③ Learning Motivation: measured using the ARCS Model of Motivation (Attention, Relevance, Confidence, Satisfaction), assessed through a 16-item scale (4 items per dimension) on a 5-point Likert scale (1 = strongly disagree, 5 = strongly agree), with a total score ranging from 16 to 80.
Experimental tasks were selected based on the needs assessment, covering high-frequency digital skills commonly required by older adults, including online shopping (76.7%), calling/messaging (73.4%), social media use (68.2%), photo/video editing (20.7%), and video chat (20.3%). Skills were selected based on a progressive difficulty principle, ensuring fairness and scientific rigor in comparing different learning models. All selected skills were confirmed through pre-testing to be unmastered by participants before the experiment, ensuring the reliability of results. The experiment was conducted in a standardized environment, with the venue set up as a quiet, comfortable, independent space to maintain consistent environmental conditions and lighting. All participants used the same model of Android smartphone, pre-installed with the standardized test application “Digital Bridge”, and were provided with a uniform network configuration to minimize external factors influencing the results.
The experiment was divided into the pre-test phase, intervention phase, and post-test phase (as illustrated in
Figure 2), with the following procedures:
① Pre-Test Phase: participants’ baseline digital skills were assessed to ensure that none had prior knowledge of the seven experimental tasks. Additionally, this phase included device operation training to minimize the impact of unfamiliarity with the equipment.
② Intervention Phase: participants engaged in their assigned learning tasks according to group allocation. Researchers used a standardized observation record sheet to document learning duration, types and frequency of encountered issues, help-seeking behaviors, emotional responses, and learning strategies, ensuring comprehensive data collection.
③ Post-Test Phase: immediately after learning completion, the following assessments were conducted:
User Experience Measurement (UEQ-S).
Learning Motivation Measurement (ARCS Model of Motivation Scale).
Digital Skill Acquisition Assessment: participants performed skill-related tasks independently in a standardized environment, with double-blind observers recording their performance.
This study adopted an immediate testing approach to minimize the effects of short-term memory decay on test results. Additionally, all tests strictly followed a double-blind design, ensuring that both assessors and participants were unaware of their specific group allocation, thus maintaining data objectivity and experimental validity. Furthermore, all experiment staff underwent standardized training to ensure uniform instructional content and interaction methods across groups, adhering to a predefined teaching script to guarantee consistency in the experiment.