Next Article in Journal
Role of Cholesterol in Modifying the Physical and Stability Properties of Liposomes and In Vitro Release of VitaminB12
Previous Article in Journal
The Effect of Fly Ash Nanoparticles on Foam Stability for CO2 Flooding
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Generative Artificial Intelligence-Based Gamified Programming Teaching System: Promoting Peer Competition and Learning Motivation †

1
Undergraduate Program in College of Electrical Engineering and Computer Science, Chung Yuan Christian University, Taoyuan 320314, Taiwan
2
Computer Engineering, Chung Yuan Christian University, Taoyuan 320314, Taiwan
*
Author to whom correspondence should be addressed.
Presented at the 2024 4th International Conference on Social Sciences and Intelligence Management (SSIM 2024), Taichung, Taiwan, 20–22 December 2024.
Eng. Proc. 2025, 98(1), 9; https://doi.org/10.3390/engproc2025098009
Published: 12 June 2025

Abstract

In traditional programming education, teachers typically design fixed questions and standard answers, manually grading the solutions submitted by students. This process not only requires significant time and effort from educators but may also fail to provide timely and personalized feedback due to limited teaching resources. To alleviate these burdens and enhance teaching efficiency, this study leverages generative artificial intelligence (AI) technology to develop a system capable of automatically generating questions and grading answers. Students engage in programming exercises through a gamified approach, with the system providing instant feedback on their answers. Additionally, student performance is displayed via leaderboards, incorporating peer competition to boost learning motivation. According to a user survey, the gamified system demonstrates significant advantages: 56.67% of students found the system easy to use; 40% considered the system well-integrated; 60% indicated that they quickly mastered the system’s functionality, and over half (53.33%) believed that the leaderboard effectively enhanced their competitive awareness and motivation. These results suggest that the system not only reduces teachers’ workload but also increases student engagement and learning outcomes through gamified design.

1. Introduction

Recent advancements in information technology, particularly artificial intelligence (AI), have largely impacted various fields, including programming education. Since the ability of AI to generate code has made programming more accessible than before, students can rely on AI to write basic programs. However, this convenience presents challenges: students become overly dependent on AI, which hinders the facilitation of problem-solving abilities and the improvement in their programming skills. Moreover, traditional programming education often involves one-way instruction from teachers, which fails to engage students, resulting in poor interaction and reduced learning motivation [1,2].
In the current educational landscape, programming education largely relies on conventional teacher-centered approaches where students passively listen to lectures, although innovative teaching methods such as problem-based learning (PBL) and flipped classrooms have been introduced. This lack of such interaction diminishes classroom appeal and fails to effectively stimulate students’ learning motivation.
This study aims to address these issues by developing a gamified learning system that integrates programming education with gamification. To ensure user-friendliness, this system draws inspiration from the popular game Super Mario Bros, known for its simple controls and intuitive gameplay that minimizes cognitive load for players. By combining AI technology with immersive gameplay, this system creates an engaging learning environment where players solve programming problems generated by AI while avoiding obstacles in the games. This approach encourages students to think critically and apply their programming knowledge to play the game. Teachers only need to provide keywords, and the AI automatically generates programming questions and test data, ensuring students receive personalized and challenging tasks.
To further enhance competitiveness, this system incorporates a scoring mechanism and leaderboards. Students accumulate points by answering questions correctly and lose points for incorrect answers, motivating them to review their responses to maintain their cumulative scores. At the end of each game, scores are calculated to determine if a new high score is achieved and to rank students on the leaderboard. This motivates students to improve their performance. Additionally, the system enables teachers to track students’ progress and learning outcomes, providing valuable insights into their development. Through this gamified learning process, the system aims to increase students’ interest in programming and enhance overall learning efficiency and educational outcomes.
To systematically evaluate the system’s practicality and effectiveness, we explored the relationship between system usability and user experience and between learning outcomes and peer competition. Using the gamified learning system, students’ learning motivation and engagement were enhanced significantly, and the leaderboard mechanism of the game positively impacted students’ learning behavior.

2. A Literature Review

2.1. Game-Based Learning

The concept of gamification has been widely applied across various areas, such as business and marketing. In education, gamification is integrated with digital learning, becoming a significant part of educational technology. Gamified learning applies the motivational aspects of games to educational activities, tapping humans’ inherent desire for communication and achievement while managing students’ attention through goal-setting [3]. Through gamifying learning, mundane content becomes engaging and lively, attracting students and fostering deeper involvement.
Game-based learning offers numerous advantages over traditional learning approaches, such as increasing motivation, enhancing learning outcomes, improving memory retention, and providing immediate feedback. In a gamified learning environment, students have objectives, which, compared with monotonous examinations, provide immediate feedback and a sense of accomplishment, fostering enthusiasm for learning. Gamification significantly improves students’ motivation and academic performance by making learning more interactive and engaging [4].
However, while gamified learning enhances student engagement and motivation, it also presents potential challenges. Over-reliance on extrinsic rewards, such as points or badges, leads students to focus on these external incentives rather than intrinsic learning motivations [4]. In such cases, when external rewards are reduced or removed, students lose the drive for continued learning. Additionally, poorly designed games negatively impact learning. Overly challenging tasks frustrate students and diminish their interest in learning, while tasks that are too simple may fail to sufficiently motivate them [5].
Therefore, designing appropriately challenging tasks and personalizing difficulty levels based on students’ varying abilities are crucial for successful gamified learning. It is important to integrate gamification with AI and big data technologies. By analyzing students’ learning behaviors, more personalized learning experiences are ensured, offering precise feedback on progress and individualized learning paths [4].

2.2. Programming Questions

The difficulty of programming questions must vary to carefully calibrate difficulty levels [6]. The difficulty must be adjusted based on their teaching experience or students’ learning outcomes, but such approaches often fall short of accurately reflecting reality [7]. Similarly, when teachers rely on their expertise in the field to set question difficulties, the inherently subjective nature of cognitive difficulty often results in inaccuracies [8]. This suggests that neither students’ performance nor teachers’ experience and expertise can ensure precision in question design, let alone tailor difficulty levels for students of varying proficiency.
Program-learning students need to learn diverse skill levels and experiences. Therefore, by designing appropriately challenging questions, being overwhelmed or disengaged can be prevented, balancing challenges and progress [9]. Previous research on programming education has examined how question difficulty influenced test-takers, including its effects on submission behavior, the sequence of difficulty, and cognitive load. Difficult questions led to fewer overall submissions but increased average submissions per individual. Additionally, arranging questions in order of difficulty results in better accuracy and lower cognitive load compared with those in the reverse order. Thus, the order of difficulty significantly impacts response behavior, confidence, and motivation [10].
When using generative AI to design questions, it is critical to carefully construct keywords to prevent poor learning outcomes. Additionally, assigning appropriate response times and attempt limits based on difficulty levels ensures learning effectiveness.

3. Methods

3.1. Experimental Design

We selected students who had completed a minimum number of gameplay sessions and course learning hours. The students answered programming questions generated by generative AI in the game. With correct answers, they were awarded points and benefits (e.g., bonus points or invincibility), while incorrect answers resulted in point deductions. A leaderboard mechanism provided real-time rankings, fostering a sense of competition to enhance student engagement and learning motivation.

3.2. Questionnaire Design

The questionnaire was created to assess students’ subjective experiences and learning perceptions of the gamified learning system from the following aspects:
  • System usability experience: Evaluate whether the system is user-friendly and intuitive, including interface design, feature integration, and overall ease of use (e.g., how quickly students could master the system’s operations);
  • Learning motivation and competitive influence: Focus on whether the competitive mechanisms in the system (e.g., the point system and leaderboard) effectively encouraged students’ learning engagement. Questions included evaluations of whether “competition motivated learning” and “efforts are made to improve ranking”;
  • Subjective evaluation of learning effectiveness: Measure the extent to which the system improved students’ programming knowledge, such as enhancing logical thinking and problem-solving skills. Items were rated on a five-point Likert scale (1 = strongly disagree; 5 = strongly agree), and open-ended sections were included for feedback on system improvements.

3.3. Data Collection and Processing

The questionnaire was administered during class, allowing students to reflect on their overall experience and provide feedback. The data collected were analyzed to evaluate the system’s features, learning outcomes, and competitive mechanisms as perceived by students. Metrics such as students’ highest scores, the number of questions attempted, and their answer accuracy rates were calculated to examine usage patterns.

4. Results and Analysis

We analyzed the impact of a generative AI-based gamified programming education system on students’ learning outcomes and motivation through user surveys. Results and data analysis were explored across dimensions, including system usability, motivation enhancement, peer competition, and improvements in learning performance.

4.1. System Usability and Integration

4.1.1. Ease of Use

A total of 56.67% of the students found the system simple and user-friendly, indicating that the design adequately considered user experience. The intuitive interface and functions allowed students to quickly adapt and engage in learning. For students unfamiliar with interface operations, the streamlined workflow significantly reduced technical barriers, minimizing frustration unrelated to learning (Table 1).

4.1.2. Functional Integration

A total of 40% of students indicated that the system’s functionality was well-integrated, showing success in combining gamification elements with learning objectives. Moreover, 60% found the game easy to play, reflecting the effectiveness of the “learning-as-gaming” model. The game promoted immersive learning, enabling students to acquire programming knowledge during gameplay (Table 2 and Table 3).

4.2. Learning Motivation and Peer Competition Effects

4.2.1. Leaderboard

The leaderboard, a key mechanism for encouraging peer competition, demonstrated significant effects. A total of 53.33% of the students agreed that the leaderboard motivated their competitive intent and inspired them to improve their rankings. This extrinsic motivation drove students to focus more on their learning, pay attention to details, and seek better problem-solving methods (Table 4).

4.2.2. Competition

We analyzed the positive impact of competition on learning. The students were more eager to challenge themselves in the game, frequently checking rankings after completing generative AI tasks and improving their strategies. This continuous improvement mindset enhanced short-term learning outcomes and fostered long-term study habits.

4.3. Learning Performance Enhancement Using Generative AI

4.3.1. Dynamic Question Generation and Differentiated Learning

Generative AI dynamically generated programming questions and adapted their difficulty based on student performance. This made tasks challenging and attainable. For high-achieving students, the system offered advanced experiences, while for beginners, the system built confidence through basic tasks. This “differentiated instruction” met diverse learner needs and mitigated learning fatigue.

4.3.2. Balance Between Challenge and Achievement

Challenging questions effectively boosted engagement and motivation. By successfully solving moderately difficult problems, students obtained a strong sense of achievement, further increasing their interest in programming. Additionally, the system’s real-time feedback mechanism encouraged students to reflect on errors and refine their strategies, fostering critical thinking skills.

4.4. In-Depth Exploration of Student Learning Behaviors

4.4.1. Changes in Learning Behavior

The gamified learning system prompted changes in student behavior. Compared with traditional classroom teaching, students displayed enhanced initiative. During gameplay, they faced time pressures and were required to think quickly to solve programming problems. These immediate challenges enhanced focus and problem-solving abilities.

4.4.2. Value of Data-Driven Learning Records

The system’s built-in database was used to track students’ performance, including answer accuracy and completion time for each question. These records helped teachers monitor progress and helped students realize their strengths and weaknesses, enabling targeted improvements on key challenges.

5. Conclusions

We developed an innovative system that integrated generative AI with gamified learning to enhance students’ motivation and programming skills. Based on the results of the system implementation, the following results were obtained.
The results demonstrated that dynamically generated questions and differentiated difficulty adjustments, implemented through generative AI, provided students with tailored learning experiences. This improved learning efficiency, critical thinking, and problem-solving skills. Additionally, the incorporation of gamification elements, such as point mechanisms, further boosted learner motivation, establishing a trend of sustained learning progress.
External mechanisms, such as leaderboards, strengthened peer competition. Backend data and survey feedback helped students to improve their leaderboard rankings. Over half of the students considered ranking as a key factor in their willingness to continue learning through the system and competing with peers. This competitive environment facilitated concept retention and growth through continuous engagement. Therefore, incorporating external mechanisms into a learning environment can yield significant benefits.
To further enhance the educational value of the system, it is necessary to explore the integration of gamified learning and generative AI in subjects such as mathematics, science, or other disciplines, offering diverse learning opportunities to more students. Team collaboration or competition features also need to be introduced to foster student interaction and cooperative learning. In addition, occasional misjudgments by generative AI that may lead to minor data misinterpretations must be addressed. Improved AI model training significantly reduces such errors, helping teachers track student behavior and outcomes more effectively and exert immediate teaching interventions.
Compared with previous studies in which the learning system often overly relied on external rewards (e.g., badges or points) and potentially weakened intrinsic motivation, the developed system in this study presented a higher level of integration between game elements and educational content. Students increased their interest in learning while challenging themselves. Furthermore, the system provided empirical evidence of its application value in real educational environments, while previous systems with generative AI in education focused primarily on theoretical discussions.
The developed gamified system demonstrates the potential of combining generative AI with gamified learning and offers a reference for future development of educational technology. This innovative system promotes student progress in programming education and for broader educational applications.

Author Contributions

Conceptualization, C.-H.L.; methodology, Y.-J.C.; software, Z.-P.C. and Y.-J.C.; validation, Y.-J.C., Z.-P.C. and C.-H.L.; formal analysis, Y.-J.C. and Z.-P.C.; investigation, Y.-J.C. and Z.-P.C.; resources, Y.-J.C.; data curation, Y.-J.C. and Z.-P.C.; writing—original draft preparation, Y.-J.C. and Z.-P.C.; writing—review and editing, Y.-J.C., Z.-P.C. and C.-W.P.; visualization, Y.-J.C. and Z.-P.C.; supervision, Y.-J.C.; project administration, Y.-J.C.; funding acquisition, Z.-P.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

The original contributions presented in this study are included in this article; further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Robins, A.; Rountree, J.; Rountree, N. Learning and teaching programming: A review and discussion. Comput. Sci. Educ. 2003, 13, 137–172. [Google Scholar] [CrossRef]
  2. Lai, C.-H. Design Interactive Response System Based on Hot Option for Improving the Participation of Students: A Case Study. Educ. J. 2023, 51, 77–99. [Google Scholar]
  3. Wood, L.C.; Reiners, T. Gamification. In Encyclopedia of Information Science and Technology, 3rd ed.; Khosrow-Pour, M., Ed.; Information Science Reference: Hershey, PA, USA, 2015; pp. 3039–3047. [Google Scholar]
  4. Jaramillo-Mediavilla, L.; Basantes-Andrade, A.; Cabezas-González, M.; Casillas-Martín, S. Impact of Gamification on Motivation and Academic Performance: A Systematic Review. Educ. Sci. 2024, 14, 639. [Google Scholar] [CrossRef]
  5. Majuri, J.; Koivisto, J.; Hamari, J. Gamification of Education and Learning: A Review of Empirical Literature. In Proceedings of the GamiFIN Conference 2018, Pori, Finland, 21–23 May 2018. [Google Scholar]
  6. Lai, C.-H. Apply Difficulty Perception Classification System to Improve Students’ Learning Achievement. In Proceedings of the Global Chinese Society for Computers in Education 2021 (GCCCE 2021), Taipei, Taiwan, 11–15 September 2021. [Google Scholar]
  7. Liu, J.; Liu, C.; Yuan, X.; Belkin, N.J. Understanding Searchers’ Perception of Task Difficulty: Relationships with Task Type. Proc. Am. Soc. Inf. Sci. Technol. 2011, 48, 1–10. [Google Scholar] [CrossRef]
  8. Verdú, E.; Verdú, M.J.; Regueras, L.M.; de Castro, J.P.; García, R. A Genetic Fuzzy Expert System for Automatic Question Classification in a Competitive Learning Environment. Expert Syst. Appl. 2012, 39, 7471–7478. [Google Scholar] [CrossRef]
  9. Wauters, K.; Desmet, P.; Van Den Noortgate, W. Adaptive Item-Based Learning Environments Based on the Item Response Theory: Possibilities and Challenges. J. Comput. Assist. Learn. 2010, 26, 549–562. [Google Scholar] [CrossRef]
  10. Wang, J.; Lin, P.; Tang, Z.; Chen, S. How Problem Difficulty and Order Influence Programming Education Outcomes in Online Judge Systems. Heliyon 2023, 9, e20947. [Google Scholar] [CrossRef] [PubMed]
Table 1. Response to statement “I Find the Game System Easy to Use”.
Table 1. Response to statement “I Find the Game System Easy to Use”.
ResponsePercentage
Strongly Disagree3.33%
Disagree13.33%
Neutral26.67%
Agree46.67%
Strongly Agree10%
Table 2. Response to statement “I Feel I Need Experienced Help To Use The Game System”.
Table 2. Response to statement “I Feel I Need Experienced Help To Use The Game System”.
ResponsePercentage
Strongly Disagree10%
Disagree50%
Neutral33.33%
Agree6.67%
Strongly Agree0%
Table 3. Response to statement “I Feel The Game System Integrates Its Functions Well”.
Table 3. Response to statement “I Feel The Game System Integrates Its Functions Well”.
ResponsePercentage
Strongly Disagree0%
Disagree10%
Neutral50%
Agree30%
Strongly Agree10%
Table 4. Response to statement “Seeing The Leaderboard Rankings Motivates Me To Aim For A Higher Position”.
Table 4. Response to statement “Seeing The Leaderboard Rankings Motivates Me To Aim For A Higher Position”.
ResponsePercentage
Strongly Disagree0%
Disagree0%
Neutral46.67%
Agree30%
Strongly Agree23.33%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, Y.-J.; Chen, Z.-P.; Lai, C.-H.; Peng, C.-W. Generative Artificial Intelligence-Based Gamified Programming Teaching System: Promoting Peer Competition and Learning Motivation. Eng. Proc. 2025, 98, 9. https://doi.org/10.3390/engproc2025098009

AMA Style

Chen Y-J, Chen Z-P, Lai C-H, Peng C-W. Generative Artificial Intelligence-Based Gamified Programming Teaching System: Promoting Peer Competition and Learning Motivation. Engineering Proceedings. 2025; 98(1):9. https://doi.org/10.3390/engproc2025098009

Chicago/Turabian Style

Chen, You-Jen, Ze-Ping Chen, Chien-Hung Lai, and Chen-Wei Peng. 2025. "Generative Artificial Intelligence-Based Gamified Programming Teaching System: Promoting Peer Competition and Learning Motivation" Engineering Proceedings 98, no. 1: 9. https://doi.org/10.3390/engproc2025098009

APA Style

Chen, Y.-J., Chen, Z.-P., Lai, C.-H., & Peng, C.-W. (2025). Generative Artificial Intelligence-Based Gamified Programming Teaching System: Promoting Peer Competition and Learning Motivation. Engineering Proceedings, 98(1), 9. https://doi.org/10.3390/engproc2025098009

Article Metrics

Back to TopTop