Detecting Credit-Seeking Behavior with Programmed Instruction Framesets in a Formal Languages Course
Abstract
:1. Introduction
- Unlike previous studies that broadly categorize credit-seeking behavior, our research provides a fine-grained analysis of how different types of questions (multi-choice, single-choice, and true/false) influence credit-seeking behavior. We find that multi-choice questions are particularly prone to triggering credit-seeking behavior, which is a significant insight for the design of educational software.
- Our study provides empirical evidence of the negative correlation between credit-seeking behavior and learning outcomes. We showed that students who engage in credit-seeking behavior achieve lower exam scores and overall academic performance compared to their peers. This finding underscores the importance of addressing credit-seeking behavior to improve educational outcomes.
- We used an unsupervised learning technique to cluster work sessions based on student interactions. This approach allowed us to identify patterns of credit-seeking behavior without relying on pre-labeled data, making our model more adaptable to different educational contexts.
2. Background
3. Literature Review
4. Materials and Methods
4.1. Interaction Logs Dataset
4.2. Dataset Preprocessing
- We removed any non-student interactions from the dataset, such as interactions by teaching staff and those automatically generated by the system.
- We standardized frameset names over the course of the three semesters, as some names were changed as the content was steadily improved.
- We excluded interactions that occurred after the end of the semester, because they had no association with the concept of credit-seeking as there was no longer any reward.
- We transformed the initial interactions into work sessions, which consisted of a series of chronologically sequential activities. We identified five types of activities that could appear in a session. Sessions started with SESSION_START and ended with SESSION_END. A correct answer was labeled CRRCT, and an incorrect answer was labeled X. The activity BACK occurred when the student moved backward in the same frameset and attempted a question they had previously solved. If there was a gap of more than two minutes between any two consecutive interactions, we ended the current session and started a new one that started with the next interaction.
- We removed sessions with fewer than three interactions, since sessions with only one or two interactions were not useful for studying the differences between credit-seeking and active learning patterns.
- We aggregated the session activities into a single data point that represented the session, by creating attributes for each session. These attributes represented the final features that we relied on to conduct the clustering process for the coarse-grained analysis. These are explained next.
4.3. Work Session Attributes
5. Results
5.1. Coarse-Grained Analysis
5.2. Fine-Grained Analysis
6. Discussion
7. Threats to Validity
8. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
PI | Programmed Instruction |
FLA | Formal Languages and Automata |
PCA | Principal Components Analysis |
FPC | Fuzzy Partition Coefficient |
References
- Aleven, V., Mclaren, B., Roll, I., & Koedinger, K. (2006). Toward meta-cognitive tutoring: A model of help seeking with a cognitive tutor. International Journal of Artificial Intelligence in Education, 16, 101–128. [Google Scholar]
- Baker, R., Corbett, A., & Koedinger, K. (2004). Detecting student misuse of intelligent tutoring systems. In Intelligent tutoring systems: 7th international conference, ITS 2004, Maceió, Alagoas, Brazil, August 30–September 3, 2004. Proceedings 7 (vol. 3220, pp. 531–540). Springer. [Google Scholar] [CrossRef]
- Baker, R., Mitrovic, A., & Mathews, M. (2010). Detecting gaming the system in constraint-based tutors. In International conference on user modeling, adaptation, and personalization (vol. 6075, pp. 267–278). Springer. [Google Scholar] [CrossRef]
- Baker, R. S., Corbett, A. T., Koedinger, K. R., & Wagner, A. Z. (2004). Off-task behavior in the cognitive tutor classroom: When students “Game the System”. In Proceedings of the sigchi conference on human factors in computing systems (pp. 383–390). Association for Computing Machinery. [Google Scholar] [CrossRef]
- Beck, J. (2005). Engagement tracing: Using response times to model student disengagement. In Proceedings of the ITS2004 workshop on social and emotional intelligence in learning environments (pp. 88–95). IOS Press. [Google Scholar]
- Cheng, R., & Vassileva, J. (2006). Design and evaluation of an adaptive incentive mechanism for sustained educational online communities. User Modeling and User-Adapted Interaction, 16, 321–348. [Google Scholar] [CrossRef]
- Computing Curricula, A., & Society, I. (2013). Computer science curricula 2013: Curriculum guidelines for undergraduate degree programs in computer science. Association for Computing Machinery. [Google Scholar]
- Fouh, E., Akbar, M., & Shaffer, C. (2012). The role of visualization in computer science education. Computers in the Schools, 29, 95–117. [Google Scholar]
- Fouh, E., Breakiron, D. A., Hamouda, S., Farghally, M. F., & Shaffer, C. A. (2014). Exploring students learning behavior with an interactive etextbook in computer science courses. Computers in Human Behavior, 41(C), 478–485. [Google Scholar] [CrossRef]
- Fouh, E., Karavirta, V., Breakiron, D., Hamouda, S., Hall, T. S., Naps, T., & Shaffer, C. (2014). Design and architecture of an interactive eTextbook—The OpenDSA system. Science of Computer Programming, 88, 22–40. [Google Scholar] [CrossRef]
- Haig, E., Hershkovitz, A., & Baker, R. (2009, July 8–12). The impact of off-task and gaming behaviors on learning: Immediate or aggregate? 14th International Conference on Artificial Intelligence in Education (Aied) (pp. 507–514), Recife, Brazil. [Google Scholar] [CrossRef]
- Johns, J., & Woolf, B. P. (2006). A dynamic mixture model to detect student motivation and proficiency. American Association for Artificial Intelligence. [Google Scholar]
- Koh, K. H., Fouh, E., Farghally, M. F., Shahin, H., & Shaffer, C. A. (2018). Experience: Learner analytics data quality for an etextbook system. Journal of Data and Information Quality (JDIQ), 9(2), 1–10. [Google Scholar] [CrossRef]
- Lockee, B., Moore, D., & Burton, J. (2004). Foundations of programmed instruction. In Handbook of research on educational communications and technology (pp. 545–569). Routledge. [Google Scholar]
- Ma, Y., Cain, K., & Ushakova, A. (2024). Application of cluster analysis to identify different reader groups through their engagement with a digital reading supplement. Computers & Education, 214, 105025. [Google Scholar]
- Mohammed, M. (2021). Teaching formal languages through visualizations, machine simulations, auto-graded exercises, and programmed instruction [Ph.D. dissertation, Faculty of the Virginia Polytechnic Institute and State University]. [Google Scholar]
- Mohammed, M., Shaffer, C. A., & Rodger, S. H. (2021, March 13–20). Teaching formal languages with visualizations and auto-graded exercises. 52nd ACM Technical Symposium on Computer Science Education (pp. 569–575), Virtual Event, USA. [Google Scholar] [CrossRef]
- Molenda, M. (2008). The programmed instruction era: When effectiveness mattered. TechTrends, 52, 52–58. [Google Scholar]
- Munshi, A., Biswas, G., Baker, R., Ocumpaugh, J., Hutt, S., & Paquette, L. (2023). Analysing adaptive scaffolds that help students develop self-regulated learning behaviours. Journal of Computer Assisted Learning, 39(2), 351–368. [Google Scholar]
- Murray, R. C., & Vanlehn, K. (2005, July 18–22). Effects of dissuading unnecessary help requests while providing proactive help. 12th International Conference on Artificial Intelligence in Education (AIED) (pp. 887–889), Recife, Brazil. [Google Scholar]
- Peters, C., Arroyo, I., Burleson, W., Woolf, B., & Muldner, K. (2018). Predictors and outcomes of gaming in an intelligent tutoring system. In Intelligent tutoring systems: 14th international conference, ITS 2018, Montreal, QC, Canada, June 11–15, 2018, proceedings 14 (pp. 366–372). Springer. [Google Scholar] [CrossRef]
- Rocha, H. J. B., de Barros Costa, E., & de Azevedo Restelli Tedesco, P. C. (2024). A knowledge engineering-based approach to detect gaming the system in novice programmers. In Brazilian conference on intelligent systems (pp. 18–33). Springer. [Google Scholar]
- Ross, T. J. (2012). Fuzzy logic with engineering applications (3rd ed.). Wiley. [Google Scholar]
- Rousseeuw, P. J. (1987). Silhouettes: A graphical aid to the interpretation and validation of cluster analysis. Journal of Computational and Applied Mathematics, 20, 53–65. [Google Scholar] [CrossRef]
- Sambasivarao, R. (2020). Impact of programmed instruction in learning mathematics. Aut Aut, XI, 269. [Google Scholar]
- Skinner, B. (1986). Programmed instruction revisited. Phi Delta Kappan, 68(2), 103–110. [Google Scholar]
- Tait, K., Hartley, J. R., & Anderson, R. C. (1973). Feedback procedures in computer-assisted arithmetic instruction. British Journal of Educational Psychology, 43, 161–171. [Google Scholar]
- Walonoski, J., & Heffernan, N. (2006). Prevention of off-task gaming behavior in intelligent tutoring systems. In Intelligent tutoring systems: 8th international conference, ITS 2006, Jhongli, Taiwan, June 26–30, 2006. Proceedings 8 (pp. 722–724). Springer Berlin Heidelberg. [Google Scholar] [CrossRef]
- Wangila, M. J., Martin, W., & Ronald, M. O. (2015). Effect of programmed instruction on students’ attitude towards structure of the ATOM and the periodic table among Kenyan Secondary Schools. Science Education International, 26, 488–500. [Google Scholar]
Student ID | Session ID | Frameset Name | Percentage of Consecutive Incorrect Attempts | Percentage of Consecutive Correct Attempts | Percentage of Incorrect Attempts | Percentage of Correct Attempts | Median Time Between (s) |
---|---|---|---|---|---|---|---|
812 | 4 | GrammarIntroFS | 0.053 | 0.579 | 0.211 | 0.790 | 19.5 |
5112 | 2185 | PDAFS | 0.667 | 0.200 | 0.733 | 0.267 | 3.5 |
6557 | 3323 | ClosureConceptFS | 0.000 | 0.538 | 0.231 | 0.770 | 7.0 |
6660 | 4603 | RemoveUselessFS | 0.541 | 0.054 | 0.757 | 0.243 | 2.0 |
PC1 | PC2 | |
---|---|---|
% of Incorrect Attempts | −0.993 | 0.0477 |
% of Correct Attempts | 0.995 | −0.0603 |
% of Consecutive Incorrect Attempts | 0.905 | 0.0846 |
% of Consecutive Correct Attempts | 0.904 | −0.0799 |
Median of Time in Between | 0.266 | 0.964 |
Credit-Seeking | Actively Learning | p- | Effect | ||||||
---|---|---|---|---|---|---|---|---|---|
Semester | Mean | std | Count | Mean | Std | Count | Value | Size | |
Total Exam Score | Fall 2020 | 265.55 | 26.71 | 11 | 289.85 | 31.58 | 56 | 0.0083 | 0.79 |
Total Exam Score | Spring 2021 | 268.77 | 45.04 | 23 | 297.72 | 32.30 | 41 | 0.0051 | 0.78 |
Total Exam Score | Spring 2022 | 234.9 | 51.50 | 10 | 269.07 | 49.79 | 58 | 0.037 | 0.68 |
Overall Score | Fall 2020 | 82.81 | 7.24 | 11 | 85.59 | 8.12 | 56 | 0.14 | 0.35 |
Overall Score | Spring 2021 | 78.14 | 10.26 | 23 | 85.44 | 8.94 | 41 | 0.0034 | 0.77 |
Overall Score | Spring 2022 | 79.96 | 6.77 | 10 | 83.22 | 7.36 | 58 | 0.094 | 0.45 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Elnady, Y.; Farghally, M.; Mohammed, M.; Shaffer, C.A. Detecting Credit-Seeking Behavior with Programmed Instruction Framesets in a Formal Languages Course. Educ. Sci. 2025, 15, 439. https://doi.org/10.3390/educsci15040439
Elnady Y, Farghally M, Mohammed M, Shaffer CA. Detecting Credit-Seeking Behavior with Programmed Instruction Framesets in a Formal Languages Course. Education Sciences. 2025; 15(4):439. https://doi.org/10.3390/educsci15040439
Chicago/Turabian StyleElnady, Yusuf, Mohammed Farghally, Mostafa Mohammed, and Clifford A. Shaffer. 2025. "Detecting Credit-Seeking Behavior with Programmed Instruction Framesets in a Formal Languages Course" Education Sciences 15, no. 4: 439. https://doi.org/10.3390/educsci15040439
APA StyleElnady, Y., Farghally, M., Mohammed, M., & Shaffer, C. A. (2025). Detecting Credit-Seeking Behavior with Programmed Instruction Framesets in a Formal Languages Course. Education Sciences, 15(4), 439. https://doi.org/10.3390/educsci15040439