Putting It All Together: Combining Learning Analytics Methods and Data Sources to Understand Students’ Approaches to Learning Programming
Abstract
:1. Introduction
2. Background
2.1. Temporality in Learning Analytics
2.2. Learning Analytics in Programming Education
3. Methods
3.1. Context
3.2. Data Sources
3.2.1. LMS Data
3.2.2. Automated Assessment Tool Data
3.3. Data Preparation
3.4. Data Analysis
3.4.1. Identification of Learning Tactics
3.4.2. Identification of Types of Learners According to Their Learning Strategies
4. Results
4.1. Identification of Learning Tactics
4.1.1. Identification of Learning Tactics in the LMS
- Slide-oriented instruction (n = 9258, 56.2%): This cluster is dominated by actions mostly related to studying and viewing the course materials. In particular, viewing slideshows was the prevailing learning activity. This tactic was the most frequent among the students.
- Video-oriented instruction (n = 2054, 12.5%): The most predominant action in this cluster was video-watching. These were sessions in which students consumed the videos to acquire the necessary knowledge and skills to be able to tackle the assignments.
- Assignment-viewing (n = 2784, 16.9%): This cluster contains single activity sessions in which the only traced learning activity carried out by the students was viewing the assignment instructions.
- Help-seeking (n = 2379, 14.4%): This cluster is dominated by forum consumption. Students’ first action in this learning tactic was reading the forum messages, followed by reviewing the assignment instructions and sometimes consulting the slides or videos or even write a forum post.
4.1.2. Identification of Learning Tactics in the Automated Assessment Tool
- Assignment approaching (n = 3136, 46.4%): This cluster shows learning sessions in which students progressively improved their score until they got an A. It can be seen that most students managed to quickly score at least a B, and often achieved an A score towards the end of the session, which led them to submit the assignment.
- Assignment struggling (n = 2280, 32.7%): This cluster is initially dominated by students obtaining an F score and shows a much less straightforward improvement towards succeeding at the assignment than the previous cluster, indicating that students found difficulties when completing the assignment.
- Assignment exploring (n = 941, 13.9%): This cluster includes sessions in which students only downloaded the code template for an assignment and ran the tests once to get an idea of what the assignment looked like, as mentioned earlier, so they could watch out for similar content when viewing the slides or watching the videos. Compared to the previous tactics, this one was not very common among students.
- Assignment succeeding (n = 468, 6.9%): This cluster shows that students immediately score an A and submit the assignment.
4.2. Putting It All Together
4.2.1. Learning Tactics Applied When Studying (Not Working on the Assignments)
4.2.2. Learning Tactics Applied When Struggling
4.2.3. Learning Tactics Applied When Not Showing Signs of Struggle
4.3. Identification of Types of Learners According to Their Learning Strategies
- Determined (n = 27, 9.2%): Determined students are the ones who had a higher frequency of learning sessions both using the LMS and the automated assessment tool. They struggled with the assignments more often than the other two groups. However, the results also show that they are the group with the most sessions of the type Assignment approaching, meaning that they also had very productive working sessions in which they successfully solved the assignments. This group also shows an increased help-seeking behavior and made intensive use of the slideshows and videos, especially the latter.
- Strategists (n = 148, 50.7%): The students in the strategists group had an intermediate number of learning sessions both using the LMS and the automated assessment tool. They only surpassed determined learners in two learning tactics Assignment exploring and Assignment succeeding, which were the shortest session types, indicating that they did not need as many working sessions to complete their assignments. Moreover, these learners seem to have used the slideshows as their go-to self-instruction tactic.
- Low-effort (n = 117, 40.1%): Lastly, low-effort students had the fewest number of interactions with both the LMS and the automated assessment tool. The most common session type for these students was Assignment succeeding, which may suggest avoidance towards the use of the automated assessment tool. They manifested a low frequency of self-instruction, showing a slight preference for video-oriented tactics over slide-oriented ones.
5. Discussion
6. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Luxton-Reilly, A.; Albluwi, I.; Becker, B.A.; Giannakos, M.; Kumar, A.N.; Ott, L.; Paterson, J.; Scott, M.J.; Sheard, J.; Szabo, C. Introductory Programming: A Systematic Literature Review. In Proceedings of the Proceedings Companion of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education, Larnaca, Cyprus, 2–4 July 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 55–106. [Google Scholar]
- Popat, S.; Starkey, L. Learning to code or coding to learn? A systematic review. Comput. Educ. 2019, 128, 365–376. [Google Scholar] [CrossRef]
- Bennedsen, J.; Caspersen, M.E. Failure rates in introductory programming. ACM Inroads 2019, 10, 30–36. [Google Scholar] [CrossRef] [Green Version]
- Watson, C.; Li, F.W.B. Failure rates in introductory programming revisited. In Proceedings of the 2014 Conference on Innovation & Technology in Computer Science Education (ITiCSE ’14), Uppsala, Sweden, 21–25 June 2014; ACM Press: New York, NY, USA, 2014; pp. 39–44. [Google Scholar]
- Piteira, M.; Costa, C. Learning Computer Programming: Study of Difficulties in Learning Programming. In Proceedings of the 2013 International Conference on Information Systems and Design of Communication, Lisboa, Portugal, 11 July 2013; Association for Computing Machinery: New York, NY, USA, 2013; pp. 75–80. [Google Scholar]
- Qian, Y.; Lehman, J. Students’ Misconceptions and Other Difficulties in Introductory Programming. Acm Trans. Comput. Educ. 2017, 18, 1–24. [Google Scholar] [CrossRef]
- Song, L.; Hill, J. A Conceptual Model for Understanding Self-Directed Learning in Online Environments. J. Interact. Online Learn. 2007, 6, 27–42. [Google Scholar]
- Kumar, V.; Winne, P.; Hadwin, A.; Nesbit, J.; Jamieson-Noel, D.; Calvert, T.; Samin, B. Effects of self-regulated learning in programming. In Proceedings of the Fifth IEEE International Conference on Advanced Learning Technologies (ICALT’05), Kaohsiung, Taiwan, 5–8 July 2005; pp. 383–387. [Google Scholar]
- Broadbent, J.; Poon, W.L. Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. Internet High. Educ. 2015, 27, 1–13. [Google Scholar] [CrossRef]
- Weinstein, C.E.; Husman, J.; Dierking, D.R. Self- regulation interventions with a focus on learning strategies. In Handbook of Self-Regulation; Academic Press: San Diego, CA, USA, 2000; pp. 727–747. [Google Scholar]
- Fincham, E.; Gašević, D.; Jovanović, J.; Pardo, A. From Study Tactics to Learning Strategies: An Analytical Method for Extracting Interpretable Representations. IEEE Trans. Learn. Technol. 2019, 12, 59–72. [Google Scholar] [CrossRef]
- Matcha, W.; Gašević, D.; Ahmad Uzir, N.; Jovanović, J.; Pardo, A.; Lim, L.; Maldonado-Mahauad, J.; Gentili, S.; Pérez-Sanagustín, M.; Tsai, Y.-S.; et al. Analytics of Learning Strategies: Role of Course Design and Delivery Modality. J. Learn. Anal. 2020, 7, 45–71. [Google Scholar] [CrossRef]
- Hamdan, K.M.; Al-Bashaireh, A.M.; Zahran, Z.; Al-Daghestani, A.; AL-Habashneh, S.; Shaheen, A.M. University students’ interaction, Internet self-efficacy, self-regulation and satisfaction with online education during pandemic crises of COVID-19 (SARS-CoV-2). Int. J. Educ. Manag. 2021. ahead-of-print. [Google Scholar] [CrossRef]
- Viberg, O.; Khalil, M.; Baars, M. Self-regulated learning and learning analytics in online learning environments. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, Frankfurt, Germany, 23–27 March 2020; ACM: New York, NY, USA, 2020; pp. 524–533. [Google Scholar]
- UN General Assembly. Transforming Our World: The 2030 Agenda for Sustainable Development. 2015. Available online: https://www.un.org/ga/search/view_doc.asp?symbol=A/RES/70/1&Lang=E (accessed on 22 April 2021).
- Webb, S.; Holford, J.; Hodge, S.; Milana, M.; Waller, R. Lifelong learning for quality education: Exploring the neglected aspect of sustainable development goal 4. Int. J. Lifelong Educ. 2017, 36, 509–511. [Google Scholar] [CrossRef]
- Romero, C.; Ventura, S. Educational data mining and learning analytics: An updated survey. Wires Data Min. Knowl. Discov. 2020, 10, e1355. [Google Scholar] [CrossRef]
- Blikstein, P. Using Learning Analytics to Assess Students’ Behavior in Open-Ended Programming Tasks. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, 27 February–1 March 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 110–116. [Google Scholar]
- Azcona, D.; Hsiao, I.-H.; Smeaton, A.F. Detecting students-at-risk in computer programming classes with learning analytics from students’ digital footprints. User Modeling User Adapt. Interact. 2019, 29, 759–788. [Google Scholar] [CrossRef]
- Chen, H.-M.; Nguyen, B.-A.; Yan, Y.-X.; Dow, C.-R. Analysis of Learning Behavior in an Automated Programming Assessment Environment: A Code Quality Perspective. IEEE Access 2020, 8, 167341–167354. [Google Scholar] [CrossRef]
- Reimann, P. Time is precious: Variable- and event-centred approaches to process analysis in CSCL research. Int. J. Comput. Supported Collab. Learn. 2009, 4, 239–257. [Google Scholar] [CrossRef]
- Knight, S.; Friend Wise, A.; Chen, B. Time for Change: Why Learning Analytics Needs Temporal Analysis. J. Learn. Anal. 2017, 4, 7–17. [Google Scholar] [CrossRef] [Green Version]
- Saqr, M.; Nouri, J.; Fors, U. Time to focus on the temporal dimension of learning: A learning analytics study of the temporal patterns of students’ interactions and self-regulation. Int. J. Technol. Enhanc. Learn. 2019, 11, 398. [Google Scholar] [CrossRef]
- Burnette, J.L.; O’Boyle, E.H.; VanEpps, E.M.; Pollack, J.M.; Finkel, E.J. Mind-sets matter: A meta-analytic review of implicit theories and self-regulation. Psychol. Bull. 2013, 139, 655–701. [Google Scholar] [CrossRef] [Green Version]
- van der Aalst, W. Process Mining. Acm Trans. Manag. Inf. Syst. 2012, 3, 1–17. [Google Scholar] [CrossRef]
- Pechenizkiy, M.; Trcka, N.; Vasilyeva, E.; Aalst, W.; De Bra, P. Process Mining Online Assessment Data. In Proceedings of the Nuclear Engineering and Design—NUCL ENG DES, Cordoba, Spain, 1–3 July 2009; pp. 279–288. [Google Scholar]
- Uzir, N.A.; Gašević, D.; Jovanović, J.; Matcha, W.; Lim, L.-A.; Fudge, A. Analytics of time management and learning strategies for effective online learning in blended environments. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, Frankfurt, Germany, 23–27 March 2020; ACM: New York, NY, USA, 2020; pp. 392–401. [Google Scholar]
- Gabadinho, A.; Ritschard, G.; Müller, N.S.; Studer, M. Analyzing and Visualizing State Sequences in R with TraMineR. J. Stat. Softw. 2011, 40, 1–37. [Google Scholar] [CrossRef] [Green Version]
- Matcha, W.; Gašević, D.; Uzir, N.A.; Jovanović, J.; Pardo, A. Analytics of Learning Strategies: Associations with Academic Performance and Feedback. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, Arizona, 4–8 March 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 461–470. [Google Scholar]
- Pereira, F.D.; Oliveira, E.H.T.; Oliveira, D.B.F.; Cristea, A.I.; Carvalho, L.S.G.; Fonseca, S.C.; Toda, A.; Isotani, S. Using learning analytics in the Amazonas: Understanding students’ behaviour in introductory programming. Br. J. Educ. Technol. 2020, 51, 955–972. [Google Scholar] [CrossRef]
- Filvà, D.A.; Forment, M.A.; García-Peñalvo, F.J.; Escudero, D.F.; Casañ, M.J. Clickstream for learning analytics to assess students’ behavior with Scratch. Future Gener. Comput. Syst. 2019, 93, 673–686. [Google Scholar] [CrossRef]
- Molenaar, I. Advances in temporal analysis in learning and instruction. Frontline Learn. Res. 2014, 2, 15–24. [Google Scholar] [CrossRef]
- Fields, D.A.; Quirke, L.; Amely, J.; Maughan, J. Combining Big Data and Thick Data Analyses for Understanding Youth Learning Trajectories in a Summer Coding Camp. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education, Memphis, TN, USA, 2–5 March 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 150–155. [Google Scholar]
- Watson, C.; Li, F.W.B.; Godwin, J.L. Predicting Performance in an Introductory Programming Course by Logging and Analyzing Student Programming Behavior. In Proceedings of the 2013 IEEE 13th International Conference on Advanced Learning Technologies, Beijing, China, 15–18 July 2013; IEEE: New York, NY, USA, 2013; pp. 319–323. [Google Scholar]
- Jiang, B.; Zhao, W.; Zhang, N.; Qiu, F. Programming trajectories analytics in block-based programming language learning. Interact. Learn. Environ. 2019, 1–14. [Google Scholar] [CrossRef]
- Quemada, J.; Barra, E.; Gordillo, A.; Pavon, S.; Salvachua, J.; Vazquez, I.; López-Pernas, S. AMMIL: A methodology for developing video-based learning courses. In Proceedings of the Proceedings of the 12th annual International Conference of Education, Research and Innovation (ICERI 2019), Seville, Spain, 11–13 November 2019; IATED: Seville, Spain, 2019; pp. 4893–4901. [Google Scholar]
- Barra, E.; López-Pernas, S.; Alonso, Á.; Sánchez-Rada, J.F.; Gordillo, A.; Quemada, J. Automated Assessment in Programming Courses: A Case Study during the COVID-19 Era. Sustainability 2020, 12, 7451. [Google Scholar] [CrossRef]
- Jovanović, J.; Gašević, D.; Dawson, S.; Pardo, A.; Mirriahi, N. Learning analytics to unveil learning strategies in a flipped classroom. Internet High. Educ. 2017, 33, 74–85. [Google Scholar] [CrossRef] [Green Version]
- Gašević, D.; Jovanović, J.; Pardo, A.; Dawson, S. Detecting Learning Strategies with Analytics: Links with Self-Reported Measures and Academic Performance. J. Learn. Anal. 2017, 4, 113–128. [Google Scholar] [CrossRef]
- Matcha, W.; Gašević, D.; Ahmad Uzir, N.; Jovanović, J.; Pardo, A.; Maldonado-Mahauad, J.; Pérez-Sanagustín, M. Detection of Learning Strategies: A Comparison of Process, Sequence and Network Analytic Approaches. In Lecture Notes in Computer Science; Scheffel, M., Broisin, J., Pammer-Schindler, V., Ioannou, A., Schneider, J., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2019; Volume 11722 LNCS, pp. 525–540. ISBN 9783030297350. [Google Scholar]
- Jovanovic, J.; Dawson, S.; Joksimovic, S.; Siemens, G. Supporting actionable intelligence: Reframing the analysis of observed study strategies. ACM Int. Conf. Proceeding Ser. 2020, 161–170. [Google Scholar] [CrossRef] [Green Version]
- Kovanović, V.; Gašević, D.; Joksimović, S.; Hatala, M.; Adesope, O. Analytics of communities of inquiry: Effects of learning technology use on cognitive presence in asynchronous online discussions. Internet High. Educ. 2015, 27, 74–89. [Google Scholar] [CrossRef] [Green Version]
- Peeters, W.; Saqr, M.; Viberg, O. Applying learning analytics to map students’ self-regulated learning tactics in an academic writing course. In Proceedings of the 28th International Conference on Computers in Education, Virtual conference on. 23–27 November 2020; Volume 1, pp. 245–254. [Google Scholar]
- Janssenswillen, G. bupaR: Business Process Analysis in R 2020. Available online: https://cran.r-project.org/package=bupaR (accessed on 1 April 2021).
- Mächler, M.; Rousseeuw, P.; Struyf, A.; Hubert, M.; Hornik, K. cluster: Cluster Analysis Basics and Extensions 2019. Available online: https://cran.r-project.org/package=cluster (accessed on 1 April 2021).
- Ostertagova, E.; Ostertag, O.; Kováč, J. Methodology and application of the Kruskal-Wallis test. Appl. Mech. Mater. 2014, 611, 115–120. [Google Scholar] [CrossRef]
- Holm, S. A Simple Sequentially Rejective Multiple Test Procedure. Scand. J. Stat. 1979, 6, 65–70. [Google Scholar]
- Gaur, E.; Saxena, V.; Singh, S.K. Video annotation tools: A Review. In Proceedings of the 2018 International Conference on Advances in Computing, Communication Control and Networking (ICACCCN), Greater Noida, India, 12–13 October 2018; pp. 911–914. [Google Scholar]
- Farhadi, B.; Ghaznavi-Ghoushchi, M.B. Creating a novel semantic video search engine through enrichment textual and temporal features of subtitled YouTube media fragments. In Proceedings of the International eConference on Computer and Knowledge Engineering (ICCKE 2013), Mashhad, Iran, 31 October–1 November 2013; pp. 64–72. [Google Scholar]
- Feng, Y.; Chen, D.; Chen, H.; Wan, C.; Xi, P. The assessment of the points reward mechanism in online course forum. In Proceedings of the 2016 International Conference on Progress in Informatics and Computing (PIC), Shanghai, China, 23–25 December 2016; pp. 722–727. [Google Scholar]
- Anderson, A.; Huttenlocher, D.; Kleinberg, J.; Leskovec, J. Engaging with Massive Online Courses. In Proceedings of the 23rd International Conference on World Wide Web, Seoul, Korea, 7–11 April 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 687–698. [Google Scholar]
- Schöbel, S.; Janson, A.; Jahn, K.; Kordyaka, B.; Turetken, O.; Djafarova, N.; Saqr, M.; Wu, D.; Söllner, M.; Adam, M.; et al. A Research Agenda for the Why, What, and How of Gamification Designs—Results on an ECIS 2019 Panel. Commun. Assoc. Inf. Syst. 2020. [Google Scholar] [CrossRef]
- Chonkar, S.P.; Ha, T.C.; Chu, S.S.H.; Ng, A.X.; Lim, M.L.S.; Ee, T.X.; Ng, M.J.; Tan, K.H. The predominant learning approaches of medical students. BMC Med. Educ. 2018, 18, 17. [Google Scholar] [CrossRef] [Green Version]
- Zeegers, P. Approaches to learning in science: A longitudinal study. Br. J. Educ. Psychol. 2001, 71, 115–132. [Google Scholar] [CrossRef]
- Chen, J.; Wang, M.; Kirschner, P.A.; Tsai, C.-C. The Role of Collaboration, Computer Use, Learning Environments, and Supporting Strategies in CSCL: A Meta-Analysis. Rev. Educ. Res. 2018, 88, 799–843. [Google Scholar] [CrossRef]
- Matcha, W.; Uzir, N.A.; Gasevic, D.; Pardo, A. A Systematic Review of Empirical Studies on Learning Analytics Dashboards: A Self-Regulated Learning Perspective. IEEE Trans. Learn. Technol. 2020, 13, 226–245. [Google Scholar] [CrossRef]
- Lopez-Pernas, S.; Gordillo, A.; Barra, E.; Quemada, J. Analyzing Learning Effectiveness and Students’ Perceptions of an Educational Escape Room in a Programming Course in Higher Education. IEEE Access 2019, 7, 184221–184234. [Google Scholar] [CrossRef]
- Lopez-Pernas, S.; Gordillo, A.; Barra, E.; Quemada, J. Escapp: A Web Platform for Conducting Educational Escape Rooms. IEEE Access 2021, 9, 38062–38077. [Google Scholar] [CrossRef]
Learning Action | Description |
---|---|
View assignment instructions | View the written instructions of a programming assignment including the grading rubric |
View course information | View course instructions and planning |
View forum post | View a forum post related to the assignments |
View sample exam | View a solved exam from a previous academic year |
View slideshow | View lesson slides including explanations and code examples |
Watch video | Watch a video lesson |
Write forum post | Write a forum post |
Learning Action | Description |
---|---|
Start assignment | Download the assignment |
Score F | Run the tests and get a score under 5 |
Score C | Run the tests and get a score between 5 and 7 |
Score B | Run the tests and get a score between 7 and 10 |
Score A | Run the tests and get a score of 10 |
Submit assignment | Submit the assignment to the LMS |
Context | Learning Action | 25% | Median | 75% | Total |
---|---|---|---|---|---|
LMS | View assignment instructions | 21.00 | 30.00 | 40.00 | 9381 |
View course information | 7.00 | 11.00 | 15.00 | 3391 | |
View forum post | 9.75 | 23.00 | 42.25 | 8744 | |
View sample exam | 3.00 | 8.00 | 14.00 | 2997 | |
View slideshow | 17.00 | 33.00 | 50.00 | 10,934 | |
Watch video | 10.00 | 17.00 | 30.00 | 6375 | |
Write forum post | 0.00 | 0.00 | 1.00 | 224 | |
Automated assessment tool | Start assignment | 7.00 | 8.00 | 9.00 | 2336 |
Score A | 23.00 | 32.00 | 40.00 | 9405 | |
Score B | 9.00 | 18.00 | 29.00 | 6848 | |
Score C | 6.00 | 13.00 | 20.00 | 4770 | |
Score F | 19.75 | 36.50 | 58.00 | 12,903 | |
Submit assignment | 11.00 | 12.00 | 14.00 | 3678 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
López-Pernas, S.; Saqr, M.; Viberg, O. Putting It All Together: Combining Learning Analytics Methods and Data Sources to Understand Students’ Approaches to Learning Programming. Sustainability 2021, 13, 4825. https://doi.org/10.3390/su13094825
López-Pernas S, Saqr M, Viberg O. Putting It All Together: Combining Learning Analytics Methods and Data Sources to Understand Students’ Approaches to Learning Programming. Sustainability. 2021; 13(9):4825. https://doi.org/10.3390/su13094825
Chicago/Turabian StyleLópez-Pernas, Sonsoles, Mohammed Saqr, and Olga Viberg. 2021. "Putting It All Together: Combining Learning Analytics Methods and Data Sources to Understand Students’ Approaches to Learning Programming" Sustainability 13, no. 9: 4825. https://doi.org/10.3390/su13094825
APA StyleLópez-Pernas, S., Saqr, M., & Viberg, O. (2021). Putting It All Together: Combining Learning Analytics Methods and Data Sources to Understand Students’ Approaches to Learning Programming. Sustainability, 13(9), 4825. https://doi.org/10.3390/su13094825