Content-Focused Formative Feedback Combining Achievement, Qualitative and Learning Analytics Data
Abstract
:1. Introduction
2. Theoretical References
2.1. Using Learning Analytics to Predict Student Achievement
2.2. Using Learning Analytics to Target Formative Feedback
- Always begins with the positive. Comments to students should first point out what students did well and recognize their accomplishments.
- Identify specific aspects of student performance that need to improve. Students need to know precisely where to focus their improvement efforts.
- Offer specific guidance and direction for making improvements. Students need to know what steps to take to make their product, performance or demonstration better and more aligned with established learning criteria.
- Express confidence in student abilities to achieve at the highest level. Students need to know their teachers believe in them, are on their side, see value in their work, and are confident they can achieve the specified learning goals.
3. Research Methodology
3.1. The Case Study
3.2. Data Collection and Analysis
3.3. First Round of Data Collection: Understanding Learning Challenges through LA and Students Interviews
3.4. Second Round of Data Collection: Analysing the Effects of Content-Focused Formative Feedback on Student Performance
4. Results
4.1. Phase 1: Learning Challenges
- Students want to receive feedback early and frequently.
“I think it’s vital to see [the progress] every week. So SLT was a good example that you go to SLT to check whether you have done it the right way. Furthermore, if you have not, then you say that you have not understood something and you have to go back. So I think weekly progress is a very good thing. Because if you see them only in the midterm, which is usually after a month, it’s too late to start trying to go back and do everything again”.
“For me personally it [the feedback] would be yes. Earliest possible, so I can try to face the other stuff within the eight weeks. Instead of struggling with the course not understanding what I’m missing.”
- Students need clarification on EM as an area of study and the prior related content knowledge needed for the course.
“In Electromechanics it was basically motors, which are an application of EM. So I thought maybe something on those lines would be in EMII, but it was more related to communication. So I think I misunderstood that.”
“If I knew [students] beforehand…[I would tell them] different aspects they are lacking or are not up to date with, so you can point them out to them. Well, you should probably read into this a little more or try to understand this because we’ll get a lot of this in this course, and it will help a lot if you know more about it.”
“When we do EMI in quartile 1 [first two months of the academic year], we are fresh. That is the knowledge, and we are in the flow and then in Quartile 2 and Quartile 3 we do not. Then, in Quartile 4 they kind of expect us to completely remember Maxwell’s equations and build on that. From a logistics perspective, if it’s possible [to move EMII to Quartile 2], I think that might increase student satisfaction/performance in the course.”
“It’s not like the knowledge was insufficient. We were taught the basics we needed in EMI, I would say, because the core mathematical concepts are divergence and gradient. These three concepts were really taught in succession in a week or a week and a half was given for each topic. So it was fine but then, you know, by the time Quartile 4 came, we were not too fresh with that knowledge and it was not the best.”
“I do not think there is a good chance of passing this course if you do not understand the principles behind the first course. So that is very essential. Of course there is a lot of calculus as well”
- Students reported having many misconceptions.
“For example, circuits or signals, do you think that really matters in this course? Yeah, for circuits at some point. yeah. What you have been taught is a bit wrong because when you look at it from a physics standpoint it works differently. So that is the relation to circuits and signals. Of course there is a very last part of electromagnetics that is related to antennas. So in some way it is related to signals as well.”
“I would say that the normal problems regarding bounce diagrams. Even though the problem itself is simple it left an impact on me because I realized that these are cultural flaws that we have been learning right from the start, right from high school. In fact, they are not completely correct. Because they can only be applied once everything is in steady state. Furthermore, to reach a steady state, we need those bouncing and reflections of the wave. So, those questions kind of helped me. Helped me understand that in real life, how the real phenomena works that we were thinking that everything is instantaneous, but these fields are traveling at the speed of light and, yeah, they reflect a lot.”
“It depends on how you define it. I guess you could define it either direction but it depends on how you define the time. Furthermore, I think, in my mind I would say, like, if the Time and displacement are opposite signs and you’re traveling in the positive direction I think. That is in the end but like in the electric was like he wrote the time vector with the opposite sign and he said, Oh yeah., but then it means it turns around and it was like, Okay wait. Oh, you can do that, right? Okay. It just depends on how you define the notation, for that.”
- Student interactions contributed to learning and to keeping up with the course pace.
“For the exercises, I just met up with a couple of other students and we did most of them together, like in a small group. Furthermore, the student led tutorials really helped not fall that much behind”
“I would recommend not wasting too much time on being stuck. If you’re too stuck for an hour or two, start asking questions to anyone, using the chat channel or write an email to the professor.”
- Revise reader, online lectures and Discord Q&A.
- Provide hints or questions or counterintuitive questions that help in interpreting the problem.
- (a)
- Breaking a problem into sub-problems;
- (b)
- Identifying what variables are provided.
“Start by seeing what variables you have. What kind of things can you do with them? Then, you can often see where I have four values from one machine and one value from the other. So I probably have to calculate everything from the first machine and then transfer it over to the second machine.”
“Understanding how to apply the concepts we have learned to that particular problem because it was quite extraordinary, which was which ones we should use and how to get the data. Apply the data they have provided to make, essentially, a circuit.”
- Understanding the exercise questions was a challenge for students.
“One of the questions felt [that] it missed some explanation. So it takes a bit more to really read and understand what [the teachers] want from you, and it takes time, and then you cannot really solve it in the exam. That is something [that occurred]. From many of the students, they really missed the drawing as the explanation was in writing and they really missed the drawing or schematic of how the problem looks like. So it was hard to do calculations from it, if you do not really have an understanding of what you need to calculate”
“If I’m stuck in an exercise, it will probably be things like trying to use these equations or just pointing out some intermediate steps. Explain which route you should take.”
“The easiest thing for me is to either point to some easier problem that has the same meaning or basis. Or say (point out that) you solved it already in the other question. So I know where to start to approach it.” (This was mentioned by two students).
- An understanding of how things work.
“I also enjoy physics and it’s a good mix between physics and electricity and understanding how the electrons and charges work from the physical standpoint. Not just from applications”
“We also had knowledge clips and additional material which we could explore in our free time or get even more explanations about some topics. That is the best way to attract students to a particular subject that you give more than it’s necessary.”
- Include the historical and social context of problems.
4.2. Phase 2: Developing the System
4.3. Phase 3: Analysis of the Intervention
- Offered ideas to study EMII;
- Motivated them to learn;
- Recovered prior knowledge to learn EMII;
- Provided specific learning strategies;
- Clarified the learning content needed for the course.
“It really motivated me whenever I would receive the email, it’s a small thing, but it’s really nice. Because of this I only missed 4 ticks [in SLTs], and the reason was that they were too hard, but I personally liked this project! Keep it up EM team!”
- Comparing the content of the email with their own learning practices;
- Working harder on SLTs;
- Returning to lectures and videos.
“I did most of the exercises. Preparing them does not mean that you know it or understand it. The feedback says “you are doing great” but does not address doubts.”
5. Discussions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Feedback E-mail Formats
- Details about EM2 as an area of knowledge
- Feedback based on your survey
- Evidence-based study strategies
- Details about EMII as an area of knowledge:
- Object of study:
- The relation between EMII with Electrical Engineering:
- What is new in this course:
- Relevance of EMII for engineers:
- Prior knowledge you will need:
Full Feedback E-mail Example
“I think the discussions in the SLT were the reason I kept up with the course, and extended my understanding of the material of the week by discussing it with others. Because if we didn’t have the SLTs, it would not have been discussed that much.”
References
- Bernacki, M.L.; Greene, M.J.; Lobczowski, N.G. A systematic review of research on personalized learning: Personalized by whom, to what, how, and for what purpose(s)? Educ. Psychol. Rev. 2021, 33, 1675–1715. [Google Scholar] [CrossRef]
- Akçapınar, G.; Altun, A.; Aşkar, P. Using learning analytics to develop early-warning system for at-risk students. Int. J. Educ. Technol. High. Educ. 2019, 16, 40. [Google Scholar] [CrossRef]
- Pardo, A.; Bartimote, K.; Buckingham Shum, S.; Dawson, S.; Gao, J.; Gašević, D.; Leichtweis, S.; Liu, D.; Martínez-Maldonado, R.; Mirriahi, N.; et al. OnTask: Delivering Data-Informed, Personalized Learning Support Actions. J. Learn. Anal. 2018, 5, 235–249. [Google Scholar] [CrossRef]
- Wise, A.F.; Knight, S.; Shum, S.B. Collaborative learning analytics. In International Handbook of Computer-Supported Collaborative Learning; Springer: Cham, Switzerland, 2021; pp. 425–443. [Google Scholar]
- Buniyamin, N.; bin Mat, U.; Arshad, P.M. Educational data mining for prediction and classification of engineering students achievement. In Proceedings of the 2015 IEEE 7th International Conference on Engineering Education (ICEED), Kanazawa, Japan, 17–18 November 2015; pp. 49–53. [Google Scholar]
- Conijn, R.; Snijders, C.; Kleingeld, A.; Matzat, U. Predicting student performance from LMS data: A comparison of 17 blended courses using Moodle LMS. IEEE Trans. Learn. Technol. 2016, 10, 17–29. [Google Scholar] [CrossRef]
- Greller, W.; Santally, M.I.; Boojhawon, R.; Rajabalee, Y.; Sungkur, R.K. Using learning analytics to investigate student performance in blended learning courses. Z. Hochschulentwicklung 2017. [Google Scholar] [CrossRef]
- Yilmaz, R.; Yurdugül, H.; Yilmaz, F.G.K.; Şahin, M.; Sulak, S.; Aydin, F.; Tepgeç, M.; Müftüoğlu, C.T.; Ömer, O. Smart MOOC integrated with intelligent tutoring: A system architecture and framework model proposal. Comput. Educ. Artif. Intell. 2022, 3, 100092. [Google Scholar] [CrossRef]
- Pishchukhina, O.; Allen, A. Supporting learning in large classes: Online formative assessment and automated feedback. In Proceedings of the 2021 30th Annual Conference of the European Association for Education in Electrical and Information Engineering (EAEEIE), Prague, Czech Republic, 1–4 September 2021; pp. 1–4. [Google Scholar]
- Wong, B.T.M.; Li, K.C. A review of learning analytics intervention in higher education (2011–2018). J. Comput. Educ. 2020, 7, 7–28. [Google Scholar] [CrossRef]
- Tempelaar, D.T.; Rienties, B.; Giesbers, B. In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Comput. Hum. Behav. 2015, 47, 157–167. [Google Scholar] [CrossRef]
- Banihashem, S.K.; Noroozi, O.; van Ginkel, S.; Macfadyen, L.P.; Biemans, H.J. A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educ. Res. Rev. 2022, 37, 100489. [Google Scholar] [CrossRef]
- Liz-Domínguez, M.; Rodríguez, M.C.; Nistal, M.L.; Mikic-Fonte, F.A. Predictors and early warning systems in higher education—A systematic literature review. In Proceedings of the LASI-SPAIN 2019, Vigo, Spain, 27–28 June 2019; pp. 84–99. [Google Scholar]
- Matcha, W.; Uzir, N.A.; Gašević, D.; Pardo, A. A systematic review of empirical studies on learning analytics dashboards: A self-regulated learning perspective. IEEE Trans. Learn. Technol. 2019, 13, 226–245. [Google Scholar] [CrossRef]
- Sundaramoorthy, P.; Vertegaal, C.; Martinez, C.; Serra, R.; Verhoeven, C.; Montagne, A.; Bentum, M. Extracting Learning Performance Indicators from Digital Learning Environments. In Proceedings of the 2022 IEEE Global Engineering Education Conference (EDUCON), Tunis, Tunisia, 28–31 March 2022; pp. 712–718. [Google Scholar]
- Greller, W.; Drachsler, H. Translating learning into numbers: A generic framework for learning analytics. J. Educ. Technol. Soc. 2012, 15, 42–57. [Google Scholar]
- Kattoua, T.; Al-Lozi, M.; Alrowwad, A. A review of literature on E-learning systems in higher education. Int. J. Bus. Manag. Econ. Res. 2016, 7, 754–762. [Google Scholar]
- Wilson, A.; Watson, C.; Thompson, T.L.; Drew, V.; Doyle, S. Learning analytics: Challenges and limitations. Teach. High. Educ. 2017, 22, 991–1007. [Google Scholar] [CrossRef]
- Winne, P.H. Students’ calibration of knowledge and learning processes: Implications for designing powerful software learning environments. Int. J. Educ. Res. 2004, 41, 466–488. [Google Scholar] [CrossRef]
- Plak, S.; Cornelisz, I.; Meeter, M.; van Klaveren, C. Early warning systems for more effective student counselling in higher education: Evidence from a Dutch field experiment. High. Educ. Q. 2022, 76, 131–152. [Google Scholar] [CrossRef]
- Faria, A.M.; Sorensen, N.; Heppen, J.; Bowdon, J.; Eisner, R. Getting Students on Track for Graduation: Impact of the Early Warning Intervention and Monitoring System after One Year. Soc. Res. Educ. Eff. 2018. [Google Scholar]
- Mayse, S. Elements of Early Warning Systems: Identifying Students Who Need Support Early Enough to Change Their Trajectory Towards Graduation. Culminating Experience Projects: 210. Master’s Thesis, Grand Valley State University, Allendale, MI, USA, 2022. [Google Scholar]
- Howard, E.; Meehan, M.; Parnell, A. Contrasting prediction methods for early warning systems at undergraduate level. Internet High. Educ. 2018, 37, 66–75. [Google Scholar] [CrossRef]
- Jokhan, A.; Sharma, B.; Singh, S. Early warning system as a predictor for student performance in higher education blended courses. Stud. High. Educ. 2018, 44, 1900–1911. [Google Scholar] [CrossRef]
- Atif, A.; Richards, D.; Bilgin, A. Student Preferences and Attitudes to the Use of Early Alerts. In Proceedings of the Twenty-first Americas Conference on Information Systems, Fajardo, Puerto Rico, 13–15 August 2015; Volume 8. [Google Scholar]
- Morrow, I.J. Manuel London. The Power of Feedback: Giving, Seeking, and Using Feedback for Performance Improvement. New York, NY: Routledge, 2015, 209 pages, $52.95 paperback. Pers. Psychol. 2016, 69, 513–516. [Google Scholar] [CrossRef]
- Newman, M.; Kwan, I.; Schucan Bird, K.; Hoo, H.T. The Impact of Feedback on Student Attainment: A Systematic Review; Education Endowment Foundation: London, UK, 2021. [Google Scholar]
- Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
- Gasevic, D.; Tsai, Y.; Dawson, S.; Pardo, A. How do we start? An approach to learning analytics adoption in higher education. Int. J. Inf. Learn. Technol. 2019, 36, 342–353. [Google Scholar] [CrossRef]
- Ifenthaler, D.; Widanapathirana, C. Development and validation of a learning analytics framework: Two case studies using support vector machines. Technol. Knowl. Learn. 2014, 19, 221–240. [Google Scholar] [CrossRef]
- Jivet, I.; Scheffel, M.; Specht, M.; Drachsler, H. License to evaluate: Preparing learning analytics dashboards for educational practice. In Proceedings of the ACM International Conference Proceeding Series, Association for Computing Machinery, Sydney, Australia, 7–9 March 2018; pp. 31–40. [Google Scholar] [CrossRef]
- Ryan, T.; Gašević, D.; Henderson, M. Identifying the impact of feedback over time and at scale: Opportunities for learning analytics. In The Impact of Feedback in Higher Education: Improving Assessment Outcomes for Learners; Palgrave Macmillan: Cham, Switzerland, 2019; pp. 207–223. [Google Scholar] [CrossRef]
- Cavalcanti, A.P.; Barbosa, A.; Carvalho, R.; Freitas, F.; Tsai, Y.S.; Gašević, D.; Mello, R.F. Automatic feedback in online learning environments: A systematic literature review. Comput. Educ. Artif. Intell. 2021, 2, 100027. [Google Scholar] [CrossRef]
- Wisniewski, B.; Zierer, K.; Hattie, J. The Power of Feedback Revisited: A Meta-Analysis of Educational Feedback Research. Front. Psychol. 2020, 10, 3087. [Google Scholar] [CrossRef]
- Lim, L. What changes, and for whom? A study of the impact of learning analytics-based process feedback in a large course. Learn Instr. 2021, 72, 101202. [Google Scholar] [CrossRef]
- Afzaal, M. Explainable AI for Data-Driven Feedback and Intelligent Action Recommendations to Support Students Self-Regulation. Front. Artif. Intell. 2021, 4, 723447. [Google Scholar] [CrossRef]
- Schaaf, M. Improving workplace-based assessment and feedback by an E-portfolio enhanced with learning analytics. Educ. Technol. Res. Dev. 2017, 65, 359–380. [Google Scholar] [CrossRef]
- Pardo, A.; Jovanovic, J.; Dawson, S.; Gašević, D.; Mirriahi, N. Using learning analytics to scale the provision of personalised feedback. Br. J. Educ. Technol. 2019, 50, 128–138. [Google Scholar] [CrossRef]
- Guskey, T.R. Grades versus comments: Research on student feedback. Phi Delta Kappan 2019, 101, 42–47. [Google Scholar] [CrossRef]
- Gedye, S. Formative assessment and feedback: A review. Planet 2010, 23, 40–45. [Google Scholar] [CrossRef]
- Miller, T. Formative computer-based assessment in higher education: The effectiveness of feedback in supporting student learning. Assess. Eval. High. Educ. 2009, 34, 181–192. [Google Scholar] [CrossRef]
- Tsai, Y.; Mello, R.; Jovanović, J.; Gašević, D. Student appreciation of data-driven feedback: A pilot study on OnTask. In Proceedings of the 11th International Learning Analytics and Knowledge Conference, Irvine, CA, USA, 12–16 April 2021; pp. 511–517. [Google Scholar] [CrossRef]
- Yilmaz, R.; Yilmaz, F.G.K. Augmented intelligence in programming learning: Examining student views on the use of ChatGPT for programming learning. Comput. Hum. Behav. Artif. Hum. 2023, 1, 100005. [Google Scholar] [CrossRef]
- Yilmaz, R.; Yilmaz, F.G.K. The effect of generative artificial intelligence (AI)-based tool use on students’ computational thinking skills, programming self-efficacy and motivation. Comput. Educ. Artif. Intell. 2023, 4, 100147. [Google Scholar] [CrossRef]
- Zheng, L.; Niu, J.; Zhong, L. Effects of a learning analytics-based real-time feedback approach on knowledge elaboration, knowledge convergence, interactive relationships and group performance in CSCL. Br. J. Educ. Technol. 2022, 53, 130–149. [Google Scholar] [CrossRef]
- Ustun, A.; Zhang, K.; Karaoğlan-Yilmaz, F.; Yilmaz, R. Learning analytics based feedback and recommendations in flipped classrooms: An experimental study in higher education. J. Res. Technol. Educ. 2022, 55, 841–857. [Google Scholar] [CrossRef]
- Yılmaz, R. Enhancing community of inquiry and reflective thinking skills of undergraduates through using learning analytics-based process feedback. J. Comput. Assist. Learn. 2020, 36, 909–921. [Google Scholar] [CrossRef]
- Topali, P.; Chounta, I.; Martínez-Monés, A.; Dimitriadis, Y. Delving into instructor-led feedback interventions informed by learning analytics in massive open online courses. J. Comput. Assist. Learn. 2023, 39, 1039–1060. [Google Scholar] [CrossRef]
- Anderson, T.; Shattuck, J. Design-based research: A decade of progress in education research? Educ. Res. 2012, 41, 16–25. [Google Scholar] [CrossRef]
- Serra, R.; Martinez, C.; Vertegaal, C.J.C.; Sundaramoorthy, P.; Bentum, M.J. Using Student-Led Tutorials to Improve Student Performance in Challenging Courses. IEEE Trans. Educ. 2023, 66, 339–349. [Google Scholar] [CrossRef]
- Chabay, R.; Sherwood, B. Brief electricity and magnetism assessment. Phys. Rev. Spec.-Top.-Phys. Educ. Res. 2006, 2, 7–13. [Google Scholar]
- McColgan, M.W.; Finn, R.A.; Broder, D.L.; Hassel, G.E. Assessing students’ conceptual knowledge of electricity and magnetism. Phys. Rev. Phys. Educ. Res. 2017, 13, 020121. [Google Scholar] [CrossRef]
- Maloney, D.; O’Kuma, T.; Hieggelke, C.; Heuvelen, A. Surveying students’ conceptual knowledge of electricity & magnetism. Am. J. Phys. 2001, 69, S12–S23. [Google Scholar] [CrossRef]
- Hallberg, L.R. The “core category” of grounded theory: Making constant comparisons. Int. J. Qual. Stud. Health Well-Being 2006, 1, 141–148. [Google Scholar] [CrossRef]
- Notaros, B. Using Conceptual Questions in Electromagnetics Education [Education Corner]. IEEE Antennas Propag. Mag. 2021, 63, 128–137. [Google Scholar] [CrossRef]
- Ulaby, F.; Hauck, B. Undergraduate electromagnetics laboratory: An invaluable part of the learning process. Proc. IEEE 2000, 88, 55–62. [Google Scholar] [CrossRef]
2021–2022: with Feedback | 2021–2022: without Feedback | 2022–2023: with Feedback | 2022–2023: without Feedback | |
---|---|---|---|---|
Sample | 94 | 74 | 112 | 86 |
Dropped out | 10 (11%) | 14 (19%) | 13 (12%) | 15 (17%) |
Took pretest | 44 (47%) | 10 (14%) | 56 (50%) | 9 (10%) |
Reached 60% SLT participation | 87 (93%) | 61 (82%) | 101 (90%) | 76 (88%) |
Passing students | 54 (57%) | 27 (36%) | 73 (65%) | 43 (50%) |
Average grade | 5.45 | 4.67 | 6.05 | 5.73 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Martinez, C.; Serra, R.; Sundaramoorthy, P.; Booij, T.; Vertegaal, C.; Bounik, Z.; van Hastenberg, K.; Bentum, M. Content-Focused Formative Feedback Combining Achievement, Qualitative and Learning Analytics Data. Educ. Sci. 2023, 13, 1014. https://doi.org/10.3390/educsci13101014
Martinez C, Serra R, Sundaramoorthy P, Booij T, Vertegaal C, Bounik Z, van Hastenberg K, Bentum M. Content-Focused Formative Feedback Combining Achievement, Qualitative and Learning Analytics Data. Education Sciences. 2023; 13(10):1014. https://doi.org/10.3390/educsci13101014
Chicago/Turabian StyleMartinez, Cecilia, Ramiro Serra, Prem Sundaramoorthy, Thomas Booij, Cornelis Vertegaal, Zahra Bounik, Kevin van Hastenberg, and Mark Bentum. 2023. "Content-Focused Formative Feedback Combining Achievement, Qualitative and Learning Analytics Data" Education Sciences 13, no. 10: 1014. https://doi.org/10.3390/educsci13101014
APA StyleMartinez, C., Serra, R., Sundaramoorthy, P., Booij, T., Vertegaal, C., Bounik, Z., van Hastenberg, K., & Bentum, M. (2023). Content-Focused Formative Feedback Combining Achievement, Qualitative and Learning Analytics Data. Education Sciences, 13(10), 1014. https://doi.org/10.3390/educsci13101014