A Systematic Review of the Role of Learning Analytics in Supporting Personalized Learning
Abstract
:1. Introduction
2. Methodology
- RQ1: How can LA be leveraged to support personalized learning?
- RQ2: What are the challenges in leveraging LA in personalized learning?
3. Findings and Discussion
3.1. The Way That LA Supports Personalized Learning (RQ1)
3.1.1. Extracted Analytics
- Individual Level
- Classroom/Group Level
- Structural Level
3.1.2. Embedded Analytics
3.2. The Challenges of LA in Personalized Learning (RQ2)
- Accuracy
- Privacy
- Fairness
- Opportunity Cost
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Chatti, M.A.; Jarke, M.; Specht, M. The 3P Learning Model. Educ. Technol. Soc. 2010, 13, 74–85. [Google Scholar]
- Makhambetova, A.; Zhiyenbayeva, N.; Ergesheva, E. Personalized Learning Strategy as a Tool to Improve Academic Performance and Motivation of Students. Int. J. Web-Based Learn. Teach. Technol. 2021, 16, 1–17. [Google Scholar] [CrossRef]
- Fung, C.Y.; Abdullah, M.N.L.Y.; Hashim, S. Improving Self-regulated Learning through personalized weekly e-Learning Journals: A time series quasi-experimental study. E-J. Bus. Educ. Scholarsh. Teach. 2019, 13, 30–45. [Google Scholar]
- Ali, A. Exploring the Transformative Potential of Technology in Overcoming Educational Disparities. Int. J. Multidiscip. Sci. Arts 2023, 2, 107–117. [Google Scholar]
- Li, K.C.; Wong, B.T.-M. Features and trends of personalised learning: A review of journal publications from 2001 to 2018. Interact. Learn. Environ. 2021, 29, 182–195. [Google Scholar] [CrossRef]
- Gross, B.; Tuchman, S.; Patrick, S. A National Landscape Scan of Personalized Learning in K-12 Education in the United States. iNACOL, 2018. Available online: https://eric.ed.gov/?id=ED589851 (accessed on 1 May 2023).
- Yuyun, I.; Suherdi, D. Components and Strategies for Personalized Learning in Higher Education: A Systematic Review. In Proceedings of the 20th AsiaTEFL-68th TEFLIN-5th iNELTAL Conference (ASIATEFL 2022), Malang, Java, Indonesia, 5–7 August 2022; Atlantis Press: Dordrecht, The Netherlands, 2023; pp. 271–290. [Google Scholar] [CrossRef]
- Deci, E.L.; Ryan, R.M. The “What” and “Why” of Goal Pursuits: Human Needs and the Self-Determination of Behavior. Psychol. Inq. 2000, 11, 227–268. [Google Scholar] [CrossRef]
- Yeager, S.; Dweck, C. Mindsets that promote resilience: When students believe that per-sonal characteristics can be developed. Educ. Psychol. 2012, 47, 302–314. [Google Scholar] [CrossRef]
- Bulger, M. Personalized Learning: The Conversations We’re Not Having. Data Soc. 2016, 22, 1–29. [Google Scholar]
- Roschelle, J. Learning by Collaborating: Convergent Conceptual Change. J. Learn. Sci. 1992, 2, 235–276. [Google Scholar] [CrossRef]
- Hardison, H. How Teachers Spend Their Time: A Breakdown. Available online: https://www.edweek.org/teaching-learning/how-teachers-spend-their-time-a-breakdown/2022/04 (accessed on 19 April 2023).
- Brown, M. Student to Teacher Ratio in High Schools. Get Access to the World’s Best Online Private Tutors with Learner. Available online: https://www.learner.com/blog/student-to-teacher-ratio-in-high-schools#:~:text=billion%20(Technavio).-,What%20Is%20the%20Average%20Student%2DTeacher%20Ratio%20in%20the%20United,education%20codes%2C%20and%20grade%20level (accessed on 2 November 2022).
- Conole, G.; Gašević, D.; Long, P.; Siemens, G. Message from the LAK 2011 General & Program Chairs. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada, 27 February–1 March 2011; Association for Computing Machinery: New York, NY, USA, 2011. [Google Scholar] [CrossRef]
- Mangaroska, K.; Vesin, B.; Giannakos, M. Cross-platform analytics: A step towards personalization and adaptation in education. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, AZ, USA, 4–8 March 2019; pp. 71–75. [Google Scholar]
- Khor, E.T.; Looi, C.K. A Learning Analytics Approach to Model and Predict Learners’ Success in Digital Learning. In Proceedings of the 36th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education, Singapore, 2–5 December 2019; pp. 476–480. [Google Scholar] [CrossRef]
- Anderson, J.R.; Corbett, A.T.; Koedinger, K.R.; Pelletier, R. Cognitive Tutors: Lessons Learned. J. Learn. Sci. 1995, 4, 167–207. [Google Scholar] [CrossRef]
- Sleeman, D.; Brown, J.S. Intelligent Tutoring Systems; Academic Press: Cambridge, MA, USA, 1982. [Google Scholar]
- Siemens, G.; Baker, R.S.J.D. Learning analytics and educational data mining. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, BC, Canada, 29 April–2 May 2012. [Google Scholar]
- Mangaroska, K.; Giannakos, M.N. Learning Analytics for Learning Design: A Systematic Literature Review of Analytics-Driven Design to Enhance Learning. IEEE Trans. Learn. Technol. 2018, 12, 516–534. [Google Scholar] [CrossRef]
- Carroll, C. CASP Selection Criteria and Completion. Social Care Institute for Excellence. Available online: https://www.scie.org.uk/publications/briefings/methodology/files/methodology-casp.pdf?res=true (accessed on 7 October 2004).
- Leitner, P.; Khalil, M.; Ebner, M. Learning analytics in higher education—A literature review. In Learning Analytics: Fundaments, Applications, and Trends: A View of the Current State of the Art to Enhance E-Learning; Springer International Publishing: Cham, Switzerland, 2017; pp. 1–23. [Google Scholar]
- Sousa, E.; Mello, R. Learning analytics to support teachers in the challenge of overcoming the learning gaps in k-12 students. In Proceedings of the Doctoral Consortium of Seventeenth European Conference on Technology Enhanced Learning, Toulouse, France, 12–16 September 2022; pp. 1–7. [Google Scholar]
- Ruiperez-Valiente, J.A.; Gomez, M.J.; Martinez, P.A.; Kim, Y.J. Ideating and Developing a Visualization Dashboard to Support Teachers Using Educational Games in the Classroom. IEEE Access 2021, 9, 83467–83481. [Google Scholar] [CrossRef]
- Vahdat, M.; Oneto, L.; Ghio, A.; Anguita, D. Advances in Learning Analytics and Educational Data Mining. In Proceedings of the European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Bruges, Belgium, 22–23 April 2015; pp. 297–306. [Google Scholar]
- Kleinman, E.; Shergadwala, M.; Teng, Z.; Villareale, J.; Bryant, A.; Zhu, J.; El-Nasr, M.S. Analyzing Students’ Problem-Solving Sequences. J. Learn. Anal. 2022, 9, 138–160. [Google Scholar] [CrossRef]
- Kurilovas, E. Advanced machine learning approaches to personalise learning: Learning analytics and decision making. Behav. Inf. Technol. 2018, 38, 410–421. [Google Scholar] [CrossRef]
- Adebodun, M. The Predictive Value of Academic Analytics and Learning Analytics for Stu-dents’ Academic Success in Higher Education. Doctoral dissertation, Texas Southern University, Houston, TX, USA, 2020. [Google Scholar]
- Alshehri, M.; Alamri, A.; Cristea, A.I. Predicting Certification in MOOCs Based on Students’ Weekly Activities. In Proceedings of the Intelligent Tutoring Systems: 17th International Conference, ITS 2021, Virtual Event, 7–11 June 2021; Proceedings 17. Springer International Publishing: Berlin/Heidelberg, Germany, 2021; pp. 173–185. [Google Scholar]
- Alwadei, A. Adaptive Learning Analytics: Understanding Student Learning Behavior and Predicting Academic Success. Doctoral Dissertation, University of Illinois, Chicago, IL, USA, 2019. [Google Scholar]
- Heo, J.; Chung, K.-M.; Yoon, S.; Yun, S.B.; Ma, J.W.; Ju, S. Spatial-Data-Driven Student Characterization in Higher Education. In Proceedings of the 1st ACM SIGSPATIAL Workshop on Prediction of Human Mobility, Redondo Beach, CA, USA, 7–10 November 2017; pp. 1–4. [Google Scholar]
- Pardo, A.; Mirriahi, N.; Martinez-Maldonado, R.; Jovanovic, J.; Dawson, S.; Gašević, D. Generating actionable predictive models of academic performance. In Proceedings of the sixth international conference on learning analytics & knowledge, Edinburgh, UK, 25–29 April 2016; pp. 474–478. [Google Scholar]
- Sweta, S.; Mahato, S.; Pathak, L.K. Prediction of Learner’s Performance in Adaptive E-Learning System using Learning Analytics. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2021; Volume 1049, p. 012006. [Google Scholar] [CrossRef]
- Cen, L.; Ruta, D.; Ng, J. Big education: Opportunities for Big Data analytics. In Proceedings of the 2015 IEEE international conference on digital signal processing (DSP), Singapore, 21–24 July 2015; pp. 502–506. [Google Scholar]
- Antonova, A.; Bontchev, B. Designing Smart Services to Support Instructors to Create Personalized and Adaptable Video Games for Learning. Educ. Res. Inf. Soc. 2022, 3372, 9–16. [Google Scholar]
- Wen, Y.; Song, Y. Learning Analytics for Collaborative Language Learning in Classrooms: From the Holistic Perspective of Learning Analytics, Learning Design and Teacher Inquiry. Educ. Technol. Soc. 2021, 24, 1–15. [Google Scholar]
- Saqr, M.; Nouri, J.; Jormanainen, I. A Learning Analytics Study of the Effect of Group Size on Social Dynamics and Performance in Online Collaborative Learning. In Proceedings of the European Conference on Technology Enhanced Learning, Delft, The Netherlands, 16–19 September 2019; Springer International Publishing: Cham, Switzerland, 2019; pp. 466–479. [Google Scholar]
- Llurba, C.; Fretes, G.; Palau, R. Pilot study of real-time Emotional Recognition technology for Secondary school students. Interact. Des. Arch. 2022, 52, 61–80. [Google Scholar] [CrossRef]
- Troussas, C.; Krouska, A.; Virvou, M. Using a Multi Module Model for Learning Analytics to Predict Learners’ Cognitive States and Provide Tailored Learning Pathways and Assessment. In Machine Learning Paradigms; Virvou, M., Alepis, E., Tsihrintzis, G., Jain, L., Eds.; Intelligent Systems Reference Library; Springer: Cham, Switzerland, 2020; Volume 158. [Google Scholar] [CrossRef]
- Colasante, M.; Bevacqua, J.; Muir, S. Flexible hybrid format in university curricula to offer students in-subject choice of study mode: An educational design research project. J. Univ. Teach. Learn. Pract. 2020, 17, 119–136. [Google Scholar] [CrossRef]
- Chen, L.; Yoshimatsu, N.; Goda, Y.; Okubo, F.; Taniguchi, Y.; Oi, M.; Konomi, S.; Shimada, A.; Ogata, H.; Yamada, M. Direction of collaborative problem solving-based STEM learning by learning analytics approach. Res. Pract. Technol. Enhanc. Learn. 2019, 14, 24. [Google Scholar] [CrossRef]
- Coussement, K.; Phan, M.; De Caigny, A.; Benoit, D.F.; Raes, A. Predicting student dropout in subscription-based online learning environments: The beneficial impact of the logit leaf model. Decis. Support Syst. 2020, 135, 113325. [Google Scholar] [CrossRef]
- Moltudal, S.H.; Krumsvik, R.J.; Høydal, K.L. Adaptive Learning Technology in Primary Education: Implications for Professional Teacher Knowledge and Classroom Management. Front. Educ. 2022, 7, 830536. [Google Scholar] [CrossRef]
- Niemelä, P.; Silverajan, B.; Nurminen, M.; Hukkanen, J.; Järvinen, H.-M. LAOps: Learning Analytics with Privacy-aware MLOps. In Proceedings of the 14th International Conference on Computer Supported Education, Virtual Event, 22–24 April 2022. [Google Scholar]
- Meacham, S.; Pech, V.; Nauck, D. AdaptiveVLE: An Integrated Framework for Personalized Online Education Using MPS JetBrains Domain-Specific Modeling Environment. IEEE Access 2020, 8, 184621–184632. [Google Scholar] [CrossRef]
- Roberts, L.D.; Howell, J.A.; Seaman, K. Give Me a Customizable Dashboard: Personalized Learning Analytics Dashboards in Higher Education. Technol. Knowl. Learn. 2017, 22, 317–333. [Google Scholar] [CrossRef]
- Blumenstein, M. Synergies of Learning Analytics and Learning Design: A Systematic Review of Student Outcomes. J. Learn. Anal. 2020, 7, 13–32. [Google Scholar] [CrossRef]
- Kumar, K.; Vivekanandan, V. Advancing learning through smart learning analytics: A review of case studies. Asian Assoc. Open Univ. J. 2018, 13, 1–12. [Google Scholar] [CrossRef]
- Vesin, B.; Mangaroska, K.; Giannakos, M. Learning in smart environments: User-centered design and analytics of an adaptive learning system. Smart Learn. Environ. 2018, 5, 24. [Google Scholar] [CrossRef]
- Wilson, A.; Watson, C.; Thompson, T.L.; Drew, V.; Doyle, S. Learning analytics: Challenges and limitations. Teach. High. Educ. 2017, 22, 991–1007. [Google Scholar] [CrossRef]
- Banihashem, S.K.; Aliabadi, K.; Pourroostaei Ardakani, S.; Delaver, A.; Nili Ahmadabadi, M. Learning analytics: A systematic literature review. Interdiscip. J. Virtual Learn. Med. Sci. 2018, 9, 1–10. [Google Scholar]
- Reimers, G.; Neovesky, A. Student Focused Dashboards—An Analysis of Current Student Dashboards and What Students Really Want. In Proceedings of the 7th International Conference on Computer Supported Education, Lisbon, Portugal, 23–25 May 2015; pp. 399–404. [Google Scholar]
- Rubel, A.; Jones, K.M.L. Student privacy in learning analytics: An information ethics perspective. Inf. Soc. 2016, 32, 143–159. [Google Scholar] [CrossRef]
- Wintrup, J. Higher Education’s Panopticon? Learning Analytics, Ethics and Student Engagement. High. Educ. Policy 2017, 30, 87–103. [Google Scholar] [CrossRef]
- Uttamchandani, S.; Quick, J. An introduction to fairness, absence of bias, and equity in learning analytics. In The Handbook of Learning Analytics; Society for Learning Analytics research: Alberta, CA, USA, 2022; pp. 205–212. [Google Scholar] [CrossRef]
- Riazy, S.; Simbeck, K.; Schreck, V. Fairness in Learning Analytics: Student At-risk Prediction in Virtual Learning Environments. In Proceedings of the 12th International Conference on Computer Supported Education, Prague, Czech Republic, 2–4 May 2020; pp. 15–25. [Google Scholar]
- Gardner, J.; Brooks, C.; Baker, R. Evaluating the Fairness of Predictive Student Models Through Slicing Analysis. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, AZ, USA, 4–8 March 2019; pp. 225–234. [Google Scholar]
- Bayer, V.; Hlosta, M.; Fernandez, M. Learning analytics and fairness: Do existing algorithms serve everyone equally? In International Conference on Artificial Intelligence in Education; Springer International Publishing: Cham, Switzerland, 2021; pp. 71–75. [Google Scholar]
- Knight, S.; Shibani, A.; Shum, S.B. A reflective design case of practical micro-ethics in learning analytics. Br. J. Educ. Technol. 2023, 54, 1837–1857. [Google Scholar] [CrossRef]
- Alamri, H.A.; Watson, S.; Watson, W. Learning Technology Models that Support Personalization within Blended Learning Environments in Higher Education. TechTrends 2020, 65, 62–78. [Google Scholar] [CrossRef]
- Carlson, E. Rethinking Learning Design: Learning Analytics to Support Instructional Scaf-folding in International Schools. Doctoral Dissertation, University of West Georgia, Carroll County, GA, USA, 2020. [Google Scholar]
Inclusion | Exclusion |
---|---|
Empirical studies | Non-empirical studies |
Studies that address educational practices | Studies that do not address educational practices |
Studies with full text | Studies without full text |
Studies that were published after 2015 | Studies that were published before 2015 |
Studies there were published in English | Studies there were not published in English |
Quality Test (QT) | Question |
---|---|
QT1 | Does the study have results related to the research questions? |
QT2 | Is there a clear statement of the research problem? |
QT3 | Does the study clearly determine the research methods? |
QT4 | Is there a clear statement of findings? |
Extracted Analytics | Articles | Count |
---|---|---|
Individual Level | [15,23,24,25,26,27,28,29,30,31,32,33,34] | 13 |
Classroom/Group Level | [23,24,35,36,37] | 5 |
Structural Level | [38,39,40,41,42] | 5 |
Article | Data | Methods |
---|---|---|
[15] | Examples, challenges, and coding exercises | A cross-platform architecture that incorporates and makes use of analytics was used to gather rich, real-world data from connected learning spaces, harmonize it, and present it visually as a tool for students and teachers. This could then increase teachers’ and students’ awareness of their own behavior and the settings in which learning takes place. |
[23] | Educational data generated from digital tools (Google Classroom and Khan Academy) | Learning Analytics Dashboard (LAD) was created using information gathered from online resources like Khan Academy and Google Classroom. LAD assists teachers in locating, tracking, and suggesting solutions to address the learning gaps of their students. The data produced by student usage of digital tools can be utilized to track, evaluate, forecast, intervene, suggest, and enhance the effectiveness of teaching and learning. |
[24] | Students’ interactions with the game (how many puzzles they solve, how long they spend in the game world, and the events that each student creates) | Through a visualization dashboard, teachers access real-time analytics about how their students are interacting with the Shadowspect game. The dashboard assists teachers in keeping tabs on the general operation of the classroom as well as the progress of individual students, identifying problem areas, and giving each student tailored feedback. The dashboard also assists teachers in changing their instructional approaches, which can result in more efficient and personalised learning opportunities for students. |
[25] | Programme for International Student Assessment (PISA) data and student population | LA and Educational Data Mining were applied to collect empirical data on the various variables that can impact learning and to apply computational methods to characterize, identify, and comprehend the preconditions of effective learning. This facilitates the customization and adaptation of technology enhanced learning systems to enhance instructional design and pedagogical decisions depending on the needs of students. |
[26] | Data were collected from a learning game called Parallel that was equipped to record students’ actions at a fine-grained action-to-action level as they completed a task or problem. | Human-in-the-loop method was used to sequence analysis to enhance personalized learning with LA. By visualizing the results of a clustering algorithm, this method enables stakeholders to better comprehend the data and the methodology. A deeper understanding of the algorithm’s interpretation of the sequences is also made possible by the displayed output, which can be utilized to modify the algorithm’s settings. The findings indicate that clustered sequences that more closely resemble an expert’s assessment of the data are produced when stakeholders are given the opportunity to examine the displayed sequences and iteratively modify the algorithm. In the end, this strategy can support the identification of underperforming students and the creation of suitable and efficient personalized learning settings. |
[27] | Data on students’ behaviors in learning environments | Decision-making models were used to assess the applicability, acceptance, and use of personalized learning units in virtual learning environments. The methodology uses probabilistic appropriateness indices, the educational technology acceptance and satisfaction model, and multiple criteria decision analysis to determine whether learning components are appropriate for a given student’s needs in accordance with their learning preferences. |
[28] | The websites of the individual universities were used to compile the archival (secondary) data used in the study. The information was gathered from a sample of universities’ websites, and the accuracy of the information was determined by the websites’ content | Social networks adapting pedagogical practice (SNAPP) and Graphical Interactive Student Monitoring (GISMO) were used to gather information about students and their environments, providing both students and teachers the feedback. Students’ motivation may rise due to the feedback’s ability to control their work and make them aware of their progress. |
[29] | FutureLearn MOOC platform data (unique IDs and time stamps with students, weekly-based step visits, completions, comments added, and attempted questions) | Student behaviors were utilized to forecast whether they would decide to enroll in a MOOC and receive course certification. This forecast can aid in planning future runs and determining their profitability. The analysis may also reveal what personalization options could be offered for students. |
[30] | Usage data, participation data, response time-related measures, navigation logs | An adaptive platform was used to gather information on the contexts of the students. By contrasting the learning outcomes of students who received education using adaptive media with those who received traditional instruction, the efficacy of the adaptive intervention was determined. Students using the adaptive platform performed better academically than those receiving traditional education. |
[31] | Student-related datasets, such as traditional questionnaire surveys, and student activity log data from LMS are examined as well as unstructured datasets like SNS activities, text data, and other transactional data | Data mining and machine learning algorithms were utilized to identify key features of students based on student data. A brand-new paradigm for research on student characterization that is driven by spatial data was provided to characterize students and forecast their outcomes. |
[32] | Students’ interactions with online activities | A predictive model was developed using a recursive partitioning technique that can recognize student sub-populations based on their anticipated exam results. This model employs indications that are directly drawn from the learning design to categorize students and give each subset of students tailored feedback. The methodology can assist teachers in creating personalised feedback for various student sub-populations and pedagogical interventions that are informed by data. |
[33] | Student activities and the learning process throughout the course | Support vector machine (SVM), a machine learning method, was utilized to determine where students are falling behind and how the suggested strategy enables them to dynamically improve their performance. During adaptive e-learning, the SVM technique is used to determine the prior demographic data and dynamical inputs in terms of feedback, special assistance, and tailored recommendations to help students avoid failures and improve performance. The performance of students is enhanced by the provision of adaptive feedback, personalization, and customization of the responses in accordance with their preferences. |
[34] | Educational data (students’ profiles, historical performance, and demographics data) combined with external data gathered from social networks | The behavioral and attitudinal characteristics of students were examined to reveal significant factors affecting various aspects of each student’s learning process. The contents can be modified to be presented, the order and level of difficulty of knowledge items, as well as the format, style, and pace of learning, to enhance knowledge understanding and reception retention for specific students at distinct learning stages. When a higher risk of failure is anticipated, early warnings and additional tutoring can be given. |
Article | Data | Methods |
---|---|---|
[23] | Educational data generated from digital tools (Google Classroom and Khan Academy) | Learning Analytics Dashboard (LAD) was created using information gathered from online resources like Khan Academy and Google Classroom. LAD assists teachers in locating, tracking, and suggesting solutions to address the learning gaps of their students. The data produced by student usage of digital tools can be utilized to track, evaluate, forecast, intervene, suggest, and enhance the effectiveness of teaching and learning. |
[24] | Students’ interactions with the game (how many puzzles they solve, how long they spend in the game world, and the events that each student creates) | Through a visualization dashboard, teachers access real-time analytics about how their students are interacting with the Shadowspect game. The dashboard assists teachers in keeping tabs on the general operation of the classroom as well as the progress of individual students, identifying problem areas, and giving each student tailored feedback. The dashboard also assists teachers in changing their instructional approaches, which can result in more efficient and personalised learning opportunities for students. |
[35] | Smart devices data | Smart services were used where digital solutions that are user-centered, context-aware, automated, and data-enabled are provided through smart services. Customizable and flexible instructional labyrinth video games are created through the APOGEE platform. Teachers can more easily see the benefits of various learning methodologies, learning personalization, and dynamic adaptation. Smart services can help teachers keep track of students’ progress toward learning objectives, individual and group development, and necessary corrections. Smart services can set up learning goals that better match students’ changing interests and motivations, or they might indicate learning paths to overcome risks and problems. |
[36] | Lesson plans, field notes from each post-lesson discussion, teacher interviews, screenshots of LA findings, and screenshots of student-generated artifacts | LA was used in collaborative language learning classrooms. The authors contend that LA can aid personalised instruction by giving language teachers helpful data regarding particular curricula or learning settings. When the learning design (LD) for language learning is based on social constructivist theories, LA can successfully inform pedagogical refinement. On the premise that the teacher has innovation-oriented beliefs and is enthusiastic about working with researchers and professional development, LA can also enhance teacher inquiry and LD. |
[37] | Students’ interaction in online problem-based learning | Social network analysis was leveraged to investigate how social dynamics and performance in online collaborative learning are affected by group size. This offers insightful data on how students learn and interact with one another to increase the efficiency of collaborative learning settings. |
Article | Data | Methods |
---|---|---|
[38] | Data obtained from a camera | A camera for emotion recognition was utilized to track students’ emotional states. The teacher’s decision-making in the classroom, as well as maximizing attention to students, modifying their methods, or concentrating on a particular student, can all be improved with the use of this information. |
[39] | Student performance, demographic, student behavior, and engagement data | A multi-module model that includes the identification of target content, curriculum enhancement, cognitive state and behavior prediction, and personalization was used. This model aids in better understanding learning and the settings in which it takes place. To deliver personalized learning routes and evaluation resources, the approach makes use of data about students and their environments. A hierarchical clustering was used to separate students’ data into distinct clusters to enhance students’ learning experiences, and students can be supported with appropriate instructional designs. |
[40] | StudyFlex trial data (subjects designed and taught in a flexible hybrid format at a university) | A complex mechanism was incorporated into designs for capturing analytics to ascertain student involvement and learning behaviors as applied to understand that more complex metrics are needed because attendance data techniques are not the best way to gauge involvement in flexible hybrid formats. Longitudinal information was utilized to make institutions more amenable to future hybrid learning projects by enabling them to prepare ahead for issues like booking learning spaces, staffing, and timetabling. |
[41] | Learning tests, survey responses, and learning logs | Personalized learning with LA was used in collaborative problem-solving to support key elements of STEM education, such as learning strategy and learning behaviors. |
[42] | Student learning activities interactions, demographics, cognitive, academic, and behavioral engagement data | A logit leaf model (LLM) algorithm was employed to improve student dropout predictions and by identifying elements that can encourage students to continue their education. The authors employed five different types of indicators to accurately predict student dropout in a subscription-based online learning environment, including demographics, classroom features, and cognitive, intellectual, and behavioral forms of participation. The LLM algorithm performs better than any alternative method in striking a balance between comprehensibility and predictive performance. Additionally, in contrast to a conventional LLM visualization, the authors provided a new multilayer informative representation of the LLM that provides fresh benefits. By analyzing LLM segments, numerous insights for distinct student segments with different learning styles become apparent. These insights can be leveraged to tailor student retention strategies. |
Article | Data | Methods |
---|---|---|
[39] | Student performance, demographic, student behavior, and engagement data | A multi-module model that includes the identification of target content, curriculum enhancement, cognitive state and behavior prediction, and personalization was used. This model aids in better understanding learning and the settings in which it takes place. To deliver personalized learning routes and evaluation resources, the approach makes use of data about students and their environments. A hierarchical clustering was used to separate students’ data into distinct clusters to enhance students’ learning experiences, and students can be supported with appropriate instructional designs. |
[43] | Data about learning progress, performance, and behavior, as well as feedback from teachers | Adaptive learning technology (ALT) was integrated into classroom management and teachers’ professionalism in a real-world primary education context. ALT is an inherent opportunity to enhance teacher-facilitated learning and to individually tailor the curriculum and learning experiences for each student. |
[44] | Data from the LMS and Gitlab | LAOps, a cutting-edge course-scope analytics technology, was used to enhance personalised learning with learning analytics. Data from the LMS and Gitlab, which serves as a gateway for student submissions, are used in this program. After the data have been securely protected and moved to the cloud, models are trained, and analysis are carried out. Both teachers and students receive the results, which are then used to tailor and differentiate exercises based on students’ ability levels. |
[45] | Easy and medium quiz questions, weekly contents, and student profiles | An adaptive virtual learning environment (AdaptiveVLE) framework, leveraging the MPS JetBrains Domain-Specific Modeling Environment, was used by teachers to design adaptive VLEs that are customized for their needs and help develop a more general foundation for adaptive systems. The framework is made up of the following stages: data collection configuration by the teachers, application of the adaptive VLE, data processing, and learning path adaptation. The framework enables the teachers to configure the data collected and the way the data are processed without having any prior understanding of software development and/or the implementation specifics of data science techniques. This enables quick experimentation with various approaches. |
[46] | Data from focus groups, content analysis of dashboard drawings made by students | Student dashboards were used to support personalized learning where students have access to information on extra resources, scholarships, personalized links to assignments they have already completed, and opportunities to catch up. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Khor, E.T.; K, M. A Systematic Review of the Role of Learning Analytics in Supporting Personalized Learning. Educ. Sci. 2024, 14, 51. https://doi.org/10.3390/educsci14010051
Khor ET, K M. A Systematic Review of the Role of Learning Analytics in Supporting Personalized Learning. Education Sciences. 2024; 14(1):51. https://doi.org/10.3390/educsci14010051
Chicago/Turabian StyleKhor, Ean Teng, and Mutthulakshmi K. 2024. "A Systematic Review of the Role of Learning Analytics in Supporting Personalized Learning" Education Sciences 14, no. 1: 51. https://doi.org/10.3390/educsci14010051
APA StyleKhor, E. T., & K, M. (2024). A Systematic Review of the Role of Learning Analytics in Supporting Personalized Learning. Education Sciences, 14(1), 51. https://doi.org/10.3390/educsci14010051