Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions
Abstract
:1. Introduction
- Identify and evaluate learning analytics approaches utilised within Learning Management Systems to enhance the user experience of students.
- Examine the challenges hindering the successful integration of learning analytics for user experience improvement in Learning Management Systems environments.
2. Materials and Methods
2.1. Literature Retrieval, Screening, and Eligibility Criteria
- Population (P): Higher education learners (undergraduate, postgraduate programmes) utilising LMS. Inclusion: Studies explicitly identifying undergraduate and postgraduate learners within an LMS context. Exclusion: Studies focusing on pregraduate education or general populations without LMS-specific data.
- Intervention (I): Enhance UX via LA in LMS. Inclusion: Studies applying LA to improve learner experience within LMS. Exclusion: Purely technical LA studies without LMS UX focus.
- Comparison (C): LA intervention comparison within LMS. Preference: Studies comparing LA effectiveness within LMS. Screening: Prioritised comparative studies within LMS; non-comparative studies not prioritised.
- Outcome (O): UX and learner success in LMS. Inclusion: Studies measuring UX and learner outcomes (academic, engagement, satisfaction) within LMS. Exclusion: Studies without UX or learner outcome data within LMS [24].
- Language: Studies published in English to ensure accessibility and consistency in analysis.
- Publication Period: Studies published from January 2015 to February 2025, reflecting the rapid evolution of LA technologies and their application within LMS environments. This timeframe ensures the inclusion of recent advancements relevant to UX enhancement.
- Methodological Rigour and Conceptual Clarity: Research articles demonstrating a robust understanding of LA concepts and their application within LMS environments. Studies should employ rigorous research methodologies, clearly define the learner sample within the LMS context, and present findings that contribute to the understanding of how LA enhances UX and student success in this specific digital learning environment. The PRISMA flow diagram (Figure 1) illustrates the systematic search and study selection process.
2.1.1. Stage 1: Automated Deduplication and Machine Learning-Assisted Prioritisation
- Rayyan’s deduplication functionality was employed to identify and remove redundant and non-English records [26]. Utilising algorithms that compared attributes such as title, authors, abstract, and publication details, 1547 records were excluded. This deduplication process reduced the dataset to 1560 records for subsequent evaluation.
- Subsequently, ASReview was utilised to prioritise the remaining 1560 records for full-text review [27]. The reviewers were able to concentrate on the most promising records, since ASReview’s machine learning algorithms projected each record’s relevance by comparing abstracts and titles to the predetermined inclusion and exclusion criteria. This automated prioritisation made the screening process much faster [26,27].
2.1.2. Stage 2: Independent Full-Text Review and Selection
- Following the ASReview prioritisation, the two independent reviewers conducted a full-text review using Rayyan. Each reviewer independently applied the predetermined eligibility criteria to the prioritised records.
- Through this rigorous application of the eligibility criteria, 1490 records were excluded due to lack of peer review and non-compliance with the research objectives, resulting in the identification of 70 records directly relevant to the application of LA within LMS environments to enhance UX. These 70 records were retained for methodological rigour and validity assessment. Therefore, the final step in the selection process involved applying the quality criteria.
2.2. Methodological Rigour and Validity Assessment
- Conceptual Clarity and Theoretical Foundation: The reviewers carefully considered how LA fundamentals were articulated and understood, as well as how they were applied in higher education. This required a thorough examination of the introduction and literature review sections to determine the authors’ familiarity with pertinent theoretical frameworks, current research, and the unique educational possibilities and challenges present in this field. This component dealt with the examined literature’s construct validity.
- Methodological Soundness: Each article’s methodology sections were carefully examined. The reviewers assessed the suitability of data analysis methods, the validity and reliability of data collection tools (such as surveys and interviews), the researchers’ attempts to address potential sources of bias, and the appropriateness of the research design with respect to the stated research questions. Prioritising studies that demonstrated well-reasoned and methodologically sound procedures highlighted the research’s internal validity.
- Sampling Adequacy and Generalisability: Clearly defined sampling frames are important, as the CASP checklist emphasised. The reviewers evaluated how well the authors defined the target group of students, the sampling procedure they used, and the rationale behind the sample size and representativeness. In order to address the external validity and transferability of the results, studies with clearly defined and representative sampling frames were given greater credibility.
- Importance and Interpretive Complexity of Results: To assess the offered findings’ coherence, clarity, and applicability to the study objectives and methodology, a thorough analysis of the results and discussion sections was conducted. The reviewers evaluated how well the authors articulated the implications of their findings and how deeply they interpreted them. Particularly noteworthy were articles that offered fresh perspectives and useful suggestions about the use of LA in HEIs. This element had to do with the research’s ecological validity and usefulness.
2.3. Inter-Rater Reliability and Consistency
2.4. Extraction and Synthesis of Data
- Bibliographic details, such as the author, the year of publication, and the location.
- Methodological design.
- LA tools and methods used.
- Significant findings about the benefits and difficulties of LA in HEIs.
3. Results
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AI | Artificial Intelligence |
CASP | Critical Appraisal Skills Programme |
DA | Data Analytics |
HEIs | Higher Education Institutions |
IRR | Inter-Rater Reliability |
LA | Learning Analytics |
LMS | Learning Management Systems |
ML | Machine Learning |
OSF | Open Science Framework |
PICO | Population, Intervention, Comparison, Outcome |
PRISMA | Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews |
UX | User Experience |
Appendix A
Author(s) | Research Title | Research Design | Place of Publication | Learning Analytics Tools Used | Major Findings |
[32] | Using log variables in a learning management system to evaluate learning activity using the lens of activity theory | Quantitative analysis of LMS log data | Assessment and Evaluation in Higher Education | Moodle log data analysis, statistical analysis | Low overall LMS usage; significant variation in activity patterns across courses and colleges. Contradictions within the activity system hinder effective LMS use. |
[33] | Towards actionable learning analytics using dispositions | Quantitative analysis, incorporating self-reported data | IEEE Transactions on Learning Technologies | Demographic, trace (LMS), and self-reported data analysis | Incorporation of dispositional data (e.g., procrastination, boredom) into LA models enhances understanding of student behaviour and enables more actionable interventions. |
[34] | Predicting time-management skills from learning analytics | Quantitative (linear and multilevel regression) | Journal of Computer-Assisted Learning | Canvas LMS trace data, questionnaire data | LMS trace data can predict self-reported time-management skills, but models are not readily transferable between courses. Further research is needed to improve portability. |
[35] | Evaluation of usability in Moodle learning management system through analytics graphs: University of Applied Sciences teacher’s perspective in Finland | Quantitative analysis of LMS log data | International Journal of Education and Development using Information and Communication Technology | Moodle log data, analytics graphs plugin | Analytics graphs in Moodle provide insights into student activity and enable identification of student profiles. This aids teachers and management in tracking and improving student performance. |
[36] | Analytics-informed design: Exploring visualisation of learning management systems recorded data for learning design | Educational design research, qualitative interviews, dashboard pilot evaluation | SAGE Open | Visualisation dashboard development, LMS data visualisation | Educational design research can effectively develop user-friendly LA visualisation dashboards to support data-informed learning design. Preliminary design principles were identified. |
[37] | Using analytics to predict students’ interactions with learning management systems in online courses | Quantitative, Multiple Linear Regression (MLR) and Decision Tree (DT) | Education and Information Technologies | LMS analytics and log data analysis | MLR and DT models can effectively predict learner–LMS interactions. Key predictors include submission, content access, and assessment access metrics. |
[38] | Learning analytics and data ethics in performance data management: A benchlearning exercise involving six European universities | Qualitative, benchlearning exercise | Quality in Higher Education | Analysis of institutional data management models, ethical review | Learning analytics are present in European universities but are primarily based on traditional data. Ethical risks are generally covered by regulations. Learning analytics offers opportunities for improved data and quality management. |
[39] | Pre-class learning analytics in flipped classroom: Focusing on resource management strategy, procrastination, and repetitive learning | Quantitative, log data, survey, and exam data analysis. | Journal of Computer-Assisted Learning | Learning analytics of pre-class video viewing data, statistical analysis | Resource management strategies (time, study environment) significantly influence pre-class video engagement and learning achievement in flipped classrooms. Procrastination significantly decreases video engagement. |
[40] | From data to action: Faculty experiences with a university-designed learning analytics system | Qualitative case study, surveys, and focus groups | International Journal on E-Learning | University-designed LA system, cloud-based data collection, dashboards, alert emails | Faculty will use LA to make data-driven changes to teaching, including feedback and communication. Implementation challenges include learning curves and LMS integration issues. Ongoing training and clear policies are needed. |
[41] | The mediating role of learning analytics: Insights into student approaches to learning and academic achievement in Latin America | Quantitative, analysis of LMS trace data | Journal of Learning Analytics | LMS trace data analysis, statistical analysis | Most LA indicators do not mediate the effect between learning approaches and performance, but fine-grained indicators can. Organised learning approaches are effective in Latin American higher education. |
[42] | Using learning analytics and student perceptions to explore student interactions in an online construction management course | Case study, learning analytics, surveys | Journal of Civil Engineering Education | Canvas LMS analytics, survey data | Student interactions with course materials decreased after the midterm. Students found lecture videos and slides most helpful. LA can inform course design. |
[43] | Student device usage, learning management system tool usage, and self-regulated learning | Non-intervention descriptive research design, LMS data logs, surveys | ProQuest Dissertations and Theses Global (University of Nevada, Las Vegas) | LMS data logs, Online Self-Regulated Learning Questionnaire (OSLQ) | Device usage varies; low-performing students report similar or higher SRL but use LMS tools less. SRL instruction and tool/device effectiveness are crucial. |
[44] | Leveraging complexity science to promote learning analytics adoption in higher education: An embedded case study | Embedded case study | ProQuest Dissertations and Theses Global (University of Maryland, College Park) | Analysis of learning analytics practices, application of CAS framework | Learning analytics implementation requires consideration of higher education institutions as complex adaptive systems. Emergent, ground-up approaches are more effective than top-down. |
[45] | Beyond learning management systems: Teaching digital fluency | Pedagogical reflections, qualitative analysis | Journal of Political Science Education | Analysis of pedagogical approaches, platforms beyond LMS | Teaching digital fluency requires platforms beyond LMS. Innovative assignments improve digital skills and content retention for Generation Z learners. |
[46] | Increasing student engagement with course content in graduate public health education: A pilot randomised trial of behavioral nudges | Pilot randomised controlled trial | Education and Information Technologies | LMS data analysis, behavioural nudges | Behavioural nudges based on LA did not significantly change student engagement. Future work should focus on qualitative assessment of motivations and richer analysis of learning behaviours. |
[47] | Learning management systems for higher education: A brief comparison | Comparative analysis | Discover Education | Evaluation criteria based on SQTL (Software Quality and Teaching–Learning tools) | Paradiso and Moodle are the top-rated LMS based on SQTL criteria, with high scores in interoperability, accessibility, and learning tools. |
[48] | Learners’ needs in online learning environments and third-generation learning management systems (LMS 3.0) | Qualitative, open-ended questionnaire, semi-structured interviews | Technology, Knowledge and Learning | Content analysis of questionnaire and interview data | Learners desire entertaining, self-monitoring LMS environments with gamification. Needs align with LMS 3.0, which can be developed using data mining and LA. |
[49] | Examining learning management system success: A multiperspective framework | Quantitative, survey, structural equation modelling (SEM) | Education and Information Technologies | TAM3 and ISS framework, SEM analysis | LMS success depends on content, system, and output quality, leading to student satisfaction and perceived usefulness. User satisfaction negatively impacts system and output quality. |
[50] | LearnSphere: A learning data and analytics cyberinfrastructure | Use-driven design, case studies | Journal of Educational Data Mining | LearnSphere, Tigris | LearnSphere facilitates discoveries about learning (active learning vs. passive, discussion board quality), supports research reproducibility, and enables workflow combinations for analytics. |
[51] | Pragmatic monitoring of learning recession using log data of learning management systems and machine learning techniques | Analysis of system log data, machine learning application | International Journal of Education and Development using Information and Communication Technology | Machine learning techniques (unspecified) applied to LMS log data. | System log data can be used for machine learning-based monitoring of students’ learning recession. Proposed indicators and visualisations for proactive intervention. |
[52] | Evidence-based multimodal learning analytics for feedback and reflection in collaborative learning | Two-year longitudinal study, survey, Evaluation Framework for Learning Analytics | British Journal of Educational Technology | Multimodal Learning Analytics (MMLA) system | MMLA solution enhances feedback and reflection in collaborative learning. Positive perceptions from teachers and students, but complexity and qualitative measures need improvement. Importance of data accuracy, transparency, and privacy. |
[53] | Students’ use of learning management systems and desired e-learning experiences: Are they ready for next generation digital learning environments? | Survey | Higher Education Research and Development | Analysis of LMS usage data | Students’ LMS usage for content and discussion correlates with desired engagement in student-centred e-learning. Students desire systems supporting content curation, group management, and mobile interoperability. |
[54] | Architecting analytics across multiple e-learning systems to enhance learning design | Cross-platform architecture development, regression and classification techniques | IEEE Transactions on Learning Technologies | Cross-platform architecture for integrating data from multiple e-learning systems. | Combining data across multiple e-learning systems improves classification accuracy. Cross-platform analytics provides broader insights into learner behaviour. |
[55] | Utilizing clickstream data to reveal the time management of self-regulated learning in a higher education online learning environment | Analysis of clickstream data, learning analytics | Interactive Learning Environments | StarC system log analysis | Clickstream data reveals time management aspects of self-regulated learning (SRL). Differences in time management among students with varying academic performance. |
[56] | Effectiveness of machine learning algorithms on predicting course level outcomes from learning management system data | Quantitative comparative study, machine learning algorithm comparison | ProQuest Dissertations and Theses Global (Doctoral Dissertation) | Naive Bayes, decision tree, neural network, support vector machine | Decision tree effectively predicts students with poor course outcomes using LMS data. Decision trees outperformed other algorithms. |
[57] | Detecting learning strategies with analytics: Links with self-reported measures and academic performance | Analysis of trace data, correlation with self-reported measures | Journal of Learning Analytics | Analysis of trace data from a flipped classroom environment. | Learning strategies extracted from trace data correlate with deep and surface approaches to learning. Deep approach to learning correlates with higher academic performance. |
[58] | Individual differences related to college students’ course performance in Calculus II | Dominance analysis, correlation analysis | Journal of Learning Analytics | Analysis of LMS data, discussion forum data, quiz attempts. | Math importance, approximate number system (ANS) ability, discussion forum posting, and workshop submission time are significant predictors of final grades in Calculus II. |
[59] | How flexible is your data? A comparative analysis of scoring methodologies across learning platforms in the context of group differentiation | Comparative analysis of scoring methodologies, resampling approach | Journal of Learning Analytics | Analysis of ASSISTments and Cognitive Tutor data. | Partial credit scoring offers more efficient group differentiation than binary accuracy measures in learning platforms. Partial credit increases analytic power. |
[60] | Designing Analytics for Collaboration Literacy and Student Empowerment | Survey | Journal of Learning Analytics | BLINC (collaborative analytics tool) | Student collaboration concerns fall into seven dimensions: Climate, Compatibility, Communication, Conflict, Context, Contribution, and Constructive. These dimensions should inform collaboration analytics design. |
[61] | A Novel Deep Learning Model for Student Performance Prediction Using Engagement Data | Deep learning model development and evaluation | Journal of Learning Analytics | ASIST (Attention-aware convolutional Stacked BiLSTM network) | ASIST, a deep learning model, predicts student performance using engagement data from VLEs. It outperforms baseline models. |
[62] | Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course Performance | Deep learning approach, comparison with machine learning classifiers | Journal of Learning Analytics | LSTM (Long Short-Term Memory) networks | LSTM networks effectively predict course performance using LMS time series data, outperforming traditional machine learning classifiers. |
[63] | The Positive Impact of Deliberate Writing Course Design on Student Learning Experience and Performance | Analysis of LMS data, correlation with course design decisions | Journal of Learning Analytics | Analysis of LMS usage data | Course design influences learner interaction patterns. Discussion entry length predicts final grades, highlighting the impact of writing practice. |
[64] | Privacy-driven Design of Learning Analytics Applications—Exploring the Design Space of Solutions for Data Sharing and Interoperability | Conceptual model development | Journal of Learning Analytics | Learning Analytics Design Space model | Privacy-driven design is crucial for learning analytics systems. The Learning Analytics Design Space model aids in designing privacy-conscious solutions. |
[65] | RiPPLE: A Crowdsourced Adaptive Platform for Recommendation of Learning Activities | Platform development and pilot study | Journal of Learning Analytics | RiPPLE (Recommendation in Personalised Peer-Learning Environments) | RiPPLE, a crowdsourced adaptive platform, recommends personalised learning activities and shows measurable learning gains. |
[66] | Leveraging learning analytics for student reflection and course evaluation | Faculty utilisation of learning analytics, curriculum evaluation | Journal of Applied Research in Higher Education | Learning analytics tools within LMS | Learning analytics enables student reflection, remediation, and curriculum evaluation, providing detailed data for stakeholders. |
[67] | Applying learning analytics for the early prediction of students’ academic performance in blended learning | Predictive modelling, principal component regression | Educational Technology and Society | Analysis of LMS data, video-viewing, practice behaviours, homework, quizzes | Learning analytics predicts student performance in blended learning. Online and traditional factors contribute to prediction accuracy. |
[68] | Learning management system and course influences on student actions and learning experiences | Comparative study of LMS and course influences | Educational Technology Research and Development | Analysis of LMS usage data | Course type and LMS design influence student actions and experiences. Discussion-focused systems increase perceived learning support. |
[69] | Toward Precision Education: Educational Data Mining and Learning Analytics for Identifying Students’ Learning Patterns with Ebook Systems | Clustering approach, analysis of ebook system data | Educational Technology and Society | Analysis of ebook system data | Clustering identifies subgroups of students with different learning patterns. Learning patterns correlate with learning outcomes. |
[70] | A Bayesian Classification Network-based Learning Status Management System in an Intelligent Classroom | System development and experiment | Educational Technology and Society | Bayesian classification network, sensor technology, image recognition | Learning status management system using sensors and image recognition. Bayesian network infers student learning status, with feedback to teachers and students. |
[71] | Student perceptions of privacy principles for learning analytics | Exploratory study, survey | Educational Technology Research and Development | Analysis of student perceptions | Students desire adaptive, personalised dashboards in learning analytics systems but are conservative about data sharing. Stakeholder involvement is crucial for successful implementation. |
[72] | Fostering evidence-based education with learning analytics: Capturing teaching-learning cases from log data | System development, case studies | Educational Technology and Society | Learning analytics framework, statistical modelling of learning logs | Automated capture of teaching–learning cases (TLCs) using learning analytics. Statistical modelling identifies intervention effectiveness. |
References
- Marks, A.; Al-Ali, M. Analytics within UAE higher education context. In Proceedings of the 2016 3rd MEC International Conference on Big Data and Smart City (ICBDSC), Muscat, Oman, 15–16 March 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Saleh, A.M.; Abuaddous, H.Y.; Alansari, I.S.; Enaizan, O. The Evaluation of User Experience on Learning Management Systems Using UEQ. Int. J. Emerg. Technol. Learn. 2022, 17, 145–162. [Google Scholar] [CrossRef]
- Mohd Kasim, N.N.; Khalid, F. Choosing the Right Learning Management System (LMS) for the Higher Education Institution Context: A Systematic Review. Int. J. Emerg. Technol. Learn. 2016, 11, 55–61. [Google Scholar] [CrossRef]
- de Kock, E.; van Biljon, J.; Botha, A. User Experience of Academic Staff in the Use of a Learning Management System Tool. In Proceedings of the SAICSIT ‘16: Proceedings of the Annual Conference of the South African Institute of Computer Scientists and Information Technologists, Johannesburg, South Africa, 26–28 September 2016; pp. 1–10. [Google Scholar] [CrossRef]
- Maslov, I.; Nikou, S.; Hansen, P. Exploring user experience of learning management system. Int. J. Inf. Learn. Technol. 2021, 38, 344–363. [Google Scholar] [CrossRef]
- Arqoub, M.A.; El-Khalili, N.; Hasan, M.A.-S.; Banna, A.A. Extending Learning Management System for Learning Analytics. In Proceedings of the 2022 International Conference on Business Analytics for Technology and Security (ICBATS), Dubai, United Arab Emirates, 16–17 February 2022; pp. 1–6. [Google Scholar] [CrossRef]
- Hernández-Leo, D.; Martinez-Maldonado, R.; Pardo, A.; Muñoz-Cristóbal, J.A.; Rodríguez-Triana, M.J. Analytics for learning design: A layered framework and tools. Br. J. Educ. Technol. 2019, 50, 139–152. [Google Scholar] [CrossRef]
- Society for Learning Analytics Research. What Is Learning Analytics? 2024. Available online: https://www.solaresearch.org/about/what-is-learning-analytics/ (accessed on 19 November 2024).
- Ismail, S.N.; Hamid, S.; Ahmad, M.; Alaboudi, A.; Jhanjhi, N. Exploring Students Engagement Towards the Learning Management System (LMS) Using Learning Analytics. Comput. Syst. Sci. Eng. 2021, 37, 73–87. [Google Scholar] [CrossRef]
- Viberg, O.; Hatakka, M.; Bälter, O.; Mavroudi, A. The current landscape of learning analytics in higher education. Comput. Hum. Behav. 2018, 89, 98–110. [Google Scholar] [CrossRef]
- Goode, C.; Terry, A.; Harlow, H.; Cash, R. Mining for Gold: Learning Analytics and Design for Learning: A Review. Scope Teach. Learn. 2021, 7474, 10. [Google Scholar] [CrossRef]
- Tzimas, D.; Demetriadis, S.N. The Impact of Learning Analytics on Student Performance and Satisfaction in a Higher Education Course. In Proceedings of the Educational Data Mining, Paris, France, 29 June–2 July 2021; Available online: https://api.semanticscholar.org/CorpusID:247321827 (accessed on 18 November 2024).
- Mkpojiogu, E.O.C.; Okeke-Uzodike, O.E.; Emmanuel, E.I. Quality Attributes for an LMS Cognitive Model for User Experience Design and Evaluation of Learning Management Systems. In Proceedings of the 3rd International Conference on Integrated Intelligent Computing Communication & Security (ICIIC 2021), Bangalore, India, 6–7 August 2021; Atlantis Highlights in Computer Sciences. Atlantis Press: Dordrecht, The Netherlands, 2021; Volume 4. [Google Scholar]
- Almusharraf, A.I. An Investigation of University Students’ Perceptions of Learning Management Systems: Insights for Enhancing Usability and Engagement. Sustainability 2024, 16, 10037. [Google Scholar] [CrossRef]
- Maluleke, A.F. Enhancing Learning Analytics through Learning Management Systems Engagement in African Higher Education. J. Educ. Learn. Technol. 2024, 5, 130–149. [Google Scholar] [CrossRef]
- Ncube, M.M.; Ngulube, P. Optimising Data Analytics to Enhance Postgraduate Student Academic Achievement: A Systematic Review. Educ. Sci. 2024, 14, 1263. [Google Scholar] [CrossRef]
- El Alfy, S.; Marx Gómez, J.; Dani, A. Exploring the benefits and challenges of learning analytics in higher education institutions: A systematic literature review. Inf. Discov. Deliv. 2019, 47, 25–34. [Google Scholar] [CrossRef]
- Samuelsen, J.; Chen, W.; Wasson, B. Integrating multiple data sources for learning analytics—Review of literature. Res. Pract. Technol. Enhanc. Learn. 2019, 14, 11. [Google Scholar] [CrossRef]
- Pan, Z.; Biegley, L.; Taylor, A.; Zheng, H. A Systematic Review of Learning Analytics: Incorporated Instructional Interventions on Learning Management Systems. J. Learn. Anal. 2024, 11, 52–72. [Google Scholar] [CrossRef]
- Adeniran, I.A.; Efunniyi, C.P.; Osundare, O.S.; Abhulimen, A.O. Integrating data analytics in academic institutions: Enhancing research productivity and institutional efficiency. Int. J. Sch. Res. Multidiscip. Stud. 2024, 5, 77–87. [Google Scholar] [CrossRef]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. J. Clin. Epidemiol. 2021, 134, 178–189. [Google Scholar] [CrossRef] [PubMed]
- Pieper, D.; Rombey, T. Where to prospectively register a systematic review. Syst. Rev. 2022, 11, 8. [Google Scholar] [CrossRef]
- MacFarlane, A.; Russell-Rose, T.; Shokraneh, F. Search Strategy Formulation for Systematic Reviews: Issues, Challenges and Opportunities. Intell. Syst. Appl. 2022, 15, 200091. [Google Scholar] [CrossRef]
- Methley, A.M.; Campbell, S.; Chew-Graham, C.; McNally, R.; Cheraghi-Sohi, S. PICO, PICOS and SPIDER: A Comparison Study of Specificity and Sensitivity in Three Search Tools for Qualitative Systematic Reviews. BMC Health Serv. Res. 2014, 14, 579. [Google Scholar] [CrossRef]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
- Rayyan. Faster Systematic Reviews. 2024. Available online: https://www.rayyan.ai/ (accessed on 7 December 2024).
- ASReview. Join the Movement Towards Fast, Open, and Transparent Systematic Reviews. 2024. Available online: https://asreview.nl/ (accessed on 9 December 2024).
- Critical Appraisal Skills Programme (CASP). CASP Checklists. 2024. Available online: https://casp-uk.net/casp-tools-checklists/ (accessed on 13 December 2024).
- Li, M.; Gao, Q.; Yu, T. Kappa Statistic Considerations in Evaluating Inter-Rater Reliability between Two Raters: Which, When and Context Matters. BMC Cancer 2023, 23, 799. [Google Scholar] [CrossRef]
- Mandrekar, J.N. Measures of Interrater Agreement. Biostat. Clin. 2011, 6, 6–7. [Google Scholar] [CrossRef] [PubMed]
- CADIMA. Evidence Synthesis Tool and Database. 2025. Available online: https://www.cadima.info/ (accessed on 17 January 2025).
- Park, Y.; Jo, I.-H. Using log variables in a learning management system to evaluate learning activity using the lens of activity theory. Assess. Eval. High. Educ. 2017, 42, 531–547. [Google Scholar] [CrossRef]
- Tempelaar, D.T.; Rienties, B.; Nguyen, Q. Towards actionable learning analytics using dispositions. IEEE Trans. Learn. Technol. 2017, 10, 6–17. [Google Scholar] [CrossRef]
- Sluijs, M.; Matzat, U. Predicting time-management skills from learning analytics. J. Comput. Assist. Learn. 2024, 40, 525–537. [Google Scholar] [CrossRef]
- Olaleye, S.; Agjei, R.; Jimoh, B.; Adoma, P. Evaluation of usability in Moodle learning management system through analytics graphs: University of Applied Sciences teacher’s perspective in Finland. Int. J. Educ. Dev. Using Inf. Commun. Technol. 2023, 19, 85–107. Available online: http://files.eric.ed.gov/fulltext/EJ1413526.pdf (accessed on 11 October 2024).
- Liu, Q.; Gladman, T.; Muir, J.; Wang, C.; Grainger, R. Analytics-informed design: Exploring visualization of learning management systems recorded data for learning design. SAGE Open 2023, 13, 1–10. [Google Scholar] [CrossRef]
- Alshammari, A. Using analytics to predict students’ interactions with learning management systems in online courses. Educ. Inf. Technol. 2024, 29, 20587–20612. [Google Scholar] [CrossRef]
- Rosa, M.J.; Williams, J.; Claeys, J.; Kane, D.; Bruckmann, S.; Costa, D.; Rafael, J.A. Learning analytics and data ethics in performance data management: A bench learning exercise involving six European universities. Qual. High. Educ. 2022, 28, 65–81. [Google Scholar] [CrossRef]
- Doo, M.Y.; Park, Y. Pre-class learning analytics in flipped classroom: Focusing on resource management strategy, procrastination and repetitive learning. J. Comput. Assist. Learn. 2024; advance online publication. [Google Scholar] [CrossRef]
- Fuller, J.; Lokey-Vega, A. From data to action: Faculty experiences with a university-designed learning analytics system. Int. J. E-Learn. 2024, 23, 471–487. Available online: https://www.learntechlib.org/primary/p/225169/ (accessed on 11 November 2024). [CrossRef]
- Villalobos, E.; Hilliger, I.; Gonzalez, C.; Celis, S.; Pérez-Sanagustín, M.; Broisin, J. The mediating role of learning analytics: Insights into student approaches to learning and academic achievement in Latin America. J. Learn. Anal. 2024, 11, 6–20. Available online: http://files.eric.ed.gov/fulltext/EJ1423426.pdf (accessed on 11 November 2024). [CrossRef]
- West, P.; Paige, F.; Lee, W.; Watts, N.; Scales, G. Using learning analytics and student perceptions to explore student interactions in an online construction management course. J. Civ. Eng. Educ. 2022, 148, 05022001. [Google Scholar] [CrossRef]
- Webb, N.L. Student Device Usage, Learning Management System Tool Usage, and Self-Regulated Learning (Publication No. 30310232). Doctoral Dissertation, University of Nevada, Las Vegas, NV, USA, 2023. ProQuest Dissertations & Theses Global. Available online: http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:29996357 (accessed on 11 November 2024).
- Moses, P.S. Leveraging Complexity Science to Promote Learning Analytics Adoption in Higher Education: An Embedded Case Study (Publication No. 30814981). Doctoral Dissertation, University of Maryland, College Park, MD, USA, 2023. ProQuest Dissertations & Theses Global. Available online: https://www.proquest.com/docview/3113526225 (accessed on 10 November 2024).
- Le, D.; Pole, A. Beyond learning management systems: Teaching digital fluency. J. Polit. Sci. Educ. 2023, 19, 134–153. [Google Scholar] [CrossRef]
- Garbers, S.; Crinklaw, A.D.; Brown, A.S.; Russell, R. Increasing student engagement with course content in graduate public health education: A pilot randomized trial of behavioral nudges. Educ. Inf. Technol. 2023, 28, 13405–13421. [Google Scholar] [CrossRef]
- Sanchez, L.; Penarreta, J.; Poma, X.S. Learning management systems for higher education: A brief comparison. Discov. Educ. 2024, 3, 58. [Google Scholar] [CrossRef]
- Sahin, M.; Yurdugül, H. Learners’ needs in online learning environments and third generation learning management systems (LMS 3.0). Technol. Knowl. Learn. 2022, 27, 33–48. [Google Scholar] [CrossRef]
- Becirovic, S. Examining learning management system success: A multiperspective framework. Educ. Inf. Technol. 2024, 29, 11675–11699. [Google Scholar] [CrossRef]
- Stamper, J.; Moore, S.; Rosé, C.P.; Pavlik, P.I., Jr.; Koedinger, K. LearnSphere: A learning data and analytics cyberinfrastructure. J. Educ. Data Min. 2024, 16, 141–163. [Google Scholar]
- Kalegele, K. Pragmatic monitoring of learning recession using log data of learning management systems and machine learning techniques. Int. J. Educ. Dev. Using Inf. Commun. Technol. 2023, 19, 177–190. [Google Scholar]
- Yan, L.; Echeverria, V.; Jin, Y.; Fernandez-Nieto, G.; Zhao, L.; Li, X.; Alfredo, R.; Swiecki, Z.; Gašević, D.; Martinez-Maldonado, R. Evidence-based multimodal learning analytics for feedback and reflection in collaborative learning. Br. J. Educ. Technol. 2024, 55, 1900–1925. [Google Scholar] [CrossRef]
- Koh, J.H.L.; Kan, R.Y.P. Students’ use of learning management systems and desired e-learning experiences: Are they ready for next generation digital learning environments? High. Educ. Res. Dev. 2021, 40, 995–1010. [Google Scholar] [CrossRef]
- Mangaroska, K.; Vesin, B.; Kostakos, V.; Brusilovsky, P.; Giannakos, M.N. Architecting analytics across multiple e-learning systems to enhance learning design. IEEE Trans. Learn. Technol. 2021, 14, 173–188. [Google Scholar] [CrossRef]
- Cao, T.; Zhang, Z.; Chen, W.; Shu, J. Utilizing clickstream data to reveal the time management of self-regulated learning in a higher education online learning environment. Interact. Learn. Environ. 2023, 31, 6555–6572. [Google Scholar] [CrossRef]
- Ashby, M.W. Effectiveness of Machine Learning Algorithms on Predicting Course Level Outcomes from Learning Management SYSTEM Data (Publication No. 30182602). Doctoral Dissertation, National University, San Diego, CA, USA, 2022. ProQuest Dissertations & Theses Global. Available online: https://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:31241309 (accessed on 18 November 2024).
- Gašević, D.; Jovanović, J.; Pardo, A.; Dawson, S. Detecting learning strategies with analytics: Links with self-reported measures and academic performance. J. Learn. Anal. 2017, 4, 10–27. [Google Scholar] [CrossRef]
- Hart, S.; Daucourt, M.; Ganley, C. Individual differences related to college students’ course performance in Calculus II. J. Learn. Anal. 2017, 4, 28–44. [Google Scholar] [CrossRef]
- Ostrow, K.S.; Wang, Y.; Heffernan, N.T. How flexible is your data? A comparative analysis of scoring methodologies across learning platforms in the context of group differentiation. J. Learn. Anal. 2017, 4, 1–9. [Google Scholar] [CrossRef]
- Worsley, M.; Anderson, K.; Melo, N.; Jang, J.Y. Designing Analytics for Collaboration Literacy and Student Empowerment. J. Learn. Anal. 2021, 8, 30–48. [Google Scholar] [CrossRef]
- Fazil, M.; Rísquez, A.; Halpin, C. A Novel Deep Learning Model for Student Performance Prediction Using Engagement Data. J. Learn. Anal. 2024, 11, 23–41. [Google Scholar] [CrossRef]
- Chen, F.; Cui, Y. Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course Performance. J. Learn. Anal. 2020, 7, 1–17. [Google Scholar] [CrossRef]
- Lancaster, A.; Moses, P.S.; Clark, M.; Masters, M.C. The Positive Impact of Deliberate Writing Course Design on Student Learning Experience and Performance. J. Learn. Anal. 2020, 7, 48–63. [Google Scholar] [CrossRef]
- Hoel, T.; Chen, W. Privacy-driven Design of Learning Analytics Applications—Exploring the Design Space of Solutions for Data Sharing and Interoperability. J. Learn. Anal. 2016, 3, 139–158. [Google Scholar] [CrossRef]
- Khosravi, H.; Kitto, K.; Williams, J.J. RiPPLE: A Crowdsourced Adaptive Platform for Recommendation of Learning Activities. J. Learn. Anal. 2019, 6, 91–105. [Google Scholar] [CrossRef]
- Ozdemir, D.; Opseth, H.M.; Taylor, H. Leveraging learning analytics for student reflection and course evaluation. J. Appl. Res. High. Educ. 2020, 12, 27–37. [Google Scholar] [CrossRef]
- Lu, O.H.T.; Huang, A.Y.Q.; Lin, A.J.Q.; Ogata, H.; Yang, S.J.H. Applying learning analytics for the early prediction of students’ academic performance in blended learning. Educ. Technol. Soc. 2018, 21, 220–232. Available online: https://www.jstor.org/stable/26388400 (accessed on 22 November 2024).
- Epp, C.D.; Phirangee, K.; Hewitt, J.; Perfetti, C.A. Learning management system and course influences on student actions and learning experiences. Educ. Technol. Res. Dev. 2020, 68, 3263–3297. [Google Scholar] [CrossRef]
- Yang, C.C.Y.; Chen, I.Y.L.; Ogata, H. Toward Precision Education: Educational Data Mining and Learning Analytics for Identifying Students’ Learning Patterns with Ebook Systems. Educ. Technol. Soc. 2021, 24, 152–163. [Google Scholar]
- Chiu, C.-K.; Tseng, J.C.R. A Bayesian Classification Network-based Learning Status Management System in an Intelligent Classroom. Educ. Technol. Soc. 2021, 24, 256–267. [Google Scholar]
- Ifenthaler, D.; Schumacher, C. Student perceptions of privacy principles for learning analytics. Educ. Technol. Res. Dev. 2016, 64, 923–938. [Google Scholar] [CrossRef]
- Kuromiya, H.; Majumdar, R.; Ogata, H. Fostering evidence-based education with learning analytics: Capturing teaching-learning cases from log data. Educ. Technol. Soc. 2020, 23, 14–29. [Google Scholar]
- Jin, Y.; Echeverria, V.; Yan, L.; Zhao, L.; Alfredo, R.; Tsai, Y.-S.; Gašević, D.; Martinez-Maldonado, R. FATE in MMLA: A Student-Centred Exploration of Fairness, Accountability, Transparency, and Ethics in Multimodal Learning Analytics. arXiv 2024, arXiv:2402.19071. [Google Scholar] [CrossRef]
- Kasun, M.; Ryan, K.; Paik, J.; Lane-McKinley, K.; Bodin Dunn, L.; Weiss Roberts, L.; Paik Kim, J. Academic machine learning researchers’ ethical perspectives on algorithm development for health care: A qualitative study. J. Am. Med. Inform. Assoc. 2024, 31, 563–573. [Google Scholar] [CrossRef] [PubMed]
- Tan, F.Z.; Lim, J.Y.; Chan, W.H.; Idris, M.I.T. Computational intelligence in learning analytics: A mini review. ASEAN Eng. J. 2024, 14, 121–129. [Google Scholar] [CrossRef]
Tool/Technique Category | Key Observations | Studies |
---|---|---|
LMS Log Data Analysis (General) | Core of many studies; fundamental data for LA. | [32,41,44,45,46,53,55,56,60,63,65,66,67,68,69] |
Statistical Analysis (Regression, Correlation, etc.) | Used to find refs. [32,41,44,45,46,53,55,56,60,63,65,66,67,68,69] and relationships, as well as make predictions. | [33,34,37,39,49,58,64,72] |
Survey Data Analysis | Combines behavioural data with self-reported experiences. | [33,39,42,43,52,53,59,71] |
Machine Learning (ML) | Increasingly used for predictive modelling and pattern recognition. | [51,56,61,62] |
Deep Learning (Long Short-Term Memory) networks, etc.) | Demonstrates the application of advanced AI techniques. | [50,57,61,62] |
Qualitative Analysis (Interviews, Questionnaires) | Provides context and deeper insights into user experiences. | [36,38,48] |
Visualisation Dashboards | Aids in data interpretation and decision making. | [36,40] |
Moodle Specific Tools | Highlights the use of specific LMS features for analytics. | [35,47] |
Cross-Platform Data Integration | Focuses on combining data from different learning systems. | [54] |
Bayesian Networks | Utilises probabilistic models for learning status inference. | [70] |
LMS Platform/Category | Key Research Themes/Observations | Studies |
---|---|---|
General LMS Usage/Data | Studies examining broad LMS usage patterns, general educational data, or theoretical concepts without specifying a particular LMS platform. | [33,36,37,38,39,40,41,44,45,46,47,48,49,50,51,52,53,54,56,57,58,59,60,61,62,63,64,66,67,68,69,70,71,72] |
Moodle | Investigations into the utilisation of specific Moodle features and plugins, emphasising platform-specific functionality. | [32,35,47] |
Canvas | Analyses of Canvas functionalities and the impact of Canvas system implementations on course design and delivery. | [34,42,43] |
StarC | Focused analysis of clickstream data generated within the StarC LMS, highlighting user interaction patterns. | [55] |
RiPPLE | Application-focused study of RiPPLE, a personalised peer-learning environment, emphasising its unique pedagogical implementation. | [65] |
Challenge Category | Specific Challenges Identified | Impact on UX Improvement | Studies |
---|---|---|---|
Complexity and Implementation Issues | Learning curve for faculty; LMS integration difficulties; complexities of multimodal learning analytics (MMLA) systems and in implementing complex adaptive systems. | Creates barriers to user adoption and hinders the development of intuitive and seamless UX solutions. | [40,44,48,52] |
Data Interpretation and Actionable Insights | Difficulty in deriving actionable insights from data; lack of effective visualisations; need for qualitative assessments and translating data to applicable improvements | Leads to inefficient use of LA; reduces the ability to transform data into meaningful UX enhancements that positively impact learning outcomes. | [36,46,66,71] |
Data Privacy and Ethical Concerns | Lack of clarity on data usage; concerns about data sharing, ensuring compliance with ethical guidelines, as well as maintaining transparency, data accuracy, and privacy. | Limits stakeholder trust in LA systems; inhibits data sharing, thus impacting the ability to create personalised and effective UX improvements. | [38,52,64] |
Lack of Transferability and Generalisability | Models not readily transferable between courses, variability in course design affecting learning patterns and context-specific results. | Restricts scalability and limits the creation of generalised UX improvements that benefit diverse user populations. | [34,41,63] |
User Preferences and Personalisation | Balancing user desires for personalised dashboards with concerns about data sharing, defining desired engagement and system feature requests, and content curation vs. monitoring. | Challenges in designing personalised UX solutions that resonate with diverse user preferences and address the complexities around individual customisation within data constraints. | [48,53,71] |
Technology and System Quality | System output and quality issues regarding integration and interoperability between LMSs, data accuracy, and system stability. | Impacts user confidence in the systems, and reduces reliable information used to provide UX improvements. | [47,49,52] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ngulube, P.; Ncube, M.M. Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions. Information 2025, 16, 419. https://doi.org/10.3390/info16050419
Ngulube P, Ncube MM. Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions. Information. 2025; 16(5):419. https://doi.org/10.3390/info16050419
Chicago/Turabian StyleNgulube, Patrick, and Mthokozisi Masumbika Ncube. 2025. "Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions" Information 16, no. 5: 419. https://doi.org/10.3390/info16050419
APA StyleNgulube, P., & Ncube, M. M. (2025). Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions. Information, 16(5), 419. https://doi.org/10.3390/info16050419