Next Article in Journal
PROMPTHEUS: A Human-Centered Pipeline to Streamline Systematic Literature Reviews with Large Language Models
Previous Article in Journal
Big Five Personality Trait Prediction Based on User Comments
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions

by
Patrick Ngulube
and
Mthokozisi Masumbika Ncube
*
Department of Interdisciplinary Research and Postgraduate Studies, University of South Africa, Pretoria 0003, South Africa
*
Author to whom correspondence should be addressed.
Information 2025, 16(5), 419; https://doi.org/10.3390/info16050419
Submission received: 13 April 2025 / Revised: 13 May 2025 / Accepted: 19 May 2025 / Published: 20 May 2025

Abstract

:
This systematic review examines the application of learning analytics to enhance user experience within Learning Management Systems in higher education institutions. Addressing a salient knowledge gap regarding the optimal integration of learning analytics for diverse learner populations, this study identifies analytical approaches and delineates implementation challenges that contribute to data misinterpretation and underutilisation. Consequently, the absence of a systematic evaluation of analytical methodologies impedes the capacity of higher education institutes to tailor learning processes to individual student needs. Adhering to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, a search was conducted across five academic databases. Studies employing learning analytics within Learning Management Systems environments to improve user experience in higher education institutions were included, while purely theoretical or non-higher education institution studies were excluded, resulting in a final corpus of 41 studies. Methodological rigour was assessed using the Critical Appraisal Skills Programme Checklist. This study revealed diverse learning analytics methodologies and a dual research focus on specific platforms and broader impacts on Learning Management Systems. However, ethical, implementation, generalisability, interpretation, personalisation, and system quality challenges impede effective learning analytics integration for user experience improvement, demanding rigorous and contextually aware strategies. This study’s reliance on existing literature introduces potential selection and database biases. As such, future research should prioritise empirical validation and cross-institutional studies to address these limitations.

1. Introduction

The distinctive context of higher education, characterised by advanced qualifications like undergraduate degrees and postgraduate programmes (postgraduate diplomas, master’s degrees, and doctoral studies), necessitates a tailored approach to data analytics within Learning Management System (LMS) platforms to effectively improve the user experience (UX) and support academic activities [1,2]. LMS are digital platforms designed to facilitate online learning and educational administration [3]. Their features, which include discussion boards, tools for submitting assignments, gradebooks, and multimedia integration, serve as a hub for the delivery of course content, communication, assessment, and student engagement [1,3]. LMS platforms in higher education must be able to meet the complex requirements of students who frequently collaborate on group projects, perform independent research, and complete specialised coursework [4]. LMS platforms must, therefore, be reliable, adaptable, and user-friendly to effectively support the academic endeavours and enhance the UX of students [4,5,6].
UX, in this context, refers to the overall interaction and perception of learners when using an LMS [2,5]. It encompasses factors such as ease of navigation, clarity of information, efficiency of task completion, and the overall satisfaction derived from the platform. A positive UX is crucial for learners, as it can significantly impact their engagement, motivation, and eventually, their academic performance [2,5,6]. For instance, an LMS with a cluttered interface or a convoluted assignment submission process can lead to frustration and wasted time, detracting from the learner’s focus on their research or coursework. Conversely, an LMS with a streamlined, intuitive design can enhance productivity, facilitate seamless communication, and foster a more positive learning environment. In this regard, higher education institutions (HEIs) can better understand how learners use their LMS, pinpoint areas for development, and tailor the learning process to each student’s unique requirements and preferences by utilising learning analytics (LA) [2,3,4,5]. In LA, data about learners and their contexts is measured, gathered, analysed, and reported in order to better understand and optimise learning as well as the settings in which it takes place [7,8]. In essence, it turns the unprocessed data produced by learner interactions inside the LMS into useful insights. This involves monitoring a number of student behaviour indicators, including how often students log in [4], how much time they spend on particular learning resources, how often they participate in discussion boards, how often they turn in assignments, and how well they do on tests and quizzes [9,10,11]. HEIs can obtain a detailed picture of how students interact with their LMS by examining this data and recognising both successful and challenging trends [9,10]. In particular, the rapid evolution of digital learning environments calls for a better comprehension of how LA can be adapted to the particular difficulties encountered by students, who frequently balance rigorous schedules, a variety of learning preferences, and intricate research goals [11,12]. The complex requirements of this group are frequently not met by the conventional one-size-fits-all approach to the LMS design, which results in less-than-ideal engagement and may even impede academic achievement [2,12,13]. Additionally, there is an opportunity to optimise and customise the learning process due to the growing amount of data available within LMS platforms [1,14,15].
Even though data analytics has the potential to improve academic performance, HEIs confront numerous obstacles and constraints when attempting to effectively adopt it [16,17]. In particular, there is a significant gap in the way LA are standardised and integrated into LMS platforms to maximise learners’ UX [3,18,19]. In the context of LMS, this lack of best practices causes HEIs to be unclear about data gathering tactics, efficient analysis methods, and the application of data-driven insights for programme enhancement [16,20]. The resultant inconsistency in LA implementation across programmes and institutions can lead to variable impacts on student learning outcomes and UX [10,11]. Furthermore, the extant literature on LA often focuses on isolated analytical techniques, such as descriptive analytics or predictive modelling, rather than exploring the integrated application of diverse approaches within the LMS environment [18,19]. This integrated approach, which could offer nuanced insights into student learning behaviour and performance within the digital learning platform [20], remains under-explored, potentially limiting the development of targeted interventions and improved academic achievement through enhanced LMS user experience.
In addition, despite the widespread adoption of LMS in education [3], a significant knowledge gap persists regarding the effective integration of LA to enhance UX. This gap impedes the development of LMS platforms optimised to specific requirements and learning styles [16]. Consequently, this systematic review aimed to synthesise empirical literature to identify and evaluate LA approaches that have demonstrated efficacy in improving the UX of LMS platforms for students. Furthermore, this study assessed the impact of these approaches on academic outcomes, student engagement, and satisfaction, thereby producing an evidence-based synthesis to inform future research and practice. Thus, without a systematic evaluation of effective analytical methods, institutions risk misinterpreting or underutilising the potential of LA [7]. Therefore, this review will address this critical gap by synthesising the methodological rigour of existing research, establishing best practices in the application of LA for UX enhancement, and providing practical insights for educators, instructional designers, and LMS developers. These insights will facilitate the creation of more user-friendly, engaging, and productive learning environments for students. In this regard, this review is guided by the following research objectives:
  • Identify and evaluate learning analytics approaches utilised within Learning Management Systems to enhance the user experience of students.
  • Examine the challenges hindering the successful integration of learning analytics for user experience improvement in Learning Management Systems environments.

2. Materials and Methods

In performing this systematic review and meta-analysis, the researchers followed the guidelines provided in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA 2020 (Supplementary Materials) statement to guarantee integrity and methodological rigour [21]. To bolster the methodological robustness and clarity of our review process, we proactively registered the study protocol on the Open Science Framework (OSF) [22]. This preregistration created a publicly available, time-stamped account of our intended aims, eligibility criteria, search approach, data extraction techniques, and planned synthesis, thereby mitigating the risk of biases emerging after data collection and enhancing trust in the reliability of this review’s conclusions [21].

2.1. Literature Retrieval, Screening, and Eligibility Criteria

The literature retrieval was conducted across five prevalent academic databases: Web of Science, ERIC, PsycINFO, Scopus, and the ACM Digital Library. Peer-reviewed journal articles, conference papers, and book chapters were among the many scholarly outputs from the original search that dealt with the nexus between data analytics and higher education. In particular, the search approach was carefully crafted to find research that concentrated on applying LA in LMS to improve students’ UXs. This approach combined keywords and subject titles [23] pertaining to LA, higher education, LMS platforms, and UX. The researchers used the subsequent syntax for each database: ACM Digital Library: ((“undergraduate” OR “postgraduate education” OR “masters” OR “doctoral” OR “graduate education”) AND (“learning analytics” OR “educational data mining” OR “data analytics”) AND (“learning management system” OR “LMS” OR “user experience” OR “UX” OR “student engagement” OR “academic performance”)). ERIC: (TI = (“undergraduate” OR “postgraduate education” OR “masters” OR “doctoral” OR “graduate education”) OR AB = (“learning analytics” OR “educational data mining” OR “data analytics”)) AND (KW = (“learning management system” OR “LMS” OR “user experience” OR “UX” OR “student engagement” OR “academic performance”))). PsycINFO: ((TX = “undergraduate” OR “postgraduate education” OR TX = “masters” OR TX = “doctoral” OR TX = “graduate education”) AND (TX = “learning analytics” OR TX = “educational data mining” OR TX = “data analytics”) AND (TX = “learning management system” OR TX = “LMS” OR TX = “user experience” OR TX = “UX” OR TX = “student engagement” OR TX = “academic performance”)). Scopus: TITLE-ABS-KEY((“undergraduate” OR “postgraduate education” OR “masters” OR “doctoral” OR “graduate education”) AND (“learning analytics” OR “educational data mining” OR “data analytics”) AND (“learning management system” OR “LMS” OR “user experience” OR “UX” OR “student engagement” OR “academic performance”)). Web of Science: TS = ((“undergraduate” OR “postgraduate education” OR “masters” OR “doctoral” OR “graduate education”) AND (“learning analytics” OR “educational data mining” OR “data analytics”) AND (“learning management system” OR “LMS” OR “user experience” OR “UX” OR “student engagement” OR “academic performance”)). Therefore, to enhance search recall and precision, Boolean operators (AND, OR) were employed to effectively combine the aforementioned search terms [23].
Subsequently, a targeted and pertinent literature search was ensured through the development of a structured search strategy and the establishment of eligibility criteria, guided by the Population, Intervention, Comparison, Outcome (PICO) framework [24]. Specifically, the PICO framework served as an instrumental tool in streamlining the initial stages of the study selection process, facilitating the identification of eligible articles [24]. Accordingly, the following eligibility criteria were formulated based on the PICO framework:
  • Population (P): Higher education learners (undergraduate, postgraduate programmes) utilising LMS. Inclusion: Studies explicitly identifying undergraduate and postgraduate learners within an LMS context. Exclusion: Studies focusing on pregraduate education or general populations without LMS-specific data.
  • Intervention (I): Enhance UX via LA in LMS. Inclusion: Studies applying LA to improve learner experience within LMS. Exclusion: Purely technical LA studies without LMS UX focus.
  • Comparison (C): LA intervention comparison within LMS. Preference: Studies comparing LA effectiveness within LMS. Screening: Prioritised comparative studies within LMS; non-comparative studies not prioritised.
  • Outcome (O): UX and learner success in LMS. Inclusion: Studies measuring UX and learner outcomes (academic, engagement, satisfaction) within LMS. Exclusion: Studies without UX or learner outcome data within LMS [24].
Building on the fundamental PICO framework, the following further requirements for eligibility were created:
  • Language: Studies published in English to ensure accessibility and consistency in analysis.
  • Publication Period: Studies published from January 2015 to February 2025, reflecting the rapid evolution of LA technologies and their application within LMS environments. This timeframe ensures the inclusion of recent advancements relevant to UX enhancement.
  • Methodological Rigour and Conceptual Clarity: Research articles demonstrating a robust understanding of LA concepts and their application within LMS environments. Studies should employ rigorous research methodologies, clearly define the learner sample within the LMS context, and present findings that contribute to the understanding of how LA enhances UX and student success in this specific digital learning environment. The PRISMA flow diagram (Figure 1) illustrates the systematic search and study selection process.
As shown in Figure 1, an initial retrieval of 3107 records were conducted across the designated databases. To manage this volume of data and ensure a rigorous selection process, a two-stage screening methodology was implemented, utilising Rayyan software version 1.5.0 (a web-based deduplication and screening platform) and ASReview software version 2.0 (an open-source machine, learning-assisted screening tool).

2.1.1. Stage 1: Automated Deduplication and Machine Learning-Assisted Prioritisation

  • Rayyan’s deduplication functionality was employed to identify and remove redundant and non-English records [26]. Utilising algorithms that compared attributes such as title, authors, abstract, and publication details, 1547 records were excluded. This deduplication process reduced the dataset to 1560 records for subsequent evaluation.
  • Subsequently, ASReview was utilised to prioritise the remaining 1560 records for full-text review [27]. The reviewers were able to concentrate on the most promising records, since ASReview’s machine learning algorithms projected each record’s relevance by comparing abstracts and titles to the predetermined inclusion and exclusion criteria. This automated prioritisation made the screening process much faster [26,27].

2.1.2. Stage 2: Independent Full-Text Review and Selection

  • Following the ASReview prioritisation, the two independent reviewers conducted a full-text review using Rayyan. Each reviewer independently applied the predetermined eligibility criteria to the prioritised records.
  • Through this rigorous application of the eligibility criteria, 1490 records were excluded due to lack of peer review and non-compliance with the research objectives, resulting in the identification of 70 records directly relevant to the application of LA within LMS environments to enhance UX. These 70 records were retained for methodological rigour and validity assessment. Therefore, the final step in the selection process involved applying the quality criteria.

2.2. Methodological Rigour and Validity Assessment

A thorough assessment of study quality was conducted to assure the epistemic integrity and robustness of this systematic review. This study made use of the Critical Appraisal Skills Programme (CASP) checklist, which is a proven tool that provides an organised and systematic framework for evaluating research articles [28]. To reduce subjective bias and promote transparency in the selection process, the CASP checklist enabled a uniform approach across all included research [28]. The CASP checklist directed attention to four cardinal dimensions of study quality and reliability:
  • Conceptual Clarity and Theoretical Foundation: The reviewers carefully considered how LA fundamentals were articulated and understood, as well as how they were applied in higher education. This required a thorough examination of the introduction and literature review sections to determine the authors’ familiarity with pertinent theoretical frameworks, current research, and the unique educational possibilities and challenges present in this field. This component dealt with the examined literature’s construct validity.
  • Methodological Soundness: Each article’s methodology sections were carefully examined. The reviewers assessed the suitability of data analysis methods, the validity and reliability of data collection tools (such as surveys and interviews), the researchers’ attempts to address potential sources of bias, and the appropriateness of the research design with respect to the stated research questions. Prioritising studies that demonstrated well-reasoned and methodologically sound procedures highlighted the research’s internal validity.
  • Sampling Adequacy and Generalisability: Clearly defined sampling frames are important, as the CASP checklist emphasised. The reviewers evaluated how well the authors defined the target group of students, the sampling procedure they used, and the rationale behind the sample size and representativeness. In order to address the external validity and transferability of the results, studies with clearly defined and representative sampling frames were given greater credibility.
  • Importance and Interpretive Complexity of Results: To assess the offered findings’ coherence, clarity, and applicability to the study objectives and methodology, a thorough analysis of the results and discussion sections was conducted. The reviewers evaluated how well the authors articulated the implications of their findings and how deeply they interpreted them. Particularly noteworthy were articles that offered fresh perspectives and useful suggestions about the use of LA in HEIs. This element had to do with the research’s ecological validity and usefulness.
The CASP checklist’s methodical implementation made it easier to thoroughly assess each article’s benefits and drawbacks in relation to these four important quality metrics [28]. This procedure led to the identification of a cohort of eligible research publications, which served as the systematic review’s evidentiary foundation [28]. Following the application of the pre-established quality criteria, 29 studies were deemed ineligible, resulting in the selection of 41 studies that formed the evidentiary base of this review.

2.3. Inter-Rater Reliability and Consistency

After the first independent screening of articles using the Rayyan and ASReview software, the reviewers worked together to settle any disagreements that surfaced throughout the selection process. This cooperative strategy guaranteed a fair and impartial assessment of each study’s applicability. Inter-rater reliability (IRR) tests were used to determine the screening process’s consistency and reduce inter-reviewer variability [29]. As a major metric, the reviewers’ percentage of agreement produced a high concordance rate of 80%. Additionally, a more robust metric that takes chance agreement into consideration, Cohen’s kappa (κ), was developed [29,30]. The dependability of the screening procedure was further supported by the ensuing kappa coefficient of 0.75, which showed a high degree of agreement among reviewers [30]. These quantitative measurements of IRR increased the research’s credibility by demonstrating the rigour used to guarantee a thorough and objective selection of studies for this review [29,30].

2.4. Extraction and Synthesis of Data

To ensure adherence to the best practices in systematic review methodology, data extraction was conducted in compliance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist. The researchers made use of CADIMA, an open-access web application, to establish a consistent data extraction procedure [31]. This promoted consistency and efficiency in the data extraction process by making it easier to systematically collect relevant information from the chosen research [31]. Key components of the data extraction form included the following:
  • Bibliographic details, such as the author, the year of publication, and the location.
  • Methodological design.
  • LA tools and methods used.
  • Significant findings about the benefits and difficulties of LA in HEIs.
Manual data coding involved extracting and classifying crucial information, including the authors, the research design, the place of publication, and the main conclusions. Appendix A [32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72] offers a thorough audit trail of the data extraction and coding procedure.

3. Results

The subsequent section delineates the analysed results, with a specific focus on addressing the core objectives of this study. These objectives entail, firstly, the identification and evaluation of diverse LA approaches employed within LMS to enhance student UX; and secondly, the examination of challenges that hinder the successful integration of LA for the explicit purpose of UX improvement within LMS environments. As such, this analysis sought to provide a comprehensive understanding of how LA are currently being utilised to optimise the student learning experience, while also critically assessing the obstacles that impede their effective implementation. Table 1 provides a structured overview of the LA tools and techniques identified across the reviewed studies.
Table 1 demonstrates a diverse application of LA tools in LMS research. Studies commonly employ general log data analysis, statistical methods (regression, correlation), and machine learning for predictive modelling. The table also shows that visualisation dashboards aid data interpretation, while mixed methods approach integrates behavioural and survey data. Furthermore, trends include the use of Moodle-specific tools, advanced artificial intelligence (AI) techniques, and cross-platform data integration for holistic learner analysis.
In addition to analysing the LA techniques and tools, the reviewed studies also demonstrate a focus on specific LMS platforms. Table 2 presents a summary of these LMS platforms.
Table 2 highlights a dual focus in LMS research: a platform-centric approach, analysing specific systems like Moodle and Canvas with data-driven methods, and a broader analysis of general LMS usage and educational technology impacts. This dichotomy reveals a field concerned with both granular system analysis and the overarching theoretical and practical implications of technology integration in education, including innovative frameworks like RiPPLE.
To address the second research objective, an examination of the challenges impeding the efficacious integration of LA for UX enhancement within LMS environments was conducted. Table 3 presents the findings of this analysis.
Table 3 presents six key challenges in leveraging LA for UX improvement: ethical data handling, implementation complexity, limited generalisability, difficulty in deriving actionable insights, personalisation conflicts, and system quality issues. These challenges, collectively, hinder the effective translation of LA data into meaningful UX enhancements.

4. Discussion

Table 1 highlights the diverse methodological approaches prevalent within LA research [16], illustrating a broad adoption of various tools and techniques aimed at comprehending and improving learning environments [2,3]. A fundamental component of this methodological landscape is the widespread utilisation of general LMS log data, which offers substantial behavioural insights into student interactions within these platforms [4,6]. Although this quantitative approach provides valuable data, it is crucial to acknowledge its inherent limitations, as it may not fully encapsulate the nuanced and intricate nature of learning processes. Consequently, statistical analyses, such as regression and correlation, are frequently applied to discern relationships and generate predictions, signifying a clear inclination towards quantifying the influence of diverse factors on student learning outcomes [9,10,12].
The growing adoption of machine learning (ML) marks a significant transition towards more data-centric and automated methodologies within LA, facilitating sophisticated predictive modelling and pattern identification (Table 1). While ML presents considerable potential for the development of personalised interventions, it also introduces critical ethical considerations pertaining to data privacy and algorithmic bias, thereby underscoring the necessity for transparency and explainability in its application [3,73,74]. Visualisation dashboards are also playing an increasingly vital role in rendering complex data comprehensible and actionable, emphasising the importance of effective data representation for facilitating informed decision making [16]. Complementing these quantitative approaches, the integration of survey data analysis and qualitative methods, such as interviews and questionnaires, underscores the recognised value of incorporating subjective perspectives and contextual insights into LA research [33], thus moving beyond purely quantitative analyses [10,12,16]. Furthermore, as evidenced in Table 1, the specific utilisation of Moodle-centric tools highlights the significance of context-sensitive analyses and the recognition of the unique affordances offered by different LMS platforms [75]. The increasing application of advanced AI techniques, including deep learning and Bayesian networks, further demonstrates the escalating sophistication of LA methodologies [18], enabling the modelling of intricate temporal dependencies and probabilistic inferences [7,9]. Moreover, the emphasis on cross-platform data integration indicates a progressive movement towards achieving a holistic understanding of learner behaviour across disparate learning systems [18]. Taken together, these observations underscore the inherently multifaceted nature of LA research [16], advocating for the synergistic combination of quantitative and qualitative methods alongside advanced analytical techniques to attain a comprehensive understanding of student learning, while concurrently emphasising critical ethical considerations and the imperative for transparent and responsible data utilisation [38,73,74].
Table 2 delineates the diverse landscape of LMS research, revealing a dual focus encompassing both platform-specific analyses and broader examinations of LMS usage [18]. Studies centred around Moodle demonstrate a significant interest in capitalising on its platform-specific features and plugins, thereby underscoring the importance of comprehending the unique functionalities inherent in individual LMS environments [3,35]. Similarly, research investigating Canvas explores its influence on course design and delivery, highlighting a focus on the practical ramifications of implementing particular LMS systems [34,43]. The analysis of StarC, employing clickstream data, further exemplifies the increasing application of quantitative methodologies to elucidate user interaction patterns within specific LMS platforms [55]. Conversely, a substantial body of research transcends these platform-specific boundaries, examining general LMS usage [9], educational data more broadly, and underlying theoretical concepts. These studies offer a wider perspective on the impact of technology on education without restricting their analyses to a singular LMS [6,7]. The inclusion of RiPPLE, a personalised peer-learning environment, signals a growing interest in innovative pedagogical frameworks that operate within or in conjunction with traditional LMS structures [65]. Collectively, these findings suggest a multifaceted research landscape, encompassing both granular analyses of specific LMS platforms and macro-level investigations into the broader implications of technology integration within educational contexts [15,18]. This dual emphasis underscores the necessity for both specialised expertise in individual LMS platforms and a comprehensive understanding of the theoretical and practical implications of LMS usage across diverse learning environments [17].
Table 3 articulates several critical challenges that impede the effective utilisation of LA for UX enhancement. Initially, concerns surrounding data privacy and ethical considerations [3,73], including a lack of clarity regarding data usage protocols and anxieties about data sharing, significantly erode stakeholder trust and restrict data accessibility. Consequently, this hinders the development of personalised and impactful UX improvements that could otherwise be informed by comprehensive data insights [72]. Secondly, the inherent complexity and practical implementation issues [40,48], such as the steep learning curve faced by faculty and difficulties encountered in LMS integration, present substantial obstacles to user adoption and the creation of seamless, intuitive UX solutions [5,13]. Furthermore, the limited transferability and generalisability of LA models across diverse educational contexts restricts the scalability and widespread applicability of UX enhancements, potentially confining effective solutions to specific environments [41,63]. The challenges associated with data interpretation and the derivation of actionable insights further compound these issues [46,74]. These challenges are often exacerbated by inadequate visualisations and the essential yet time-consuming necessity for qualitative assessments, leading to inefficient LA utilisation and impeding the translation of raw data into meaningful UX improvements that demonstrably enhance learning outcomes [2,4,5]. Navigating the inherent complexities of user preferences and personalisation [48], particularly in striking a balance between individualised customisation and existing data constraints alongside diverse user desires, poses a significant design challenge for effective UX implementation [5,13]. Moreover, technology and system quality issues, encompassing aspects such as system stability, data accuracy, and interoperability across different platforms, can erode user confidence and compromise the reliability of the information intended to inform UX improvements [49,52]. Collectively, these multifaceted challenges underscore the imperative for the adoption of rigorous, ethically sound, and contextually aware approaches to LA implementation. Such approaches are crucial to effectively realise the inherent potential of LA in meaningfully enhancing learning experiences and ensuring responsible innovation in educational technology [73,74].
This study, while providing a comprehensive overview of LA tools and challenges within LMS environments, is inherently limited by its reliance on the existing literature, which may introduce biases based on study selection and database limitations. The generalisability of findings could be constrained by the diverse contexts and populations represented in the reviewed studies, and a potential overemphasis on technical aspects might overlook crucial pedagogical considerations. The subjective nature of data interpretation and the absence of original empirical data further contribute to potential limitations.

5. Conclusions

This study examined the application of LA within LMS to enhance UX, revealing a diverse methodological landscape encompassing both quantitative and qualitative approaches, including log data analysis, statistical modelling, machine learning, and visualisation. It also highlighted a dual research focus on platform-specific analyses and broader LMS usage, demonstrating the field’s concern for both granular systems understanding and the wider impact of educational technology. However, significant challenges, including ethical concerns, implementation complexities, limited generalisability, data interpretation difficulties, personalisation conflicts, and system quality issues, hinder the effective integration of LA for UX improvement, necessitating rigorous, ethical, and contextually aware implementation strategies to realise its full potential. Given the limitations inherent in the study, future research should prioritise empirical investigations to validate findings in real-world settings, address ethical concerns related to data privacy and algorithmic bias, develop practical implementation strategies, and explore personalised UX solutions. Further studies should also focus on enhancing the generalisability of LA models, integrating qualitative insights, conducting longitudinal studies, and investigating the integration of emerging technologies and the development of interoperability standards.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/info16050419/s1, PRISMA 2020 Checklist. Reference [25] is cited in Supplementary Materials file.

Author Contributions

P.N. and M.M.N. jointly conceived the study and developed its methodological framework. P.N. and M.M.N. were responsible for software development, undertaking formal analysis, investigation, and meticulous data curation, and also crafted the initial manuscript and created visual representations to illustrate key findings. Also, both P.N. and M.M.N. collaborated to validate the results, ensuring the study’s conclusions were rigorously tested and substantiated. P.N. successfully secured essential resources and funding, providing overarching supervision for the project’s duration, and reviewed and edited the manuscript to refine its content and clarity. The final manuscript received the approval of all authors, confirming their collective endorsement of the research outcomes and conclusions. All authors have read and agreed to the published version of the manuscript.

Funding

The National Research Foundation (SA) SRUG2205025721 and University of South Africa (Unisa) funded the APC.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analysed in this study.

Acknowledgments

The authors acknowledge the support of two postdoctoral fellows from UNISA for their support in checking the data coding.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AIArtificial Intelligence
CASPCritical Appraisal Skills Programme
DAData Analytics
HEIsHigher Education Institutions
IRRInter-Rater Reliability
LALearning Analytics
LMSLearning Management Systems
MLMachine Learning
OSFOpen Science Framework
PICOPopulation, Intervention, Comparison, Outcome
PRISMAPreferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews
UXUser Experience

Appendix A

Author(s)Research TitleResearch DesignPlace of PublicationLearning Analytics Tools UsedMajor Findings
[32]Using log variables in a learning management system to evaluate learning activity using the lens of activity theoryQuantitative analysis of LMS log dataAssessment and Evaluation in Higher EducationMoodle log data analysis, statistical analysisLow overall LMS usage; significant variation in activity patterns across courses and colleges. Contradictions within the activity system hinder effective LMS use.
[33]Towards actionable learning analytics using dispositionsQuantitative analysis, incorporating self-reported dataIEEE Transactions on Learning TechnologiesDemographic, trace (LMS), and self-reported data analysisIncorporation of dispositional data (e.g., procrastination, boredom) into LA models enhances understanding of student behaviour and enables more actionable interventions.
[34]Predicting time-management skills from learning analyticsQuantitative (linear and multilevel regression)Journal of Computer-Assisted LearningCanvas LMS trace data, questionnaire dataLMS trace data can predict self-reported time-management skills, but models are not readily transferable between courses. Further research is needed to improve portability.
[35]Evaluation of usability in Moodle learning management system through analytics graphs: University of Applied Sciences teacher’s perspective in FinlandQuantitative analysis of LMS log dataInternational Journal of Education and Development using Information and Communication TechnologyMoodle log data, analytics graphs pluginAnalytics graphs in Moodle provide insights into student activity and enable identification of student profiles. This aids teachers and management in tracking and improving student performance.
[36]Analytics-informed design: Exploring visualisation of learning management systems recorded data for learning designEducational design research, qualitative interviews, dashboard pilot evaluationSAGE OpenVisualisation dashboard development, LMS data visualisationEducational design research can effectively develop user-friendly LA visualisation dashboards to support data-informed learning design. Preliminary design principles were identified.
[37]Using analytics to predict students’ interactions with learning management systems in online coursesQuantitative, Multiple Linear Regression (MLR) and Decision Tree (DT)Education and Information TechnologiesLMS analytics and log data analysisMLR and DT models can effectively predict learner–LMS interactions. Key predictors include submission, content access, and assessment access metrics.
[38]Learning analytics and data ethics in performance data management: A benchlearning exercise involving six European universitiesQualitative, benchlearning exerciseQuality in Higher EducationAnalysis of institutional data management models, ethical reviewLearning analytics are present in European universities but are primarily based on traditional data. Ethical risks are generally covered by regulations. Learning analytics offers opportunities for improved data and quality management.
[39]Pre-class learning analytics in flipped classroom: Focusing on resource management strategy, procrastination, and repetitive learningQuantitative, log data, survey, and exam data analysis.Journal of Computer-Assisted LearningLearning analytics of pre-class video viewing data, statistical analysisResource management strategies (time, study environment) significantly influence pre-class video engagement and learning achievement in flipped classrooms. Procrastination significantly decreases video engagement.
[40]From data to action: Faculty experiences with a university-designed learning analytics systemQualitative case study, surveys, and focus groupsInternational Journal on E-LearningUniversity-designed LA system, cloud-based data collection, dashboards, alert emailsFaculty will use LA to make data-driven changes to teaching, including feedback and communication. Implementation challenges include learning curves and LMS integration issues. Ongoing training and clear policies are needed.
[41]The mediating role of learning analytics: Insights into student approaches to learning and academic achievement in Latin AmericaQuantitative, analysis of LMS trace dataJournal of Learning AnalyticsLMS trace data analysis, statistical analysisMost LA indicators do not mediate the effect between learning approaches and performance, but fine-grained indicators can. Organised learning approaches are effective in Latin American higher education.
[42]Using learning analytics and student perceptions to explore student interactions in an online construction management courseCase study, learning analytics, surveysJournal of Civil Engineering EducationCanvas LMS analytics, survey dataStudent interactions with course materials decreased after the midterm. Students found lecture videos and slides most helpful. LA can inform course design.
[43]Student device usage, learning management system tool usage, and self-regulated learningNon-intervention descriptive research design, LMS data logs, surveysProQuest Dissertations and Theses Global (University of Nevada, Las Vegas)LMS data logs, Online Self-Regulated Learning Questionnaire (OSLQ)Device usage varies; low-performing students report similar or higher SRL but use LMS tools less. SRL instruction and tool/device effectiveness are crucial.
[44]Leveraging complexity science to promote learning analytics adoption in higher education: An embedded case studyEmbedded case studyProQuest Dissertations and Theses Global (University of Maryland, College Park)Analysis of learning analytics practices, application of CAS frameworkLearning analytics implementation requires consideration of higher education institutions as complex adaptive systems. Emergent, ground-up approaches are more effective than top-down.
[45]Beyond learning management systems: Teaching digital fluencyPedagogical reflections, qualitative analysisJournal of Political Science EducationAnalysis of pedagogical approaches, platforms beyond LMSTeaching digital fluency requires platforms beyond LMS. Innovative assignments improve digital skills and content retention for Generation Z learners.
[46]Increasing student engagement with course content in graduate public health education: A pilot randomised trial of behavioral nudgesPilot randomised controlled trialEducation and Information TechnologiesLMS data analysis, behavioural nudgesBehavioural nudges based on LA did not significantly change student engagement. Future work should focus on qualitative assessment of motivations and richer analysis of learning behaviours.
[47]Learning management systems for higher education: A brief comparisonComparative analysisDiscover EducationEvaluation criteria based on SQTL (Software Quality and Teaching–Learning tools)Paradiso and Moodle are the top-rated LMS based on SQTL criteria, with high scores in interoperability, accessibility, and learning tools.
[48]Learners’ needs in online learning environments and third-generation learning management systems (LMS 3.0)Qualitative, open-ended questionnaire, semi-structured interviewsTechnology, Knowledge and LearningContent analysis of questionnaire and interview dataLearners desire entertaining, self-monitoring LMS environments with gamification. Needs align with LMS 3.0, which can be developed using data mining and LA.
[49]Examining learning management system success: A multiperspective frameworkQuantitative, survey, structural equation modelling (SEM)Education and Information TechnologiesTAM3 and ISS framework, SEM analysisLMS success depends on content, system, and output quality, leading to student satisfaction and perceived usefulness. User satisfaction negatively impacts system and output quality.
[50]LearnSphere: A learning data and analytics cyberinfrastructureUse-driven design, case studiesJournal of Educational Data MiningLearnSphere, TigrisLearnSphere facilitates discoveries about learning (active learning vs. passive, discussion board quality), supports research reproducibility, and enables workflow combinations for analytics.
[51]Pragmatic monitoring of learning recession using log data of learning management systems and machine learning techniquesAnalysis of system log data, machine learning applicationInternational Journal of Education and Development using Information and Communication TechnologyMachine learning techniques (unspecified) applied to LMS log data.System log data can be used for machine learning-based monitoring of students’ learning recession. Proposed indicators and visualisations for proactive intervention.
[52]Evidence-based multimodal learning analytics for feedback and reflection in collaborative learningTwo-year longitudinal study, survey, Evaluation Framework for Learning AnalyticsBritish Journal of Educational TechnologyMultimodal Learning Analytics (MMLA) systemMMLA solution enhances feedback and reflection in collaborative learning. Positive perceptions from teachers and students, but complexity and qualitative measures need improvement. Importance of data accuracy, transparency, and privacy.
[53]Students’ use of learning management systems and desired e-learning experiences: Are they ready for next generation digital learning environments?SurveyHigher Education Research and DevelopmentAnalysis of LMS usage dataStudents’ LMS usage for content and discussion correlates with desired engagement in student-centred e-learning. Students desire systems supporting content curation, group management, and mobile interoperability.
[54]Architecting analytics across multiple e-learning systems to enhance learning designCross-platform architecture development, regression and classification techniquesIEEE Transactions on Learning TechnologiesCross-platform architecture for integrating data from multiple e-learning systems.Combining data across multiple e-learning systems improves classification accuracy. Cross-platform analytics provides broader insights into learner behaviour.
[55]Utilizing clickstream data to reveal the time management of self-regulated learning in a higher education online learning environmentAnalysis of clickstream data, learning analyticsInteractive Learning EnvironmentsStarC system log analysisClickstream data reveals time management aspects of self-regulated learning (SRL). Differences in time management among students with varying academic performance.
[56]Effectiveness of machine learning algorithms on predicting course level outcomes from learning management system dataQuantitative comparative study, machine learning algorithm comparisonProQuest Dissertations and Theses Global (Doctoral Dissertation)Naive Bayes, decision tree, neural network, support vector machineDecision tree effectively predicts students with poor course outcomes using LMS data. Decision trees outperformed other algorithms.
[57]Detecting learning strategies with analytics: Links with self-reported measures and academic performanceAnalysis of trace data, correlation with self-reported measuresJournal of Learning AnalyticsAnalysis of trace data from a flipped classroom environment.Learning strategies extracted from trace data correlate with deep and surface approaches to learning. Deep approach to learning correlates with higher academic performance.
[58]Individual differences related to college students’ course performance in Calculus IIDominance analysis, correlation analysisJournal of Learning AnalyticsAnalysis of LMS data, discussion forum data, quiz attempts.Math importance, approximate number system (ANS) ability, discussion forum posting, and workshop submission time are significant predictors of final grades in Calculus II.
[59]How flexible is your data? A comparative analysis of scoring methodologies across learning platforms in the context of group differentiationComparative analysis of scoring methodologies, resampling approachJournal of Learning AnalyticsAnalysis of ASSISTments and Cognitive Tutor data.Partial credit scoring offers more efficient group differentiation than binary accuracy measures in learning platforms. Partial credit increases analytic power.
[60]Designing Analytics for Collaboration Literacy and Student EmpowermentSurveyJournal of Learning AnalyticsBLINC (collaborative analytics tool)Student collaboration concerns fall into seven dimensions: Climate, Compatibility, Communication, Conflict, Context, Contribution, and Constructive. These dimensions should inform collaboration analytics design.
[61]A Novel Deep Learning Model for Student Performance Prediction Using Engagement DataDeep learning model development and evaluationJournal of Learning AnalyticsASIST (Attention-aware convolutional Stacked BiLSTM network)ASIST, a deep learning model, predicts student performance using engagement data from VLEs. It outperforms baseline models.
[62]Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course PerformanceDeep learning approach, comparison with machine learning classifiersJournal of Learning AnalyticsLSTM (Long Short-Term Memory) networksLSTM networks effectively predict course performance using LMS time series data, outperforming traditional machine learning classifiers.
[63]The Positive Impact of Deliberate Writing Course Design on Student Learning Experience and PerformanceAnalysis of LMS data, correlation with course design decisionsJournal of Learning AnalyticsAnalysis of LMS usage dataCourse design influences learner interaction patterns. Discussion entry length predicts final grades, highlighting the impact of writing practice.
[64]Privacy-driven Design of Learning Analytics Applications—Exploring the Design Space of Solutions for Data Sharing and InteroperabilityConceptual model developmentJournal of Learning AnalyticsLearning Analytics Design Space modelPrivacy-driven design is crucial for learning analytics systems. The Learning Analytics Design Space model aids in designing privacy-conscious solutions.
[65]RiPPLE: A Crowdsourced Adaptive Platform for Recommendation of Learning ActivitiesPlatform development and pilot studyJournal of Learning AnalyticsRiPPLE (Recommendation in Personalised Peer-Learning Environments)RiPPLE, a crowdsourced adaptive platform, recommends personalised learning activities and shows measurable learning gains.
[66]Leveraging learning analytics for student reflection and course evaluationFaculty utilisation of learning analytics, curriculum evaluationJournal of Applied Research in Higher EducationLearning analytics tools within LMSLearning analytics enables student reflection, remediation, and curriculum evaluation, providing detailed data for stakeholders.
[67]Applying learning analytics for the early prediction of students’ academic performance in blended learningPredictive modelling, principal component regressionEducational Technology and SocietyAnalysis of LMS data, video-viewing, practice behaviours, homework, quizzesLearning analytics predicts student performance in blended learning. Online and traditional factors contribute to prediction accuracy.
[68]Learning management system and course influences on student actions and learning experiencesComparative study of LMS and course influencesEducational Technology Research and DevelopmentAnalysis of LMS usage dataCourse type and LMS design influence student actions and experiences. Discussion-focused systems increase perceived learning support.
[69]Toward Precision Education: Educational Data Mining and Learning Analytics for Identifying Students’ Learning Patterns with Ebook SystemsClustering approach, analysis of ebook system dataEducational Technology and SocietyAnalysis of ebook system dataClustering identifies subgroups of students with different learning patterns. Learning patterns correlate with learning outcomes.
[70]A Bayesian Classification Network-based Learning Status Management System in an Intelligent ClassroomSystem development and experimentEducational Technology and SocietyBayesian classification network, sensor technology, image recognitionLearning status management system using sensors and image recognition. Bayesian network infers student learning status, with feedback to teachers and students.
[71]Student perceptions of privacy principles for learning analyticsExploratory study, surveyEducational Technology Research and DevelopmentAnalysis of student perceptionsStudents desire adaptive, personalised dashboards in learning analytics systems but are conservative about data sharing. Stakeholder involvement is crucial for successful implementation.
[72]Fostering evidence-based education with learning analytics: Capturing teaching-learning cases from log dataSystem development, case studiesEducational Technology and SocietyLearning analytics framework, statistical modelling of learning logsAutomated capture of teaching–learning cases (TLCs) using learning analytics. Statistical modelling identifies intervention effectiveness.

References

  1. Marks, A.; Al-Ali, M. Analytics within UAE higher education context. In Proceedings of the 2016 3rd MEC International Conference on Big Data and Smart City (ICBDSC), Muscat, Oman, 15–16 March 2016; pp. 1–6. [Google Scholar] [CrossRef]
  2. Saleh, A.M.; Abuaddous, H.Y.; Alansari, I.S.; Enaizan, O. The Evaluation of User Experience on Learning Management Systems Using UEQ. Int. J. Emerg. Technol. Learn. 2022, 17, 145–162. [Google Scholar] [CrossRef]
  3. Mohd Kasim, N.N.; Khalid, F. Choosing the Right Learning Management System (LMS) for the Higher Education Institution Context: A Systematic Review. Int. J. Emerg. Technol. Learn. 2016, 11, 55–61. [Google Scholar] [CrossRef]
  4. de Kock, E.; van Biljon, J.; Botha, A. User Experience of Academic Staff in the Use of a Learning Management System Tool. In Proceedings of the SAICSIT ‘16: Proceedings of the Annual Conference of the South African Institute of Computer Scientists and Information Technologists, Johannesburg, South Africa, 26–28 September 2016; pp. 1–10. [Google Scholar] [CrossRef]
  5. Maslov, I.; Nikou, S.; Hansen, P. Exploring user experience of learning management system. Int. J. Inf. Learn. Technol. 2021, 38, 344–363. [Google Scholar] [CrossRef]
  6. Arqoub, M.A.; El-Khalili, N.; Hasan, M.A.-S.; Banna, A.A. Extending Learning Management System for Learning Analytics. In Proceedings of the 2022 International Conference on Business Analytics for Technology and Security (ICBATS), Dubai, United Arab Emirates, 16–17 February 2022; pp. 1–6. [Google Scholar] [CrossRef]
  7. Hernández-Leo, D.; Martinez-Maldonado, R.; Pardo, A.; Muñoz-Cristóbal, J.A.; Rodríguez-Triana, M.J. Analytics for learning design: A layered framework and tools. Br. J. Educ. Technol. 2019, 50, 139–152. [Google Scholar] [CrossRef]
  8. Society for Learning Analytics Research. What Is Learning Analytics? 2024. Available online: https://www.solaresearch.org/about/what-is-learning-analytics/ (accessed on 19 November 2024).
  9. Ismail, S.N.; Hamid, S.; Ahmad, M.; Alaboudi, A.; Jhanjhi, N. Exploring Students Engagement Towards the Learning Management System (LMS) Using Learning Analytics. Comput. Syst. Sci. Eng. 2021, 37, 73–87. [Google Scholar] [CrossRef]
  10. Viberg, O.; Hatakka, M.; Bälter, O.; Mavroudi, A. The current landscape of learning analytics in higher education. Comput. Hum. Behav. 2018, 89, 98–110. [Google Scholar] [CrossRef]
  11. Goode, C.; Terry, A.; Harlow, H.; Cash, R. Mining for Gold: Learning Analytics and Design for Learning: A Review. Scope Teach. Learn. 2021, 7474, 10. [Google Scholar] [CrossRef]
  12. Tzimas, D.; Demetriadis, S.N. The Impact of Learning Analytics on Student Performance and Satisfaction in a Higher Education Course. In Proceedings of the Educational Data Mining, Paris, France, 29 June–2 July 2021; Available online: https://api.semanticscholar.org/CorpusID:247321827 (accessed on 18 November 2024).
  13. Mkpojiogu, E.O.C.; Okeke-Uzodike, O.E.; Emmanuel, E.I. Quality Attributes for an LMS Cognitive Model for User Experience Design and Evaluation of Learning Management Systems. In Proceedings of the 3rd International Conference on Integrated Intelligent Computing Communication & Security (ICIIC 2021), Bangalore, India, 6–7 August 2021; Atlantis Highlights in Computer Sciences. Atlantis Press: Dordrecht, The Netherlands, 2021; Volume 4. [Google Scholar]
  14. Almusharraf, A.I. An Investigation of University Students’ Perceptions of Learning Management Systems: Insights for Enhancing Usability and Engagement. Sustainability 2024, 16, 10037. [Google Scholar] [CrossRef]
  15. Maluleke, A.F. Enhancing Learning Analytics through Learning Management Systems Engagement in African Higher Education. J. Educ. Learn. Technol. 2024, 5, 130–149. [Google Scholar] [CrossRef]
  16. Ncube, M.M.; Ngulube, P. Optimising Data Analytics to Enhance Postgraduate Student Academic Achievement: A Systematic Review. Educ. Sci. 2024, 14, 1263. [Google Scholar] [CrossRef]
  17. El Alfy, S.; Marx Gómez, J.; Dani, A. Exploring the benefits and challenges of learning analytics in higher education institutions: A systematic literature review. Inf. Discov. Deliv. 2019, 47, 25–34. [Google Scholar] [CrossRef]
  18. Samuelsen, J.; Chen, W.; Wasson, B. Integrating multiple data sources for learning analytics—Review of literature. Res. Pract. Technol. Enhanc. Learn. 2019, 14, 11. [Google Scholar] [CrossRef]
  19. Pan, Z.; Biegley, L.; Taylor, A.; Zheng, H. A Systematic Review of Learning Analytics: Incorporated Instructional Interventions on Learning Management Systems. J. Learn. Anal. 2024, 11, 52–72. [Google Scholar] [CrossRef]
  20. Adeniran, I.A.; Efunniyi, C.P.; Osundare, O.S.; Abhulimen, A.O. Integrating data analytics in academic institutions: Enhancing research productivity and institutional efficiency. Int. J. Sch. Res. Multidiscip. Stud. 2024, 5, 77–87. [Google Scholar] [CrossRef]
  21. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. J. Clin. Epidemiol. 2021, 134, 178–189. [Google Scholar] [CrossRef] [PubMed]
  22. Pieper, D.; Rombey, T. Where to prospectively register a systematic review. Syst. Rev. 2022, 11, 8. [Google Scholar] [CrossRef]
  23. MacFarlane, A.; Russell-Rose, T.; Shokraneh, F. Search Strategy Formulation for Systematic Reviews: Issues, Challenges and Opportunities. Intell. Syst. Appl. 2022, 15, 200091. [Google Scholar] [CrossRef]
  24. Methley, A.M.; Campbell, S.; Chew-Graham, C.; McNally, R.; Cheraghi-Sohi, S. PICO, PICOS and SPIDER: A Comparison Study of Specificity and Sensitivity in Three Search Tools for Qualitative Systematic Reviews. BMC Health Serv. Res. 2014, 14, 579. [Google Scholar] [CrossRef]
  25. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  26. Rayyan. Faster Systematic Reviews. 2024. Available online: https://www.rayyan.ai/ (accessed on 7 December 2024).
  27. ASReview. Join the Movement Towards Fast, Open, and Transparent Systematic Reviews. 2024. Available online: https://asreview.nl/ (accessed on 9 December 2024).
  28. Critical Appraisal Skills Programme (CASP). CASP Checklists. 2024. Available online: https://casp-uk.net/casp-tools-checklists/ (accessed on 13 December 2024).
  29. Li, M.; Gao, Q.; Yu, T. Kappa Statistic Considerations in Evaluating Inter-Rater Reliability between Two Raters: Which, When and Context Matters. BMC Cancer 2023, 23, 799. [Google Scholar] [CrossRef]
  30. Mandrekar, J.N. Measures of Interrater Agreement. Biostat. Clin. 2011, 6, 6–7. [Google Scholar] [CrossRef] [PubMed]
  31. CADIMA. Evidence Synthesis Tool and Database. 2025. Available online: https://www.cadima.info/ (accessed on 17 January 2025).
  32. Park, Y.; Jo, I.-H. Using log variables in a learning management system to evaluate learning activity using the lens of activity theory. Assess. Eval. High. Educ. 2017, 42, 531–547. [Google Scholar] [CrossRef]
  33. Tempelaar, D.T.; Rienties, B.; Nguyen, Q. Towards actionable learning analytics using dispositions. IEEE Trans. Learn. Technol. 2017, 10, 6–17. [Google Scholar] [CrossRef]
  34. Sluijs, M.; Matzat, U. Predicting time-management skills from learning analytics. J. Comput. Assist. Learn. 2024, 40, 525–537. [Google Scholar] [CrossRef]
  35. Olaleye, S.; Agjei, R.; Jimoh, B.; Adoma, P. Evaluation of usability in Moodle learning management system through analytics graphs: University of Applied Sciences teacher’s perspective in Finland. Int. J. Educ. Dev. Using Inf. Commun. Technol. 2023, 19, 85–107. Available online: http://files.eric.ed.gov/fulltext/EJ1413526.pdf (accessed on 11 October 2024).
  36. Liu, Q.; Gladman, T.; Muir, J.; Wang, C.; Grainger, R. Analytics-informed design: Exploring visualization of learning management systems recorded data for learning design. SAGE Open 2023, 13, 1–10. [Google Scholar] [CrossRef]
  37. Alshammari, A. Using analytics to predict students’ interactions with learning management systems in online courses. Educ. Inf. Technol. 2024, 29, 20587–20612. [Google Scholar] [CrossRef]
  38. Rosa, M.J.; Williams, J.; Claeys, J.; Kane, D.; Bruckmann, S.; Costa, D.; Rafael, J.A. Learning analytics and data ethics in performance data management: A bench learning exercise involving six European universities. Qual. High. Educ. 2022, 28, 65–81. [Google Scholar] [CrossRef]
  39. Doo, M.Y.; Park, Y. Pre-class learning analytics in flipped classroom: Focusing on resource management strategy, procrastination and repetitive learning. J. Comput. Assist. Learn. 2024; advance online publication. [Google Scholar] [CrossRef]
  40. Fuller, J.; Lokey-Vega, A. From data to action: Faculty experiences with a university-designed learning analytics system. Int. J. E-Learn. 2024, 23, 471–487. Available online: https://www.learntechlib.org/primary/p/225169/ (accessed on 11 November 2024). [CrossRef]
  41. Villalobos, E.; Hilliger, I.; Gonzalez, C.; Celis, S.; Pérez-Sanagustín, M.; Broisin, J. The mediating role of learning analytics: Insights into student approaches to learning and academic achievement in Latin America. J. Learn. Anal. 2024, 11, 6–20. Available online: http://files.eric.ed.gov/fulltext/EJ1423426.pdf (accessed on 11 November 2024). [CrossRef]
  42. West, P.; Paige, F.; Lee, W.; Watts, N.; Scales, G. Using learning analytics and student perceptions to explore student interactions in an online construction management course. J. Civ. Eng. Educ. 2022, 148, 05022001. [Google Scholar] [CrossRef]
  43. Webb, N.L. Student Device Usage, Learning Management System Tool Usage, and Self-Regulated Learning (Publication No. 30310232). Doctoral Dissertation, University of Nevada, Las Vegas, NV, USA, 2023. ProQuest Dissertations & Theses Global. Available online: http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:29996357 (accessed on 11 November 2024).
  44. Moses, P.S. Leveraging Complexity Science to Promote Learning Analytics Adoption in Higher Education: An Embedded Case Study (Publication No. 30814981). Doctoral Dissertation, University of Maryland, College Park, MD, USA, 2023. ProQuest Dissertations & Theses Global. Available online: https://www.proquest.com/docview/3113526225 (accessed on 10 November 2024).
  45. Le, D.; Pole, A. Beyond learning management systems: Teaching digital fluency. J. Polit. Sci. Educ. 2023, 19, 134–153. [Google Scholar] [CrossRef]
  46. Garbers, S.; Crinklaw, A.D.; Brown, A.S.; Russell, R. Increasing student engagement with course content in graduate public health education: A pilot randomized trial of behavioral nudges. Educ. Inf. Technol. 2023, 28, 13405–13421. [Google Scholar] [CrossRef]
  47. Sanchez, L.; Penarreta, J.; Poma, X.S. Learning management systems for higher education: A brief comparison. Discov. Educ. 2024, 3, 58. [Google Scholar] [CrossRef]
  48. Sahin, M.; Yurdugül, H. Learners’ needs in online learning environments and third generation learning management systems (LMS 3.0). Technol. Knowl. Learn. 2022, 27, 33–48. [Google Scholar] [CrossRef]
  49. Becirovic, S. Examining learning management system success: A multiperspective framework. Educ. Inf. Technol. 2024, 29, 11675–11699. [Google Scholar] [CrossRef]
  50. Stamper, J.; Moore, S.; Rosé, C.P.; Pavlik, P.I., Jr.; Koedinger, K. LearnSphere: A learning data and analytics cyberinfrastructure. J. Educ. Data Min. 2024, 16, 141–163. [Google Scholar]
  51. Kalegele, K. Pragmatic monitoring of learning recession using log data of learning management systems and machine learning techniques. Int. J. Educ. Dev. Using Inf. Commun. Technol. 2023, 19, 177–190. [Google Scholar]
  52. Yan, L.; Echeverria, V.; Jin, Y.; Fernandez-Nieto, G.; Zhao, L.; Li, X.; Alfredo, R.; Swiecki, Z.; Gašević, D.; Martinez-Maldonado, R. Evidence-based multimodal learning analytics for feedback and reflection in collaborative learning. Br. J. Educ. Technol. 2024, 55, 1900–1925. [Google Scholar] [CrossRef]
  53. Koh, J.H.L.; Kan, R.Y.P. Students’ use of learning management systems and desired e-learning experiences: Are they ready for next generation digital learning environments? High. Educ. Res. Dev. 2021, 40, 995–1010. [Google Scholar] [CrossRef]
  54. Mangaroska, K.; Vesin, B.; Kostakos, V.; Brusilovsky, P.; Giannakos, M.N. Architecting analytics across multiple e-learning systems to enhance learning design. IEEE Trans. Learn. Technol. 2021, 14, 173–188. [Google Scholar] [CrossRef]
  55. Cao, T.; Zhang, Z.; Chen, W.; Shu, J. Utilizing clickstream data to reveal the time management of self-regulated learning in a higher education online learning environment. Interact. Learn. Environ. 2023, 31, 6555–6572. [Google Scholar] [CrossRef]
  56. Ashby, M.W. Effectiveness of Machine Learning Algorithms on Predicting Course Level Outcomes from Learning Management SYSTEM Data (Publication No. 30182602). Doctoral Dissertation, National University, San Diego, CA, USA, 2022. ProQuest Dissertations & Theses Global. Available online: https://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:31241309 (accessed on 18 November 2024).
  57. Gašević, D.; Jovanović, J.; Pardo, A.; Dawson, S. Detecting learning strategies with analytics: Links with self-reported measures and academic performance. J. Learn. Anal. 2017, 4, 10–27. [Google Scholar] [CrossRef]
  58. Hart, S.; Daucourt, M.; Ganley, C. Individual differences related to college students’ course performance in Calculus II. J. Learn. Anal. 2017, 4, 28–44. [Google Scholar] [CrossRef]
  59. Ostrow, K.S.; Wang, Y.; Heffernan, N.T. How flexible is your data? A comparative analysis of scoring methodologies across learning platforms in the context of group differentiation. J. Learn. Anal. 2017, 4, 1–9. [Google Scholar] [CrossRef]
  60. Worsley, M.; Anderson, K.; Melo, N.; Jang, J.Y. Designing Analytics for Collaboration Literacy and Student Empowerment. J. Learn. Anal. 2021, 8, 30–48. [Google Scholar] [CrossRef]
  61. Fazil, M.; Rísquez, A.; Halpin, C. A Novel Deep Learning Model for Student Performance Prediction Using Engagement Data. J. Learn. Anal. 2024, 11, 23–41. [Google Scholar] [CrossRef]
  62. Chen, F.; Cui, Y. Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course Performance. J. Learn. Anal. 2020, 7, 1–17. [Google Scholar] [CrossRef]
  63. Lancaster, A.; Moses, P.S.; Clark, M.; Masters, M.C. The Positive Impact of Deliberate Writing Course Design on Student Learning Experience and Performance. J. Learn. Anal. 2020, 7, 48–63. [Google Scholar] [CrossRef]
  64. Hoel, T.; Chen, W. Privacy-driven Design of Learning Analytics Applications—Exploring the Design Space of Solutions for Data Sharing and Interoperability. J. Learn. Anal. 2016, 3, 139–158. [Google Scholar] [CrossRef]
  65. Khosravi, H.; Kitto, K.; Williams, J.J. RiPPLE: A Crowdsourced Adaptive Platform for Recommendation of Learning Activities. J. Learn. Anal. 2019, 6, 91–105. [Google Scholar] [CrossRef]
  66. Ozdemir, D.; Opseth, H.M.; Taylor, H. Leveraging learning analytics for student reflection and course evaluation. J. Appl. Res. High. Educ. 2020, 12, 27–37. [Google Scholar] [CrossRef]
  67. Lu, O.H.T.; Huang, A.Y.Q.; Lin, A.J.Q.; Ogata, H.; Yang, S.J.H. Applying learning analytics for the early prediction of students’ academic performance in blended learning. Educ. Technol. Soc. 2018, 21, 220–232. Available online: https://www.jstor.org/stable/26388400 (accessed on 22 November 2024).
  68. Epp, C.D.; Phirangee, K.; Hewitt, J.; Perfetti, C.A. Learning management system and course influences on student actions and learning experiences. Educ. Technol. Res. Dev. 2020, 68, 3263–3297. [Google Scholar] [CrossRef]
  69. Yang, C.C.Y.; Chen, I.Y.L.; Ogata, H. Toward Precision Education: Educational Data Mining and Learning Analytics for Identifying Students’ Learning Patterns with Ebook Systems. Educ. Technol. Soc. 2021, 24, 152–163. [Google Scholar]
  70. Chiu, C.-K.; Tseng, J.C.R. A Bayesian Classification Network-based Learning Status Management System in an Intelligent Classroom. Educ. Technol. Soc. 2021, 24, 256–267. [Google Scholar]
  71. Ifenthaler, D.; Schumacher, C. Student perceptions of privacy principles for learning analytics. Educ. Technol. Res. Dev. 2016, 64, 923–938. [Google Scholar] [CrossRef]
  72. Kuromiya, H.; Majumdar, R.; Ogata, H. Fostering evidence-based education with learning analytics: Capturing teaching-learning cases from log data. Educ. Technol. Soc. 2020, 23, 14–29. [Google Scholar]
  73. Jin, Y.; Echeverria, V.; Yan, L.; Zhao, L.; Alfredo, R.; Tsai, Y.-S.; Gašević, D.; Martinez-Maldonado, R. FATE in MMLA: A Student-Centred Exploration of Fairness, Accountability, Transparency, and Ethics in Multimodal Learning Analytics. arXiv 2024, arXiv:2402.19071. [Google Scholar] [CrossRef]
  74. Kasun, M.; Ryan, K.; Paik, J.; Lane-McKinley, K.; Bodin Dunn, L.; Weiss Roberts, L.; Paik Kim, J. Academic machine learning researchers’ ethical perspectives on algorithm development for health care: A qualitative study. J. Am. Med. Inform. Assoc. 2024, 31, 563–573. [Google Scholar] [CrossRef] [PubMed]
  75. Tan, F.Z.; Lim, J.Y.; Chan, W.H.; Idris, M.I.T. Computational intelligence in learning analytics: A mini review. ASEAN Eng. J. 2024, 14, 121–129. [Google Scholar] [CrossRef]
Figure 1. Flowchart and description of the literature search and study selection protocol (Adapted from PRISMA Statement, Page et al., 2021 [25]).
Figure 1. Flowchart and description of the literature search and study selection protocol (Adapted from PRISMA Statement, Page et al., 2021 [25]).
Information 16 00419 g001
Table 1. Learning analytics tools and techniques used.
Table 1. Learning analytics tools and techniques used.
Tool/Technique CategoryKey ObservationsStudies
LMS Log Data Analysis (General)Core of many studies; fundamental data for LA.[32,41,44,45,46,53,55,56,60,63,65,66,67,68,69]
Statistical Analysis (Regression, Correlation, etc.)Used to find refs. [32,41,44,45,46,53,55,56,60,63,65,66,67,68,69] and relationships, as well as make predictions.[33,34,37,39,49,58,64,72]
Survey Data AnalysisCombines behavioural data with self-reported experiences.[33,39,42,43,52,53,59,71]
Machine Learning (ML)Increasingly used for predictive modelling and pattern recognition.[51,56,61,62]
Deep Learning (Long Short-Term Memory) networks, etc.)Demonstrates the application of advanced AI techniques.[50,57,61,62]
Qualitative Analysis (Interviews, Questionnaires)Provides context and deeper insights into user experiences.[36,38,48]
Visualisation DashboardsAids in data interpretation and decision making.[36,40]
Moodle Specific ToolsHighlights the use of specific LMS features for analytics.[35,47]
Cross-Platform Data IntegrationFocuses on combining data from different learning systems.[54]
Bayesian NetworksUtilises probabilistic models for learning status inference.[70]
Table 2. Focus on Learning Management Systems specificity.
Table 2. Focus on Learning Management Systems specificity.
LMS Platform/CategoryKey Research Themes/ObservationsStudies
General LMS Usage/DataStudies examining broad LMS usage patterns, general educational data, or theoretical concepts without specifying a particular LMS platform.[33,36,37,38,39,40,41,44,45,46,47,48,49,50,51,52,53,54,56,57,58,59,60,61,62,63,64,66,67,68,69,70,71,72]
MoodleInvestigations into the utilisation of specific Moodle features and plugins, emphasising platform-specific functionality.[32,35,47]
CanvasAnalyses of Canvas functionalities and the impact of Canvas system implementations on course design and delivery.[34,42,43]
StarCFocused analysis of clickstream data generated within the StarC LMS, highlighting user interaction patterns.[55]
RiPPLEApplication-focused study of RiPPLE, a personalised peer-learning environment, emphasising its unique pedagogical implementation.[65]
Table 3. Challenges hindering successful LA integration for LMS UX improvement.
Table 3. Challenges hindering successful LA integration for LMS UX improvement.
Challenge CategorySpecific Challenges IdentifiedImpact on UX ImprovementStudies
Complexity and Implementation IssuesLearning curve for faculty; LMS integration difficulties; complexities of multimodal learning analytics (MMLA) systems and in implementing complex adaptive systems.Creates barriers to user adoption and hinders the development of intuitive and seamless UX solutions.[40,44,48,52]
Data Interpretation and Actionable InsightsDifficulty in deriving actionable insights from data; lack of effective visualisations; need for qualitative assessments and translating data to applicable improvementsLeads to inefficient use of LA; reduces the ability to transform data into meaningful UX enhancements that positively impact learning outcomes.[36,46,66,71]
Data Privacy and Ethical ConcernsLack of clarity on data usage; concerns about data sharing, ensuring compliance with ethical guidelines, as well as maintaining transparency, data accuracy, and privacy.Limits stakeholder trust in LA systems; inhibits data sharing, thus impacting the ability to create personalised and effective UX improvements.[38,52,64]
Lack of Transferability and GeneralisabilityModels not readily transferable between courses, variability in course design affecting learning patterns and context-specific results.Restricts scalability and limits the creation of generalised UX improvements that benefit diverse user populations.[34,41,63]
User Preferences and PersonalisationBalancing user desires for personalised dashboards with concerns about data sharing, defining desired engagement and system feature requests, and content curation vs. monitoring.Challenges in designing personalised UX solutions that resonate with diverse user preferences and address the complexities around individual customisation within data constraints.[48,53,71]
Technology and System QualitySystem output and quality issues regarding integration and interoperability between LMSs, data accuracy, and system stability.Impacts user confidence in the systems, and reduces reliable information used to provide UX improvements.[47,49,52]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ngulube, P.; Ncube, M.M. Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions. Information 2025, 16, 419. https://doi.org/10.3390/info16050419

AMA Style

Ngulube P, Ncube MM. Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions. Information. 2025; 16(5):419. https://doi.org/10.3390/info16050419

Chicago/Turabian Style

Ngulube, Patrick, and Mthokozisi Masumbika Ncube. 2025. "Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions" Information 16, no. 5: 419. https://doi.org/10.3390/info16050419

APA Style

Ngulube, P., & Ncube, M. M. (2025). Leveraging Learning Analytics to Improve the User Experience of Learning Management Systems in Higher Education Institutions. Information, 16(5), 419. https://doi.org/10.3390/info16050419

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop