Next Article in Journal
SteadyEval: Robust LLM Exam Graders via Adversarial Training and Distillation
Next Article in Special Issue
Editorial: Recent Advances in Computer-Assisted Learning
Previous Article in Journal
Gabor Transform-Based Deep Learning System Using CNN for Melanoma Detection
Previous Article in Special Issue
Learning Analytics with Scalable Bloom’s Taxonomy Labeling of Socratic Chatbot Dialogues
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Implementing Learning Analytics in Education: Enhancing Actionability and Adoption

by
Dimitrios E. Tzimas
* and
Stavros N. Demetriadis
School of Informatics, Aristotle University of Thessaloniki, 541 24 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
Computers 2026, 15(1), 56; https://doi.org/10.3390/computers15010056
Submission received: 30 November 2025 / Revised: 8 January 2026 / Accepted: 11 January 2026 / Published: 14 January 2026
(This article belongs to the Special Issue Recent Advances in Computer-Assisted Learning (2nd Edition))

Abstract

The broader aim of this research is to examine how Learning Analytics (LA) can become ethically sound, pedagogically actionable, and realistically adopted in educational practice. To address this overarching challenge, the study investigates three interrelated research questions: ethics by design, learning impact, and adoption conditions. Methodologically, the research follows an exploratory sequential multi-method design. First, a meta-synthesis of 53 studies is conducted to identify key ethical challenges in LA and to derive an ethics-by-design framework. Second, a quasi-experimental study examines the impact of interface-based LA guidance (strong versus minimal) on students’ self-regulated learning skills and academic performance. Third, a mixed-methods adoption study, combining surveys, focus groups, and ethnographic observations, investigates the factors that encourage or hinder teachers’ adoption of LA in K–12 education. The findings indicate that strong LA-based guidance leads to statistically significant improvements in students’ self-regulated learning skills and academic performance compared to minimal guidance. Furthermore, the adoption analysis reveals that performance expectancy, social influence, human-centred design, and positive emotions facilitate LA adoption, whereas effort expectancy, limited facilitating conditions, ethical concerns, and cultural resistance inhibit it. Overall, the study demonstrates that ethics by design, effective pedagogical guidance, and adoption conditions are mutually reinforcing dimensions. It argues that LA can support intelligent, responsive, and human-centred learning environments when ethical safeguards, instructional design, and stakeholder involvement are systematically aligned.

1. Introduction

Learning Analytics (LA) has emerged as a global field of research and practice, attracting substantial investment and policy attention across higher education and K-12. Although there is a global market for learning analytics, recent multi-institutional studies have revealed a paradox that classroom teachers rarely use data dashboards to guide their instruction [1,2]. This stark contrast reveals a critical paradox: education invests in data tools yet fails to integrate them into practice, suggesting that many LA tools measure activity but do not generate actionable insights [3,4].
This research contributes to the broader field of decision-making through LA in education. Learning analytics is an interdisciplinary teaching technique that collects, analyses, and visualises large-scale learning data to offer real-time feedback and ongoing engagement. It is a significant issue in education, involving the integration of various disciplines to enhance the interaction between technology and humans in educational contexts. In recent years, ethical issues arising from the use of LA have garnered research interest. Critics argue that ethical considerations surrounding LA may hinder its effective implementation, thus limiting its potential benefits for students and educators [5,6]. However, the literature on addressing these issues in LA teaching interventions remains limited [7]. Another area of interest among educational researchers is the guidance and influence of LA on the learning process [8,9,10,11]. Furthermore, regarding LA adoption, researchers suggest it is somewhat restricted [4,12,13,14,15]. Overall, the identified research gaps concern (a) specific strategies to address ethical challenges in LA, (b) measurable outcomes from successful LA implementation in educational settings, and (c) how to improve training for educators to adopt and effectively utilise LA in their teaching practices.
Therefore, the main problem statement is “what conditions are required for LA interventions—designed around ethical safeguards and instructional guidance—to become pedagogically actionable and realistically adopted in formal educational settings?” Rather than attempting to cover the entire LA landscape exhaustively, this study focuses on three tightly scoped and complementary contributions: (a) ethics-by-design principles, (b) interface-based guidance effects, and (c) teacher adoption conditions. Therefore, the research questions (RQs) outlined in this paper are as follows.
First, to address the gap in ethical challenges, we ask (RQ1): How can a framework for ethics by design be developed to address ethical concerns in learning analytics implementation?
Second, to respond to the limited evidence of measurable outcomes, we ask (RQ2): What is the impact of interface-based learning analytics guidance (strong versus minimal) on students’ self-regulated learning skills and academic performance?
Concerning the second research question, we propose two hypotheses.
Hypothesis 1.
The self-regulated learning skills of the experimental group do not differ significantly from those of the control group.
Hypothesis 2.
The academic performance, as measured by the course grade of the experimental group, does not differ significantly from that of the control group.
Third, to respond to the challenge of limited teacher training and adoption, we ask (RQ3): What factors encourage or hinder the adoption of learning analytics in educational environments?
Overall, this investigation aims to examine the ethical and pedagogical challenges presented by LA to better understand its impact on learning outcomes. Furthermore, the paper investigates the frontiers of educational technology by emphasising the actionability and adoption of LA, while also addressing related ethical issues. In addition, this study makes three main contributions: A synthesised ethics-by-design framework for LA, derived from a meta-synthesis of 53 studies. Then, causal evidence via a quasi-experimental study showed that strong LA-based guidance improves SRL skills and academic performance compared with minimal guidance. Finally, a multi-layer adoption model for K-12 teachers, extending the Unified Theory of Acceptance and Use of Technology (UTAUT) with emotional, cultural, and human-centred constructs, validated through mixed-methods (surveys, focus groups, ethnographic observations).
This study builds on a series of prior empirical investigations conducted by the authors; however, it goes beyond reproducing earlier results. Specifically, while RQ1 and RQ2 draw on previously published empirical work, they are re-analysed and re-contextualised here to support a unified investigation of learning analytics actionability and adoption. The empirical data for RQ3 are newly collected, and the primary contribution of this article lies in the integrative synthesis of ethical design (RQ1), pedagogical effectiveness (RQ2), and adoption mechanisms (RQ3) into a single coherent socio-technical framework.
Concerning the structure of the paper, after the Section 2, Section 3 reviews ethical challenges and introduces ethics by design, Section 4 examines the impact of interface-based guidance, and Section 5 analyses adoption factors.

2. Methods

To address the three research questions holistically, we adopted a multi-method design in which each component corresponds to one research question but also informs the others. Taken together, these approaches provide a cohesive picture of LA as both a technological and socio-cultural innovation. Specifically, we frame our study as a sequence of mutually informing phases: Phase 1 maps ethical issues and develops the ethics-by-design framework; Phase 2 tests whether LA can deliver actionable pedagogical impact through interface-based guidance; and Phase 3 investigates adoption conditions, with ethics and pedagogical values as inputs.
Regarding the methodology employed, this study involves analysing data collected from individual surveys [11,15,16]. We aim to answer the same research questions using an alternative methodology and to explore additional aspects of the original research questions. Specifically, we conduct a meta-synthesis of surveys conducted with mixed methods to deepen understanding, interpret findings, and generate new knowledge in the field of LA.
The research design used in this study aligns with the research questions: a literature meta-synthesis examined ethical issues, a quasi-experimental intervention evaluated the effects of strong versus minimal LA guidance on student outcomes, and a mixed-methods approach investigated factors influencing teachers’ adoption of LA. Each component is theoretically grounded in self-regulated learning theory and the UTAUT. The combination of quantitative and qualitative methods, including surveys, focus groups, and ethnographic observations, offers both breadth and depth, thereby enhancing the robustness and credibility of the findings. Notably, the intervention study extends beyond descriptive analysis by testing the causal effect of LA guidance on student learning outcomes and offering practical implementation insights.
Finally, ethical safeguards condition adoption, pedagogical effectiveness drives adoption, and adoption feeds back into the need for ethics by design. Hence, combining these approaches allows us to capture LA as a socio-technical ecosystem rather than a narrow intervention. Although each component could stand alone, their integration is necessary to address the core research problem of LA actionability and adoption, which cannot be understood through a single methodological lens.

Survey Designs and Instruments

This study employed multiple survey instruments, pre- and post-questionnaires, focus groups, and ethnographic observations. The following subsections describe the survey designs, item structure, participant characteristics, and analysis procedures, which together ensure transparency regarding the data collection and interpretation processes.
Instruments for RQ2—SRL Questionnaire (Pre/post)
To evaluate the impact of strong versus minimal LA guidance, a pre–post questionnaire measured students’ self-regulated learning (SRL) skills. Items used a 7-point Likert scale (1 = strongly disagree, 7 = strongly agree). Pre- and post-questionnaires were administered in weeks 1 and 13 of the semester. The internal consistency reliability across subscales ranged from Cronbach’s α = 0.73 to 0.87 in the original instrument; similar reliability was observed in this study. The questionnaire, adapted from SOL-Q-R, consisted of 42 items covering the subscales:
  • Metacognitive regulation (e.g., “I plan my study tasks before starting the lesson.”).
  • Time management (e.g., “I structure my study time effectively.”).
  • Perseverance (e.g., “I continue working even when the material is difficult.”).
  • Help-seeking (e.g., “I reach out for assistance when I do not understand a concept.”).
Regarding participants and procedure, 93 undergraduate students participated: 47 in the strong-guidance (SG) condition and 46 in the minimal-guidance (MG) condition. All participants were enrolled in the same degree programme, attended the same online course, and completed identical assignments. The only difference between groups was the level of LA-based guidance. ANCOVA was used to compare post-test SRL scores, with pre-test scores as covariates. Independent t-tests were used to compare course performance between groups.
Instruments for RQ3—Teacher Survey based on UTAUT Questionnaire
The adoption study used a structured questionnaire adapted from the Unified Theory of Acceptance and Use of Technology (UTAUT). The survey was administered during a national webinar titled “Analysis of Learning Data in Education,” which attracted over 70 K-12 teachers from mathematics, literature, computer science, and economics. The instrument consisted of Likert-scale items grouped into the following constructs:
  • Performance expectancy (e.g., “LA can help me improve my teaching strategies.”).
  • Effort expectancy (e.g., “Learning to use LA tools would require significant effort.”).
  • Facilitating conditions (e.g., “My school provides adequate technical support for new digital tools.”).
  • Social influence (e.g., “My colleagues would encourage me to use LA.”).
  • Emotions (motivation, anxiety, technophobia).
  • Self-efficacy (e.g., “I feel confident in interpreting learning data.”).
  • Human-centredness (e.g., “LA tools should be designed with teacher participation.”).
Focus Groups & Ethnographic Study
Five focus groups were conducted immediately after the webinar, each lasting approximately 45–60 min. Guiding prompts included: “What motivates you to adopt LA in your teaching?” and “What barriers prevent you from applying LA in your school?” The ethnographic study included observations and interviews with five teachers. Observations focused on everyday teaching practices, communication channels, and attitudes toward LA. Interviews explored teachers’ values, perceived risks, technical readiness, and expectations for future use.
Regarding analysis procedures, qualitative data from focus groups and ethnographic interviews were analysed using thematic analysis, following open coding and iterative theme refinement. Quantitative survey data were analysed descriptively and used to triangulate the qualitative results, resulting in the combined adoption model presented in Figure 1. Where relevant, previously published datasets are reused to support cross-question synthesis rather than result replication, enabling theoretical integration across the three research questions.

3. Background—Ethical Challenges in Learning Analytics

To address the first research question (How can a framework for ethics by design be developed to address ethical concerns in learning analytics implementation?), a review was conducted to map the critical concerns across the dimensions, particularly the ethics of LA. It is the age of big data, social networks, and cloud computing. Every piece of data is recorded, leaving a digital footprint that increases the volume and variety of educational learning data [17]. Ethics is a framework of moral principles that concern what is right for the individual and the community [18]. Ethics can also be defined as a set of complex rules that vary across cultures [19]. The literature addresses numerous technological and pedagogical controversies that educational stakeholders encounter in LA. This section analyses these antagonisms by emphasising the diverse perspectives that must be considered.
For the review [16], an extensive survey of the LA literature was conducted to understand and document current trends in LA and the ethical issues that have arisen. Regarding the article selection process, after searching through more than 500 articles that initially met the selection criteria and examining their abstracts and results, we selected a corpus of 53 articles that covered the ethical principles of LA. Finally, a bottom-up comparative analysis of the selected literature resulted in a classification scheme that describes the dimensions of LA: Object of analysis, Backend data processing technology, Target of intervention, Stakeholders, and Ethics. To answer the first research question, we present the results as a list of instructional values related to data management goals, including the significant ethical issues of LA highlighted in the articles studied.
Labelling and Algorithmic Fairness. Although data-driven instruction offers the benefits of enhancing learning outcomes and reducing student attrition, it raises concerns about labelling learners, specifically the risk of students being unfairly stereotyped. Scholes [20] claims that the emphasis on self-regulation and personalisation may inadvertently overlook the diverse needs and learning styles of all students.
Therefore, instructors must guarantee that their feedback does not discourage or manipulate students. Additionally, any LA intervention should be implemented according to a specific instructional design framework, such as self-regulated learning theory [21].
There are various reasons for errors in data analysis, such as misinterpretation of data and the use of misleading models [22]. Systems depend on data, so incomplete, noisy, or unrepresentative data or models can cause incorrect decisions. Standard statistical methods may be inaccurate when used on unstructured textual data. As a result, from a teaching perspective, the outcomes might be inconsistent for learners.
Data Privacy and Ownership. Privacy is a fundamental human need; however, a significant challenge posed by big data is its global and persistent nature. In the past, stakeholders have addressed privacy through trust; however, some stakeholders lack mutual trust in LA. In addition, many educational institutions do not control the storage of trainee data because it is managed outside their institutions or even outside the country where they are based, where different laws may apply [23]. Furthermore, views on privacy vary across cultures, and different countries have distinct perceptions of what constitutes ethical behaviour.
Data ownership is a complex legal and ethical issue in data management. The primary data belongs to its creator. However, in practice, the processed data no longer belongs to the learner. Ownership pertains to the data collected, the analytics used, and the resulting output of those analytics. Finally, Hoel et al. [24] referred to the “learners’ right to be forgotten,” which relates to minimising the data and limiting its use.
Transparency and Duty to Act. Consent is an ongoing process that involves the individual permitting data collection and making decisions based on the outcomes of data processing. Arnold and Sclater [25] stated that the ethical duty of educational institutions is to obtain the highest quality educational data to ensure they provide optimal support. This suggests that, if learners have the right to opt out, it could be unethical, as opting out might create significant gaps in the dataset. Finally, Herder and Kawase [26] emphasised that knowledge and confidentiality are essential prerequisites for learners to provide their informed consent for data collection.
The cost of studying is high due to tuition fees and the time and effort required. Therefore, from both managerial and pedagogical perspectives, educational institutions should actively support and motivate their students [20]. It is unethical to neglect the predictive value of managing learning data (e.g., performance) [27]. Additionally, Prinsloo and Slade [17] argued that educational institutions have an ethical duty to act when instructional data highlight the need for instructional interventions.
These ethical safeguards set the stage for testing whether LA can actually deliver actionable pedagogical value. Next, the role of LA-based guidance in the educational process and the factors that influence or impede its adoption in everyday educational practice are examined.

4. Actionable Guidance via Learning Analytics

Regarding the second research question (What is the impact of interface-based learning analytics guidance on students’ self-regulated learning skills and academic performance?), several studies [3,28] have reported a slight improvement in learning outcomes through LA. Our focus is on LA’s contribution to improved learning outcomes, and the specific question we address is whether “strong (SG) vs. minimal (MG)” guidance [7,11] is practical. In particular, we emphasise the role of SG versus MG based on LA in the development of self-regulated learning skills and learning performance. Our intervention involves students in two comparative conditions. The MG group followed a low-prompting (reflection) approach, informing participants with visualised information. The SG group followed a high-prompting approach. Specifically, the instructor implemented an intervention protocol that included posting a traffic signal message indicating each student’s performance and facilitating online interactions among students and the instructor. The research objective is to investigate whether minimal and strong guidance based on the LA interface has the same impact on learning outcomes.
Participants and instructional context. The research [11] was carried out as part of an undergraduate course in the seventh semester. An IT department at a technological education institution in Greece offered this course online. We chose this course because of the high dropout and failure rates in previous semesters. According to our design, 93 students participated—47 as an experimental group that received the LA intervention with SG. The control group served as a baseline measurement and comprised 46 students who received the LA with the MG intervention.

Results (RQ2)

Pre-analysis procedures. Before analysing group differences, all SRL subscale scores were screened for normality and internal consistency. Reliability was acceptable across scales (α = 0.73–0.87). Before conducting the ANCOVA, the analysis’s assumptions were examined. Independence of observations was ensured by the study design, as participants were assigned to only one experimental condition and contributed a single set of measurements. Normality of residuals was assessed by visual inspection of Q–Q plots and the Shapiro–Wilk test, which indicated no substantial deviations from normality. Homogeneity of variances was evaluated using Levene’s test and was found to be satisfactory. In addition, the assumption of homogeneity of regression slopes was examined by testing the interaction between the covariate (pre-test scores) and group membership, which was not statistically significant. These results indicate that the ANCOVA assumptions were adequately met. ANCOVA was then applied to compare post-intervention SRL scores between groups while controlling for pre-intervention measurements. Course performance was analysed using independent samples t-tests.
Hypothesis 1.
The self-regulated learning skills of the experimental group do not differ significantly from those of the control group.
Regarding Hypothesis 1 (The self-regulated learning skills of the experimental group do not differ significantly from those of the control group), ANCOVAs were conducted on the self-regulated learning skills scores from the post-questionnaire, with the pre-questionnaire scores as covariates. Statistically significant differences (p < 0.01) were found across the five experimental and control conditions. The findings indicated that students in the experimental group scored higher, with statistically significant differences in the metacognitive activity subscales before and after the learning process, as well as in time management, perseverance, and help-seeking.
Hypothesis 2.
The performance, as measured by the course grade of the experimental group, does not differ significantly from that of the control group.
Regarding Hypothesis 2 (Performance, as measured by the course grade of the experimental group, does not differ significantly from that of the control group), the mean score of the experimental group (M = 7.22, SD = 2.71) was higher than that of the control group (M = 5.28, SD = 3.95). Independent t-tests comparing group scores showed a statistically significant difference (t = 2.75, p = 0.007). Overall, the null hypothesis is rejected.
Discussing the above findings, this research highlights intervention strategies that improve learning outcomes and provides evidence that the SG mode is more effective than the MG mode. Specifically, the experimental group achieved higher scores than the control group, in line with previous studies [29] reporting that learners tend to perform better when they receive well-targeted LA interventions.
Furthermore, the results showed that the degree of mentoring significantly influenced participants’ development of self-regulated learning skills [30]. The effect size (ηp2) indicated that variations in skill development were attributable to the level of mentoring, aligning with related studies [28]. In conclusion, since the only difference between the groups was the level of mentoring, we suggest that effective mentoring leads to better learning outcomes and, consequently, to more actionable learning.
Overall, this study demonstrated that the SG group outperformed the MG group in enhancing learning outcomes. As both groups experienced the same instructional environment, we attribute this difference to the level of instruction. Having demonstrated learning impact, we next examine whether teachers are willing and able to adopt such tools in practice. The following section examines K-12 teachers’ perceptions and readiness for adopting LA.

5. Teachers’ Perceptions of Learning Analytics Adoption

Regarding the third research question (What factors encourage or hinder the adoption of learning analytics in educational environments?), this section synthesises two studies [15,31] that focus on LA adoption in K-12 education. The introduction of LA in schools is considered necessary for social, economic, and pedagogical reasons [14]. Despite increasing interest in implementing LA in schools, teachers are often characterised as technophobic and sceptical about adopting it [13]. Few studies have provided empirical evidence on the educational challenges of adopting LA from classroom teachers’ perspectives. Specifically, few have examined the unified theory of acceptance and use of technology (UTAUT) [32] to investigate teachers’ adoption of LA. Guided by the UTAUT [33], we conducted a survey [15] using a questionnaire and held five focus group interviews with K-12 teachers. We explored their perceptions of LA adoption, followed by an ethnographic study with observations and interviews.
A holistic theoretical framework can provide a structure for understanding the factors underlying LA adoption. We draw evidence from the UTAUT model, which is based on TAM, ARM, planned behaviour, and innovation diffusion theories, as a framework that sheds light on how teachers perceive LA adoption. Finally, by extending the UTAUT acceptance model to the educational setting, we link a well-established information systems theoretical framework to empirical studies in education.
Method. The survey was carried out in Greece through a webinar titled “Analysis of Learning Data in Education.” The participants included over 70 teachers specialising in mathematics, literature, and computer science. Additionally, the second (ethnographic) study involved five teachers teaching mathematics, literature, and economics (content-independent survey). In ethnographic research, there is a history of studies employing small-N designs, in which each teacher is treated as a replicated unit [34]. Furthermore, ethnographic qualitative research investigates teachers’ beliefs and values in their daily practices. The data collection instruments used to gauge teachers’ perceptions included a research questionnaire and five focus groups conducted after the seminar. Moreover, in our ethnographic research, we utilised methods such as observation, interviews, and field notes.

Results (RQ3)

Survey responses (N = 73) were analysed descriptively and used to identify UTAUT-related patterns. Qualitative data from five focus groups and five ethnographic interviews were analysed using thematic analysis, with two researchers independently coding transcripts to enhance consistency and validity. Observational notes were used to triangulate self-reported perceptions.
Survey results. The survey questionnaire focused on teachers’ perceptions regarding the UTAUT model. Regarding perceived usefulness (performance expectancy), participants believe that LA enhances students’ self-regulated learning skills, particularly metacognitive skills, time management, help-seeking, and perseverance. Additionally, they stated that adopting LA could improve outcomes, including attendance, performance, and satisfaction. Lastly, the most valued items by respondents were the obligation to act and the importance of understanding learning.
Focus group results. The questionnaire directed the focus groups in exploring teachers’ motivations for adopting LA. The results (theory-centred themes with teacher-centred concepts in parentheses) are presented as follows:
  • Performance expectancy (participation, performance, and self-reflection);
  • Effort expectancy (workload, complexity);
  • Feelings (technophobia, anxiety, satisfaction);
  • Future use intentions (scepticism, guidance);
  • Facilitating conditions (training, technical infrastructure);
  • Social influence (community of practice, professional development);
  • Anthropocentricity (sense-making);
  • Self-efficacy (data literacy skills);
  • Data culture (resistance, comfort zone).
Ethnographic interview results. The interviews captured participating teachers’ readiness for adopting LA. The results—comprising theory-centred themes with teacher-centred concepts in parentheses—are presented as follows:
  • Performance expectancy (time management, awareness);
  • Effort expectancy (workload);
  • Feelings (anxiety, scepticism, satisfaction, expectation);
  • Future intentions for use (training, guidance, impact);
  • Human-centred LA (well-being, co-design, explainability);
  • Data culture (resistance, age differentiation);
  • Ethics (privacy, surveillance, trust, algorithm accuracy);
  • Social context (facilitating conditions, communication, professional development).
Observation results. During the ethnographic study, we engaged with the school community and documented numerous observations, focusing on teachers’ perceptions and guided by the UTAUT model. Regarding perceived usefulness (performance expectancy), participants believed that LA improved students’ persistence, suggesting that adopting LA could enhance student engagement. Specifically, Teacher 1 (T1) appeared interested in and requested training. She seeks professional development and uses LA for group work activities. Teacher 2 utilises LA for absence management interventions, while Teacher 3 shows a reserved interest, and Teacher 4 employs LA for communication purposes.
Mixed-method results. Figure 1 illustrates the classification model for the adoption process. The boxes display the (theory-centred) constructs along with some overlapping (participant-centred) concepts. Specifically, the green boxes are derived solely from the first study (questionnaire and focus groups). In contrast, the blue boxes come exclusively from the second/ethnographic study (observations and ethnographic interviews). Lastly, the white boxes are shared between both studies.
Specifically, Figure 1 synthesises quantitative survey findings and qualitative evidence (focus groups and ethnographic observations) into a theory-building classification model extending UTAUT with emotional, cultural, ethical, and human-centred constructs. Arrows indicate theoretically inferred influences and empirically convergent relationships, not statistically tested causal paths. The model is intended as a conceptual and integrative framework, rather than a structural equation model.
Building on the above findings, we examine LA adoption from the perspective of the following constructs, thereby broadening the UTAUT framework.
Facilitating Conditions and Self-efficacy. Teachers have reported a lack of confidence in their data literacy skills [34,35] and express a need for practical training within schools to make LA more actionable. Additionally, the teaching community has yet to develop a shared understanding of LA, as it emphasises the shift from human mediation (agency) to algorithmic mediation.
Expected Performance. According to our research, most teachers recognise LA’s usefulness (perceived relative advantage). In addition, teachers reported several benefits, including providing early support to students at risk of failure and improving teaching strategies (target of intervention) [36,37]. Furthermore, teachers firmly believe that LA enhances students’ metacognitive skills (effectiveness of intervention). Conversely, some educators feel that the data does not adequately capture the subtle differences in teaching.
Human-centred Learning Analytics and Expected Effort. The success of LA cannot be judged solely by technical metrics; it must also be evaluated based on its effectiveness within educational institutions [38,39]. Anthropocentricity is a characteristic of systems designed by identifying critical stakeholders and their relationships [40]. Involving teachers in the design of LA can be complex and time-consuming. However, engaging them through participatory and co-creation methods (shared experience) can transform an impersonal prototype into a successfully adopted system. Therefore, shifting LA from something imposed on teachers to a collaborative effort involving teachers exemplifies a human-centred approach [41]. Finally, we emphasise the importance of human-centred learning analytics, highlighting student and teacher participation, system explainability, and ethical safeguards such as fairness and privacy, which add depth and societal relevance.
Usability and time were taken into account when evaluating LA’s usefulness relative to its perceived complexity [42]. A common challenge faced by some teachers is workload and time constraints, which lead to reluctance to adopt new technologies [43]. Lastly, we consider the view that the complexities of educational settings might hinder the adoption of LA, making its benefits less clear
Data Culture and the Intention to Use Learning Analytics. The implementation of analytical methods is not an inherent part of school culture, as analytics have originated outside educational contexts. Specifically, in the case of LA entering the school community, it is an externally driven need for change. Therefore, there is an additional challenge where teachers must accept the necessity of change and adapt the school’s environment to align with this new culture [44]. Ultimately, we argue that the challenge of cultural change in educational institutions is a fundamental barrier that diminishes the potential benefits of LA.
Feelings and Social Influence. The most commonly reported positive emotions (psychological factors) were motivation and satisfaction, while the prevalent negative emotions included irritation and confusion. A comparison of the findings from the questionnaire, focus groups, and ethnographic interviews reveals notable similarities. Encouragement, confidence, and perseverance are consistent concepts (perceived compatibility). To explore contrasts, some teachers reported dissatisfaction and anxiety.
Additionally, teachers are motivated to use LA to enhance their professional profile and social influence [14], which pertains to how their colleagues view them. Furthermore, interpersonal communication channels are crucial for spreading LA adoption through participatory design (shared experience). Lastly, we observe that both teachers’ lack of training and ethical considerations are significant barriers to the effective implementation of LA.
Overall, Figure 2 illustrates the conceptual interdependence between ethics, pedagogical impact, and adoption in learning analytics. The bidirectional arrows do not imply strict causal relationships, but somewhat reciprocal influences and feedback mechanisms. Ethical safeguards constrain and shape pedagogical interventions, while demonstrated pedagogical impact reinforces trust and willingness to adopt learning analytics. Conversely, adoption practices feed back into ethical considerations and pedagogical refinement. The figure should therefore be interpreted as a conceptual model highlighting mutual dependencies rather than a causal pathway.

6. Discussion and Conclusions

The three dimensions examined in this study—ethics, pedagogical impact, and adoption—are not treated as independent outcomes but as interrelated components of a socio-technical system. Ethical safeguards (RQ1) serve as enabling conditions for trust and legitimacy, prerequisites for both pedagogical effectiveness and institutional acceptance of learning analytics. Demonstrating pedagogical impact through actionable guidance (RQ2) provides empirical justification for adoption, as stakeholders are unlikely to adopt systems that do not show measurable educational value. Finally, adoption conditions (RQ3) feed back into both ethics and pedagogy by shaping how learning analytics are interpreted, enacted, and sustained in real-world contexts.
Our findings suggest that ethics by design (RQ1), effective guidance (RQ2), and adoption conditions (RQ3) are mutually reinforcing dimensions: without ethical safeguards, adoption is undermined; without demonstrated pedagogical value, adoption stalls; and without adoption, ethical and pedagogical advances remain theoretical. The multi-method design, therefore, allows us to capture the interdependence of these factors and propose a more holistic model of LA actionability.
In RQ2, we asked whether strong vs. minimal guidance affects SRL skills and performance. Our findings show clear gains under strong guidance. This extends earlier correlational research [28] by providing causal evidence and supports self-regulated learning theory, which predicts that external scaffolds promote metacognitive regulation. However, unlike previous studies, we also found improvements in perseverance and help-seeking, suggesting that LA-based interventions can influence both motivational and cognitive aspects of SRL.
Contrasting the perceptions between the groups, secondary school teachers gave more positive responses regarding their readiness to adopt LA. The higher cognitive maturity of secondary students compared to primary students enables the use of more innovative teaching approaches. Additionally, computer science teachers responded even more positively regarding their expectations of adopting LA. Their advanced ICT and data literacy skills provide them with greater resources and expertise for innovative teaching techniques. In this sense, we conclude that early adopters who invest time to integrate technology into their teaching are more likely to adopt innovative techniques and serve as agents of change.
According to UTAUT and studies on technology adoption, we found that the main factors encouraging the adoption of LA were (a) added value that makes teachers’ monitoring practices more structured (performance expectation), (b) emotions such as motivation, satisfaction, and confidence, and (c) social influence. Conversely, factors that inhibit LA adoption included: (a) effort expectancy involving time and effort, (b) self-efficacy due to lack of data handling skills, (c) facilitating conditions like training challenges and technological infrastructure issues, and (d) cultural change, notably the lack of recognition of LA’s added value. Furthermore, ethical concerns related to LA may outweigh its potential educational advantages. Overall, the findings are somewhat contradictory, reflecting both scepticism and a lack of confidence alongside expectations and enthusiasm for new opportunities. We identified numerous potentials for LA to improve learning experiences, but also significant challenges, particularly in K-12 environments.
Our experimental results (RQ2) demonstrate that the mere collection and visualisation of learning data (Minimal Guidance) is insufficient to improve learning outcomes. Significant improvements in SRL and academic performance were achieved only when data were transformed into proactive, personalised guidance (Strong Guidance). Therefore, we conclude that the pedagogical design of the intervention—specifically, the level and quality of guidance—is a more critical factor for enhancing teaching and learning than the volume of data alone.
Based on our mixed-methods findings (RQ3), we conclude that two core perceptions primarily drive teacher adoption of LA: (1) its perceived pedagogical usefulness (Performance Expectancy) for tasks like early intervention, and (2) the extent to which tools are designed with and for teachers (Human-Centredness). Qualitative data reveal that without co-design, even useful tools may be rejected as top-down impositions. Conversely, the most significant barriers are not simply technical but socio-cultural: the perceived increase in workload (Effort Expectancy) and a deep-seated resistance to data-driven cultural change within schools, which emerged as a dominant theme in our ethnographic work.
The ethics-by-design framework (RQ1) is not merely a theoretical recommendation but a prerequisite for adoption. Our qualitative findings (RQ3) explicitly show that teachers’ ethical concerns—regarding privacy, surveillance, and algorithmic fairness—are not abstract worries but concrete barriers to adoption. Teachers expressed scepticism and anxiety about tools that lack transparency. Therefore, we conclude that addressing these ethical issues through participatory design (e.g., co-creating consent protocols, explainable dashboards) is not ancillary but central to overcoming adoption resistance.
While our quasi-experiment was conducted in higher education and the adoption study focused on K-12 teachers, the interdependence of ethics, pedagogy, and adoption appears to be a foundational principle across educational contexts. However, the relative weight of specific barriers may differ. For instance, ‘data culture’ resistance may be more pronounced in K-12 settings with established traditional practices, whereas in higher education, ‘facilitating conditions’ such as institutional support may be a more salient factor. Future research should test this model across levels.
Next, we outline key points for research and practice derived from these results.
  • Big data alone cannot enhance teaching; more research into the pedagogical aspects of LA is necessary.
  • Ethics acts as a mediating factor in the impact and adoption of LA.
  • When we implement strong guidance through LA, the results indicate improved final achievement and enhanced students’ self-regulated learning skills.
  • Students favour the participatory and human-centred design of LA.
  • The limited adoption of LA may stem from insufficient teacher training rather than from LA’s effectiveness.
  • The factors that promote the adoption of LA are performance expectancy, anthropocentricity, social influence, and emotions.
  • The factors that hinder the adoption of LA are effort expectancy, facilitating conditions, ethical issues, and cultural change.
Overall, stakeholder co-design steps, consent artefacts, transparency notices, and bias checks could improve teachers’ trust, awareness, and explainability. Below are summary tables (Table 1, Table 2, Table 3 and Table 4) that synthesise the article’s main findings, providing a structured overview.
Consequently, these findings suggest that teacher adoption of LA is not merely a rational cost–benefit calculation (as assumed in UTAUT), but is shaped by affective factors (confidence, anxiety) and cultural norms. Extending adoption models to include these dimensions could yield a more realistic account of LA uptake in schools.

7. Future Research

The broader aim of this research was to explore the ethical valuation, learning impact, and adoption of learning data analysis in education. Specifically, LA is situated within an instructional design framework that emphasises the ethical (ethics-by-design) agency of LA and, together with appropriate instructor guidance, improves learning outcomes (such as performance, satisfaction, engagement, and self-regulatory skills), serving as an anthropocentric instructional tool. Research using mixed methods has demonstrated that LA, when supported by suitable guidance and targeted adoption models (UTAUT) that promote participatory and co-design approaches involving stakeholders, can effectively facilitate the utilisation and actionable use of LA by learners, teachers, and institutions across K-12 and higher education. Moreover, the proposed adoption model provides a theoretically grounded basis for future Structural Equation Modelling-based testing, particularly to examine mediation effects between human-centredness, ethical concerns, and intention to adopt LA.
In conclusion, interesting initial hypotheses can serve as valuable inputs for future research, guided by our insights. Future work in the examined research areas could build upon these findings and conclusions to further investigate the conditions necessary for the adoption of LA. Specifically, it could explore the attitudes of acceptance and resistance among secondary school students towards implementing LA through a multi-site ethnographic study. Additionally, we might examine whether the limited adoption of LA is due to a lack of institutional support rather than the technology’s effectiveness. Furthermore, future research could explore how ethical concerns might be balanced with the educational benefits of LA, potentially increasing its practical adoption. Finally, it is important to investigate how professional development activities can improve teachers’ competence with LA tools [45].
While this study provides valuable insights, its scope is limited to data collected in Greece. This focus was chosen because Greece represents a context where LA adoption is still emerging, offering a valuable opportunity to study early-stage implementation challenges. However, institutional and cultural factors—such as data governance frameworks, teacher training systems, and prevailing attitudes toward technology—vary significantly across educational systems. As a result, our findings should be interpreted with caution when transferred to other contexts. Future studies should replicate and extend this research in diverse cultural and institutional settings to examine whether the factors we identified (e.g., performance expectancy, human-centred design, ethical safeguards) hold consistent relevance internationally.
Furthermore, the intervention is provided via a single online Greek course. Future studies should address generalisability (across different disciplines, institutions, and delivery methods) and outline a replication plan (e.g., cluster-randomised or stepped-wedge across sections). Subsequently, replication studies could use data of a size sufficient to be considered big data.
While our study focused on actionable insights from traditional analytics, the growing field of Artificial Intelligence (AI) offers transformative potential to address the challenges of adoption, actionability, and personalisation we have identified. Specifically, AI advances LA beyond descriptive and predictive analytics towards prescriptive and adaptive systems capable of creating truly personalised learning experiences at scale. In this context, machine learning techniques such as classification, clustering, semantic analysis, and summarisation (including LSA, LDA, and keyword extraction methods) can be employed for student profiling, evaluation, personalised feedback, participation, and sentiment analysis (e.g., satisfaction and anxiety). Therefore, the impact of AI on personalised educational analytics could be further explored.
Our study highlighted a key issue: a primary obstacle to adopting LA is teachers’ lack of self-efficacy and data literacy skills, along with a need for practical training. A future study will examine how a theory-based professional development programme affects teachers’ competence, confidence, and willingness to use LA tools in their practice.
Future research should involve key educational stakeholders by examining the findings in various social contexts. We believe that the effectiveness of LA might be overstated due to a lack of comprehensive studies on its long-term impact across different educational environments. Future research could also investigate the types of support that enhance teachers’ external and internal motivations to adopt LA. Additionally, while this study offers an analysis of teachers’ perceptions, it lacks longitudinal evidence on how students experience and respond to LA over time. Incorporating student attitudes and acceptance would improve the thoroughness of the adoption analysis. Finally, meta-analyses should examine the implementation and validation stages to assess the sustainability of the LA adoption process.

Author Contributions

Conceptualization, D.E.T. and S.N.D.; Data curation, D.E.T. and S.N.D.; Formal analysis, D.E.T. and S.N.D.; Investigation, D.E.T. and S.N.D.; Methodology, D.E.T. and S.N.D.; Resources, D.E.T. and S.N.D.; Supervision, D.E.T. and S.N.D.; Validation, D.E.T. and S.N.D.; Visualization, D.E.T. and S.N.D.; Writing—original draft, D.E.T. and S.N.D.; Writing—review & editing, D.E.T. and S.N.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research did not receive any external funding.

Institutional Review Board Statement

The study was conducted in accordance with the ethical guidelines of the Declaration of Helsinki. All data were anonymized at the point of collection, and no identifiable or sensitive personal information was collected or stored, thereby ensuring participant confidentiality and privacy was protected in compliance with institutional data protection policies.

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study. Prior to any data collection, all potential participants were fully informed about the purpose, procedures, and nature of their voluntary participation in the research (including surveys, focus groups, and classroom observations). They were explicitly assured of data anonymity, the non-collection of any identifiable or sensitive information, and their right to withdraw at any time without consequence.

Data Availability Statement

The datasets utilised and analysed during this study are accessible from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare they have no conflicts of interest.

References

  1. Knobbout, J.; van der Stappen, E.; Versendaal, J.; van de Wetering, R. Supporting Learning Analytics Adoption: Evaluating the Learning Analytics Capability Model in a Real-World Setting. Appl. Sci. 2025, 13, 3236. [Google Scholar] [CrossRef]
  2. Weidlich, J.; Fink, A.; Frey, A.; Jivet, I.; Gombert, S.; Menzel, L.; Giorgashvili, T.; Yau, J.; Drachsler, H. Highly informative feedback using learning analytics: How feedback literacy moderates student perceptions of feedback. Int. J. Educ. Technol. High. Educ. 2025, 22, 43. [Google Scholar] [CrossRef]
  3. Guzmán-Valenzuela, C.; Gómez-González, C.; Tagle, A.R.-M.; Lorca-Vyhmeister, A. Learning analytics in higher education: A preponderance of analytics but very little learning? Int. J. Educ. Technol. High. Educ. 2021, 18, 23. [Google Scholar] [CrossRef]
  4. Wang, M.; Chen, Z.; Xu, Y.; Maheshi, B.; Gašević, D. Intelligent teaching analytics for collaborative reflection: Investigating pre-service teachers’ perceptions, experiences and shared regulation processes. Int. J. Educ. Technol. High. Educ. 2025, 22, 45. [Google Scholar] [CrossRef]
  5. Rubel, A.; Jones, K.M.L. Student privacy in learning analytics: An information ethics perspective. Inf. Soc. 2016, 32, 143–159. [Google Scholar] [CrossRef]
  6. Tzimas, D.; Demetriadis, S. Culture of ethics in adopting learning analytics. In Augmented Intelligence and Intelligent Tutoring Systems. ITS 2023; Frasson, C., Mylonas, P., Troussas, C., Eds.; Lecture notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2023; Volume 13891. [Google Scholar] [CrossRef]
  7. Tzimas, D.; Demetriadis, S. The impact of learning analytics on student performance and satisfaction in a higher education course. In Proceedings of the 14th International Conference on Educational Data Mining (EDM21), Virtual, 29 June–2 July 2021; International Educational Data Mining Society: Mountain View, CA, USA; pp. 654–660.
  8. Kirschner, P.; Hendrick, C.; Heal, J. How Teaching Happens: Seminal Works in Teaching and Teacher Effectiveness and What They Mean in Practice; Routledge: London, UK, 2022. [Google Scholar] [CrossRef]
  9. Li, T.; Fan, Y.; Tan, Y.; Wang, Y.; Singh, S.; Li, X.; Raković, M.; van der Graaf, J.; Lim, L.; Yang, B.; et al. Analytics of self-regulated learning scaffolding: Effects on learning processes. Front. Psychol. 2023, 14, 1206696. [Google Scholar] [CrossRef] [PubMed]
  10. Matcha, W.; Gašević, D.; Uzir, N.A.; Jovanović, J.; Pardo, A. Analytics of learning strategies: Associations with academic performance and feedback. In ACM International Conference Proceeding Series, Proceedings of the 9th International Conference on Learning Analytics and Knowledge (LAK19), Tempe, AZ, USA, 4–8 March 2019; ACM: New York, NY, USA, 2019; pp. 461–470. [Google Scholar] [CrossRef]
  11. Tzimas, D.E.; Demetriadis, S.N. Impact of learning analytics guidance on student self-regulated learning skills, performance, and satisfaction: A mixed methods study. Educ. Sci. 2024, 14, 92. [Google Scholar] [CrossRef]
  12. Falcão, T.P.; Mello, R.F.; Rodrigues, R.L.; Diniz, J.R.B.; Tsai, Y.S.; Gaševic, D. Perceptions and expectations about learning analytics from a brazilian higher education institution. In ACM International Conference Proceeding Series, Proceedings of the LAK ′20: 10th International Conference on Learning Analytics and Knowledge, Frankfurt am Main, Germany, 23–27 March 2020; ACM: New York, NY, USA, 2020; pp. 240–249. [Google Scholar] [CrossRef]
  13. Hilliger, I.; Ortiz-Rojas, M.; Pesántez-Cabrera, P.; Scheihing, E.; Tsai, Y.-S.; Muñoz-Merino, P.J.; Broos, T.; Whitelock-Wainwright, A.; Gašević, D.; Pérez-Sanagustín, M. Towards learning analytics adoption: A mixed methods study of data-related practices and policies in Latin American universities. Br. J. Educ. Technol. 2020, 51, 915–937. [Google Scholar] [CrossRef]
  14. Tsai, Y.-S.; Moreno-Marcos, P.M.; Jivet, I.; Scheffel, M.; Tammets, K.; Kollom, K.; Gašević, D. The SHEILA framework: Informing institutional strategies and policy processes of learning analytics. J. Learn. Anal. 2018, 5, 5–20. [Google Scholar] [CrossRef]
  15. Tzimas, D.; Demetriadis, S. K-12 teachers’ acceptance and resistance perceptions of learning analytics adoption: A mixed-methods approach. TechTrends 2025, 69, 385–399. [Google Scholar] [CrossRef]
  16. Tzimas, D.; Demetriadis, S. Ethical issues in learning analytics: A review of the field. Educ. Technol. Res. Dev. 2021, 69, 1101–1133. [Google Scholar] [CrossRef]
  17. Prinsloo, P.; Slade, S. An elephant in the learning analytics room. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference on—LAK’17, Vancouver, BC, Canada, 13–17 March 2017; ACM: New York, NY, USA, 2017; pp. 46–55. [Google Scholar] [CrossRef]
  18. Gray, C.M.; Boling, E. Inscribing ethics and values in designs for learning: A problematic. Educ. Technol. Res. Dev. 2016, 64, 969–1001. [Google Scholar] [CrossRef]
  19. Spector, J.M. Ethics in educational technology: Towards a framework for ethical decision making in and for the discipline. Educ. Technol. Res. Dev. 2016, 64, 1003–1011. [Google Scholar] [CrossRef]
  20. Scholes, V. The ethics of using learning analytics to categorize students on risk. Educ. Technol. Res. Dev. 2016, 64, 939–955. [Google Scholar] [CrossRef]
  21. Pardo, A.; Han, F.; Ellis, R.A. Combining university student self-regulated learning indicators and engagement with online learning events to predict academic performance. IEEE Trans. Learn. Technol. 2017, 10, 82–92. [Google Scholar] [CrossRef]
  22. Fynn, A. Ethical considerations in the practical application of the Unisa socio-critical model of student success. Int. Rev. Res. Open Distance Learn. 2016, 17, 206–220. [Google Scholar] [CrossRef]
  23. Willis, J.E.; Slade, S.; Prinsloo, P. Ethical oversight of student data in learning analytics: A typology derived from a cross-continental, cross-institutional perspective. Educ. Technol. Res. Dev. 2016, 64, 881–901. [Google Scholar] [CrossRef]
  24. Hoel, T.; Griffiths, D.; Chen, W. The influence of data protection and privacy frameworks on the design of learning analytics systems. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference on LAK’17, Vancouver, BC, Canada, 13–17 March 2017; ACM: New York, NY, USA, 2017; pp. 243–252. [Google Scholar] [CrossRef]
  25. Arnold, K.E.; Sclater, N. Student perceptions of their privacy in learning analytics applications. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference on—LAK’17, Vancouver, BC, Canada, 13–17 March 2017; ACM: New York, NY, USA, 2017; pp. 66–69. [Google Scholar] [CrossRef]
  26. Herder, E.; Kawase, R. Considerations for recruiting contributions to anonymised data sets. Int. J. Technol. Enhanc. Learn. 2012, 4, 85–98. [Google Scholar] [CrossRef]
  27. West, D.; Huijser, H.; Heath, D. Putting an ethical lens on learning analytics. Educ. Technol. Res. Dev. 2016, 64, 903–922. [Google Scholar] [CrossRef]
  28. Wong, J.; Baars, M.; de Koning, B.B.; van der Zee, T.; Davis, D.; Khalil, M.; Houben, G.; Paas, F. Educational theories and learning analytics: From data to knowledge. In Utilizing Learning Analytics to Support Study Success; Springer: Berlin/Heidelberg, Germany, 2019; pp. 3–25. [Google Scholar] [CrossRef]
  29. Khalil, M.; Ebner, M. Clustering patterns of engagement in Massive Open Online Courses (MOOCs): The use of learning analytics to reveal student categories. J. Comput. High. Educ. 2017, 29, 114–132. [Google Scholar] [CrossRef]
  30. Luo, J.; Zheng, C.; Yin, J.; Teo, H.H. Design and assessment of AI-based learning tools in higher education: A systematic review. Int. J. Educ. Technol. High. Educ. 2025, 22, 42. [Google Scholar] [CrossRef]
  31. Tzimas, D.; Demetriadis, S. Students’ perceptions of adopting learning analytics. In Generative Intelligence and Intelligent Tutoring Systems. ITS 2024; Sifaleras, A., Lin, F., Eds.; Lecture notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2024; Volume 14798. [Google Scholar] [CrossRef]
  32. Dwivedi, Y.K.; Rana, N.P.; Jeyaraj, A.; Clement, M.; Williams, M.D. Re-examining the Unified Theory of Acceptance and Use of Technology (UTAUT): Towards a Revised Theoretical Model. Inf. Syst. Front. 2019, 21, 719–734. [Google Scholar] [CrossRef]
  33. Williams, M.D.; Rana, N.P.; Dwivedi, Y.K. The unified theory of acceptance and use of technology (UTAUT): A literature review. J. Enterp. Inf. Manag. 2015, 28, 443–488. [Google Scholar] [CrossRef]
  34. Barker, C.; Pistrang, N.; Elliott, R. Small-N Designs. In Research Methods in Clinical Psychology; Barker, C., Pistrang, N., Elliott, R., Eds.; John Wiley & Sons, Ltd.: Chichester, UK, 2002. [Google Scholar] [CrossRef]
  35. Jin, F.J.Y.; Nath, D.; Guan, R.; Li, T.; Li, X.; Mello, R.F.; Rodrigues, L.; Junior, C.P.; Abuzayyad-Nuseibeh, H.; Raković, M.; et al. Analytics of Self-Regulated Learning in Learning Analytics Feedback Processes: Associations with Feedback Literacy in Secondary Education. J. Comput. Assist. Learn. 2025, 41, e70076. [Google Scholar] [CrossRef]
  36. Herodotou, C.; Maguire, C.; Hlosta, M.; Mulholland, P. Predictive Learning Analytics and University Teachers: Usage and perceptions three years post implementation. In ACM International Conference Proceeding Series, Proceedings of the LAK 2023: 13th International Learning Analytics and Knowledge Conference, Tempe, AZ, USA, 13–17 March 2023; ACM: New York, NY, USA, 2023; pp. 68–78. [Google Scholar] [CrossRef]
  37. Šimić, D.; Šlibar, B.; Munđar, J.G.; Rako, S. Making Sense of Learning Analytics Use Cases in Higher Education: Development of Scientific Communities and Topics. Technol. Knowl. Learn. 2025, 30, 1249–1268. [Google Scholar] [CrossRef]
  38. Lim, L.-A.; Atif, A.; Heggart, K.; Sutton, N. In Search of Alignment between Learning Analytics and Learning Design: A Multiple Case Study in a Higher Education Institution. Educ. Sci. 2023, 13, 1114. [Google Scholar] [CrossRef]
  39. Nazaretsky, T.; Cukurova, M.; Alexandron, G. An Instrument for Measuring Teachers’ Trust in AI-Based Educational Technology. In ACM International Conference Proceeding Series, Proceedings of the LAK22: 12th International Learning Analytics and Knowledge Conference, Bonn, Germany, 21–25 March 2022; ACM: New York, NY, USA, 2022; pp. 56–66. [Google Scholar] [CrossRef]
  40. Dimitriadis, Y.; Martínez-Maldonado, R.; Wiley, K. Human-Centered Design Principles for Actionable Learning Analytics. In Research on E-Learning and ICT in Education; Tsiatsos, T., Demetriadis, S., Mikropoulos, A., Dagdilelis, V., Eds.; Springer: Berlin/Heidelberg, Germany, 2021; pp. 277–296. [Google Scholar] [CrossRef]
  41. Buckingham Shum, S.; Ferguson, R.; Martinez-Maldonado, R. Human-Centred Learning Analytics. J. Learn. Anal. 2019, 6, 1–9. [Google Scholar] [CrossRef]
  42. Herodotou, C.; Rienties, B.; Boroowa, A.; Zdrahal, Z.; Hlosta, M. A large-scale implementation of predictive learning analytics in higher education: The teachers’ role and perspective. Educ. Technol. Res. Dev. 2019, 67, 1273–1306. [Google Scholar] [CrossRef]
  43. Michos, K.; Lang, C.; Hernández-Leo, D.; Price-Dennis, D. Involving teachers in learning analytics design: Lessons learned from two case studies. In ACM International Conference Proceeding Series, Proceedings of the LAK ′20: 10th International Conference on Learning Analytics and Knowledge, Frankfurt am Main, Germany, 23–27 March 2020; ACM: New York, NY, USA, 2020; pp. 94–99. [Google Scholar] [CrossRef]
  44. Bahari, M.; Arpaci, I.; Azmi, N.F.M.; Shuib, L. Predicting the Intention to Use Learning Analytics for Academic Advising in Higher Education. Sustainability 2023, 15, 15190. [Google Scholar] [CrossRef]
  45. Sharif, H.; Atif, A. The Evolving Classroom: How Learning Analytics Is Shaping the Future of Education and Feedback Mechanisms. Educ. Sci. 2024, 14, 176. [Google Scholar] [CrossRef]
Figure 1. Integrated conceptual/classification model for the adoption of learning analytics.
Figure 1. Integrated conceptual/classification model for the adoption of learning analytics.
Computers 15 00056 g001
Figure 2. Conceptual framework illustrating the triadic interdependence between ethics, pedagogy, and adoption in learning analytics.
Figure 2. Conceptual framework illustrating the triadic interdependence between ethics, pedagogy, and adoption in learning analytics.
Computers 15 00056 g002
Table 1. Summary of ethical challenges (RQ1).
Table 1. Summary of ethical challenges (RQ1).
Ethical DimensionCore IssueKey Concerns & Implications
Labelling &
Algorithmic Fairness
The risk of stereotyping students based on data.
-
Unfair categorisation and stigmatisation.
-
Overlooking diverse learning needs and styles.
-
Incorrect decisions due to flawed data or biased models.
Data Privacy &
Ownership
Who controls and has rights to learner data?
-
Data stored in external jurisdictions with different laws.
-
Ambiguity over who owns processed data and analytics outputs.
-
Conflict with the “right to be forgotten.”
Transparency &
Consent
The process of informing and obtaining permission for data use.
-
Meaningful, ongoing consent is difficult to obtain.
-
The ethical dilemma of allowing opt-outs creates data gaps.
-
Requires knowledge for true informed consent.
Duty to ActThe institutional obligation to intervene based on data.
-
Withholding predictive support on student failure is unethical.
-
Institutions have a responsibility to use data to improve student support.
Table 2. Summary of guidance intervention impact (RQ2).
Table 2. Summary of guidance intervention impact (RQ2).
AspectMinimal Guidance (MG) GroupStrong Guidance (SG) GroupResult & Significance
InterventionLow-promoting; basic visualisations of performance data.High-promoting; visualisations, personalised prompts.The only variable that differed between the two groups.
SRL SkillsBaseline measurementStatistically significant improvement in metacognition, time management, perseverance, and help-seeking.Hypothesis 1 rejected. SG was significantly more effective at developing SRL skills.
Academic
Performance
Mean Grade: 5.28 (SD = 3.95)Mean Grade: 7.22 (SD = 2.71)Hypothesis 2 rejected. The SG group performed significantly better.
Overall
Conclusion
Strong guidance leads to more actionable outcomes, directly improving both learning skills and final grades.
Table 3. Summary of factors influencing LA adoption (RQ3).
Table 3. Summary of factors influencing LA adoption (RQ3).
Factor CategorySpecific ConstructEncourages AdoptionHinders Adoption
UTAUT Core
Factors
Performance ExpectancyBelief it improves outcomes, SRL skills, and teaching strategies.The belief that data fails to capture the nuances of teaching.
Effort Expectancy Perceived high workload, complexity, and time required.
Social InfluencePeer encouragement, community of practice, professional development.
Facilitating Conditions Lack of training, technical infrastructure, and institutional support.
Extended & Contextual FactorsEmotionsMotivation, satisfaction, confidence.Anxiety, technophobia, scepticism, irritation.
Self-Efficacy Lack of data literacy skills.
Human-CenterednessInvolvement in co-design, explainability, and ethical safeguards.Perceived as an impersonal, top-down imposition.
Data Culture Resistance to change, comfort with existing methods, and age differentiation.
Ethical Concerns Issues of privacy, surveillance, trust, and algorithm accuracy.
Table 4. Overall conclusions and future research directions.
Table 4. Overall conclusions and future research directions.
ThemeKey ConclusionProposed Future Research
Pedagogy & EthicsBig data alone is insufficient; pedagogical integration and an “ethics-by-design” approach are crucial.Balance ethical concerns with educational benefits. Investigate long-term impact across different environments.
ActionabilityThe level of guidance (Strong vs. Minimal) is a critical factor in making LA actionable and practical.Replicate the intervention study across different disciplines, institutions, and delivery methods.
AdoptionAdoption is a socio-technical challenge. Requires a human-centred approach, co-design with teachers, and institutional support.Investigate student perceptions and acceptance of LA. Study types of support that enhance teachers’ motivation to adopt LA.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tzimas, D.E.; Demetriadis, S.N. Implementing Learning Analytics in Education: Enhancing Actionability and Adoption. Computers 2026, 15, 56. https://doi.org/10.3390/computers15010056

AMA Style

Tzimas DE, Demetriadis SN. Implementing Learning Analytics in Education: Enhancing Actionability and Adoption. Computers. 2026; 15(1):56. https://doi.org/10.3390/computers15010056

Chicago/Turabian Style

Tzimas, Dimitrios E., and Stavros N. Demetriadis. 2026. "Implementing Learning Analytics in Education: Enhancing Actionability and Adoption" Computers 15, no. 1: 56. https://doi.org/10.3390/computers15010056

APA Style

Tzimas, D. E., & Demetriadis, S. N. (2026). Implementing Learning Analytics in Education: Enhancing Actionability and Adoption. Computers, 15(1), 56. https://doi.org/10.3390/computers15010056

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop