Next Article in Journal
Predicting the Spatial Distribution of Hyalomma ssp., Vector Ticks of Crimean–Congo Haemorrhagic Fever in Iraq
Previous Article in Journal
Carbon-Based Solid Acid Catalyzed Esterification of Soybean Saponin-Acidified Oil with Methanol Vapor for Biodiesel Synthesis
Previous Article in Special Issue
Promising Emerging Technologies for Teaching and Learning: Recent Developments and Future Challenges
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Learning Analytics in Supporting Student Agency: A Systematic Review

1
Centre for Educational Technology, Tallinn University, 10120 Tallinn, Estonia
2
Center for Digitalisation in Lifelong Learning, University for Continuing Education Krems, 3500 Krems an der Donau, Austria
3
School of Educational Sciences, Tallinn University, 10120 Tallinn, Estonia
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(18), 13662; https://doi.org/10.3390/su151813662
Submission received: 6 July 2023 / Revised: 18 August 2023 / Accepted: 20 August 2023 / Published: 13 September 2023
(This article belongs to the Collection Technology-Enhanced Learning and Teaching: Sustainable Education)

Abstract

:
Student agency, or agency for learning, refers to an individual’s ability to act and cause changes during the learning process. Recently, learning analytics (LA) has demonstrated its potential in promoting agency, as it enables students to take an active role in their learning process and supports the development of their self-regulatory skills. Despite the growing interest and potential for supporting student agency, there have yet to be any studies reviewing the extant works dealing with the use of LA in supporting student agency. We systematically reviewed the existing related works in eight major international databases and identified 15 articles. Analysis of these articles revealed that most of the studies aimed to investigate student or educators’ agency experiences, propose design principles for LA, and to a lesser extent, develop LA methods/dashboards to support agency. Of those studies developing LA, none initially explored student agency experiences and then utilized their findings to develop evidence-based LA methods and dashboards for supporting student agency. Moreover, we found that the included articles largely rely on descriptive and diagnostic analytics, paying less attention to predictive analytics and completely overlooking the potential of prescriptive learning analytics in supporting agency. Our findings also shed light on nine key design elements for effective LA support of student agency, including customization, decision-making support, consideration of transparency and privacy, and facilitation of co-design. Surprisingly, we found that no studies have considered the use of LA to support student agency in K–12 education, while higher education has been the focal point of the LA community. Finally, we highlighted the fields of study and data visualization types that the studies mostly targeted and, more importantly, identified eight crucial challenges facing LA in its support of student agency.

1. Introduction

In today’s education, a key challenge is to provide students with equal opportunities that facilitate the acquisition of skills and competencies required in the modern labor market [1]. Recent research has highlighted agency as a crucial element of professionalism [2], which aligns closely with the principles of sustainable education. Many researchers have found agency to play an important role in developing personal well-being and a meaningful career [3,4], creativity and transformed practices of expert work (e.g., [5]), as well as coping with work–life changes and lifelong learning [6]. Despite being acknowledged as an established educational objective at the policy level, student agency development has not received enough attention yet, especially in higher education [1]. For instance, according to Trede et al. [7], university students are not well-prepared for the work world which needs agency, and universities give more attention to acquiring formal and theoretical knowledge.
There exist various research directions/studies focusing on student agency. For instance, some revolve around participatory structures and learning relations (e.g., [8,9]) and singular constructs or characteristics of beliefs or self-processes of students (e.g., [10,11]). Several other studies focus on the development of knowledge in micro-level learning interactions (with a focus on epistemic agency) (e.g., [12]). Additionally, several recent literature reviews have addressed student agency (e.g., [13,14]). For example, Stenalt and Lassesen [14] conducted a systematic review of student agency and its connection to student learning in higher education.
One prospective way to support student agency and its foundations is through learning analytics (LA). LA aims to make use of learners’ data to understand, enhance, and optimize learning. According to Wise [15], one of the key features of LA is that it enables students to take an active role in their learning process and supports the development of their self-regulatory skills. For instance, LA methods could predict student performance in courses and feed back information about learners’ competencies, knowledge levels, misconceptions, motivation, engagement status, and more (e.g., [16]). Such support can potentially facilitate learners’ engagement, meta-cognition, and decision making, and it can also provide customizability, which would eventually contribute to student agency and empowerment. To this end, Chen and Zhang [17] indicate that supporting choice-making using LA could promote high-level epistemic agency and design mode thinking competencies. Wise et al. [18] propose that LA could support fostering student agency by providing options for the customization of learning goals. van Leeuwen et al. [19] summarize that LA dashboards for students differ from earlier feedback systems in several ways—instead of providing simple static feedback, dashboards often visualize complex processes, and feedback can be available not only after the learning activity but also on demand.
Despite the potential, some studies argue that LA might restrict student agency. For instance, Ochoa and Wise [20] state that a lack of co-creation in developing and interpreting LA could restrict student agency because students are not provided with opportunities to act as active agents in the interpretation process, which is required to identify actionable insights. Saarela et al. [21] and Prinsloo and Slade [22] argue that open practices (e.g., lack of algorithm opaqueness) and issues related to data ownership in LA could hinder student ownership of analytics and trust as well as hinder the empowerment of students as participants in their own learning.
In the history of learning analytics research, there have been several literature reviews investigating different aspects of learning analytics for different levels of education. The closest review to our study was conducted by Matcha et al. [23]. While shedding light on many existing challenges in learning analytics research, this study overlooks the agency of learning and merely deals with learning analytics from self-regulated learning perspectives. Even though supporting self-regulated learning could contribute to fostering agency [24], these two differ in that the agency of learning emphasizes students’ ability to actively enhance learning environments, contribute to knowledge development, or engage in innovation processes. Given the absence of an existing systematic review of LA in supporting student agency and the lack of consensus among researchers regarding whether or not LA can contribute to student agency, this research aims to explore the potential effects of LA in promoting student agency.

1.1. The Construct of Agency

There has been an increasing interest in agency within research revolving around lifelong learning, the workplace, and learning frameworks such as constructivism and sociocultural theories (e.g., [25,26,27,28]). Nonetheless, the construct of agency’s origin dates back to the agency conceptualization in social sciences (e.g., [29,30]) where it refers to the capability of individuals to engage in self-defined, deliberate, and meaningful actions in situations limited by contextual and structural relations and factors [31].
The social–cognitive psychology standpoint sees agency as a mediating factor from thoughts to actions that are related to an individual’s intentionality, self-processes, and self-reflection, as well as competence beliefs and self-efficacy (e.g., [32]). Constructivist viewpoints on agency mostly deal with the active actions of individuals in knowledge development and reorganization [33]. On the other hand, a sociocultural viewpoint places more emphasis on agency exhibiting itself through decision making and instances of action (e.g., [34]). Moreover, aside from cultural and social structures, a subject-centered sociocultural viewpoint on agency considers the purposes, meanings, and interpretations that individuals attribute to actions that are deemed crucial in promoting agency (e.g., [35]). Generally, while many researchers acknowledge the subjective standpoint of agency, they place more emphasis on the dynamic, contextually situated, and relationally constructed nature of agency (e.g., [26]). For instance, Jääskelä et al. [36] synthesize the existing literature across psychology, educational sciences, and social sciences to define agency as “a student’s experience of having access to or being empowered to act through personal, relational, and participatory resources, which allow him/her to engage in purposeful, intentional, and meaningful action and learning in study contexts”.

1.2. The Potential of Learning Analytics to Promote Student Agency

Several disciplines have studied the concept of agency, which generally refers to an individual’s capacity to take action and effect changes. According to many researchers, when it comes to the educational context, an increase in student agency and deeper learning leads to more effective pedagogical practices (e.g., [37,38]). For example, Lindgren and McDaniel [39] state that considering agency while designing course guidance and instructions can facilitate engaging students in challenging learning tasks and improve their learning. Vaughn [35] indicates that through cultivating agency, teachers can adapt a flexible method to teaching that supports students’ social emotional needs and direct them better toward thinking about their engagement in the learning process. According to Scardamalia and Bereiter [40], creating instructional environments that encourage learners to pose educationally valuable questions can enhance agency and contribute to the development of knowledge structures. Moreover, students’ opportunities to contribute to their educational environments and engage in participatory learning have been highlighted as means of augmenting agency (e.g., [41]).
One potential way to support student agency is through learning analytics. In this regard, LA dashboards are often employed to visualize study pathways and learning processes, thereby enhancing learners’ awareness and providing them with personalized feedback. LA holds the potential to facilitate the promotion of student agency in numerous ways. For example, it can provide options allowing for goal setting and reflection, supporting meta-cognitive activities, facilitating student decision and choice making based on the analytics, supporting productive engagement and discussion in the online discussion environment, enabling students to identify actionable insights by involving them in the design and creation of analytic tools, and many more. Despite its potential, LA methods and dashboards frequently neglect to consider the integration of theoretical knowledge about learning and student agency into their designs (e.g., [15,42]).

1.3. Research Questions

To fill the mentioned gaps and build upon existing work, we have set four research questions as below:
  • RQ1: What are the objectives for the application of learning analytics in the context of student agency?
  • RQ2: What types and methods of learning analytics have been employed to promote student agency?
  • RQ3: How can learning analytics more effectively support student agency?
  • RQ4: What majors and education levels were mostly targeted, and what types of data visualization were mostly employed by the LA dashboards or methods?
This article’s structure is as follows: Section 2 describes our systematic method, Section 3 deals with results, Section 4 illustrates the discussion (including existing challenges), and finally, Section 5 presents the conclusions.

2. Methodology

We used PRISMA’s guidelines proposed by Page et al. [43] to design and carry out this systematic review.

2.1. Database and Keywords

In order to comprehensively and impartially review the literature concerning the role of learning analytics in supporting student agency, we conducted searches across eight major international databases, namely Web of Science, Scopus, ScienceDirect, SpringerLink, IEEE Xplore, Wiley, ERIC and Taylor & Francis. To ensure covering all existing related works, we also adopted backward and forward search on the final included studies [44]. For our search process, we employed several different keyword groupings, which were each tailored to a specific database. The reason is that the named databases have their specific guidelines: for instance, the ScienceDirect advanced search does not allow the use of wild cards, while Web of Science provides this functionality. Table 1 lists the name of databases and the respective keyword groupings used to search title, abstract, and keywords of articles.

2.2. Eligibility Criteria

To ensure finding relevant articles for answering our research questions, we set inclusion and exclusion criteria in our research. The inclusion criteria were mostly predefined and used during the search (e.g., the time frame of 2010, as learning analytics emerged as a field around that period), while the exclusion criteria evolved during the subsequent screening stages. Table 2 illustrated our inclusion and exclusion criteria.

2.3. Study Selection and Data Analysis

As shown in Figure 1, we considered three main selection stages, identification, screening, and eligibility evaluation. After importing the result of our searches to our reference management software (i.e., Zotero), we employed automatic and manual duplicate searching to find and remove similar studies imported from different databases. During the first screening, two researchers analyzed the suitability of the article’s abstract and title based on our eligibility criteria. If unsure, the articles were sent to the second screening. During the second screening, the full text of the articles was retrieved and carefully analyzed using our eligibility criteria. Our exclusion criteria were then refined and completed, had we identified any new criteria. Both researchers discussed their disagreements regarding the inclusion of specific articles until an agreement was reached. Quality appraisal was also conducted on the studies included in the second screening (for the quality appraisal check list, see [45,46]). Briefly, the quality appraisal criteria included six criteria for qualitative studies and six criteria for quantitative studies. Scores 0, 0.5, and 1 were used if the quality criterion was not satisfied, partially satisfied, or fully satisfied, respectively. Three quality rankings of “low”, “medium”, and “high” were used to evaluate the studies. All 15 studies met the minimum scores (ranked medium or high), and therefore, no studies were excluded during this stage. Once finalized, relevant information was extracted by the researchers from the included articles.
The extracted information includes the article characteristics (e.g., author names, country, year of publication, publication type, and citation), materials and methods (e.g., objectives, data collection methods, fields, LA methods, participants, and many more), evaluation, findings, challenges, future works, etc. The information was recorded in an Excel spreadsheet and was further used for analysis to answer the research questions. Specifically, to answer the research questions two and four, we grouped findings of the studies and used a narrative summary to present the results. To answer the first and third research questions, a thematic analysis was employed [47]. The raw text data were exported to a Qualitative Content Analysis program QCAmap as separate projects for the first and third research questions. For the first research question, raw text from the included articles that revolves around the objective of the studies was used as data. For the second research question, raw text that includes ways of LA supporting student agency was used as data. Thereafter, an inductive coding was carried out for both research questions separately. Words and phrases were used as analytical units and codes were assigned. In order to increase the trustworthiness of the results, two researchers repeatedly read through the data and discussed the codes until an agreement was reached. Next, the codes that carried similar meaning were distributed under categories. From here on, the categories of the first research question are named as “Objectives”, while the categories of the second research question are named as “Proposed LA elements and design principles”.

2.4. Visualization of Results

Mirroring the research carried out by Hooshyar et al. [48], we employ the data visualization technique of a heatmap matrix. We used Python to generate the heatmaps matrix, facilitating addressing research questions. In the heatmaps, color saturation is used to code values within cells, allowing us to easily see patterns across a quantitative scale.

3. Results

3.1. Characteristics of the Studies

Table 3 illustrates the overview of the included articles. As shown, 15 articles have met our eligibility criteria and have been included in our research, including 11 journals and four conference articles. Figure 2 illustrates the distribution of included articles according to country. Finland and Canada have contributed the most with four and three articles, respectively. The UK and US have also published two articles each, advancing the research revolving around learning analytics in supporting student agency. Moreover, Figure 3 presents the analysis of the included articles’ keywords, highlighting the most frequently used terms such as “analytics”, “learning”, “agency”, “student”, and “participation”.

3.2. Objectives

RQ1: What are the objectives for the application of learning analytics in the context of student agency?
As shown in Table 4, all the included articles have fallen under three main objectives, investigate (n = 6), propose (n = 8), and develop (n = 5). The six articles under the investigate category either study student experiences of agency or educators’ perception of student agency in learning analytics (e.g., [49,53,56]). While some of these studies use their findings to come up with design principles or frameworks that contribute to supporting student agency in learning analytics (e.g., [53]), other studies ignore exploiting their findings to propose new design principles (e.g., [50,56]). None of these studies develops any new LA methods and dashboards for promoting student agency.
The eight articles under the propose category mainly revolve around proposing design principles or frameworks rather than developing and evaluating learning analytics methods and dashboard designs to support student agency (e.g., [18]). For instance, Bennett and Folley [49] first investigate the experiences of students about agency in LA and found that many students are uninformed about the utilization of their behavioral data in LA methods and dashboards, and they are not engaged in the design of these methods and dashboards. Accordingly, they propose four design principles for LA methods and dashboards to support student agency. These include (1) providing students with options to customize their dashboards, for instance, how to order or what to display; (2) supporting students in the interpretation process of the analytics through design features of the dashboards like the display form (e.g., progress bar and pie chart), granularity and type of data, aggregation level, etc. (for more details on data visualization and its relation to educational conceptions, see Sedrakyan et al. [57]); (3) help students identify actionable insights through different means like assisting with identifying goals or offering displays with various granularity levels; and (4) embedding dashboards in into educational processes.
Of the five studies that were involved in developing LA methods and dashboards for promoting student agency, three developed LA methods called agency analytics (e.g., [21,37]) and two focused on LA methods that foster student agency through network visualizations (e.g., [54]). Of the three agency analytics, two aimed to promote students’ agentic awareness and inform pedagogical practices (e.g., [56]), and one targeted support teacher reflection and pedagogical awareness of agency (e.g., [21]). Surprisingly, none of these five developmental works had first investigated student agency experiences and then used findings to develop (evidence-based) LA methods and dashboards to support student agency.
We further generated a heatmap visualizing the connection between the objectives and publication years showing how the focus of the LA research community has evolved with respect to student agency. As shown in Figure 4, before 2017, the focus of the LA research community mainly revolved around proposing design principles for supporting student agency in LA with almost no attention on investigating the key driving factors for promoting student agency and/or developing LA methods that consider fostering student agency. In the past five years, however, there has been more focus on investigating students’ and educators’ experiences about agency in LA (the elements contributing to fostering student agency). Also, more developments can be seen compared to before 2017 when it comes to designing and developing LA methods and dashboards that promote student agency. Overall, not only more research works have evolved dealing with LA in supporting agency, but also all three objectives of investigating, proposing, and developing have been more or less covered by the LA research community.

3.3. Types and Methods

RQ2: What types and methods of learning analytics have been employed to promote agency?
As illustrated in Table 5, the included studies mainly employ descriptive and diagnostic analytics and a little less predictive analytics. Surprisingly, prescriptive analytics which could play a major role in promoting student agency has been fully ignored. More specifically, eight studies employed statistical methods to cater to descriptive analytics, illustrating the way things are and what has happened (e.g., [54,55]). For instance, the LA dashboard investigated by the study of Bennett and Folley [49] provided students with seven descriptive elements like an average of student access and activity in the module, the overall course summary for students, the attendance ratio of students compared to the class, etc.
Eight studies have used data mining and statistical methods to provide diagnostic analytics, enabling students to investigate the root causes of possible learning issues and difficulties (e.g., [36,37]). For example, Ouyang et al. [54] used data mining techniques to visually exhibit online discussions to foster student engagement. Two studies used machine learning and statistical modeling to provide predictive analytics. For instance, Saarela et al. [21] employed clustering methods to group students into four agency profiles and then employed supervised machine learning algorithms to predict future students’ agency profiles. Interestingly, they employed explainer models to explain the prediction of the predictive analytics, which allows students to find out why they are put into a specific agency category. Overall, as it is apparent, despite advances in artificial intelligence and machine learning, they either have been rarely used to provide predictive analytics or completely overlooked to cater to prescriptive analytics.
Figure 5 illustrates the analysis of the analytics types concerning the included studies’ objectives. Based on these findings, more studies have considered investigating, proposing, and developing elements contributing to fostering student agency using descriptive analytics followed by diagnostics analytics. Nonetheless, predictive and prescriptive analytics for promoting student agency have never been proposed and have been rarely investigated or developed by the included studies.

3.4. Learning Analytics in Supporting Agency

RQ3: How can learning analytics support student agency more effectively?
According to Table 6, nine main elements are important to consider in the design principles of LA methods/dashboards for supporting student agency. These include developing agency analytics, offering customization, embedding dashboards, meta-cognition support, decision-making support, engagement and discussion support, considering transparency and privacy, relying on learning sciences, and facilitating co-design.
Transparency and privacy have been the most important element for supporting student agency in LA methods and dashboards. Five studies have highlighted transparency in terms of data policies, practices, and algorithms (considering student vulnerability) as well as providing an explanation for the LA method as the main factor for promoting agency in LA (e.g., [22]). Saarela et al. [21] indicated that visualizing the inter-individual differences of learning experiences in an interpretable form can promote student agency. In their research, they employed the explainable artificial intelligence method of Shapley to provide both global and local explanations of the LA method. Particularly, their LA method (called agency analytics) explains the key features for the agency profiles and the reason students are put in a specific profile. Such an explanation not only is interesting for educators to see why a certain student was assigned to a specific agency group but also provides students with the right to information about the decisions made by the LA method. Wise et al. [18] underlined several main challenges regarding the use of LA dashboards and methods. Among those, there are several challenges related to learner interpretation of LA, such as the challenge of trust. The trust challenge refers to students’ trust in the usefulness and fidelity of the presented analytics. Students are mainly unaware of who and why can see their data and how they are being monitored. This usually leads to a sense of distrust. To overcome such issues, Wise and colleagues indicate the need for establishing transparency between analytics and developments. One prospective way to achieve this is to allow for the flexibility of interpretation of LA that helps to support student decision making based on analytics.
Facilitating co-design and developing agency analytics is the second most important element to be considered in LA methods and dashboards regarding fostering student agency. Shibani et al. [55] found that co-designing LA methods and dashboards builds trust and eventually leads to better outcomes and agency. In other words, it enables students to identify actionable insights by involving them in the design and creation of analytic tools. Moreover, such involvement enables students to be active in the process of interpretation and encourages them to become active and empowered about their studies. Ochoa and Wise [20] state that learning analytics dashboards should adopt and adapt participatory design methodologies so students can not only be included in analyzing information needs but also in revisiting, ideating, and evaluating LA prototypes and concepts.
Offering customizability, and meta-cognition and decision-making support has been also found to be crucial when it comes to supporting student agency in LA methods and dashboards. Regarding customizability, Bennett and Folley [49] mentioned that offering customizability allows students to tailor the dashboard according to their needs, which would eventually contribute to student agency and empowerment. Concerning the meta-cognition support, Wise et al. [18] concluded that providing options for the individualization of learning goals could be a prospective way of fostering student agency in LA. As for decision-making support, Chen and Zhang [17] state that equipping LA with tools to support choice making could promote high-level epistemic agency and design mode thinking competencies.
Finally, relying on learning sciences, engagement and discussion support, as well as embedding dashboards in educational processes have also been underlined as potentials in supporting student agency. Pertaining to the consideration of learning sciences in designing LA, Tsai et al. [53] indicated the need for interventions triggered by LA that are based on learning sciences to support agency (e.g., feedback principles proposed by Nicol and Macfarlane [58]). Wise et al. [52] highlighted the support for productive engagement and discussion in the online discussion as a pedagogical activity that can potentially foster student agency. Bennett and Folley [49] state that LA dashboards are a part of a much wider system of support and feedback and thus should be taken into account within this context. In other words, LA dashboards should be embedded in educational processes to be effective in fostering student agency and empowerment.

3.5. Education Levels and Data Visualization Types

RQ4: What majors and education levels were mostly targeted and what types of data visualization were mostly employed by the LA dashboards or methods?
Table 7 lists majors, education levels, and visualization types used by the studies in the context of LA in supporting agency. As shown, most studies investigated, proposed, or developed LA methods or dashboards for supporting agency in social sciences and humanities (e.g., [54,55]), and engineering (e.g., [37]). Specifically, regarding social science and humanities, six studies dealt with fields like primary school teacher training, literature, history, geography, educational technology, art, information technologies and education, law, and business. Three articles targeted mathematics, basic computer programming, and basic IT skills fields from engineering majors.
Surprisingly, no studies have considered the use of LA in supporting student agency in K–12 education, and all the included articles revolved around higher education. Bar charts were the most frequently employed visualization type followed by textual, color coding, or summary tables. The least frequently used visualization type is the pie chart, while boxplots, progress bars, and network visualization were employed two times each by the included studies.
Figure 6 shows patterns related to the use of specific visualization types in the studies’ objectives. Overall, while only one article proposed using bar charts for supporting agency in LA, it has been mainly considered in developing LA methods or dashboards for supporting student agency. Also, although textual and summary tables were investigated by some studies, no studies had actually employed them in developmental work for promoting agency using LA.

4. Discussion

Concerning the first research question, our findings reveal that the included articles mainly aim to propose design principles or frameworks for LA methods or dashboards in supporting agency. The second and third main objectives were to investigate student experiences of agency or educators’ experiences about student agency in LA as well as develop LA methods and dashboards to support student agency, respectively. While some of the studies that investigated student or staff experiences of agency proposed design principles or frameworks that contribute to supporting student agency in learning analytics, none of those studies develop any new LA methods and dashboards. We also found that not only more research works have evolved dealing with LA in supporting agency in the past five years, but also all three objectives of investigating, proposing, and developing have been covered by the LA research community compared to before 2017.
Our findings from the second research question show that most of the included articles rely on descriptive and diagnostic analytics, paying less attention to predictive learning analytics and completely ignoring the potential of prescriptive learning analytics in supporting agency. Statistical analysis and data mining-related methods were mostly employed to cater for descriptive and diagnostic analytics. Moreover, we found that despite the potential of artificial intelligence and machine learning, they have been rarely used, and statistical methods are still widely preferred.
Regarding the third research question, we found that nine main elements are important to consider in the design principles of LA methods/dashboards for supporting student agency. These include developing agency analytics, offering customization, embedding dashboards, meta-cognition support, decision-making support, engagement and discussion support, considering transparency and privacy, relying on learning sciences, and facilitating co-design. The most frequently recommended are to consider transparency and privacy, co-design, as well as developing agency analytics in LA methods/dashboards.
Concerning the last research question, our findings show that, surprisingly, no studies have considered the use of LA in supporting student agency in K–12 education (higher education has been the focus of the LA community thus far). Bar charts were the most frequently employed visualization type in the LA dashboards followed by textual, color coding, or summary tables. Also, most studies targeted promoting student agency using LA in social science and humanities majors. In addition, we have identified eight main challenges facing LA in supporting student agency according to our findings.
(1) General lack of LA research for supporting student agency. Despite the importance of student agency and its effects on the quality of education systems as well as LA in supporting student agency (e.g., [1,59,60]), there seems not to be enough attention on this topic. Our systematic search of the topic returned only 278 articles (including duplicates) from eight major databases dealing with LA and agency. More research is required to investigate the potential different types and methods of LA in supporting student agency. Moreover, there is still a lack of proper framework or design principles for LA methods and dashboards to support student agency.
(2) Complete ignorance of student agency for K–12. One of the most striking findings of this research is that there have yet to be any research studies revolving around the investigation, proposing, or development of LA methods or dashboards in supporting student agency in K–12 education. Despite OECD’s call on supporting student agency and its advantages for society and education [1], the potential of LA has not been explored for fostering student agency. LA methods and dashboards can promote student agency through different means that eventually lead to increasing their engagement and motivation throughout the teaching and learning process, thus increasing their comprehension of content. According to our findings, some prospective ways that LA can foster student agency include enabling students to identify actionable insights, providing interventions to balance what students want and what is good for them, offering different types of analytics with explanations facilitating student decision and choice making, allowing for goal setting and reflection (supporting meta-cognitive activities in general), and many more.
(3) Open practices and data ownership problems of LA. Among the included studies, only one had developed LA methods that consider reducing ethical issues by developing an explainable predictive LA [21]. There is not enough developmental work in LA to support agency in ways to address privacy challenges like anonymizing and deidentifying individuals and data transparency and ownership. For instance, there are no LA-related studies in supporting student agency that include students in the transparency of the data process and the opaqueness of algorithms (open practices and data ownership). As argued by Prinsloo and Slade [22], to bring about student ownership of analytics and trust, the ethical adoption of LA requires empowering students as participants in their learning and increasing student agency.
(4) Lack of co-creation in developing LA. Even though emphasized by several studies (e.g., [20,49,53]), there seem to be no studies that actually involve students and/or educators in the development and interpretation of LA for supporting agency. Considering students and educators in the development process of LA could promote the identification of actionable insights and even help bridge the gap between theory and practice. More specifically, it enables students to act as active agents in the interpretation process rather than digesting the data for them, which facilitates assisting them in identifying what actions they could take next (translating their understandings into practical actions). Also, it can help take into account how students may change their behavior because of using the dashboard.
(5) Not enough attention toward tracing student agency using log data. The majority of LA methods relied on self-reported answers of students collected through assessment tools for student agency (e.g., Jääskelä et al. [37]). One key potential of LA is its capability to benefit from students’ log data during the learning process and provide continuous and uninterrupted assessments that can further be used to support students and educators in agency. Unfortunately, of all the included studies, none had identified indicators of student agency during the learning process and benefited from these to provide different means of LA. For instance, the explainable agency analytics developed by Saarela et al. [21] mainly relies on students’ responses to their assessment tool to create agency profiles for students. Future works could take advantage of both log and self-reported data to develop more comprehensive LA methods and dashboards.
(6) Less attention toward predictive analytics and ignoring early predictive modeling. Of all included articles, only two considered predictive modeling in their learning analytics types. Most studies employed descriptive or diagnostic analytics. Interestingly, none of the two articles took advantage of early predictive modeling and merely used students’ answers to learner agency questionnaires to predict agency profiles of future students after responding to the same assessment tools. According to Macfadyen and Dawson [61], the primary objective of predictive modeling is to make early predictions and provide timely intervention. In the context of education, the prediction of students’ performance and agency has the potential to enhance various educational systems (e.g., [62]). However, in practice, these predictive models struggle to make accurate and timely predictions during the initial stages of the learning process. If learners are only informed about their performance or agency profiles after completing a learning session without considering the element of time, it becomes challenging for educators or the educational system to offer personalized interventions that can prevent failure or enhance their learning agency.
(7) Neglecting the potential of prescriptive learning analytics. Another important finding was the complete ignorance of prescriptive analytics and its potential to foster student agency. Prescriptive learning analytics aims to employ data-driven prescriptive analytics to interpret the internal of the predictive LA models, explain their predictions for individual students, and accordingly generate evidence-based hints, feedback, etc. Despite its potential (see Susnjak [63]), there have been no works considering the developed predictive learning analytics and the desired outcome for an individual student to prescribe an optimized input (course of actions) to achieve the outcome. For instance, the benefits of such an LA type could be to guide students engaging in discussion with certain peers, revising a concept, attempting extra quizzes, solving specific practice examples, taking extra assignments, and so forth. Therefore, translating predictions into actionable insights is missing from the literature.
(8) General absence of explainable AI. Of all 15 included articles, only one developed interpretability for their model using a general explainer model (like Shapely) (Saarela et al., [21]). While effective and successful in generating an approximation of the decisions, there is still a long way to go in the field had we wanted truly practical LA methods and dashboards to support students. Unfortunately, general explainer algorithms like Shapely and LIME suffer from serious challenges: (i) using a particular attribute to predict without that attribute appearing in the prediction explanation, (ii) producing unrealistic scenarios, and (iii) being unable to reason the path to their decisions (e.g., [64,65]). According to this and many other related works in LA research (see [63]), most LA methods focus on the development of models that are more accurate and effective in their predictions. There is a general absence of explainable AI for LA usage. In general, LA methods and dashboards will have limited capabilities, and their uptake will remain constrained if they are not equipped with the ability to provide reasoning behind their made decisions/predictions as well as recommendations on proper pathways for moving forward.

Limitations

We have employed eligibility criteria to constrain our references. For instance, we only considered studies that properly dealt with the use of LA in supporting student agency. We did not consider studies published in books and thesis or dissertations. Moreover, we excluded articles with a focus on agency of teachers or staff. Finally, studies that implicitly revolve around student agency were not considered (e.g., through the lens of other concepts like self-regulation). More specifically, only articles that specifically used the term “agency” were considered. If we had not specified this limitation, we probably would have been dealing with a larger number of articles.

5. Conclusions

This research examined 15 articles to answer four main research questions related to the use of LA in supporting student agency. Analysis of the included articles showed that most of the articles’ objectives were to investigate student or educators’ experiences with respect to agency, propose design principles or frameworks for supporting LA methods/dashboards to support agency more effectively, and develop LA dashboards or methods to promote agency. Of those studies with the aims of investigating or proposing, none have developed any new LA methods and dashboards for promoting student agency. Similarly, none of the studies with the development objective had first investigated student agency experiences and then used findings to develop (evidence-based) LA methods and dashboards to support student agency. Further analysis revealed that not only more research works have evolved dealing with LA in supporting agency in the past five years, but also all three objectives of investigating, proposing, and developing have been covered by the LA research community compared to before 2017.
Regarding the second research question, our findings showed that the LA community overlooks the potential of predictive and prescriptive learning analytics in supporting agency and yet mainly relies on descriptive and diagnostic analytics. To this end, statistical analysis and data mining-related methods were mostly employed to cater for descriptive and diagnostic analytics, while the potential of artificial intelligence and machine learning techniques has not been exploited. Further analysis revealed that more studies have considered investigating, proposing, and developing elements contributing to fostering student agency using descriptive analytics followed by diagnostics analytics.
The findings of the third research question revealed nine main LA design elements for more effectively supporting student agency. Some examples include offering customization, meta-cognition support, engagement and discussion support, considering transparency and privacy, relying on learning sciences, and facilitating co-design. Of these nine, the most frequently recommended were to consider transparency and privacy, co-design, as well as developing agency analytics in LA methods/dashboards. The fourth research question found that no studies have considered the use of LA in supporting student agency in K–12 education, and bar charts were the most frequently employed visualization type in the LA dashboards followed by textual, color coding, and summary tables. Also, the findings showed that most studies targeted promoting student agency using LA in social science and humanities majors. Further analysis showed that while textual and summary tables were investigated by some studies, no studies had actually employed them in developmental work for promoting agency using LA. Finally, our analysis highlighted eight main challenges facing LA in supporting student agency according to our findings. These include the general lack of LA research for supporting student agency, a complete ignorance of student agency for K–12, open practices and data ownership problems of LA, lack of co-creation in developing LA, not enough attention toward tracing student agency using log data, less attention toward predictive analytics and ignoring early predictive modeling, neglecting the potential of prescriptive learning analytics, and the general absence of explainable AI.
In conclusion, the analysis of the articles has provided valuable insights regarding the role of LA in fostering student agency. A key observation is the importance of researching and implementing ways to support student agency, recognizing the role of LA and AI. To ensure the maintenance of agency in a data-enriched environment, it is imperative to investigate the essential skills that learners and students must possess. This can be achieved by creating and testing models that combine LA with student agency, especially in K–12 education. Engaging in a co-creative approach that involves educators, students, and other stakeholders in the design and evaluation of LA systems promises to yield more effective and widely adopted tools. This collaborative approach not only guarantees the effectiveness of the tools developed but also enhances their widespread acceptance. By prioritizing a human-centered methodology in designing LA solutions, a direct link can be established between end-users and the developmental process. Consequently, this approach not only aligns the tools with pedagogical frameworks but also profoundly advances and reinforces student agency in a synergistic manner.

Author Contributions

Conceptualization, D.H., K.T., T.L., K.A. and K.K.; methodology, D.H. and K.T.; software, D.H.; validation, D.H., K.T. and K.K.; data curation, D.H., K.T. and K.K.; writing—original draft preparation, D.H., K.T., T.L., K.A. and K.K.; writing—review and editing, D.H., K.T., T.L., K.A. and K.K.; visualization, D.H.; supervision, D.H., K.T., T.L., K.A. and K.K.; funding acquisition, T.L. All authors have read and agreed to the published version of the manuscript. The authors equally contributed to the conceptualization and design of the study.

Funding

This work was supported by the Ministry of Education of Estonia under the project title of “Flexible learning paths to support student-centered learning in schools” (No. ÕLHAR1).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are unavailable due to privacy or ethical restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Organisation for Economic Co-operation and Development (OECD). The Future of Education and Skills: Education 2030; OECD Publishing: Paris, France, 2018. [Google Scholar]
  2. Eteläpelto, A. Emerging conceptualisations on professional agency and learning. In Agency at Work: An Agentic Perspective on Professional Learning and Development; Springer: Cham, Switzerland, 2017; pp. 183–201. [Google Scholar]
  3. Vaughn, M.; Jang, B.G.; Sotirovska, V.; Cooper-Novack, G. Student Agency in Literacy: A Systematic Review of the Literature. Read. Psychol. 2020, 41, 712–734. [Google Scholar] [CrossRef]
  4. Zembylas, M.; Chubbuck, S. Conceptualizing ‘teacher identity’: A political approach. In Research on Teacher Identity: Mapping Challenges and Innovations; Springer: Dordrecht, The Netherlands, 2018; pp. 183–193. [Google Scholar]
  5. Edwards, A. Recognising and realising teachers’ professional agency. Teach. Teach. 2015, 21, 779–784. [Google Scholar] [CrossRef]
  6. Su, Y.-H. The constitution of agency in developing lifelong learning ability: The ‘being’mode. High. Educ. 2011, 62, 399–412. [Google Scholar] [CrossRef]
  7. Trede, F.; Macklin, R.; Bridges, D. Professional identity development: A review of the higher education literature. Stud. High. Educ. 2012, 37, 365–384. [Google Scholar] [CrossRef]
  8. Admiraal, W.; Akkerman, S.F.; de Graaff, R. How to foster collaborative learning in communities of teachers and student teachers: Introduction to a special issue. Learn. Environ. Res. 2012, 15, 273–278. [Google Scholar] [CrossRef]
  9. Hökkä, P.; Eteläpelto, A.; Rasku-Puttonen, H. The professional agency of teacher educators amid academic discourses. J. Educ. Teach. 2012, 38, 83–102. [Google Scholar] [CrossRef]
  10. Brahm, T.; Jenert, T.; Wagner, D. The crucial first year: A longitudinal study of students’ motivational development at a Swiss Business School. High. Educ. 2017, 73, 459–478. [Google Scholar] [CrossRef]
  11. Usher, E.L.; Pajares, F. Sources of Self-Efficacy in School: Critical Review of the Literature and Future Directions. Rev. Educ. Res. 2008, 78, 751–796. [Google Scholar] [CrossRef]
  12. Damşa, C.I.; Kirschner, P.A.; Andriessen, J.E.B.; Erkens, G.; Sins, P.H.M. Shared Epistemic Agency: An Empirical Study of an Emergent Construct. J. Learn. Sci. 2010, 19, 143–186. [Google Scholar] [CrossRef]
  13. Marín, V.I.; de Benito Crosetti, B.; Darder, A. Technology-Enhanced Learning for Student Agency in Higher Edu-cation: A Systematic Literature Review. IxD&A 2020, 45, 15–49. [Google Scholar]
  14. Stenalt, M.H.; Lassesen, B. Does student agency benefit student learning? A systematic review of higher education research. Assess. Evaluation High. Educ. 2022, 47, 653–669. [Google Scholar] [CrossRef]
  15. Wise, A.F. Designing pedagogical interventions to support student use of learning analytics. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge, Indianapolis, IN, USA, 24–28 March 2014; ACM: New York, NY, USA, 2014; pp. 203–211. [Google Scholar]
  16. Namoun, A.; Alshanqiti, A. Predicting Student Performance Using Data Mining and Learning Analytics Techniques: A Systematic Literature Review. Appl. Sci. 2020, 11, 237. [Google Scholar] [CrossRef]
  17. Chen, B.; Zhang, J. Analytics for Knowledge Creation: Towards Epistemic Agency and Design-Mode Thinking. J. Learn. Anal. 2016, 3, 139–163. [Google Scholar] [CrossRef]
  18. Wise, A.F.; Vytasek, J.M.; Hausknecht, S.; Zhao, Y. Developing Learning Analytics Design Knowledge in the “Middle Space”: The Student Tuning Model and Align Design Framework for Learning Analytics Use. Online Learn. 2016, 20, 155–182. [Google Scholar] [CrossRef]
  19. van Leeuwen, A.; Teasley, S.D.; Wise, A.; Lang, C.; Siemens, G.; Gašević, D.; Merceron, A. Teacher and Student Facing Analytics; Society for Learning Analytics Research: Beaumont, AB, Canada, 2022. [Google Scholar]
  20. Ochoa, X.; Wise, A.F. Supporting the shift to digital with student-centered learning analytics. Educ. Technol. Res. Dev. 2021, 69, 357–361. [Google Scholar] [CrossRef]
  21. Saarela, M.; Heilala, V.; Jaaskela, P.; Rantakaulio, A.; Karkkainen, T. Explainable Student Agency Analytics. IEEE Access 2021, 9, 137444–137459. [Google Scholar] [CrossRef]
  22. Prinsloo, P.; Slade, S. Student Vulnerability, Agency and Learning Analytics: An Exploration. J. Learn. Anal. 2016, 3, 159–182. [Google Scholar] [CrossRef]
  23. Matcha, W.; Gašević, D.; Pardo, A. A systematic review of empirical studies on learning analytics dashboards: A self-regulated learning perspective. IEEE Trans. Learn. Technol. 2019, 13, 226–245. [Google Scholar] [CrossRef]
  24. Code, J. Agency for learning: Intention, motivation, self-efficacy and self-regulation. Front. Genet. 2020, 5, 19. [Google Scholar] [CrossRef]
  25. Billett, S. Lifelong learning and self: Work, subjectivity and learning. Stud. Contin. Educ. 2010, 32, 1–16. [Google Scholar] [CrossRef]
  26. Eteläpelto, A.; Vähäsantanen, K.; Hökkä, P.; Paloniemi, S. Identity and agency in professional learning. In International Handbook of Research in Professional and Practice-Based Learning; Springer: Dordrecht, The Netherlands, 2014; pp. 645–672. [Google Scholar]
  27. Moore, J.W. What is the sense of agency and why does it matter? Front. Psychol. 2016, 7, 1272. [Google Scholar] [CrossRef] [PubMed]
  28. Usher, E.L.; Schunk, D.H. Social cognitive theoretical perspective of self-regulation. In Handbook of Self-Regulation of Learning and Performance; Routledge/Taylor & Francis: New York, NY, USA, 2017; pp. 19–35. [Google Scholar]
  29. Foucault, M. Discipline and Punish: The Birth of the Prison, Trans. Alan Sheridan. 1977. Available online: https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf (accessed on 1 January 2023).
  30. Giddens, A. The Constitution of Society: Outline of the Theory of Structuration (Vol. 349); Univ of California Press: Berkeley, CA, USA, 1984. [Google Scholar]
  31. Musolf, G.R. Structure and Agency in Everyday Life: An Introduction to Social Psychology; Rowman & Littlefield: Lanham, MD, USA, 2003. [Google Scholar]
  32. Zimmerman, B.J.; Schunk, D.H. Self-regulated learning and performance: An introduction and an overview. In Handbook of Self-Regulation of Learning and Performance; Routledge: New York, NY, USA, 2011; pp. 15–26. [Google Scholar]
  33. Zimmerman, B.J. Attaining self-regulation: A social cognitive perspective. In Handbook of Self-Regulation; Elsevier: Amsterdam, The Netherlands, 2000; pp. 13–39. [Google Scholar]
  34. Hökkä, P.; Vähäsantanen, K.; Mahlakaarto, S. Teacher educators’ collective professional agency and identity—Transforming marginality to strength. Teach. Teach. Educ. 2017, 63, 36–46. [Google Scholar] [CrossRef]
  35. Vaughn, M. What is student agency and why is it needed now more than ever? Theory Into. Pract. 2020, 59, 109–118. [Google Scholar] [CrossRef]
  36. Jääskelä, P.; Poikkeus, A.-M.; Vasalampi, K.; Valleala, U.M.; Rasku-Puttonen, H. Assessing agency of university students: Validation of the AUS Scale. Stud. High. Educ. 2017, 42, 2061–2079. [Google Scholar] [CrossRef]
  37. Jääskelä, P.; Heilala, V.; Kärkkäinen, T.; Häkkinen, P. Student agency analytics: Learning analytics as a tool for analysing student agency in higher education. Behav. Inf. Technol. 2021, 40, 790–808. [Google Scholar] [CrossRef]
  38. Ruohotie-Lyhty, M.; Moate, J. Proactive and reactive dimensions of life-course agency: Mapping student teachers’ language learning experiences. Lang. Educ. 2015, 29, 46–61. [Google Scholar] [CrossRef]
  39. Lindgren, R.; McDaniel, R. Transforming online learning through narrative and student agency. Educ. Technol. Soc. 2012, 15, 344. [Google Scholar]
  40. Scardamalia, M.; Bereiter, C. Higher Levels of Agency for Children in Knowledge Building: A Challenge for the Design of New Knowledge Media. J. Learn. Sci. 1991, 1, 37–68. [Google Scholar] [CrossRef]
  41. Starkey, L. Three dimensions of student-centred education: A framework for policy and practice. Crit. Stud. Educ. 2019, 60, 375–390. [Google Scholar] [CrossRef]
  42. Wise, A.F.; Shaffer, D.W. Why Theory Matters More than Ever in the Age of Big Data. J. Learn. Anal. 2015, 2, 5–13. [Google Scholar] [CrossRef]
  43. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Int. J. Surg. 2021, 88, 105906. [Google Scholar] [CrossRef] [PubMed]
  44. Webster, J.; Watson, R.T. Analyzing the past to prepare for the future: Writing a literature review. MIS Q. 2002, 26, xiii–xxiii. [Google Scholar]
  45. Banihashem, S.K.; Noroozi, O.; van Ginkel, S.; Macfadyen, L.P.; Biemans, H.J. A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educ. Res. Rev. 2022, 37, 100489. [Google Scholar] [CrossRef]
  46. Munn, A.E. JBI Manual for Evidence Synthesis. 2020. Available online: https://synthesismanual.jbi.global (accessed on 1 January 2023).
  47. Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef]
  48. Hooshyar, D.; Pedaste, M.; Saks, K.; Leijen, Ä.; Bardone, E.; Wang, M. Open learner models in supporting self-regulated learning in higher education: A systematic literature review. Comput. Educ. 2020, 154, 103878. [Google Scholar] [CrossRef]
  49. Bennett, L.; Folley, S. Four design principles for learner dashboards that support student agency and empowerment. J. Appl. Res. High. Educ. 2019, 12, 15–26. [Google Scholar] [CrossRef]
  50. Algers, A. Open Textbooks: A Balance Between Empowerment and Disruption. Technol. Knowl. Learn. 2020, 25, 569–584. [Google Scholar] [CrossRef]
  51. Heilala, V.; Jääskelä, P.; Kärkkäinen, T.; Saarela, M. Understanding the Study Experiences of Students in Low Agency Profile: Towards a Smart Education Approach; Springer International Publishing: Cham, Switzerland, 2019; pp. 498–508. [Google Scholar]
  52. Wise, A.F.; Zhao, Y.; Hausknecht, S.N. Learning Analytics for Online Discussions: A Pedagogical Model for Inter-Vention with Embedded and Extracted Analytics. In Proceedings of the 3nd International Conference on Learning Analytics and Knowledge, Leuven, Belgium, 8–13 April 2013; pp. 48–56. [Google Scholar]
  53. Tsai, Y.-S.; Perrotta, C.; Gašević, D. Empowering learners with personalised learning approaches? Agency, equity and transparency in the context of learning analytics. Assess. Evaluation High. Educ. 2020, 45, 554–567. [Google Scholar] [CrossRef]
  54. Ouyang, F.; Chen, S.; Li, X. Effect of three network visualizations on students' social-cognitive engagement in online discussions. Br. J. Educ. Technol. 2021, 52, 2242–2262. [Google Scholar] [CrossRef]
  55. Shibani, A.; Knight, S.; Shum, S.B. Educator perspectives on learning analytics in classroom practice. Internet High. Educ. 2020, 46, 100730. [Google Scholar] [CrossRef]
  56. Heilala, V.; Saarela, M.; Jaaskela, P.; Karkkainen, T. Course Satisfaction in Engineering Education Through the Lens of Student Agency Analytics. In Proceedings of the 2020 IEEE Frontiers in Education Conference (FIE), Uppsala, Sweden, 21–24 October 2020; pp. 1–9. [Google Scholar] [CrossRef]
  57. Sedrakyan, G.; Mannens, E.; Verbert, K. Guiding the choice of learning dashboard visualizations: Linking dashboard design and data visualization concepts. J. Comput. Lang. 2019, 50, 19–38. [Google Scholar] [CrossRef]
  58. Nicol, D.J.; Macfarlane-Dick, D. Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Stud. High. Educ. 2006, 31, 199–218. [Google Scholar] [CrossRef]
  59. Klemenčič, M. From student engagement to student agency: Conceptual considerations of European policies on stu-dent-centered learning in higher education. High. Educ. Policy 2017, 30, 69–85. [Google Scholar] [CrossRef]
  60. Hooshyar, D.; Huang, Y.M.; Yang, Y. A Three-Layered Student Learning Model for Prediction of Failure Risk in Online Learning. Hum.-Centric Comput. Inf. Sci. 2022, 12, 28. [Google Scholar]
  61. Macfadyen, L.P.; Dawson, S. Mining LMS data to develop an “early warning system” for educators: A proof of concept. Comput. Educ. 2010, 54, 588–599. [Google Scholar] [CrossRef]
  62. Hooshyar, D.; Yang, Y. Predicting Course Grade through Comprehensive Modelling of Students’ Learning Behavioral Pattern. Complexity 2021, 2021, 7463631. [Google Scholar] [CrossRef]
  63. Susnjak, T. A Prescriptive Learning Analytics Framework: Beyond Predictive Modelling and onto Explainable AI with Prescriptive Analytics. arXiv 2022, arXiv:2208.14582. [Google Scholar]
  64. Slack, D.; Hilgard, S.; Jia, E.; Singh, S.; Lakkaraju, H. Fooling lime and shap: Adversarial attacks on post hoc explanation methods. In Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, New York, NY, USA, 7–8 February 2020; pp. 180–186. [Google Scholar]
  65. Lakkaraju, H.; Bastani, O. “How do I fool you?” Manipulating user trust via misleading black box explanations. In Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, New York, NY, USA, 7–8 February 2020; pp. 79–85. [Google Scholar]
Figure 1. Study selection procedure.
Figure 1. Study selection procedure.
Sustainability 15 13662 g001
Figure 2. Distribution of the included articles according to country.
Figure 2. Distribution of the included articles according to country.
Sustainability 15 13662 g002
Figure 3. Keywords used in the included studies.
Figure 3. Keywords used in the included studies.
Sustainability 15 13662 g003
Figure 4. Focus of LA research community concerning student agency during the past decade.
Figure 4. Focus of LA research community concerning student agency during the past decade.
Sustainability 15 13662 g004
Figure 5. Different LA types in the objective of the included studies.
Figure 5. Different LA types in the objective of the included studies.
Sustainability 15 13662 g005
Figure 6. Objective of studies in relation to different visualization types used in the LA dashboards.
Figure 6. Objective of studies in relation to different visualization types used in the LA dashboards.
Sustainability 15 13662 g006
Table 1. Databases and keyword grouping.
Table 1. Databases and keyword grouping.
DatabaseBoolean/Phrase
Web of ScienceTS = ((“Learning analytics”) AND (agency))
Scopus TITLE-ABS-KEY ((“Learning analytics” AND agency))
ScienceDirectTitle, abstract, keywords: (‘learning analytics’) AND agency
SpringerLink“learning analytics” AND agency
IEEE Xplore(“All Metadata”:“learning analytics”) AND (“All Metadata”:agency)
Wiley““learning analytics” AND agency” in Abstract
ERIC (through EBSCO)“learning analytics” AND agency
Taylor & Francis“learning analytics” AND agency
Table 2. Eligibility criteria.
Table 2. Eligibility criteria.
Inclusion Criteria
IC1Publication dated from Jan 2010 to 2022
IC2Conference publications or peer-reviewed journals
IC3In English and accessible
IC4The study focuses on the use of LA to support agency
Exclusion Criteria
EC1The study superficially looks at LA in the context of agency
EC2The study revolves around agency of teachers or staff
EC3Conference publications that were not part of the main conference (e.g., workshop papers), editorial reports, etc.
EC4The studies that merely revolve around data privacy-related issues in LA
Table 3. Overview of the included articles.
Table 3. Overview of the included articles.
IDReferenceJournal/ConferenceArticle Title
1Saarela et al. [21]IEEE AccessExplainable student agency analytics
2Bennett and Folley [49]Journal of Applied Research in Higher EducationFour design principles for learner dashboards that support student agency and empowerment
3Jääskelä et al. [37]Behaviour & Information TechnologyStudent agency analytics: learning analytics as a tool for analysing student agency in higher education
4Algers [50]Technology, Knowledge and LearningOpen textbooks: A balance between empowerment and disruption
5Heilala, Jääskelä, et al. [51]SmartICT 2019 conference Understanding the study experiences of students in low agency profile: Towards a smart education approach
6Wise et al. [18]Online learningDeveloping Learning Analytics Design Knowledge in the” Middle Space”: The Student Tuning Model and Align Design Framework for Learning Analytics Use
7Wise [15]LAK conference Designing pedagogical interventions to support student use of learning analytics
8Wise et al. [52]LAK conferenceLearning analytics for online discussions: A pedagogical model for intervention with embedded and extracted analytics
9Tsai et al. [53]Assessment & Evaluation in Higher EducationEmpowering learners with personalised learning approaches? Agency, equity and transparency in the context of learning analytics
10Ochoa and Wise [20]Educational Technology Research and DevelopmentSupporting the shift to digital with student-centered learning analytics
11Ouyang et al. [54]British Journal of Educational TechnologyEffect of three network visualizations on students’ social-cognitive engagement in online discussions
12Chen and Zhang [17]Journal of Learning AnalyticsAnalytics for Knowledge Creation: Towards Epistemic Agency and Design-Mode Thinking
13Shibani et al. [55]The Internet and Higher EducationEducator perspectives on learning analytics in classroom practice
14Prinsloo and Slade [22]Journal of Learning AnalyticsStudent Vulnerability, Agency, and Learning Analytics: An Exploration
15Heilala, Saarela, et al. [56]IEEE Frontiers in Education ConferenceCourse Satisfaction in Engineering Education Through the Lens of Student Agency Analytics
Table 4. Objectives of the included articles.
Table 4. Objectives of the included articles.
ObjectiveSub-ObjectiveNumber of ArticlesArticle ID in Table 3
InvestigateStudents’ experience of agency42, 5, 9, 15
Educators’ experience of student agency34, 9, 13
ProposeDesign principles for designing LA methods and dashboards to support student agency42, 9, 11, 12
Student agency-based framework for more productive use of LA46, 7, 10, 14
DevelopExplainable student agency analytics to support teacher reflection and pedagogical awareness of agency11
Student agency analytics for promoting students’ agentic awareness and informing pedagogical practices23, 5
LA approach for online network visualization to support student agency28, 11
Table 5. LA types and methods used in the included articles.
Table 5. LA types and methods used in the included articles.
LA TypesLA MethodsNumber of ArticlesArticle ID in Table 3
DescriptiveStatistical methods81, 2, 3, 5, 6, 11, 13, 15
DiagnosticData mining and statistical methods71, 3, 5, 6, 11, 13, 15
PredictiveMachine learning and statistical modeling21, 13
PrescriptiveArtificial intelligence--
Table 6. Proposed LA designs and elements for supporting student agency.
Table 6. Proposed LA designs and elements for supporting student agency.
Proposed LA Elements and Design PrinciplesExplanationNumber of ArticlesArticle ID in Table 3
Agency analyticsDevelop agency profiles and compare agency profiles of students with a class average for students41, 3, 5, 15
CustomizationEnable students to customize the dashboards32, 6, 10
Embedding dashboardsEmbed dashboards into educational processes12
Meta-cognition supportAllows for goal setting and reflection; meta-cognitive activities in general36, 7, 11
Decision-making supportFacilitates and foregrounds students’ sense making (facilitates student decision and choice making based on the analytics)32, 6, 12
Engagement and discussion supportSupporting productive engagement and discussion in the online discussion environment18
Transparency and privacyBe transparent and visible in terms of data policies, practices, and algorithms (provide an explanation for the LA and the aspects of student agency)51, 6, 9, 10, 14
Relying on learning sciencesOffer interventions that are based on the learning sciences to balance what students want and what is good for them29, 10
Co-designEnable students to identify actionable insights by involving them in the design and creation of analytic tools42, 9, 10, 13
Table 7. Majors and visualization types mostly used in the studies.
Table 7. Majors and visualization types mostly used in the studies.
Number of ArticlesArticle ID in Table 3
MajorsEngineering31, 3, 15
Social sciences and humanities63, 4, 6, 9, 11, 13
Education levelK–12-
Higher education151, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
Visualization typesBar charts51, 2, 3, 5, 15
Pie chart12
Boxplot21, 2
progress bar21, 2
Network visualization and tree style26, 11
Textual feedback, color coding, summary table32, 13, 15
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hooshyar, D.; Tammets, K.; Ley, T.; Aus, K.; Kollom, K. Learning Analytics in Supporting Student Agency: A Systematic Review. Sustainability 2023, 15, 13662. https://doi.org/10.3390/su151813662

AMA Style

Hooshyar D, Tammets K, Ley T, Aus K, Kollom K. Learning Analytics in Supporting Student Agency: A Systematic Review. Sustainability. 2023; 15(18):13662. https://doi.org/10.3390/su151813662

Chicago/Turabian Style

Hooshyar, Danial, Kairit Tammets, Tobias Ley, Kati Aus, and Kaire Kollom. 2023. "Learning Analytics in Supporting Student Agency: A Systematic Review" Sustainability 15, no. 18: 13662. https://doi.org/10.3390/su151813662

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop