Next Article in Journal
A Study on Pavement Classification and Recognition Based on VGGNet-16 Transfer Learning
Previous Article in Journal
A Novel Container Placement Mechanism Based on Whale Optimization Algorithm for CaaS Clouds
Previous Article in Special Issue
Student Project-Based Space Vector Modulation Technique for Power Electronics Laboratory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Self-Regulated Learning and Active Feedback of MOOC Learners Supported by the Intervention Strategy of a Learning Analytics System

Department of Computer Science Engineering, Universidad Autónoma de Madrid (UAM), 28049 Madrid, Spain
Electronics 2023, 12(15), 3368; https://doi.org/10.3390/electronics12153368
Submission received: 19 June 2023 / Revised: 27 July 2023 / Accepted: 30 July 2023 / Published: 7 August 2023

Abstract

:
MOOCs offer great learning opportunities, but they also present several challenges for learners that hinder them from successfully completing MOOCs. To address these challenges, edX-LIMS (System for Learning Intervention and its Monitoring for edX MOOCs) was developed. It is a learning analytics system that supports an intervention strategy (based on learners’ interactions with the MOOC) to provide feedback to learners through web-based Learner Dashboards. Additionally, edX-LIMS provides a web-based Instructor Dashboard for instructors to monitor their learners. In this article, an enhanced version of the aforementioned system called edX-LIMS+ is presented. This upgrade introduces new services that enhance both the learners’ and instructors’ dashboards with a particular focus on self-regulated learning. Moreover, the system detects learners’ problems to guide them and assist instructors in better monitoring learners and providing necessary support. The results obtained from the use of this new version (through learners’ interactions and opinions about their dashboards) demonstrate that the feedback provided has been significantly improved, offering more valuable information to learners and enhancing their perception of both the dashboard and the intervention strategy supported by the system. Additionally, the majority of learners agreed with their detected problems, thereby enabling instructors to enhance interventions and support learners’ learning processes.

1. Introduction

Massive Open Online Courses (MOOCs) are becoming an increasingly important and relevant tool in online learning environments, leading to an increase in available courses and enrolled learners [1]. The University Autónoma of Madrid (UAM, Spain) has offered various courses on the edX platform (https://www.edx.org/school/uamx, accessed on 15 July 2023) since 2015. However, the experience has revealed several problems that arise in the context of e-learning. Learners often feel isolated, alone, which directly affects their learning and engagement in the course. This feeling is mainly due to the impossibility of the instructor providing feedback [2,3,4] to many learners at the same time.
Learning analytics is generally recognised as the main field of study capable of assisting in providing solutions to the presented problems [5,6]. In different higher education institutions, a wide range of learning analytics approaches are employed for prediction and visualisation, among other concerns, in various areas of learning in MOOCs [7]. Furthermore, the author of this work has already developed many learning analytics systems [8,9] to address related challenges.
The most recent example of these learning analytics systems is edX-LIMS, which is a web-based learning analytics system (acronym of System for Learning Intervention and its Monitoring for edX MOOCs). edX-LIMS offers various services, including an intuitive “Web-based Learner Dashboard” that allows MOOC learners to visualise their engagement in the course. Additionally, it provides MOOC instructors with a user-friendly “Web-based Instructor Dashboard” to visualise learner interest in the aforementioned engagement metrics [9].
In this article, a new version of the aforementioned system is presented, called edX-LIMS+. This updated version includes several improvements, including the addition of two new services: (i) a service that generates new visualisations focused on self-regulated learning (SRL), integrated as a new part into each web-based Learner Dashboard, and (ii) a service that supports learner problem detection, also incorporated into the Learner Dashboard. Furthermore, the Instructor Dashboard has been enhanced with new visualisations related to learner interest in these new services available on the Learner Dashboard.
The main objectives of the research study conducted in this article are, firstly, to analyse the impact of the intervention strategy supported by edX-LIMS+ on learners by analysing the aforementioned recorded data and, secondly, to analyse the usefulness of the new services provided in the Instructor Dashboard for helping instructors monitor learners and provide them with assistance.
The structure of this article is as follows: The next section presents the related work in the area of the approach presented in this article. In Section 3, the MOOC used in the research study, along with the previous system (edX-LIMS) and the new one (edX-LIMS+), are detailed. Section 4 presents the intervention strategy, followed by Section 5, which describes the conducted research study, including the research questions and datasets. Section 6 presents the research results, Section 7 provides a discussion of these obtained results, and Section 8 offers conclusions and suggestions for future work.

2. Literature Review

Learning analytics (LA) is the research area that focuses on measuring, collecting, analysing, and reporting information on learners and their various contexts to enhance understanding and improve learning outcomes [5,6]. Education plays a crucial role in societies worldwide, and institutional reports from the European Commission highlight the need for further research in this field, with increasing support from teams and organisations for LA initiatives [7].
As a result, new initiatives and projects are emerging; however, a common mistake is observed in some of these initiatives, as they lack pedagogical support in the process [10]. This is particularly evident in technological developments, where the theoretical aspect and the potential contribution of a pedagogical vision to the project are often overlooked.
The field of learning analytics is expanding rapidly due to the abundance of learning data available today. This abundance enables researchers to draw conclusions and make informed decisions through data analysis.
Furthermore, there are numerous techniques for classifying data analysis. Onah et al. [11] presented a well-known categorisation in the field based on three types: descriptive, predictive, and prescriptive.
Firstly, descriptive analytics serves as the foundation of data processing by collecting information from the past and presenting it in a structured format for further analysis. In the context of learning analytics, descriptive analytics is employed to transform historical learner data into organised learner information, which is then presented through various components in dashboards.
Secondly, predictive analytics utilises past and current data to forecast future events or outcomes, employing machine learning and data-mining techniques. In the context of learning analytics, predictive analytics is used to identify learners who are at risk of negative situations, such as dropout or failure to obtain a degree or certification [12,13].
Finally, after the other forms of analytics, prescriptive analytics focuses on determining possibilities and recommending actions. In the field of learning analytics, it is valuable for suggesting actions that can have a positive impact through interventions in an individual’s learning process.
As mentioned in the introduction, this work concentrates on the utilisation of dashboards and self-regulated learning, with the objective of providing feedback to the learner through an intervention. These aspects will be elaborated upon in the following sections.

2.1. Dashboards in Learning Analytics Context

There are various research branches within the field of learning analytics, and one of them is dedicated to the development of dashboards. These dashboards consist of panels with different components designed to display data from various perspectives or insights.
There have been several systematic studies conducted on learning analytics dashboards (LADs). In particular, Matcha et al. [14] conducted a review that identified the most relevant works in the field of LADs. They referenced the work of Verbert et al. [15] as the initial reference, which analysed the target users of dashboards, the data handled by them, and the evaluations performed. Later, Verbert et al. [16] expanded on their work by classifying different articles based on the categories of LADs presented in various environments. They also analysed dashboards that support the four elements of the conceptual model previously developed by Verbert et al. [15]. As mentioned by Matcha et al. [14], both papers provide a comprehensive overview of the context.
Schwendimann et al. [17] conducted a further analysis of the different types of indicators displayed on dashboards and categorised the types of visualisations used. In another study, Bodily and Verbert [18] conducted a systematic review of students’ interaction with various feedback systems, including dashboards. They concluded that more research is needed to focus on improving the process of dashboard design, as it is crucial for achieving a positive impact of the dashboard.
Considering the growing recognition that LADs are a useful tool for providing reference frames [19], Jivet et al. [20,21] conducted a systematic review in which they reviewed articles classified into those that provide social, achievement, and progress reference frames.
In this branch of research, the same problem mentioned above can be observed, as many projects in the field of learning analytics dashboards (LADs) tend to focus primarily on technological development, often neglecting the pedagogical perspective. Additionally, other aspects of dashboard design and structure should also be taken into account, including the complexity of a dashboard’s components, which can vary depending on the intended purpose of the tool. Simple graphics are generally recommended for students, while more complex visualisations are reserved for instructors [22]. Kim et al. [23] discussed several success stories of different dashboards throughout their paper.
Schwendimann et al. [24] presented a study providing a comprehensive overview of the status of dashboards. It indicates that teachers are the primary target users in 75% of cases, followed by learners in 51% of cases. An additional 7% of the cases were related to online working environments. Moreover, the main purposes of the dashboards were predominantly monitoring others (71%) and self-monitoring (51%).
Dashboards primarily consist of graphs, along with other resources such as tables, texts, or videos. The most popular types of graphs are bar charts (60%), line charts (44%), tables (38%), pie charts (27%), and network charts (18%) [24]. It is important to consider not only the data themselves but also the way they are displayed. Learners should be able to comprehend the provided information in a usable and intuitive manner so that it may have an impact on them. Simpler visualisation techniques, such as timeline visualisation, have shown better results compared to more complex techniques like heat maps [25]. Additionally, selecting the appropriate information to display is crucial, with the aim of presenting useful information to the learner. In the study proposed by Sedrakyan et al. [25], students were found to be aware of certain information presented to them, while being unaware of other information such as time commitment, the activity that motivated them the most, or comparisons with their peers.
In [14], it is mentioned that a theoretical basis is often lacking when working with LADs. Therefore, it is important to employ appropriate approaches, from design to data capture, as the dashboard itself does not fulfil its purpose alone [18,21]. An appropriate pedagogical approach aids in all phases of the study to achieve the desired impact.
Another study presented by Smith [26] obtained comments supporting the use of dashboards by learners. In total, 83% of the students found the data easy to interpret, and 94% of them considered the provided information to be very useful.
In terms of other research, Leon et al. [27] presented a paper focused on developing a dashboard for the MOOC environment. Charleer et al. [28] presented another one where learners were provided with their overall course progress, the option to compare themselves with their peers, and the expected time to complete the course based on similar historical data from other learners.

2.2. Self-Regulated Learning in Learning Analytics Context

One of the main focuses of dashboards in the context of learning analytics is self-regulated learning (SRL) [29,30,31]. It has been found that students highly value information related to time spent and time remaining. Topali et al. [4] concluded in their study that many learners face challenges with effectively managing their course time.
Matcha et al. [14] mentioned several SRL models proposed by various authors, such as Zimmerman’s models [32] based on socio-cognitive theory, Boekaerts’ model [33] based on the roles of goals and emotion, and Winne and Hadwin’s model [34] developed based on information-processing theory.
In their study, Matcha et al. [14] focused their review on the last mentioned model by Winne and Hadwin, as this model has been adopted in computer-assisted learning [35,36]. The COPES model [34] indicates that SRL consists of four distinct phases: task definition, goal setting and planning, enactment of tactics and strategies, and adaptation. Additionally, the model includes five components: conditions, operations, output, evaluation, and standards.
In one study [14], it is also mentioned that students often have limited awareness of their own performance. Good students tend to underestimate their performance, while poor students tend to overestimate their performance, leading to inaccurate self-assessments [37]. Furthermore, Winne and Jamieson-Noel [38,39] observed that learners’ self-reported performance did not align with their recorded actions [40]. These misconceptions about their performance can result in learners adopting ineffective learning strategies. Therefore, Matcha et al. [14] suggested that external assistance and alternative perspectives can help correct these misjudgements.
Additionally, Rohloff et al. [41] highlighted that dashboards assists learners in applying self-regulation strategies and are perceived as useful by the majority of learners. Dashboards aid them in planning their learning activities and monitoring their progress throughout courses.
It is also important to highlight previous research works that address the generation of e-learning learner profiles and the use of learning analytics to optimise their learning path, as it can assist them in their self-regulated learning [42,43].

2.3. Intervention and Feedback in Learning Analytics Context

The primary objective of using a dashboard is to provide feedback, which is conveyed through the displayed data or information. As stated by the Education Endowment Foundation, “Feedback is information given to the learner about the learner’s performance relative to learning goals or outcomes. It should aim to (and be capable of producing) improvement in students’ learning” (source: https://educationendowmentfoundation.org.uk/, accessed on 15 July 2023). Feedback not only contributes to learner motivation [44] but also enhances learner engagement [45], thereby directly influencing the learner’s academic performance.
To reiterate the points mentioned earlier, it is crucial to design feedback effectively, employing appropriate approaches to achieve the tool’s objective [46]. In online environments, providing impactful feedback becomes more challenging due to the geographical distance between learners and instructors [47]. Automatic feedback is one alternative that can be highly beneficial, as highlighted in a systematic review [47], where it was found to improve student performance in activities (50.79% of the articles) and reduce the instructor’s workload.
Lim et al. [48] conducted another study where they investigated perceptions and emotional responses to personalised feedback based on learning analytics across four courses. The findings revealed that learners had predominantly positive perceptions, and affective responses were mostly associated with increased motivation for learning. The study also emphasised the need for further research on how feedback can influence learner learning, considering contextual factors and other aspects.
In a different article, Pardo et al. [49] highlighted the potential of combining data collected through digital tools in learning environments with the instructor’s course knowledge. This combination enabled the provision of personalised feedback messages to a large number of learners, resulting in improved learner satisfaction with feedback and academic performance.
Regarding the potential of personalised feedback, Lim et al. [50] compared courses where feedback was not provided with those where it was. In an engineering course, it was observed that positive learning strategies increased in courses with feedback, while negative learning strategies decreased. Learners themselves commented that the feedback equipped them with effective learning strategies and perceived a positive impact on their learning.
However, it can be challenging to provide feedback to learners who need it the most, as observed in the study presented by Iraj et al. [51], where the most active learners interacting with tools like the dashboard tend to be those with higher grades or better performance.
Bennett et al. [52] provided evidence that even learners with lower grades felt more motivated when they could view their data on a dashboard. This motivation led to positive decisions and improved outcomes. Similarly, Corrin et al. [53] corroborated that most learners reported increased motivation after receiving feedback and accessing data from the dashboard.
Cobos et al. [8] presented in another study that learners who received feedback had a higher success rate in their course. Furthermore, 80% of students who received help reported a positive impact on their performance, while 90% of those who did not receive help expected a positive impact. Learner reflection and awareness of their performance positively influence the course trajectory, and analytical data visualisation facilitates such reflection [54]. For example, Smith [26] reported that 55% of respondents stated that they would change their study approach after reviewing the provided data.
Broos et al. [55] used a dashboard displaying indicators such as concentration, anxiety, motivation, assessment strategies, and time management. It was found that poorly performing students were motivated by using the dashboard.
Several tools in the context of learning analytics aim to improve student outcomes through intervention strategies. One example is the work presented by Hernández-Leo et al. [56], which focuses on supporting decision making in learning design.
Additionally, some studies such as the one presented by Sønderlund et al. [57] indicated that learners, particularly in distance learning courses, require a sense of instructor presence. These findings were revealed through questionnaires, where students expressed a need for regular feedback on their performance.
Another essential aspect for achieving impact, as mentioned earlier, is ensuring that learners can interpret the data correctly [53]. In another study, the most successful data visualisation among learners included overall course progress, a diagram illustrating accumulated points achieved in comparison to the maximum possible points of the tests, and the time required to complete the course, supporting learner self-regulation.
Finally, many studies, such as the one presented by Heikkinen et al. [58], utilised the visualisation provided by a dashboard as a means to enhance student self-regulated learning through learning analytics intervention strategies.

3. The Context of the Research Study

In this section, the context surrounding the present work is discussed. Firstly, the MOOC in which the study was conducted, including its structure and important concepts, is introduced. Then, the existing edX-LIMS, with its services and functionalities, is described. Finally, a detailed description is provided for the new services implemented, which constitute the enhanced version of edX-LIMS, known as edX-LIMS+.

3.1. WebApp MOOC

The research study presented in this article was conducted with the learners of a MOOC provided by UAM on the edX platform (https://www.edx.org, accessed on 15 July 2023). The MOOC is titled “Introduction to Development of Web Applications” (WebApp MOOC for short). In this course, learners acquire skills in developing web applications and gain knowledge about concepts like client–server architecture. They have the opportunity to learn various technologies for this purpose, including HTML, CSS, JavaScript, Python, JSON, and Ajax. The course has been available in a self-paced mode since April 2019, meaning that there is no end date, and learners can join at any time. The course’s organisation is shown graphically below.
The course is divided into five different units that cover various course contents. Additionally, there is another unit titled “Unit 0”, which provides information on how learners can organise their environment to follow the course effectively. The first unit focuses on introducing the learner to the context by analysing the features of the World Wide Web. The second unit explains HTML, teaching learners how to create forms and web pages with CSS for proper styling and formatting.
Moving on to the third unit, learners are introduced to Flask and Python, which are used to create the server side of the web application. The fourth unit covers the use of sessions and JSON. Finally, in the fifth unit, learners are taught JavaScript and Ajax, which are essential for creating the client side of the application.
Each unit is further divided into subunits, designed to organise the content effectively and facilitate learning. For instance, the second unit consists of three subunits, as illustrated in Figure 1. Subunit 2.1 introduces the concept of HTML, while Subunit 2.2 focuses on explaining forms, and Subunit 2.3 covers the introduction of CSS. These subunits are complemented by interactive pages that learners can engage with, displaying the same content on their screens.
To provide an example, Subunit 2.1 includes seven pages. The first one introduces HTML, followed by an explanation of the language. Subsequently, learners’ knowledge is assessed through various assessments. These pages incorporate different elements, with text boxes containing course content in various formats such as HTML, PDF, and more. Additionally, videos, forums, and assessments are among the essential elements found on these pages.

3.2. edX-LIMS

At UAM, we have developed a learning analytics system called edX-LIMS (acronym for System for Learning Intervention and its Monitoring for edX MOOCs). It has been in use since June 2020 for the WebApp MOOC learners [9].
edX-LIMS is a web-based intervention system that provides feedback to learners. The system analyses learners’ performance in a course and generates feedback in the form of a “Web-based Learner Dashboard”. Each learner can access their dashboard by receiving a weekly email containing instructions on how to access it.
In addition, course instructors have access to a “Web-based Instructor Dashboard” where they can view general course data and access learners’ dashboards. Furthermore, the instructor has the ability to monitor learners’ access to and interactions with their dashboards, as the system records all learner interactions with each part of the dashboard.
The system has these services:
  • Course-Data-Processing Service (CDPS): The system uses course data to calculate learner indicators, which serve as input variables representing a summary of learner interactions. Examples of these indicators include the number of events related to assignment resolution, video viewing, and course navigation, as well as the total time spent in the course, the number of sessions in the course, and other relevant metrics (see Figure 2). This service is executed by the system administrator.
  • Learning Intervention Generation Service (LIGS): Based on learner indicators, the system generates individual web-based dashboards for each learner. Subsequently, it sends the necessary information to access these dashboards to the learners via email.
  • Intervention Visualisation Service (IVS): The system enables learners to view their participation and performance in the course through their dashboard. Learners can access a web page containing various graphs that provide information about their grades per course unit and the values of the indicators over time.
  • Course-Data-Monitoring Service (CDMS): Instructors can utilise this service to monitor course summary data provided by the system.
  • Learning-Intervention-Monitoring Service (LIMS): The system enables instructors to monitor learners’ interests in their dashboards, thanks to the fact that all their actions in their dashboards are registered by the system.
Figure 2 shows an example of some graphs generated by edX-LIMS in a learner dashboard with data about the learner interactions in the form of their calculated indicators daily. The learner can select the indicators to show in the graphs. These indicators can be compared with the averages of these metrics for all the learners. The left graph shows the selected metrics, and the right graph shows these values accumulated in time. The learner can click on the question mark icon to access help, i.e., a pop-up window with descriptions of the values in these graphs.

3.3. edX-LIMS+

Since July 2021, an updated version of the aforementioned system, called edX-LIMS+, has been implemented. Building upon the previous version, edX-LIMS+ incorporates several additional services aimed at enhancing the data-driven intervention strategy supported by the learning analytics system (more details will be provided in the next section).
Additionally, edX-LIMS+ includes support for the detection of two common problems that MOOC learners often face: the “difficulty problem” and the “SRL problem”. On the one hand, the difficulty problem occurs when a learner requires more time than the average learner to learn the course content, watch videos, or complete assessments. On the other hand, the SRL problem arises when a learner does not allocate enough time for learning the content, watching videos, or completing assessments.
To address these problems, the system analyses learners’ interactions within their course to identify those who are experiencing either of the mentioned difficulties. Based on this analysis, the system generates suggestions to help learners manage and overcome these challenges.
The existing services in edX-LIMS have been enhanced in edX-LIMS+. In the Course-Data-Processing Service, the system now calculates additional data for the learners, specifically their time spent and attempts on different sections and elements of the course (as depicted in Figure 1). Moreover, averages, maximums, and minimums are computed for each of the sections and elements, providing valuable insights into learner engagement and performance.
Figure 3 shows an example of some graphs generated by edX-LIMS+ in a learner dashboard with data on his/her time spent and attempts in the course units. These graphs also include a comparison of the learner’s values with the average and maximum values of all learners in the course.
The Learning Intervention Generation Service introduces a new page in the Learner Dashboard, providing additional data. Consequently, the Intervention Visualisation Service presents learners with the following on the new page: (i) new graphs related to the time spent and attempts on different sections and elements, and (ii) information regarding any identified problems and corresponding suggestions for managing them.
The Course-Data-Monitoring Service enriches the Instructor Dashboard with new graphs, providing information about the average time spent and attempts on different sections and elements by learners.
The Learning-Intervention-Monitoring Service enables instructors to visualise learners’ interests across the two pages of their Dashboards.
As mentioned earlier, the system has been expanded, and the following services have been added to edX-LIMS+:
  • Learner Problem Detection Service (LPDS): The system utilises learner indicators to identify individuals who are experiencing either a “difficulty problem” or a “SRL problem”.
  • Learner Feedback Service (LFS): The system includes a section on the new page of learners’ dashboards where they can provide feedback to instructors regarding their agreement or disagreement with the detected problem. This section consists of a textbox (registered upon clicking) and a text area where learners can explain the reasons behind their feedback.
  • Learner-Problem-Monitoring Service (LPMS): The system enables instructors to monitor all the data related to the problems detected among learners, as well as the data that support those detections.
  • Learner-Feedback-Monitoring Service (LFMS): The system allows instructors to monitor all the feedback received from learners regarding the detected problems. This includes learners’ feedback on their agreement or disagreement with the detected problems and their reasons for providing such feedback.
In Figure 4 we can see all the edX-LIMS+ services, which are interconnected through a MongoDB Database. In addition, the following figure indicates which services are for each type of user of the learning analytics system. The types of system users are as follows: (i) learner (any learner registered in the verified itinerary), (ii) instructor (any member of the course instructors team), and (iii) admin (any instructor who maintains and manages the course data in the system).

4. Materials and Method

In this research study, a weekly data-driven intervention strategy supported by edX-LIMS+ is conducted with the WebApp MOOC learners. This strategy is a process that follows a specific sequence of steps on a weekly basis, as illustrated in Figure 5. The process begins with the CDPS analysing learners’ data in the MOOC. Subsequently, the LIGS generates Learner Dashboards, which are enhanced with information on learners’ self-regulated learning approaches. These dashboards display data on the time spent and attempts made in various sections and elements of the course, such as units, subunits, pages, videos, and assessments. Furthermore, the learner’s data are compared to the average data of certified learners, who have successfully completed the course with a final grade higher than 0.5 (maximum value for the final grade is 1). This comparison allows learners to better assess their performance in the course and gauge their level of commitment by comparing themselves to certified learners.
In addition to the services mentioned earlier, the LPDS is responsible for detecting learners’ difficulty problems and SRL problems. These problems are identified based on a set of rules that consider the time spent and attempts made in various sections and elements of the course. The information about the detected problems and corresponding suggestions to manage them is added to a specific section of the Learner Dashboard.
Learners receive an email containing instructions to access their Learner Dashboards. They can then interact with different parts of the dashboard using the IVS. The IVS enables learners to compare their data with that of certified learners, and all these interactions are recorded in the system. Additionally, learners can provide feedback on the detected problems to the instructors, expressing their agreement or disagreement and providing comments. This feedback is managed by the LFS.
Simultaneously, instructors can access the Instructor Dashboard, where they can monitor various aspects. The CDMS provides information about the course itself, while the LIMS provides insights into learners’ interactions with the dashboards. The LPMS provides data on the detected problems, and the LFMS offers information on learners’ feedback regarding the detected problems.
This instructor monitoring provides valuable information and evidence that can assist instructors in implementing intervention actions. For example, if learners were initially making good progress in the course but have become disengaged for a while, instructors can send them motivational messages to re-engage them in the learning process.
Table 1 illustrates how the services comprising the intervention strategy of each version of the learning analytics system generate the Learner Dashboard. The following table shows a summary of the elements displayed on each web page that make up the Learner Dashboards.

5. Research Study

The research study presented in this article had two main objectives. Firstly, it aimed to analyse the impact of the intervention strategy supported by edX-LIMS+ on learners, specifically in terms of their usage of the Learner Dashboard, as well as their engagement and motivation in the MOOC. Secondly, the study aimed to analyse how the intervention strategy affected instructor tasks, particularly in terms of monitoring learners. This research study serves as an extension of a previous study [59].

5.1. Research Questions

The following research questions for this study are proposed, focusing on learners:
  • RQ1: Does the intervention strategy increase learners’ interest in using the Learner Dashboards?
  • RQ2: Does the intervention strategy enhance learners’ perception of the usefulness of the Learner Dashboards?
  • RQ3: Does the intervention strategy improve learners’ engagement and motivation in the course?
  • RQ4: Do learners agree with the problems detected by the system?
Secondly, the research questions related to instructors are as follows:
  • RQ5: Does the intervention strategy enhance instructors’ perception of the usefulness of the Instructor Dashboard?
  • RQ6: Does the intervention strategy have an impact on instructors’ monitoring of the learners?

5.2. Datasets

For the research questions, data collected from three different sources are analysed, specifically the following:
  • Learner Satisfaction Questionnaire (LSQ): When a learner completed the course and earned a certificate, he or she was given the opportunity to complete a questionnaire to evaluate the Learner Dashboard. The questionnaire consisted of multiple-choice questions and responses on a five-point Likert-type scale: (1) Strongly disagree; (2) Disagree; (3) Neither agree nor disagree; (4) Agree; (5) Strongly agree. These data are relevant to research questions RQ1, RQ2, and RQ3.
  • Learner Interactions with the Learner Dashboard (LILD): The system records all learners’ interactions with their dashboards, including their opinions and feedback about the detected problems. These data are relevant to research questions RQ1 and RQ4.
  • Instructor Satisfaction Questionnaire (ISQ): Course instructors were provided with a questionnaire to evaluate the Instructor Dashboard. The questionnaire used a five-point Likert scale (1–5), similar to the LSQ, and included both answer choices and open-ended textual responses. These data are relevant to research questions RQ5 and RQ6.

6. Results

From June 2020 to June 2021, we used the initial version of the learning analytics system, edX-LIMS, to generate the dashboards for the WebApp MOOC learners and instructors. However, starting in July 2021, the new intervention strategy supported by edX-LIMS+ has been implemented. As a result, learners and instructors have had the opportunity to access and view both sets of components offered in the dashboards, provided by both edX-LIMS and edX-LIMS+.
The MOOC used in the research study is in the Spanish language, and some of its details are in Section 3 (https://www.edx.org/course/introduccion-al-desarrollo-de-aplicaciones-web-2, accessed on 15 July 2023). This MOOC began in April 2019 in a self-paced mode. In June 2020, there were 17,110 learners enrolled in the MOOC. Among them, 811 learners opted for the verified itinerary and are referred to as verified learners. Verified learners have paid a fee and will receive an official certificate upon successful completion of the course. Out of these verified learners, 322 have obtained certification by achieving a final grade equal to or greater than 0.5 (with the minimum grade being 0 and the maximum grade being 1). Only learners enrolled in the verified itinerary have access to assessments on edX; thus, only they can be evaluated and obtain a final grade. Learners who are not enrolled in the verified itinerary are considered audit learners.
At the end of June 2021, there were 32,617 learners enrolled in the MOOC. Among them, 1182 learners opted for the verified itinerary. Out of these verified learners, 452 have obtained certification. And at the end of July 2022, there were 47,108 learners enrolled in the MOOC. Among them, 1580 learners were verified. Out of those, 675 have obtained certification.
Taking into account that throughout the duration of the course, it has been observed that approximately 50% of the learners have dropped out [59], the approximate number of learners studying in the MOOC from June 2020 to July 2022 is around 400–500 learners.
Of the enrolled learners, 72% were men and 28% were women. They represented a total of 143 different countries. However, the majority of learners were from Spain (19%), Mexico (14%), and Colombia (13%). Additionally, most of the learners had completed undergraduate studies (51%), while 30% had completed secondary studies, and 11% had pursued postgraduate studies such as masters or doctorates (source: https://insights.edx.org/, accessed on 15 July 2023).
This research study was conducted exclusively with verified learners in the MOOC. Within this group of learners in the research study, the following distinctions were made:
  • LIMS_learners: learners who received emails with access to their Learner Dashboards supported by edX-LIMS. Their data are presented in the following figures with the label “edX-LIMS” for simplicity.
  • LIMS+_learners: learners who received emails with access to their Learner Dashboards supported by edX-LIMS+. Their data are presented in the following figures with the label “edX-LIMS+” for simplicity.
  • The research results obtained from the analysis of the different sources are presented in the following subsections.

6.1. The Analysis of Learner Usage of Learner Dashboards

In Figure 6, the percentage of learners who viewed their dashboard weekly is shown (for simplicity, the same number of months of use for the first version of the system and the combined use of both versions is shown). As depicted, when the new version of the learning analytics system was used, learners accessed their dashboards more frequently and interacted with the parts offered by edX-LIMS+ more often. On average per month, 4.5% of LIMS_learners viewed their dashboards, while 6% of LIMS+_learners viewed them.
Taking into account the high dropout rate of MOOCs and the self-paced nature of learner participation, it is important to acknowledge that obtaining satisfaction questionnaires from MOOC learners is extremely challenging. In this research study, a total of eight responses were obtained from the LSQ for LIMS_learners, and fourteen responses were obtained from the LSQ for LIMS+_learners.
In the following graphs, a comparison of the answers to the questions (in italics) between the LSQ for LIMS_learners and the LSQ for LIMS+_learners is presented.
As we can observe in both Figure 7a,b, LIMS+_learners more strongly agree that the dashboard shows them their progress. Therefore, these learners could be more motivated to visit their dashboards.
As depicted in Figure 8b, the highest percentage of responses for the frequency of receiving messages is “every 3–4 days”. This indicates that LIMS+_learners express a preference for more frequent updates of data in their dashboard compared to LIMS_learners.
Hence, it can be concluded that the intervention strategy supported by edX-LIMS+ has increased learners’ interest in using the Learner Dashboards, thus confirming the findings related to RQ1.

6.2. The Analysis of Learner Perception of Learner Dashboards

As displayed in Figure 9a,b, it is evident that a higher percentage of LIMS+_learners strongly agree that the different parts of the dashboard are adequate, understandable, and usable. This indicates that these learners have a more positive perception of the usefulness of the Learner Dashboards, which aligns with the findings related to RQ2.

6.3. The Analysis of Learner Perception of Engagement and Motivation in the Course

Firstly, in both Figure 10a,b, a higher percentage of LIMS+_learners agree that the visualisations in the dashboard have improved their performance and motivated them to study for the course.
Secondly, in both Figure 11a,b, a higher percentage of LIMS+_learners agree that the visualisations in the dashboard have made them feel that their activity was recognised and have helped them maintain a consistent rhythm for learning in the course.
Finally, in both Figure 12a,b, a higher percentage of LIMS+_learners strongly agree that the visualisations in the dashboard have made them feel supervised, guided, and not alone while they were learning in the course.
Hence, it can be concluded that the intervention strategy supported by edX-LIMS+ has had a positive impact on learners’ engagement and motivation, supporting the findings related to RQ3.

6.4. The Analysis of the Learner Feedback on Their Detected Problems

Figure 13 demonstrates that a majority of learners agree with the problems detected. Specifically, among the LIMS+_learners who provided feedback, 60% agreed that the SRL problem was detected correctly, 100% agreed that the difficulty problem was detected correctly, and 85% agreed that they did not have any problem.
Furthermore, 50% of these LIMS+_learners provided comments in text format. Most of the comments were related to the learners interrupting their learning in the course but expressing their intention to continue after a short period of time.
Therefore, it can be concluded that the system has a high accuracy in detecting problems (see RQ4).

6.5. The Analysis of Instructor Perception of Instructor Dashboard

Both Figure 14a,b indicate that more instructors agree that the different parts of the dashboard are adequate, understandable, and usable. In the free text part of the ISQ, they expressed a very positive sentiment about the new visualisations and tables added by edX-LIMS+ (see RQ5).

6.6. The Analysis of Instructor Tasks to Monitor Learners

The instructors were asked how the new parts added to the Instructor Dashboard had influenced their learner-monitoring tasks, and they provided their responses in the free text part of the ISQ. They acknowledged that the visualisations of the new parts added by edX-LIMS+ helped them better understand learner progress and identify those learners who needed assistance.
Because of the newly added tables, the instructors were able to make decisions to assist learners with problems. Specifically, they implemented two additional interventions to address the needs of these learners. Firstly, the instructors sent motivational messages to learners who were making progress but seemed to lack engagement, in order to motivate them to continue with the course. Secondly, the instructors provided learners who completed the course with a final grade close to but less than 0.5 a second opportunity to retake assessments with low scores, giving them a chance to pass the course and earn certification.
Additionally, instructors noted that learners who actively engaged with their dashboards were more receptive to receiving assistance. Instructors received gratitude from learners for responding to their emails and implementing the additional interventions. This interaction improved the relationship between instructors and learners, establishing a sense of bidirectional communication.
In summary, instructors reported that the new version of the Instructor Dashboard supported by edX-LIMS+ enhanced their ability to monitor MOOC learners and improve their tasks. This supports the findings related to research question RQ6.

7. Discussion

In order to answer the research questions of this study, opinions from learners and instructors regarding the visualisations in their dashboards provided by the learning analytics system were gathered using two questionnaires: the Learner Satisfaction Questionnaire (LSQ) and the Instructor Satisfaction Questionnaire (ISQ). These opinions were analysed. Additionally, all learner interactions with their dashboards and their feedback on the detected problems recorded by the system were analysed.
Regarding the interest of learners in using their dashboards (RQ1), the results of the study indicate that the data-driven intervention strategy supported by edX-LIMS+ significantly increased learners’ interest in accessing and using their dashboards. This is evident from the increased interactions with the new parts of the dashboards provided by edX-LIMS+ and learners’ expressed desire for more frequent updates.
In regard to whether the intervention strategy supported by edX-LIMS+ improved learners’ and instructors’ perceptions of the usefulness of their dashboards (RQ2 and RQ5), learners found the new parts of their dashboards to be more suitable, understandable, and usable compared to the previous version. Instructors reported that the additional parts in the Instructor Dashboard helped them gain a better understanding of learner progress and identify those in need of assistance.
Regarding the analysis of learners’ perceptions of their motivation and engagement in the MOOC (RQ3), learners recognised an increased motivation to learn, improved their performance, felt recognition for their efforts, maintained a consistent learning rhythm, and felt supervised and guided, knowing that they were not alone in their learning journey, thanks to the intervention strategy supported by edX-LIMS+.
In regard to the utility of problem detection for learners (RQ4), most learners agreed that the system accurately detected their problems, and they expressed gratitude for the suggestions received through their dashboards on how to address these problems. Additionally, 50% of learners provided feedback to instructors on these detected problems.
Finally, in response to the feedback on detected problems, instructors were able to provide better assistance to learners in need, resulting in improved learner tracking (RQ6). For instance, instructors encouraged learners who lacked engagement, motivating them to continue in the MOOC. They also provided learners who scored close to but less than 0.5 on assessments with a second opportunity to improve their scores and pass the course for certification. This led to a stronger connection between instructors and MOOC learners, improving communication and support.

8. Conclusions

Great learning opportunities are provided through MOOCs. However, many learners face challenges in successfully completing MOOCs due to various factors, including feelings of isolation, lack of support, and limited feedback. To address these issues, edX-LIMS (System for Learning Intervention and its Monitoring for edX MOOCs) was developed. It is a learning analytics system that supports an intervention strategy approach. This system offers MOOC learners a web-based Learner Dashboard, providing them with an easy way to track their learning progress in the course. Additionally, the system provides MOOC instructors with a web-based Instructor Dashboard, enabling them to monitor learners’ progress and their usage of the dashboards.
Further, this learning analytics system was improved with new services, resulting in a new version called edX-LIMS+. These new services aim to enhance the dashboards for both instructors and learners. This new version of the system focuses on promoting learners’ self-regulated learning and encouraging their active participation by providing feedback to instructors about the problems detected in the MOOC.
In this article, a research study conducted with learners in a MOOC was presented. The results obtained from this study corroborate that the intervention strategy supported by edX-LIMS+ had a positive impact on learners’ engagement, motivation, and use of the Learner Dashboard. Additionally, the intervention strategy improved instructor tasks related to learner monitoring and assistance. Moreover, instructors stated that the communication between learners and instructors improved, and they felt closer to their learners, thanks to the learners’ feedback about the problems detected.
Both the Learner Dashboard and Instructor Dashboard can be improved. For instance, the system could be extended to include detection of other types of learner problems. Furthermore, new data and visualisations depicting the evolution of the learners’ detected problems could assist instructors in better monitoring them. Additionally, in a new version of the system, recording additional interventions by instructors and capturing learners’ reactions to them could help instructors enhance these interventions.
Additionally, as a work in progress, edX-LIMS+ is being extended with a machine learning service. This service utilises learner activity data as input to an artificial intelligence model that statistically predicts the likelihood of a learner dropping out of the course or successfully completing it (obtaining the certificate).
By leveraging this new approach, the system aims to provide timely alerts to instructors, notifying them about learners who are at risk of losing interest in the course and potentially dropping out. These alerts serve as an invitation for instructors to intervene and provide targeted support to learners who may require additional assistance. Therefore, this new service will enhance the interventions provided by instructors for MOOC learners.
In future work, expanding the application of our approach to additional MOOCs with edX will allow us to gather more data and insights, further providing an opportunity to evaluate its generalisability and adaptability to diverse learning contexts. Each course may have its unique characteristics and learner demographics, and studying the applicability of our system across different courses will help us understand its scalability and potential for wider adoption.
In conclusion, collaborating with other MOOC providers on the edX platform will also enable us to collaborate with a broader community of educators and researchers, fostering knowledge exchange and further advancements in the field of learning analytics. All of these efforts will contribute to the continuous improvement of online learning experiences.

Funding

This research has been co-funded by the Madrid Regional Government through the e-Madrid-CM Project under Grant S2018/TCS-4307, a project which is co-funded by the European Structural Funds (FSE and FEDER). This research has been co-funded by the National Research Agency of the Spanish Ministry of Science, Innovation and Universities under project grant RED2022-134284-T (SNOLA). Furthermore, this research has been co-funded by the National Research Agency of the Spanish Ministry of Science and Innovation under project grants PID2019-105951RB-I00 (IndiGo!) and PID2021-127641OB-I00 (BBforTAI).

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

Special thanks are due to the UAM for facilitating access to the data of the MOOC used in this article. Moreover, the participation in the technological developments by Juan Soberón and Unai Agirre from UAM is acknowledged and appreciated.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Ma, L.; Lee, C.S. Investigating the adoption of MOOCs: A technology-user-environment perspective. J. Comput. Assist. Learn. 2019, 35, 89–98. [Google Scholar] [CrossRef] [Green Version]
  2. Davis, D.; Chen, G.; Jivet, I.; Hauff, C.; Houben, G.J. Encouraging Metacognition & Self-Regulation in MOOCs through Increased Learner Feedback. In Proceedings of the LAL 2016 Workshop at LAK’16, Edinburgh, UK, 2016, 26 April 2016; pp. 17–22. [Google Scholar]
  3. Hone, K.S.; El Said, G.R. Exploring the factors affecting MOOC retention: A survey study. Comput. Educ. 2016, 98, 157–168. [Google Scholar] [CrossRef] [Green Version]
  4. Topali, P.; Ortega-Arranz, A.; Er, E.; Martínez-Monés, A.; Villagrá-Sobrino, S.L.; Dimitriadis, Y. Exploring the problems experienced by learners in a MOOC implementing active learning pedagogies. In Digital Education: At the MOOC Crossroads Where the Interests of Academia and Business Converge, Proceedings of the 6th European MOOCs Stakeholders Summit, EMOOCs 2019, Naples, Italy, 20–22 May 2019; Springer International Publishing: New York, NY, USA, 2019; Volume 11475, pp. 81–90. [Google Scholar] [CrossRef]
  5. Lang, C.; Siemens, G.; Wise, A.; Gasevic, D. (Eds.) Handbook of Learning Analytics; Society for Learning Analytics Research (SoLAR): 2017. Available online: https://www.solaresearch.org/publications/hla-17/ (accessed on 15 June 2023).
  6. Romero, C.; Ventura, S. Educational data mining and learning analytics: An updated survey. WIREs Data Min. Knowl. Discov. 2020, 10, e1355. [Google Scholar] [CrossRef]
  7. Martínez Monés, A.; Dimitriadis Damoulis, I.; Acquila Natale, E.; Álvarez, A.; Caeiro Rodríguez, M.; Cobos Pérez, R.; Conde González, M.Á.; García Peñalvo, F.J.; Hernández Leo, D.; Menchaca Sierra, I.; et al. Achievements and challenges in learning analytics in Spain: The view of SNOLA. RIED Rev. Iberoam. Educ. A Distancia 2020, 23, 187. [Google Scholar] [CrossRef]
  8. Cobos, R.; Ruiz-Garcia, J.C. Improving learner engagement in MOOCs using a learning intervention system: A research study in engineering education. Comput. Appl. Eng. Educ. 2021, 29, 733–749. [Google Scholar] [CrossRef]
  9. Cobos, R.; Soberón, J. A proposal for monitoring the intervention strategy on the learning of MOOC learners. In CEUR Workshop Proceedings; 2020; Volume 2671, pp. 61–72. Available online: https://ceur-ws.org/Vol-2671/paper07.pdf (accessed on 15 July 2023).
  10. Topali, P.; Ortega-Arranz, A.; Martínez-Monés, A.; Villagrá-Sobrino, S.L. “Houston, We Have a Problem”: Revealing MOOC practitioners’ experiences regarding feedback provision to learners facing difficulties. Comput. Appl. Eng. Educ. 2021, 29, 769–785. [Google Scholar] [CrossRef]
  11. Onah, D.F.O.; Pang, E.L.L.; Sinclair, J.E.; Uhomoibhi, J. Learning Analytics for Motivating Self-regulated Learning and Fostering the Improvement of Digital MOOC Resources. In Mobile Technologies and Applications for the Internet of Things, Proceedings of the 12th IMCL Conference, Thessaloniki, Greece, 31 October–1 November 2019; Springer International Publishing: New York, NY, USA, 2019; pp. 14–21. [Google Scholar] [CrossRef] [Green Version]
  12. Cobos, R.; Olmos, L. A Learning Analytics Tool for Predictive Modeling of Dropout and Certificate Acquisition on MOOCs for Professional Learning. In Proceedings of the 2018 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), Bangkok, Thailand, 16–19 December 2018; pp. 1533–1537. [Google Scholar] [CrossRef]
  13. Moreno-Marcos, P.M.; Alario-Hoyos, C.; Munoz-Merino, P.J.; Kloos, C.D. Prediction in MOOCs: A Review and Future Research Directions. IEEE Trans. Learn. Technol. 2019, 12, 384–401. [Google Scholar] [CrossRef]
  14. Matcha, W.; Uzir, N.A.; Gasevic, D.; Pardo, A. A Systematic Review of Empirical Studies on Learning Analytics Dashboards: A Self-Regulated Learning Perspective. IEEE Trans. Learn. Technol. 2020, 13, 226–245. [Google Scholar] [CrossRef]
  15. Verbert, K.; Duval, E.; Klerkx, J.; Govaerts, S.; Santos, J.L. Learning Analytics Dashboard Applications. Am. Behav. Sci. 2013, 57, 1500–1509. [Google Scholar] [CrossRef] [Green Version]
  16. Verbert, K.; Govaerts, S.; Duval, E.; Santos, J.L.; Van Assche, F.; Parra, G.; Klerkx, J. Learning dashboards: An overview and future research opportunities. Pers. Ubiquitous Comput. 2014, 18, 1499–1514. [Google Scholar] [CrossRef] [Green Version]
  17. Schwendimann, B.A.; Rodriguez-Triana, M.J.; Vozniuk, A.; Prieto, L.P.; Boroujeni, M.S.; Holzer, A.; Gillet, D.; Dillenbourg, P. Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research. IEEE Trans. Learn. Technol. 2017, 10, 30–41. [Google Scholar] [CrossRef]
  18. Bodily, R.; Verbert, K. Review of Research on Student-Facing Learning Analytics Dashboards and Educational Recommender Systems. IEEE Trans. Learn. Technol. 2017, 10, 405–418. [Google Scholar] [CrossRef]
  19. Wise, A.F. Designing pedagogical interventions to support student use of learning analytics. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge—LAK’14, Indianapolis, IN, USA, 24–28 March 2014; ACM Press: New York, NY, USA, 2014; pp. 203–211. [Google Scholar] [CrossRef]
  20. Jivet, I.; Scheffel, M.; Drachsler, H.; Specht, M. Awareness Is Not Enough: Pitfalls of Learning Analytics Dashboards in the Educational Practice. In Data Driven Approaches in Digital Education, Proceedings of the 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, 12–15 September 2017; Springer International Publishing: New York, NY, USA, 2017; pp. 82–96. [Google Scholar] [CrossRef] [Green Version]
  21. Jivet, I.; Scheffel, M.; Specht, M.; Drachsler, H. License to Evaluate: Preparing Learning Analytics Dashboards for Educational Practice. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, LAK’18, Sydney, Australia, 7–9 March 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 31–40. [Google Scholar] [CrossRef] [Green Version]
  22. Park, Y.; Jo, I.H. Development of the Learning Analytics Dashboard to Support Students’ Learning Performance. J. Univ. Comput. Sci. 2015, 21, 110–133. [Google Scholar]
  23. Kim, J.; Jo, I.H.; Park, Y. Effects of learning analytics dashboard: Analyzing the relations among dashboard utilization, satisfaction, and learning achievement. Asia Pac. Educ. Rev. 2016, 17, 13–24. [Google Scholar] [CrossRef]
  24. Schwendimann, B.A.; Rodríguez-Triana, M.J.; Vozniuk, A.; Prieto, L.P.; Boroujeni, M.S.; Holzer, A.; Gillet, D.; Dillenbourg, P. Understanding learning at a glance: An overview of learning dashboard studies. In Proceedings of the sixth international conference on learning analytics & knowledge, Edinburgh, UK, 25–29 April 2016; pp. 532–533. [Google Scholar] [CrossRef] [Green Version]
  25. Sedrakyan, G.; Leony, D.; Muñoz-Merino, P.J.; Kloos, C.D.; Verbert, K. Evaluating Student-Facing Learning Dashboards of Affective States. In Data Driven Approaches in Digital Education, Proceedings of the 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, 12–15 September 2017; Springer International Publishing: New York, NY, USA, 2017; pp. 224–237. [Google Scholar] [CrossRef] [Green Version]
  26. Smith, P. Engaging online students through peer-comparison progress dashboards. J. Appl. Res. High. Education. 2019, 12, 38–56. [Google Scholar] [CrossRef]
  27. Leon, M.; Cobos, R.; Dickens, K.; White, S.; Davis, H. Visualising the MOOC experience: A dynamic MOOC dashboard built through institutional collaboration. In Proceedings of the 3rd European MOOCs Stakeholders Summit, EMOOCs 2016, Graz, Austria, 22—24 February 2016; pp. 1–8. [Google Scholar]
  28. Charleer, S.; Moere, A.V.; Klerkx, J.; Verbert, K.; De Laet, T. Learning Analytics Dashboards to Support Adviser-Student Dialogue. IEEE Trans. Learn. Technol. 2018, 11, 389–399. [Google Scholar] [CrossRef]
  29. Alonso-Mencía, M.E.; Alario-Hoyos, C.; Maldonado-Mahauad, J.; Estévez-Ayres, I.; Pérez-Sanagustín, M.; Kloos, C.D. Self-regulated learning in MOOCs: Lessons learned from a literature review. Educ. Rev. 2020, 72, 319–345. [Google Scholar] [CrossRef]
  30. Azevedo, R.; Gasevic, D. Analyzing Multimodal Multichannel Data about Self-Regulated Learning with Advanced Learning Technologies: Issues and Challenges. Comput. Hum. Behav. 2019, 96, 207–210. [Google Scholar] [CrossRef]
  31. Ye, Z.; Jiang, L.; Li, Y.; Wang, Z.; Zhang, G.; Chen, H. Analysis of Differences in Self-Regulated Learning Behavior Patterns of Online Learners. Electronics 2022, 11, 4013. [Google Scholar] [CrossRef]
  32. Zimmerman, B.J. Theories of self-regulated learning and academic achievement: An overview and analysis. In Self-Regulated Learning and Academic Achievement: Theoretical Perspectives, 2nd ed.; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 2001; pp. 1–37. [Google Scholar]
  33. Boekaerts, M. Self-regulated Learning at the Junction of Cognition and Motivation. Eur. Psychol. 1996, 1, 100–112. [Google Scholar] [CrossRef]
  34. Winne, P.H.; Hadwin, A.F. Studying as Self-Regulated Learning. In Metacognition in Educational Theory and Practice; Hacker, D.J., Dunlosky, J., Graesser, A.C., Eds.; Routledge: New York, NY, USA, 1998; pp. 277–304. [Google Scholar]
  35. Panadero, E. A Review of Self-regulated Learning: Six Models and Four Directions for Research. Front. Psychol. 2017, 8, 422. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Panadero, E.; Klug, J.; Järvelä, S. Third wave of measurement in the self-regulated learning field: When measurement and intervention come hand in hand. Scand. J. Educ. Res. 2016, 60, 723–735. [Google Scholar] [CrossRef] [Green Version]
  37. Bjork, R.A.; Dunlosky, J.; Kornell, N. Self-Regulated Learning: Beliefs, Techniques, and Illusions. Annu. Rev. Psychol. 2013, 64, 417–444. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Winne, P.H.; Jamieson-Noel, D. Exploring students’ calibration of self reports about study tactics and achievement. Contemp. Educ. Psychol. 2002, 27, 551–572. [Google Scholar] [CrossRef]
  39. Winne, P.H.; Jamieson-Noel, D. Self-regulating studying by objectives for learning: Students’ reports compared to a model. Contemp. Educ. Psychol. 2003, 28, 259–276. [Google Scholar] [CrossRef]
  40. Malmberg, J.; Järvelä, S.; Kirschner, P.A. Elementary school students’ strategic learning: Does task-type matter? Metacogn. Learn. 2014, 9, 113–136. [Google Scholar] [CrossRef]
  41. Rohloff, T.; Sauer, D.; Meinel, C. Student Perception of a Learner Dashboard in MOOCs to Encourage Self-Regulated Learning. In Proceedings of the 2019 IEEE International Conference on Engineering, Technology and Education (TALE), Yogyakarta, Indonesia, 10–13 December 2019; pp. 1–8. [Google Scholar] [CrossRef]
  42. Marengo, A.; Pagano, A.; Barbone, A. Data Mining Methods to Assess Student Behavior in Adaptive e-Learning Processes. In Proceedings of the 2013 Fourth International Conference on e-Learning “Best Practices in Management, Design and Development of e-Courses: Standards of Excellence and Creativity”, Manama, Bahrain, 7–9 May 2013; pp. 303–309. [Google Scholar] [CrossRef]
  43. Pagano, A.; Marengo, A. Training Time Optimization through Adaptive Learning Strategy. In Proceedings of the 2021 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), Zallaq, Bahrain, 29–30 September 2021; pp. 563–567. [Google Scholar] [CrossRef]
  44. Gan, Z.; An, Z.; Liu, F. Teacher Feedback Practices, Student Feedback Motivation, and Feedback Behavior: How Are They Associated with Learning Outcomes? Front. Psychol. 2021, 12, 697045. [Google Scholar] [CrossRef]
  45. Parkin, H.J.; Hepplestone, S.; Holden, G.; Irwin, B.; Thorpe, L. A role for technology in enhancing students’ engagement with feedback. Assess. Eval. High. Educ. 2012, 37, 963–973. [Google Scholar] [CrossRef] [Green Version]
  46. Li, J.; Wong, S.C.; Yang, X.; Bell, A. Using feedback to promote student participation in online learning programs: Evidence from a quasi-experimental study. Educ. Technol. Res. Dev. 2020, 68, 485–510. [Google Scholar] [CrossRef]
  47. Cavalcanti, A.P.; Barbosa, A.; Carvalho, R.; Freitas, F.; Tsai, Y.S.; Gašević, D.; Mello, R.F. Automatic feedback in online learning environments: A systematic literature review. Comput. Educ. Artif. Intell. 2021, 2, 100027. [Google Scholar] [CrossRef]
  48. Lim, L.A.; Dawson, S.; Gašević, D.; Joksimović, S.; Pardo, A.; Fudge, A.; Gentili, S. Students’ perceptions of, and emotional responses to, personalised learning analytics-based feedback: An exploratory study of four courses. Assess. Eval. High. Educ. 2021, 46, 339–359. [Google Scholar] [CrossRef]
  49. Pardo, A.; Jovanovic, J.; Dawson, S.; Gašević, D.; Mirriahi, N. Using learning analytics to scale the provision of personalised feedback. Br. J. Educ. Technol. 2019, 50, 128–138. [Google Scholar] [CrossRef] [Green Version]
  50. Lim, L.A.; Gasevic, D.; Matcha, W.; Ahmad Uzir, N.; Dawson, S. Impact of learning analytics feedback on self-regulated learning: Triangulating behavioural logs with students’ recall. In Proceedings of the 11th International Learning Analytics and Knowledge Conference (LAK21), Irvine, CA, USA, 12–16 April 2021; ACM: New York, NY, USA, 2021; pp. 364–374. [Google Scholar] [CrossRef]
  51. Iraj, H.; Fudge, A.; Faulkner, M.; Pardo, A.; Kovanović, V. Understanding Students’ Engagement with Personalised Feedback Messages. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, LAK ’20, Frankfurt, Germany, 23–27 March 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 438–447. [Google Scholar] [CrossRef] [Green Version]
  52. Bennett, L. Students’ Learning Responses to Receiving Dashboard Data: Research Report. 2018. Available online: https://www.researchgate.net/publication/326352926_Students%27_learning_responses_to_receiving_dashboard_data_Research_Report?channel=doi&linkId=5b4751b1aca272c60938be0a&showFulltext=true (accessed on 15 July 2023).
  53. Corrin, L.; de Barba, P. How do students interpret feedback delivered via dashboards? In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, Poughkeepsie, NY, USA, 16–20 March 2015; ACM: New York, NY, USA, 2015; pp. 430–431. [Google Scholar] [CrossRef]
  54. Ulfa, S.; Fattawi, I.; Surahman, E.; Yusuke, H. Investigating Learners’ Perception of Learning Analytics Dashboard to Improve Learning Interaction in Online Learning System. In Proceedings of the 2019 5th International Conference on Education and Technology (ICET), Kota Batu, Indonesia, 3–5 October 2019; pp. 49–54. [Google Scholar] [CrossRef]
  55. Broos, T.; Peeters, L.; Verbert, K.; Van Soom, C.; Langie, G.; De Laet, T. Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness. In Learning and Collaboration Technologies. Technology in Education, Proceedings of the 4th International Conference, LCT 2017, Held as Part of HCI International 2017, Vancouver, BC, Canada, 9–14 July 2017; Springer International Publishing: New York, NY, USA, 2017; pp. 229–241. [Google Scholar] [CrossRef]
  56. Hernández-Leo, D.; Martinez-Maldonado, R.; Pardo, A.; Muñoz-Cristóbal, J.A.; Rodríguez-Triana, M.J. Analytics for learning design: A layered framework and tools. Br. J. Educ. Technol. 2019, 50, 139–152. [Google Scholar] [CrossRef]
  57. Sønderlund, A.L.; Hughes, E.; Smith, J. The efficacy of learning analytics interventions in higher education: A systematic review. Br. J. Educ. Technol. 2019, 50, 2594–2618. [Google Scholar] [CrossRef]
  58. Heikkinen, S.; Saqr, M.; Malmberg, J.; Tedre, M. Supporting self-regulated learning with learning analytics interventions—A systematic literature review. Educ. Inf. Technol. 2022, 28, 3059–3088. [Google Scholar] [CrossRef]
  59. Cobos, R. The Learning Analytics System that improves the teaching-learning experience of MOOC instructors and students. In Proceedings of the International Conference On Web-Based Learning—ICWL 2022, Tenerife, Spain, 21–23 November 2022; LNCS 13869. pp. 29–40. Available online: https://link.springer.com/chapter/10.1007/978-3-031-33023-0_3 (accessed on 15 July 2023).
Figure 1. WebApp MOOC organisation.
Figure 1. WebApp MOOC organisation.
Electronics 12 03368 g001
Figure 2. edX-LIMS screenshot with graphs with the learner daily indicators.
Figure 2. edX-LIMS screenshot with graphs with the learner daily indicators.
Electronics 12 03368 g002
Figure 3. Graphs generated by edX-LIMS+ with time spent (left) and attempts (right) in the course units by a learner that illustrate his/her engagement in the course units.
Figure 3. Graphs generated by edX-LIMS+ with time spent (left) and attempts (right) in the course units by a learner that illustrate his/her engagement in the course units.
Electronics 12 03368 g003
Figure 4. edX-LIMS+ services.
Figure 4. edX-LIMS+ services.
Electronics 12 03368 g004
Figure 5. Intervention strategy flow of edX-LIMS+.
Figure 5. Intervention strategy flow of edX-LIMS+.
Electronics 12 03368 g005
Figure 6. Percentage of learners who viewed their dashboards weekly.
Figure 6. Percentage of learners who viewed their dashboards weekly.
Electronics 12 03368 g006
Figure 7. LSQ answers related to (a) how learner progress is shown on the dashboard and (b) how much they appreciate the messages received.
Figure 7. LSQ answers related to (a) how learner progress is shown on the dashboard and (b) how much they appreciate the messages received.
Electronics 12 03368 g007
Figure 8. LSQ answers related to how often (a) LIMS_learners and (b) LIMS_learners+ would like to have updated data in their dashboards.
Figure 8. LSQ answers related to how often (a) LIMS_learners and (b) LIMS_learners+ would like to have updated data in their dashboards.
Electronics 12 03368 g008
Figure 9. LSQ answers related to the (a) adequacy of the visualization of the different parts of the Dashboard and (b) their understandability and usability.
Figure 9. LSQ answers related to the (a) adequacy of the visualization of the different parts of the Dashboard and (b) their understandability and usability.
Electronics 12 03368 g009
Figure 10. LSQ answers related to (a) learner performance and (b) motivation in the course.
Figure 10. LSQ answers related to (a) learner performance and (b) motivation in the course.
Electronics 12 03368 g010
Figure 11. LSQ answers related to (a) learner activity and (b) rhythm of study in the course.
Figure 11. LSQ answers related to (a) learner activity and (b) rhythm of study in the course.
Electronics 12 03368 g011
Figure 12. LSQ answers related to the feelings of the learners during the course: (a) whether they felt supervised/guided or (b) if they felt “not alone”.
Figure 12. LSQ answers related to the feelings of the learners during the course: (a) whether they felt supervised/guided or (b) if they felt “not alone”.
Electronics 12 03368 g012
Figure 13. Learners’ feedback on their problems detected.
Figure 13. Learners’ feedback on their problems detected.
Electronics 12 03368 g013
Figure 14. ISQ answers related to the (a) adequacy of the visualization of the different parts of the Dashboard and (b) their understandability and usability.
Figure 14. ISQ answers related to the (a) adequacy of the visualization of the different parts of the Dashboard and (b) their understandability and usability.
Electronics 12 03368 g014
Table 1. List of elements generated by the services of each version of the learning analytics system.
Table 1. List of elements generated by the services of each version of the learning analytics system.
By edX-LIMSBy edX-LIMS+
The current grade of the learner in the courseMessage with possible learner problem detected and suggestions to manage it
Line graph displaying the learner’s daily indicator (per day and accumulated)Button for learner to indicate agree/disagree about problem detected
Comparison of daily indicator with peersText area for learner to write feedback to instructors about problem detected
Radar view showing learner progress in each course unitBar graphs displaying time and attempts (SRL data) in course part
Comparison of learner progress with peersComparison of SRL data with peers
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cobos, R. Self-Regulated Learning and Active Feedback of MOOC Learners Supported by the Intervention Strategy of a Learning Analytics System. Electronics 2023, 12, 3368. https://doi.org/10.3390/electronics12153368

AMA Style

Cobos R. Self-Regulated Learning and Active Feedback of MOOC Learners Supported by the Intervention Strategy of a Learning Analytics System. Electronics. 2023; 12(15):3368. https://doi.org/10.3390/electronics12153368

Chicago/Turabian Style

Cobos, Ruth. 2023. "Self-Regulated Learning and Active Feedback of MOOC Learners Supported by the Intervention Strategy of a Learning Analytics System" Electronics 12, no. 15: 3368. https://doi.org/10.3390/electronics12153368

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop