Next Article in Journal
mD-Resilience: A Multi-Dimensional Approach for Resilience-Based Performance Assessment in Urban Transportation
Next Article in Special Issue
The Challenge of Teaching Mobile Journalism through MOOCs: A Case Study
Previous Article in Journal
A Multi-Item Replenishment Problem with Carbon Cap-and-Trade under Uncertainty
Previous Article in Special Issue
E-Mentoring in Higher Education: A Structured Literature Review and Implications for Future Research
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Could MOOC-Takers’ Behavior Discuss the Meaning of Success-Dropout Rate? Players, Auditors, and Spectators in a Geographical Analysis Course about Natural Risks

by
Sandra Ricart
1,*,
Rubén A. Villar-Navascués
1,
Salvador Gil-Guirado
2,3,
María Hernández-Hernández
1,4,
Antonio M. Rico-Amorós
1,4 and
Jorge Olcina-Cantos
2,4
1
Water and Territory Research Group, Interuniversity Institute of Geography, University of Alicante, 03690 San Vicente del Raspeig, Spain
2
Laboratory of Climatology, Interuniversity Institute of Geography, University of Alicante, 03690 San Vicente del Raspeig, Spain
3
Department of Geography, University of Murcia, 30001 Murcia, Spain
4
Department of Regional Geographic Analysis and Physical Geography, University of Alicante, Sant Vicent del Raspeig, 03690 Alicante, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(12), 4878; https://doi.org/10.3390/su12124878
Submission received: 19 May 2020 / Revised: 11 June 2020 / Accepted: 12 June 2020 / Published: 15 June 2020
(This article belongs to the Special Issue Opportunities and Challenges for the Future of Open Education)

Abstract

:
Research interest in massive online and open courses (MOOCs) is rapidly growing, questioning who enrolls, why and how to conceive engagement, and success rates. This study is focused on MOOC-takers behavior obtained from a seven-week MOOC experience on natural risks. Data scraping principles have been used to collect data. Demographics, success-dropout rates, engagement periods, achievement and scoring, and behavior were analyzed through descriptive statistics, non-parametric correlation analysis, and statistical hypothesis testing. The results show that students who start earlier and those who finish earlier the course obtain better grades in some of the modules (motivation and background on natural risks could be the explanation). However, for ‘last moment students’, speed in passing the modules is either related to greater motivation, although in this case it is not related to better grades. Furthermore, students who complete tasks during the weekend take less time to complete the modules and obtain a better grade. In addition, a learning strategy is promoted by reconsidering who is learning: players (those who complete the course and earning a certificate), auditors (those who have completed a thematic unit or the whole module, earning partial knowledge), and spectators (those enrolled until the end of the course, who intend earning experience in e-learning).

1. Introduction

Massive online and open courses (hereinafter, MOOCs) are an example of the gradual transformation in university education [1]. MOOCs are generally constructed and coordinated by universities and provided through commercial platforms. According to the Class Central portal—an aggregator of the MOOCs available on the main global platforms—the number of students enrolled in MOOCs continues to increase, exceeding more than 110 million students in 2019, and more than 900 universities have at least 1 of the 13,500 courses that are counted worldwide. About 30 platforms are offering MOOCs, of which Coursera and EdX stand out as the main platforms on content diversity and number of courses offered in English. MiriadaX is the main MOOC platform in Spanish, and Spain is, since 2013, the first European country producing MOOCs, with about 30% of the total offered courses [2].
The rise of MOOCs has also been considered an example of the democratization of access to education in two ways [3]. On the one hand, a debate about the validity of the traditional educational model and its ability to attract a greater number of students with diverse academic and professional interests has started [4]. On the other hand, the design, development, and delivery of a MOOC are an additional effort for teachers, changing their role [5]. According to this last statement, the teacher loses part of his function as a face-to-face transmitter of information to become a deferred actor focused on providing student orientation and support more than content, tools, and evaluation [6]. This affects the instructional quality of the MOOCs, which tend to be scored highly on the organization and presentation of the course material, while they fail at designing supported interaction and feedback, a key principle of effective instructional design [7]. That is, teachers are experts in a specific discipline or skills area rather than in pedagogy [8], so the role of instructional designers is shortening the distance between the student and the instructor, ensuring that students remain active throughout the course and self-regulation is running [9]. MOOCs therefore need teachers who are both instructors and assistants, able to automate the content and optimize resources, without forgetting that the student is the one who must filter, interpret, and enrich their learning process [10]. Furthermore, students need to self-regulate their learning more than in other modes of education, so MOOCs provide an excellent venue to promote autonomous learning [11]. Lastly, and taking into account situations of confinement or social distancing arising from pandemics such as Covid-19, MOOCs become a safe tool for advancing education [12].
Acceptance could be considered as a combination of technological (such as digitalization and e-learning) and pedagogical and social (such as students’ motivation and teachers’ use) recognition [13]. Several factors explain the success and acceptance of MOOCs: Its adaptation to different formative degrees of the audience; its academic and professional profile but also its ability to generate public interest; its ease and flexible use; and its gratuity [14,15]. However, the spread of MOOCs is not without limitations and criticisms. For students, the lack of interaction with teachers, the intensity of learning—concentrated in a few weeks and according to certain delivery times—or the required background are latent barriers [16,17,18]. For teachers, preparing course material for diverse and unknown audiences, promoting peer-to-peer evaluation, maintaining a minimum degree of interaction with students through forums, as well as evaluating non-face-to-face students’ dedication and managing the student retention or dropout rate—around 90%—are the main challenges to face [19]. The last one—dropout rate—is caused by the ineffectiveness of MOOCs in responding to learners needs [20] and this calls for a new perspective in which completing courses is not the only way of benefitting from participation in a MOOC [21].
What do we (un)know about success and dropout rates? Despite its popularity, the number of MOOC-takers who actually completed the course after enrolment was reported to be very low with high dropout rates (dropout equals not receiving a certificate). Success rate (considering only those students who earned a certificate for completion) in MOOC courses is less than 10% on an average (i.e., 5% on according to Breslow et al. [22]; 7% fixed by Rai and Chunrao [23]), although some more positive results—doubling this average—are fixed by Jordan [24]. However, it is worth noting that failure to complete the course does not mean that MOOCs are ineffective, because interest and motivation to learn could remain intact if students do not drop the course [25]. Students can simply audit the course or to have some feedback from high-quality universities and reputable instructors [26]. In fact, one of the main reasons for this kind of low completion rate is that many students consider that course completion is not necessary at all, because most students just want to access to specific knowledge from the whole course content [27]. Another reason that the course is not completed is the “minimum learning by doing” issue, by which online lessons are effective but they are only fit to provide theoretical education [28]. The lack of personal support and human intervention in favor of a complete machine-supported or auto-graded system of learning also explains the low completion rate [29]. In this case, students tend to be more motivated to learn when the instructor or facilitator is present within their proximity to reach a closed and systematic support system [30].
Each person experiences learning and motivation differently [31]. What if we changed the way in which we understand both success and dropout rates? Although research interest in MOOCs is rapidly growing, questioning who enrolls in and why [32], and how to analyze engagement and success rates is still discussed [33]. Although engagement is commonly described and measured as the investment of time, effort, and resources to achieve a goal [34,35], definitions about MOOC achievement or success vary. According to Liu et al. [36], success should not be limited to obtaining a certificate after completing all course lectures and exercises because this belittles the benefit of MOOCs as the opportunity for students to follow their own learning paths. This would allow learners to adopt their own strategy, such as following the course synchronously with the provided support, completing at their own rate, joining late and catching up, or just accessing materials that they were interested in [37]. Therefore, failure to finish a MOOC course is not a complete failure at all if students can learn a new topic or extend current knowledge, they present curiosity about MOOCs, and they face a personal challenge [38].

2. Background of the Research

Education is a critical driver in the context of disaster risk reduction because it conveys the essential fundamentals for risk-conscious and risk-mitigating actions among the population [39]. Preparing for disasters is a top priority for many educational and government institutions, who needs to increase students’ and societal motivation to learn about being less vulnerable to the impact of natural disasters [40]. Why an MOOC on natural risk analysis? MOOCs’ characteristics facilitate learning processes on dynamic issues across space and time, such as natural risks and hazards [41]. Increasing empirical evidence of climate change related to abnormal weather events, floods, and droughts could trigger an increase in risk perception of climate change and motivate changes in decision-making processes [42]. According to the latest annual report of the Swiss Re reinsurance multinational, in 2019, economic losses due to natural disasters were US 155 billion dollars, with more than 20 million people affected and 11,000 victims or missing persons. In this context, assessing the risk of potential hazards from climate change drivers on a community is vital to understanding how to be adequately prepared and adapted to climate change [43] as fundamental actions in climate change adaptation plans. Furthermore, teaching on natural risks, and educating and empowering local communities and stakeholders by increasing their sensitivity and adaptive capacity to natural risks is one of the main challenges to be addressed according to the Sustainable Development Goals 2030 Agenda, especially Goals n° 4 and 11 [44,45,46].
The aim of this paper is to deepen knowledge on how MOOC-takers’ behavior affects learning and success-dropout rates by improving understanding of how MOOC-takers manage their learnings and the relationship this has with achievement, success-dropout rates, and behavioral patterns. The study was designed to be exploratory (inductive research) with the purpose to deepen understanding of MOOC-takers’ behavior in order to develop some specific hypotheses or predictions about students’ patterns. However, we consider that a draft hypothesis of this research is that analyzing student behavior can help to refresh and discuss the concept of success and dropout rates, and this could provide new tools for learning designers when addressing students’ feedback and motivation. By addressing descriptive statistics, non-parametric correlation analysis, and statistical hypothesis testing, main patterns of how those enrolled in the course organize their time and learning experiences can be used to discuss the success-dropout rate meaning and the need to take into account different levels of learning that identify different types of MOOC-takers.

3. Materials and Methods

3.1. Sample Data

The empirical research presented here is supported by data obtained from a MOOC titled Geographical analysis of natural risk: Perceive, plan, and manage uncertainty. This course, with an estimated workload of 30 h, has been conceived as an opportunity to introduce to the off-campus audience the main content of the Master in Planning and Management of Natural Risks. Promoted in 2010 by the Interuniversity Institute of Geography in collaboration with the Faculty of Arts at the University of Alicante, this master is the first specific master in Spain on natural risks. The course has a double objective: (1) To conceptualize, from the geographical discipline, the physical, social, and territorial dimension of natural risk, and (2) to offer tools for analyzing natural risk and its management. The ultimate purpose of the course is to promote a global analysis of the conceptual, methodological, and perceptual aspects of natural risk to reduce vulnerability and promote the resilience of society and territory.
The number of students initially enrolled to the course was 1015, of which 988 (97.3%) remained until the end of the course (this difference in the number of students is due to the fact the registration system of the course was open until the end of the course, so once registered, students could decide to unsubscribe). In total, 215 (21.2%) completed the course (Table 1). The course was divided into an introductory module and 6 thematic weekly modules. Each weekly module was divided into five units, each of them with a video lecture (of no more than 25), optional sources (text reading, case studies and examples, links, and databases), and a module quiz. Students wishing to receive a statement of accomplishment must have passed the module and final quizzes by scoring 50% or better on each quiz. Students could take module quizzes twice and they had three attempts to overcome the final quiz. The score of the record was the best score on each quiz. There was only one deadline, the final day of the course, to complete all quizzes.

3.2. Data Collection

Investigating students’ interaction with a course by way of learning analytics can provide useful information about their learning experience and behavior [47]. Data to explore students’ learning were collected from the MiriadaX platform data analytics services, and web scraping and data extraction from Google Chrome© Data Miner extension were employed to analyze MOOC-takers’ effort, time-stamped logs of student activities such as lecture video views, and quiz pass [48].

3.3. Data Analysis

Overall results related to demographics, success–dropout rates, engagement periods, achievement and scoring, and behavior were analyzed through descriptive statistics, non-parametric correlation analysis (Spearman), and statistical hypothesis testing (Mann Whitney U test). Although the main relationships with completion rates investigated are related to educational background, gender, and geographic location [49], the available information in our case study only includes the score obtained in the questionnaires and the time taken to pass each module. All the same, there is not much evidence about issues related to student behavior [50]. Consequently, in order to deepen understanding of learner’s behavior, the sample of students was divided into several groups.
Firstly, data were analyzed from those students who completed at least one module quiz by gender (GENDER). Secondly, these data were grouped according to the types of learners, differentiating between students who completed the course and students who partially completed learning, or did not complete the course but had at least completed the first module (TYPE OF LEARNING). Thirdly, we wanted to check whether the day of the week in which the student completed each module influences the student’s performance, so the sample was divided between those who finished a module during midweek and those who finished at the weekend (WEEKDAY). In third and fourth place, the sample was divided according to the course start period (STARTING PERIOD) and the completion period (ENDING PERIOD). In the first case, differentiating between students who finished the first test during the first week and the rest of the students and, in the second case, between students who finished the last module before the last week and the rest of the students. Finally, the sample was divided between those students who completed more than half of the modules during the last week and the rest of the students (LAST MOMENT).
The characteristics of these groups are summarized in Table 2; specifically, the number of learners (N) and the average value and the standard deviation of the score obtained in the questionnaires (SA and SSD) and the time length to complete each module and its standard deviation (TA and TSD), measured in days. The non-parametric Mann Whitney U test was applied to check if there were statistically significant differences between groups in the score obtained in the questionnaires or the time spent in each module. Due to the fact that the sample size of some groups was very small in some modules and thus differences were more difficult to detect, Mann Whitney U test results were analyzed at a confidence level of 90%, without losing sight that p-value <0.05 indicate more robust results. The dataset was analyzed with the R software.

4. Results

4.1. Demographics

Information at aggregate level was available on nationality for 97.4% of enrollees, on gender for 94.2%, and for the rest of sociodemographic variables (age and education level) less than one-third of those enrolled. Regarding the origin of the students, the course has had representation from 40 countries, especially from Latin America (62.4%), which includes 20 countries from South America, Central America, and the Caribbean, and, secondly, Europe (34%), which was the country with most enrollees, Spain (30.4%), followed by Peru (18.8%), Colombia (9.4%), Mexico (8.6%), and Ecuador (6%).
Data related to gender indicate that 64% of those enrolled were men and 36% women (far from results obtained by Ruiz-Palmero et al. [51], in which the number of men and women was practically identical). These results are in line with those of other studies that indicate the masculinization of MOOC courses [52]. Regarding age, the most represented age groups were, firstly, students between 25 and 34 years old (30.2% of those who indicated age), followed by the group of 35–44 years old (25.3%), and thirdly, the group of students between 18 and 24 years old (21%). As in other studies, the most representative age groups for MOOC courses were those between 25 and 44 years old [53]. With regard to education level, 96% of those who responded had some type of university’ experience, including those who completed their university studies (50.7%), those who were pursuing their university studies (27.8%), and those who were university teachers (17.5%).

4.2. Success-Dropout Rates

Just over half of the total enrolled, 55.6%, started the course, while less than half of those finished it (21.2%), doubling the average success rates usually obtained in MOOC courses (Figure 1).
The proportion of each type of student fluctuates because (1) the registration system is open during the course and (2) students’ behavior is changing weekly. The figure shows how while the number of students registered (the sum of the three categories) increased, the number and proportion of spectators decreased as the weeks passed because students achieve the goals of each module and moved from the spectators category to the auditors and players categories. In addition, the lack of registration deadlines stimulates news enrolments during the passage of the course. After an increase in the absolute number of enrolled of 9.3% in the first week, the increase stabilized going from 4% in the second week to 2.6% in the third and between 1.5% and 1% for the rest. In the last week, however, there was a decrease of 2.2% for the enrolled students, who decided to leave the course. Regarding the students who started the course, the greatest increases occurred in the first three weeks and were subsequently reduced to increments of 9.7% in the fourth week and between 6% and 4.8% in the following weeks. It is worth noting that, in relation to the number of students who finished the course, a rebound occurred during the last week, in which half of the total students completed the course.
Despite the fact that the course was started by 55.6% of the enrolled, only 34.7% (352 students) performed the first module quiz. That is to say, it was during the first module when there was a higher dropout rate, understood as the proportion of students who start a module but do not finish it. This proportion decreased sharply in the second module and remained more or less stable as the course progressed (Figure 2).
In total, 21.4% of enrollees dropped out of the course after having started a module, with two-thirds of those dropping out during the first module. It is also necessary to take into account the dropouts that occurred from students after passing a module quiz, which in total represent another 4.4% of those enrolled, with more than half occurring before the third module.

4.3. Engagement, Achievement, and Scoring

4.3.1. Engagement Periods

Concerning the engagement temporal distribution by weeks and days, if we attend to the moment in which the participant begins the activity and passes at least one module, engagement was concentrated during the first three weeks. Regarding the moment in which the course ends, only 2.8% of those who completed the course did so during the first week. It is in the last week of the course when the greatest increase in the course completion rate occurred (115% compared to Week 6), since, as said, 50.7% of the total number of students completed the course. The temporal distribution of the visualization of the videos in each unit as well as the performance of the module quizzes differed between the first two modules, where the week of greatest activity was the first compared to that of the rest of the modules, since module 3, the week of greatest activity, was the last (Figure 3).
This indicates that most of the students do not meet the recommendation to follow a weekly calendar to pass each module. In this sense, it is worth noting that only one of the 215 students who passed the course completed each module in its corresponding week. This low-rate pattern confirms that students generally self-managed their involvement in the course. However, the number of students who followed a weekly calendar was a little higher in the rest of the modules, especially in the first module (116 students, 32.9%). Likewise, it was observed that during the first modules (modules 1–3), there was a greater activity during midweek, while this activity was gradually transferred to the weekend as the course progressed, especially in the last three modules. For the two first modules, the days with the highest involvement were Monday and Wednesday, whereas for the rest of the course was Sunday, both in the visualization of the videos of the units of each module and in the quiz performance (Figure 4).

4.3.2. Achievement by Unit Interest and Time Length

The introductory videos of each unit were viewed by 98.3% of the students who started each unit. However, in the first module, this rate was reduced to 92.8%, especially in the video of the first unit, which was the one with the highest dropout rate of the whole course (14.4% of the students who started the video of this unit watched less than half of the video). In the rest of the modules, practically 100% of those who started a thematic unit watched at least 50% of the video. In this regard, it should be borne in mind that the video display for each thematic unit can be completed in several phases, which is reflected in the average number of accesses. The average number of accesses for the whole course was 2.03 times per video, although this decreased from the first module, which had the highest average with 2.8 attempts, to 1.7 in modules 5 and 6. In this regard, it has been verified that there is no correlation between the length of the videos and the average number of views, but there is between the percentage of dropouts and the average number of views (Spearman rho = 0.55, p-value = 0.001).
Regarding the time length to complete each module, measured in days from the start of the first unit until the completion of the module quiz, there were significant differences for a 95% confidence interval in the time spent to pass each module in aggregate terms after performing the Mann Whitney U test (Table 3). Specifically, these differences were established between module 1, which, on average, took the longest time to complete (5.1 days), and the rest of the modules, which ranged from 1.1 to 2.8 days. Likewise, statistically significant differences were established between the time length to complete module 2 and the modules 4 and 5, and between modules 3–4 and 5–6, which were those that were overcome in a shorter time interval. In summary, it is verified that the period elapsed to complete the modules decreases as the course progresses. However, the duration of the videos in modules 1 and 2 was less than in the rest of the modules (33 and 31 of duration for modules 1 and 2, respectively; and 66, 100 37 and 53 of duration for modules 3, 4, 5, and 6, respectively).

4.3.3. Module Quizzes Score

The average score ranged from 76.1/100 points (module 3) to 82.2/100 points (module 6). Mann Whitney U tests were carried out to assess whether there were statistically significant differences between the score of the module’s quizzes (Table 4).
It was found that there are significant differences (p < 0.05) between two groups of modules. On the one hand, the initial modules up to the fourth presented significantly lower average marks (77.3, 79.4 and 76.1 and 77.1, respectively) and on the other, the modules 5, 6, and 7, whose mean scores were 81.9, 82.2, and 80.8, respectively.
Finally, it was checked whether there exists a relationship between the time length to complete each module and the whole course, and the score obtained in each module quiz. The results indicated that there was a slight negative correlation between the score obtained and the time length in module 4 (Spearman rho = −0.19; p-value = 0.002) and module 2 (Spearman rho = −0.15; p-value = 0.006). Although the correlation was not very intense, the sign of the correlation was striking; that is, the less time it took to overcome these modules, the better the score obtained was.

4.4. Behaviour

Non-parametric Mann Whitney U tests were applied to each group to determine whether there were significant differences in the quiz score and the time spent to pass each module (Table 5). Regarding GENDER, there were no significant differences in module quizzes scores, but in relation to time spent completing each module, there were gender differences in the time length to complete the whole course at a 90% confidence level. Women who completed the course took 25.8 days compared to 21.8 days for men. This may be explained because 63.5% of the women who finished the course did so in the last week, while in men this proportion was 44.1%.
In the TYPE OF LEARNING group, significant differences were found at a 90% confidence level in the quiz score of module 2 (p-value = 0.08) and module 3 (p-value = 0.05). In both cases, average scores were higher in students who completed the course, which occurred in all the modules except in module 6, where the sample of partial learners was very small (n = 6). During the three first modules, where the dropout rates were concentrated, it stood out that those students that obtained partial learning took longer to complete modules. In this respect, there were statistically significant differences in the time length to complete module 2 (p-value = 0.09), which was higher in students who received partial learning.
With regard to the work schedule (WEEKDAY), it was identified that those who had the habit of working during the weekends obtained a better score than those students who had the habit of working during the week. This hypothesis was confirmed in module 2 at a 95% confidence level (p-value = 0.03), where students who performed the quiz during the weekend obtained an average score of 82, while midweek students obtained 78.2. Likewise, significant differences were identified in module 6 at a 90% confidence level (p-value = 0.07), where the midweek students obtained an average score of 80.7, while the weekend students obtained 84.6. However, the greatest differences were established in relation to the time elapsed between the beginning and the end of each module. These differences were identified in all modules except the first, indicating that weekend students took less time to complete the modules than midweek students do. This premise was not fulfilled in module 7, since many midweek students performed the final quiz on the last days of the course, which were at the weekend.
Significant differences have also been identified in the BEGINNING PERIOD group. With respect to module quiz scores, there were differences in module 1 (p-value = 0.02) and module 2 (p-value = 0.04), since students who began during the first week obtained a higher score. Likewise, there existed differences in the time length in module 1 (p-value < 0.01) and module 6 (p-value < 0.01) but with a different meaning. For those students who started and complete module 1 in the first week, the length of time was much less than for the others (1.2 days versus 7 days). However, at the end of the course, the opposite happened, as less time elapsed in finishing module 6 for students who started later, at 1 day versus 2.4 days for students who started the first week. Even though the average values for total time length for students who started earlier was higher than for the rest of students (25.1 days versus 22.3 days), at the beginning of the seventh week, 72.4% of those who started early had finished the course, while in the other group, only 37.3% of the students had concluded the course.
The two last groups only include students who completed the course. As could have been deduced beforehand in the ENDING PERIOD learner’s group, there existed significant differences in the time length of the first four modules and in the whole course. For this group, there were more remarkable differences established in the score of module 4 (p-value = 0.01), module 5 (p-value = 0.07), and module 7 (p-value = 0.09). In all these cases, students who finished the last module before the last week showed better scores in modules quizzes compared with the rest of the students, which may be related to a greater interest in the course content. Finally, the results indicate that there only existed differences in the time length of module 1, 6, and the whole course for the LAST MOMENT learner’s group. Results show that students who completed more than half of the modules during the last week completed the course content quicker, as expected. However, it was also expected that this quickness could have an impact on the test scores, but this hypothesis was not confirmed.

5. Discussion and Conclusions

Success rate could comprise any individual goal achieved during the course; that is, MOOC-takers could intend to complete only some specific lessons or modules, following the whole course without doing exercises or taking the quizzes, or getting a certificate [54]. According to this approach, any specific skill and useful knowledge achieved during the course should be considered a way of success [55]. Although our obtained results highlighted a success-dropout rate of 21.2% (doubling the average success rates usually obtained in MOOC courses), learning rate could be increased to 34.7% if considering learners as those students who passed the quiz of module 1, for example, and not only those who complete the whole course. The consideration of who are and who are not considered as learner and which type of analysis could be applied to deepen on this big question is a challenge for e-learning. This study extends previous work only focused on demographics and success–dropout rates and contributes to the field by interrogating the behavior of students from engagement, achievement, and scoring, while considering different types of learning processes and MOOC-takers’ attitudes and patterns that go beyond simply looking at those who completed or failed a course. This leaves a door open to further research on learners’ experiences and how this experience can motivate a reinterpretation of the success–dropout rates. Accordingly, discussion about the success–dropout rate mining could be understood, compared to the traditional learning system, like measuring the student’ knowledge achieved in a partial exam or during an exercise. In fact, the educational research community offers a diversity of interpretations of what constitutes a learning strategy. According to Jovanovic et al. [56], a learning strategy includes any thoughts, behaviors, beliefs, or emotions that facilitate the acquisition, understanding, or later transfer of new knowledge and skills that have to be mined/detected using appropriate analytical methods and techniques. Essentially, e-learning analytics leverages digital trace data from online systems by which feedback is provided when a lecture video is watched, a quiz is completed, or a task is posted on the forum discussion board [57].
This study lays the groundwork for future research into behavioral modelling and mapping within MOOC learning environment [58]. Despite that this study is exploratory and advantaged by access to detailed data for those students who have completed the course, some limitations on data collection have to be considered. For example, only partial information has been obtained from those students who have been registered in the course but have not been able to achieve the course goals. In addition, data on forum participation or student’s motivation to register for the course have been unavailable. This lack of detailed and full information has not allowed establishing a greater degree of accuracy on students’ behavior patterns [59]. Furthermore, available data come from a single edition of the course, so additional research incorporating multiple editions of the same course would enhance the generalizability of our conclusions. Notwithstanding these limitations, obtained results highlighted some issues regarding different levels of learning and, accordingly, the diversity of learner’ profiles. In this sense, a learning strategy could be promoted by reconsidering how and to what extent MOOC-takers are learning. For example, players could be those students who complete the course (earning a certificate), auditors could be those who completed a thematic unit, the whole module, or some of them (earning partial and infrequently knowledge), and spectators could be those enrolled until the end of the course without taking any action (although they could intend on earning experience in e-learning). Undoubtedly, the auditors’ category was the most diversified because it included different students’ profiles, but also the most interesting because it allows evaluating and reinforcing the interest of the student in a specific issue [60]. For example, a detailed analysis of how success is achieved in this MOOC confirms how the ability to build an argument-based approach focused on natural risks conceptual analysis during the first week of the course was key in reducing the dropout rate and increasing auditors and players [61]. However, the difficulty of getting the student involved in this first module is reflected in a greater number of attempts to watch the videos of each thematic unit and may be the reason that a higher dropout rate is experienced.
Furthermore, self-regulated learning (that is, the weekly period in which thematic units, modules, and quizzes are achieved) could affect the success-dropout rate but, above all, the learning rate [62]. According to the obtained results, only during the first week (in which the first two modules are passed by most of those enrolled) did the activity occur mostly during the week, while the rest of the modules are carried out largely during the last week of the course or during the weekend. To address this gap, some authors rely on promoting self-regulated learning frames strategies that students can use to enhance motivation and persistence, while teachers can overcome the limitation of not being able to provide personalized course delivery and individual feedback [34,63,64]. Accordingly, more attention to the instructional design process and the role of instructors and designers is needed. For example, and according to our experience, weekly email delivery to refresh tasks and contents or forums’ comments reply have been found insufficient to ensure students’ self-regulation. This requires a rethinking by learning designers about which strategies could be promoted taking into account that not only learners’ motivation and confidence influence the iteration of the success-dropout rates, but also the structure of the course, the delivery environment, and the perceived value of learning itself [65]. According to the five principles of instruction for learning activities designed two decades ago by Merrill [66], learning is promoted when students are engaged in solving real-world problems; prior knowledge is activated; new knowledge is demonstrated to the student; the student applies new knowledge; and new knowledge is integrated into the student’s perceptions and experiences. Recently, Margaryan [7] has added a set of principles among which collaboration and feedback must be highlighted. For example, by including a starting quiz in which students could share the reason that they enrolled into the course; by promoting students’ group tasks through the forum; or by moving from current self-assessment (quizzes) to peer-assessment, enhancing the critical thinking, comprehension, and writing capabilities of students [67].
Regarding the analysis of students’ behavior, a further in-depth analysis requires that e-learning platforms facilitate access to data on the characteristics of the students (country of origin, age, and educational level). Furthermore, detailed information on students’ involvement and risk of abandonment should be provided to deepen on MOOC-takers’ behavior (by considering participation in the forum, number of attempts to view the videos, and the supplementary material) [68]. However, with the available data, some key issues related to student behavior have been identified:
  • Students who obtain partial learning (auditors or learners) take longer to complete the modules and obtain worse grades. To address this gap, it should be useful to send questionnaires to those who do not finish the course, asking why (lack of interest in the course content, does not meet expectations, need for specific knowledge already satisfied, lack of time, the difficulty of the course, etc.).
  • Students who complete tasks during the weekend take less time to complete the modules and obtain a better grade. This could be related to many factors, but it would be interesting to focus the communication strategies with the students to be promoted also during the weekend (beyond reminders or communication at the beginning of the modules that are provided each Monday).
  • Students who start earlier and those who finish earlier obtain better grades in some of the modules (motivation could be the explanation, but also students’ background in the subject: Natural risks). However, ‘last moment students’ (those students who complete the course last week) demonstrated that speed in passing the modules is either related to greater motivation, although in this case it is not related to better grades.
Learning designers could check the behavioral rules to deepen on which learning strategy is more useful to optimize the performance of online courses. To this end, the meaning of success-dropout rates and the level of motivation of those enrolled in the course should be reconsidered [69]. Furthermore, in line with other recent studies focused on geography education [70], this study is also an opportunity to highlight how the study of natural risks from a geographical perspective can be benefited from e-learning by contributing to the Sendai Framework for Disaster Risk Reduction 2015-2030 [71].

Author Contributions

Conceptualization, S.R.; methodology, S.R. and R.A.V.-N.; software, R.A.V.-N.; validation, S.R. and S.G.-G.; formal analysis, S.R. and R.A.V.-N.; investigation, S.R.; resources, S.R. and R.A.V.-N.; data curation, R.A.V.-N.; writing—Original draft preparation, S.R. and R.A.V.-N.; writing—Review and editing, S.R. and A.M.R.-A.; visualization, S.G.-G., R.A.V.-N., M.H.-H., and J.O.-C.; supervision, A.M.R.-A., M.H.-H., and J.O.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Vice-Rector’s Office for Quality and Educational Innovation of the University of Alicante, grant number E8791, BOUA 07/11/2018, of the PENSEM-ONLINE Program.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Baggaley, J. Online learning: A new testament. Distance Educ. 2014, 35, 133–140. [Google Scholar] [CrossRef]
  2. Witthaus, G.; Inamorato dos Santos, A.; Childs, M.; Tannhäuser, A.; Conole, G.; Nkuyubwatsi, B.; Punie, Y. Validation of non-formal MOOC-based learning: An analysis of assessment and recognition practices in Europe (OpenCred). EUR 27660 EN 2016. [Google Scholar] [CrossRef]
  3. Moural, V.F.; Souzal, C.A.; Oliveira-Neto, J.D.; Viana, A.B.N. MOOC’s potential for democratizing education: An analysis from the perspective of access to technology. In EMCIS 2017; Themistocleous, M., Morabito, V., Eds.; University of Coimbra: Coimbra, Portugal, 2017; pp. 139–153. [Google Scholar] [CrossRef]
  4. Thomas, L.; Herbert, J.; Teras, M. A sense of belonging to enhance participation, success and retention in online programs. Int. J. First Year High. Educ. 2014, 5, 69–80. [Google Scholar] [CrossRef]
  5. Bartoletti, R. Learning through design: MOOC development as a method for exploring teaching methods. Curr. Issues Emerg. e-Learn. 2016, 3, 2. [Google Scholar]
  6. Van de Poël, J.F.; Verpoorten, D. Designing a MOOC—A new channel for teacher professional development. In Digital Education: At the MOOC Crossroads Where the Interests of Academia and Business Converge; Calise, M., Delgado-Kloos, C., Reich, J., Ruiperez-Valiente, J., Wirsing, M., Eds.; Springer: Berlin, Germany, 2019; pp. 91–101. [Google Scholar]
  7. Margaryan, A.; Bianco, M.; Littlejohn, A. Instructional quality of Massive Open Online Courses (MOOCs). Comput. Educ. 2015, 80, 77–83. [Google Scholar] [CrossRef] [Green Version]
  8. Oh, E.; Chang, Y.; Park, S. Design review of MOOCs: Application of e-learning design principles. J. Comput. High. Educ. 2019. [Google Scholar] [CrossRef]
  9. Fontana, R.P.; Milligan, C.; Littlejohn, A.; Margaryan, A. Measuring self-regulated learning in the workplace. Int. J. Train. Dev. 2015, 19, 32–52. [Google Scholar] [CrossRef]
  10. Castaño-Muñoz, J.; Kalz, M.; Kreijns, K.; Punie, Y. Who is taking MOOCs for teachers’ professional development on the use of ICT? A cross-sectional study from Spain. Technol. Pedagog. Educ. 2018, 27, 607–624. [Google Scholar] [CrossRef] [Green Version]
  11. Hood, N.; Littlejohn, A.; Milligan, C. Context counts: How learners’ contexts influence learning in a MOOC. Comput. Educ. 2015, 91, 83–91. [Google Scholar] [CrossRef] [Green Version]
  12. Chick, R.C.; Clifton, G.T.; Peace, K.M.; Propper, B.W.; Hale, D.F.; Alseidi, A.A.; Vreeland, T.J. Using technology to maintain the education of residents during the COVID-19 pandemic. J. Surg. Educ. 2020, in press. [Google Scholar] [CrossRef]
  13. Zhou, M. Chinese university students’ acceptance of MOOCs: A self-determination perspective. Comput. Educ. 2016, 92–93, 194–203. [Google Scholar] [CrossRef]
  14. Diver, P.; Martinez, I. MOOCs as a massive research laboratory: Opportunities and challenges. Distance Educ. 2015, 36, 5–25. [Google Scholar] [CrossRef]
  15. Al-Fraihat, D.; Joy, M.; Masa’deh, R.; Sinclair, J. Evaluating e-learning systems success: An empirical study. Comput. Hum. Behav. 2020, 102, 67–86. [Google Scholar] [CrossRef]
  16. Henderikx, M.; Kreijns, K.; Kalz, M. A classification of barriers that influence intention achievement in MOOCs. In Lifelong Technology-Enhanced Learning. Proceedings of the 13th European Conference on Technology Enhanced Learning; Pammer-Schindler, V., Pérez-Sanagustín, M., Drachsler, H., Elferink, R., Scheffel, M., Eds.; Springer: Cham, Switzerland, 2018; pp. 3–15. [Google Scholar]
  17. Henderikx, M.A.; Kreijns, K.; Kalz, M. Refining success and dropout in massive open online courses based on the intention-behavior gap. Distance Educ. 2017, 38, 353–368. [Google Scholar] [CrossRef] [Green Version]
  18. Greene, J.A.; Oswald, C.A.; Pomerantz, J. Predictors of retention and achievement in a massive open online course. Am. Educ. Res. J. 2015, 52, 925–955. [Google Scholar] [CrossRef]
  19. Kaplan, A.M.; Haenlein, M. Higher education and the digital revolution: About MOOCs, SPOCs, social media, and the Cookie Monster. Bus. Horiz. 2016, 59, 441–450. [Google Scholar] [CrossRef]
  20. Al-Rahmi, W.M.; Yahaya, N.; Alamri, M.M.; Alyoussef, I.Y.; Al-Rahmi, A.M.; Kamin, Y.B. Integrating innovation diffusion theory with technology acceptance model: Supporting students’ attitude towards using a massive open online courses (MOOCs) systems. Interact. Learn. Environ. 2019. Latest articles. [Google Scholar] [CrossRef]
  21. Perna, L.W.; Ruby, A.; Boruch, R.F.; Wang, N.; Scull, J.; Ahmad, S.; Evans, C. Moving through MOOCs: Understanding the progression of users in Massive Open Online Courses. Educ. Res. 2014, 43, 421–432. [Google Scholar] [CrossRef]
  22. Breslow, L.; Pritchard, D.E.; de Boer, J.; Stump, G.S.; Ho, A.D.; Seaton, D.T. Studying learning in the worldwide classroom: Research into EdX’s first MOOC. Res. Pract. Assess. 2013, 8, 15–25. [Google Scholar]
  23. Rai, L.; Chunrao, D. Influencing factors of success and failure in MOOC and general analysis of learner behavior. Int. J. Inf. Educ. Technol. 2016, 6, 262–268. [Google Scholar] [CrossRef] [Green Version]
  24. Jordan, K. Massive open online course completion rates revisited: Assessment, length and attrition. Int. Rev. Res. Open Distrib. Learn. 2015, 16, 341–358. [Google Scholar] [CrossRef]
  25. de Barba, P.G.; Kennedy, G.E.; Ainley, M.D. The role of students’ motivation and participation in predicting performance in a MOOC. J. Comput. Assist. Learn. 2016, 32, 218–231. [Google Scholar] [CrossRef]
  26. Alraimi, K.M.; Zo, H.; Ciganek, A.P. Understanding the MOOCs continuance: The role of openness and reputation. Comput. Educ. 2015, 80, 28–38. [Google Scholar] [CrossRef]
  27. Owusu-Agyeman, Y.; Larbi-Shaw, O. Exploring the factors that enhance student–content interaction in a technology-mediated learning environment. Cogent Educ. 2018, 5, 1456780. [Google Scholar] [CrossRef]
  28. Gravani, M.N. Adult learning in a distance education context: Theoretical and methodological challenges. Int. J. Lifelong Educ. 2014, 34, 172–193. [Google Scholar] [CrossRef]
  29. Knox, J. Digital culture clash: “Massive” education in the E-learning and Digital Cultures MOOC. Distance Educ. 2014, 35, 164–177. [Google Scholar] [CrossRef] [Green Version]
  30. Doo, M.Y.; Tang, Y.; Bonk, C.J.; Zhu, M. MOOC instructor motivation and career development. Distance Educ. 2020, 41, 26–47. [Google Scholar] [CrossRef]
  31. Carrera, J.; Ramírez-Hernández, D. Innovative education in MOOC for sustainability: Learnings and motivations. Sustainability 2018, 10, 2290. [Google Scholar] [CrossRef] [Green Version]
  32. Seaton, D.T.; Bergner, Y.; Chuang, I.; Mitros, P.; Pritchard, D.E. Who does what in a massive open online course? Commun. ACM 2014, 57, 58–65. [Google Scholar] [CrossRef] [Green Version]
  33. Lan, M.; Hew, K.F. Examining learning engagement in MOOCs: A self-determination theoretical perspective using mixed method. Int. J. Educ. Technol. High. Educ. 2020, 17, 1–24. [Google Scholar] [CrossRef]
  34. Hew, K.F. Promoting engagement in online courses: What strategies can we learn from three highly rated MOOCS? Br. J. Educ. Technol. 2016, 47, 320–341. [Google Scholar] [CrossRef]
  35. Kahu, E.R. Framing student engagement in higher education. Stud. High. Educ. 2013, 38, 758–773. [Google Scholar] [CrossRef]
  36. Liu, M.; Kang, J.; McKelroy, E. Examining learners’ perspective of taking a MOOC: Reasons, excitement, and perception of usefulness. Educ. Media Int. 2015, 52, 129–146. [Google Scholar] [CrossRef]
  37. Leach, M.; Hadi, S.M. Supporting, categorizing and visualising diverse learner behaviour on MOOCs with modular design and micro-learning. J. Comput. High. Educ. 2017, 29, 147–159. [Google Scholar] [CrossRef] [Green Version]
  38. Hew, K.F.; Cheung, W.S. Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educ. Res. Rev. 2014, 12, 45–58. [Google Scholar] [CrossRef]
  39. Mönter, L.; Otto, K.-H. The concept of disasters in Geography Education. J. Geogr. High. Educ. 2018, 42, 205–219. [Google Scholar] [CrossRef]
  40. Tsai, M.-H.; Chang, Y.-L.; Shiau, J.S.; Wang, S.M. Exploring the effects of a serious game-based learning package for disaster prevention education: The case of the Battle of Flooding Protection. Reduction 2020, 43, 101393. [Google Scholar] [CrossRef]
  41. King, D.; Gurtner, Y.; Firdaus, A.; Harwood, S.; Cottrell, A. Land use planning for disaster risk reduction and climate change adaptation. Operationalizing policy and legislation at local levels. Int. J. Disaster Res. Built Environ. 2016, 7, 158–172. [Google Scholar] [CrossRef]
  42. Echavarren, J.M.; Balzekiene, A.; Telesiene, A. Multilevel analysis of climate change risk perception in Europe: Natural hazards, political contexts and mediating individual effects. Saf. Sci. 2019, 120, 813–823. [Google Scholar] [CrossRef]
  43. Fakhruddin, B.; Boylan, K.; Wild, A.; Robertson, R. Chapter 12-Assessing vulnerability and risk of climate change. In Climate Extremes and Their Implications for Impact and Risk Assessment; Sillmann, J., Sippel, S., Russo, S., Eds.; Elsevier: London, UK, 2020; pp. 217–241. [Google Scholar]
  44. Paul, J.D.; Hannah, D.M.; Liu, W. Editorial: Citizen Science: Reducing risk and building resilience to natural hazards. Front. Earth Sci. 2019, 7, 320. [Google Scholar] [CrossRef] [Green Version]
  45. Shimizu, M.; Clark, A.L. A modern risk society and resilience-based public policy: Structural views. In Nexus of Resilience and Public Policy in a Modern Risk Society; Shimizu, M., Clark, A.L., Eds.; Springer: Singapore, 2019; pp. 13–31. [Google Scholar]
  46. Tanner, A.; Arvai, J. Perceptions of risk and vulnerability following exposure to a major natural disaster: The Calgary flood of 2013. Risk Anal. 2018, 38, 548–561. [Google Scholar] [CrossRef] [PubMed]
  47. De Barba, P.G.; Malekian, D.; Oliveira, E.A.; Bailey, J.; Ryan, T.; Kennedy, G. The importance and meaning of session behaviour in a MOOC. Comput. Educ. 2020, 146, 103772. [Google Scholar] [CrossRef]
  48. Bernard, R.M.; Borokhovski, E.; Schmid, R.F.; Tamim, R.M.; Abrami, P.C. A meta-analysis of blended learning and technology use in higher education: From the general to the applied. J. Comput. High. Educ. 2014, 33, 87–122. [Google Scholar] [CrossRef]
  49. Deng, R.; Benckendorff, P.; Gannaway, D. Progress and new directions for teaching and learning in MOOCs. Comput. Educ. 2019, 129, 48–60. [Google Scholar] [CrossRef]
  50. Brooker, A.; Corrin, L.; de Barba, P.; Lodge, J.; Kennedy, G. A tale of two MOOCs: How student motivation and participation predict learning outcomes in different MOOCs. Aust. J. Educ. Technol. 2018, 34, 73–87. [Google Scholar] [CrossRef] [Green Version]
  51. Ruiz-Palmero, J.; López-Álvarez, D.; Sánchez-Rivas, E.; Sánchez-Rodríguez, J. An analysis of the profiles and the opinion of students enrolled on xMOOCs at the University of Málaga. Sustainability 2019, 11, 6910. [Google Scholar] [CrossRef] [Green Version]
  52. Jiang, S.; Schenke, K.; Eccles, J.-S.; Xu, D.; Warschauer, M. Cross-national comparison of gender differences in the enrollment in and completion of science, technology, engineering, and mathematics Massive Open Online Courses. PLoS ONE 2018, 13, e0202463. [Google Scholar] [CrossRef]
  53. Watson, S.L.; Watson, W.R.; Yu, J.H.; Alamri, H.; Mueller, C. Learner profiles of attitudinal learning in a MOOC: An explanatory sequential mixed methods study. Comput. Educ. 2017, 114, 274–285. [Google Scholar] [CrossRef]
  54. Van den Beemt, A.; Buijs, J.; van der Aalst, W. Analysing structured learning behaviour in massive open online courses (MOOCs): An approach based on process mining and clustering. Int. Rev. Res. Open Distance Learn. 2018, 19, 38–60. [Google Scholar] [CrossRef]
  55. Bannert, M.; Reimann, P.; Sonnenberg, C. Process mining techniques for analyzing patterns and strategies in students’ self-regulated learning. Metacognit. Learn. 2014, 9, 161–185. [Google Scholar] [CrossRef]
  56. Jovanovic, J.; Gasevic, D.; Dawson, S.; Pardo, A.; Mirriahi, N. Learning analytics to unveil learning strategies in a flipped classroom. Internet High. Educ. 2017, 33, 74–85. [Google Scholar] [CrossRef] [Green Version]
  57. Sunar, A.S.; Abbasi, R.A.; Davis, H.C.; White, S.; Aljohani, N.R. Modelling MOOC learners’ social behaviours. Comput. Hum. Behav. 2018, 107, 105835. [Google Scholar] [CrossRef]
  58. Rizvi, S.; Rienties, B.; Rogaten, J.; Kizilcec, R.F. Investigating variation in learning processes in a FutureLearn MOOC. J. Comput. High. Educ. 2020, 32, 162–181. [Google Scholar] [CrossRef] [Green Version]
  59. Ruthotto, I.; Kreth, Q.; Stevens, J.; Trively, C.; Melkers, J. Lurking and participation in the virtual classroom: The effects of gender, race, and age among graduate students in computer science. Comput. Educ. 2020, 151, 103854. [Google Scholar] [CrossRef]
  60. Walji, S.; Deacon, A.; Small, J.; Czerniewicz, L. Learning through engagement: MOOCs as an emergent form of provision. Distance Educ. 2016, 37, 208–223. [Google Scholar] [CrossRef] [Green Version]
  61. Douglas, K.A.; Merzdorf, H.E.; Hicks, N.M.; Sarfraz, M.I.; Bermel, P. Challenges to assessing motivation in MOOC learners: An application of an argument-based approach. Comput. Educ. 2020, 150, 103829. [Google Scholar] [CrossRef]
  62. Wong, J.; Khalil, M.; Baars, M.; de Koning, B.B.; Paas, F. Exploring sequences of learner activities in relation to self-regulated learning in a massive open online course. Comput. Educ. 2019, 140, 103595. [Google Scholar] [CrossRef]
  63. Handoko, E.; Gronseth, S.L.; McNeil, S.G.; Bonk, C.J.; Robin, B.R. Goal setting and MOOC completion: A study on the role of self-regulated learning in student performance in Massive Open Online Courses. Int. Rev. Res. Open Distrib. Learn. 2019, 20, 39–58. [Google Scholar]
  64. Li, H.; Kim, M.K.; Xiong, Y. Individual learning vs. interactive learning: A cognitive diagnostic analysis of MOOC students’ learning behaviors. Am. J. Distance Educ. 2020. Latest articles. [Google Scholar] [CrossRef]
  65. Cho, M.-H.; Kim, B.J. Students’ self-regulation for interaction with others in online learning environments. Internet High. Educ. 2013, 17, 69–75. [Google Scholar] [CrossRef]
  66. Merrill, M.D. First principles of instruction. Educ. Technol. Res. Dev. 2002, 50, 43–59. [Google Scholar] [CrossRef]
  67. Garcia-Loro, F.; Martin, S.; Ruipérez-Valiente, J.A.; Sancristobal, E.; Castro, M. Reviewing and analyzing peer review Inter-Rater Reliability in a MOOC platform. Comput. Educ. 2020, 154, 103894. [Google Scholar] [CrossRef]
  68. Olivé, D.M.; Huynh, D.Q.; Reynolds, M.; Dougiamas, M.; Wiese, D. A supervised learning framework: Using assessment to identify students at risk of dropping out of a MOOC. J. Comput. High. Educ. 2020, 32, 9–26. [Google Scholar] [CrossRef]
  69. Poellhuber, B.; Roy, N.; Bouchoucha, I. Understanding participant’s behaviour in massively open online courses. Int. Rev. Res. Open Distance Learn. 2019, 20, 222–242. [Google Scholar] [CrossRef] [Green Version]
  70. Robinson, A.C.; Kerski, J.; Long, E.C.; Luo, H.; DiBiase, D.; Lee, A. Maps and the geospatial revolution: Teaching a massive open online course (MOOC) in geography. J. Geog. High. Educ. 2015, 39, 65–82. [Google Scholar] [CrossRef]
  71. Mizutori, M. Reflections on the Sendai Framework for Disaster Risk Reduction: Five years since its adaptation. Int. J. Disaster Risk Sci. 2020, 11, 147–151. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Evolution of the number of enrolled students who start and students who finish the course.
Figure 1. Evolution of the number of enrolled students who start and students who finish the course.
Sustainability 12 04878 g001
Figure 2. Percentage of enrollees who start each module based on their performance.
Figure 2. Percentage of enrollees who start each module based on their performance.
Sustainability 12 04878 g002
Figure 3. Percentage of students who perform the module quiz and view the video of each unit according to the week number. Note: ‘M’ refers to ‘module’ and ‘U’ refers to ‘unit’, so ‘M1U1′ refers to unit 1 of module 1 and successively. Likewise, ‘Q’ refers to ‘quiz’.
Figure 3. Percentage of students who perform the module quiz and view the video of each unit according to the week number. Note: ‘M’ refers to ‘module’ and ‘U’ refers to ‘unit’, so ‘M1U1′ refers to unit 1 of module 1 and successively. Likewise, ‘Q’ refers to ‘quiz’.
Sustainability 12 04878 g003
Figure 4. Percentage of students who perform the module quiz and view the video of each unit according to the day of the week. Note: ‘M’ refers to ‘module’ and ‘U’ refers to ‘unit’, so ‘M1U1′ refers to unit 1 of module 1 and successively. Likewise, ‘Q’ refers to ‘quiz’.
Figure 4. Percentage of students who perform the module quiz and view the video of each unit according to the day of the week. Note: ‘M’ refers to ‘module’ and ‘U’ refers to ‘unit’, so ‘M1U1′ refers to unit 1 of module 1 and successively. Likewise, ‘Q’ refers to ‘quiz’.
Sustainability 12 04878 g004
Table 1. General information on massive online and open courses (MOOC) geographical analysis of natural risk: Perceive, plan, and manage uncertainty.
Table 1. General information on massive online and open courses (MOOC) geographical analysis of natural risk: Perceive, plan, and manage uncertainty.
VariableContent
Title of the courseGeographical analysis of natural risk: Perceive, plan, and manage
E-learning platform and social network profileMiriadaX (www.miriadax.net); @MoocRiesgosUA
InstitutionUniversity of Alicante, Spain
Organization and productionInteruniversity Institute of Geography and Faculty of Arts
Date (1st edition)9 September–27 October 2019
Length7 weeks
Structure6 content modules (6 quizzes, 30 thematic units in total) + presentation module + evaluation module (final quiz)
Estimated workload4–5 h per week (30 h in total)
Scientific areaGeography (Social Sciences), Environmental Sciences
LevelIntroductory
PrerequisitesNone
Teachers10 (2–3 for module)
Language of exposition, video subtitles and transcriptionsSpanish
Supplementary materialSpanish, English
AssessmentModule quizzes and forum discussions
Table 2. Descriptive statistics of the students who have completed at least one module quiz and learners’ groups.
Table 2. Descriptive statistics of the students who have completed at least one module quiz and learners’ groups.
Learners’ GroupsModule 1Module 2Module 3Module 4Module 5Module 6Module 7 1
NSASSDTATSDNSASSDTATSDNSASSDTATSDNSASSDTATSDNSASSDTATSDNSASSDTATSDNSASSDTATSD
COURSE34677.3165.18.430679.415.92.65.128176.116.42.34.526077.116.82.85.324781.915.81.12.722682.217.41.43.722180.815.423.315.9
GENDERMen2157815.14.67.619479.415.42.34.818176.316.12.34.517277.316.22.9516381.515.81.33.114882.917.21.54.21458114.321.815.8
Women12076.116.85.99.410479.516.53.25.69575.416.52.248476.417.72.768082160.91.87681.317.71.22.57480.217.425.816.1
TYPE OF LEARNINGTotal22177.815.74.87.922180.416.02.55.222177.116.02.04.022177.816.32.95.522181.915.91.12.722182.017.61.43.722180.815.423.315.9
Partial12576.516.55.89.48577.115.33.05.06072.517.33.46.03972.319.52.74.02681.215.61.22.8688.311.71.83.6-----
WEEKDAYMidweek26877.415.84.77.822278.215.92.95.420675.815.82.64.917476.816.93.15.515182.115.31.53.313780.717.71.84.312180.014.821.614.3
Weekend7877.316.76.710.38482.615.42.14.57577.217.91.63.08677.716.52.45.09681.516.70.71.48984.616.90.92.410081.716.125.417.5
STARTING PERIOD1st week11480.015.51.21.510581.915.82.44.79976.616.92.45.08979.415.33.36.88783.216.11.43.67883.516.92.35.37682.713.525.116.6
>First week23276.016.17.09.720178.215.82.85.418275.916.12.24.217175.817.42.64.316081.115.71.02.114881.617.81.02.414579.816.322.315.5
ENDING PERIOD<Last week10977.716.62.65.010981.716.11.33.410978.016.01.33.410980.615.31.52.910983.815.71.02.110982.217.61.12.410982.614.815.712.4
Last week11277.914.96.99.511279.015.83.76.311276.316.02.64.411275.116.84.27.011280.215.91.33.311281.917.71.84.611279.115.630.715.5
LAST MOMENT≥ 4 modules15177.515.53.55.315180.915.61.83.815177.715.41.93.615177.415.92.74.715182.615.71.43.215181.717.41.94.415181.415.422.115.5
Rest7078.616.27.611.27079.316.94.07.27076.017.22.24.77078.617.13.37.07080.616.40.60.97082.718.00.40.77079.615.425.816.7
1 Module 7 specifies the total time spent to complete the course.
Table 3. Mann Whitney U test results between time lengths to complete each module.
Table 3. Mann Whitney U test results between time lengths to complete each module.
M2M3M4M5M6Average Time
M1W = 67,234
p-value = 1 × 10−8
W = 60,912
p-value = 4 × 10−7
W = 53,138,
p-value = 0.0005
W = 59,663,
p-value = 2 × 10−15
W = 53,674
p-value = 1 × 10−12
5.1
M2-W = 41,300
p-value = 0.36
W = 35,560
p-value = 0.02
W = 41,614
p-value = 0.04
W = 37,023
p-value = 0.19
2.62
M3--W = 33,829
p-value = 0.10
W = 40,031
p-value = 0.003
W=35,859
p-value= 0.01
2.27
M4---W = 39,578
p-value = 8 × 10−6
W = 35,668
p-value = 0.0001
2.82
M5----W = 27,281
p-value = 0.61
1.15
M6-----1.43
Note: ‘M’ refers to ‘module’. Because module 7 has no content, since it only consists of the final quiz, it has not been included.
Table 4. Mann Whitney U test results between modules quiz scores.
Table 4. Mann Whitney U test results between modules quiz scores.
M2M3M4M5M6M7Average Score
M1W = 49,445
p-value = 0.10
W = 51,028
p-value = 0.34
W = 45,739
p-value = 0.82
W = 35,929
p-value = 0.0004
W = 32,241
p-value = 0.0001
W = 32,558
p-value = 0.001
77.3
M2-W = 48,113
p-value = 0.01
W = 43,286
p-value = 0.09
W = 34,618
p-value = 0.06
W = 30,896
p-value = 0.02
W = 31,408
p-value = 0.1206
79.4
M3--W = 35,729
p-value = 0.55
W = 27,963
p-value = 5 × 10−5
W=25,278
p-value = 3 × 10−5
W = 25,357
p-value= 0.0002
76.1
M4---W = 27,074
p-value = 0.001
W = 24,351
p-value = 0.0005
W = 24,696
p-value=0.004
77.1
M5----W=27,131
p-value=0.48
W = 27,881
p-value = 0.80
81.9
M6-----W = 26,482
p-value = 0.34
82.2
M7------80.8
Note: ‘M’ refers to ‘module’.
Table 5. Mann Whitney U test results between learner’s groups and modules (p-value).
Table 5. Mann Whitney U test results between learner’s groups and modules (p-value).
Learners’ GroupsM1M2M3M4M5M6M7
GENDERQuiz score0.240.910.530.710.770.570.99
Time length0.470.460.90.520.990.760.08 *
TYPE OF LEARNINGQuiz score0.450.08 *0.05 *0.120.740.51-
Time length0.790.09 *0.290.640.320.65-
WEEKDAYQuiz score0.930.03 **0.470.70.920.07*0.3
Time length0.420.09 *0.02 **0.03 **0.02 **<0.01 ***0.06 *
BEGINNING PERIODQuiz score0.02 **0.04 **0.890.110.260.50.27
Time length<0.01 ***0.590.550.50.39<0.01 ***0.16
ENDING PERIODQuiz score0.970.180.450.01 **0.07 *0.90.09 *
Time length<0.01 ***<0.01 ***0.05 *<0.01 ***0.10.44<0.01 ***
LAST MOMENTQuiz score0.640.550.560.610.420.580.39
Time length0.06 *0.290.440.540.17<0.01 ***0.09 *
Note: * p < 0.1 ** p < 0.05 *** p < 0.01.

Share and Cite

MDPI and ACS Style

Ricart, S.; Villar-Navascués, R.A.; Gil-Guirado, S.; Hernández-Hernández, M.; Rico-Amorós, A.M.; Olcina-Cantos, J. Could MOOC-Takers’ Behavior Discuss the Meaning of Success-Dropout Rate? Players, Auditors, and Spectators in a Geographical Analysis Course about Natural Risks. Sustainability 2020, 12, 4878. https://doi.org/10.3390/su12124878

AMA Style

Ricart S, Villar-Navascués RA, Gil-Guirado S, Hernández-Hernández M, Rico-Amorós AM, Olcina-Cantos J. Could MOOC-Takers’ Behavior Discuss the Meaning of Success-Dropout Rate? Players, Auditors, and Spectators in a Geographical Analysis Course about Natural Risks. Sustainability. 2020; 12(12):4878. https://doi.org/10.3390/su12124878

Chicago/Turabian Style

Ricart, Sandra, Rubén A. Villar-Navascués, Salvador Gil-Guirado, María Hernández-Hernández, Antonio M. Rico-Amorós, and Jorge Olcina-Cantos. 2020. "Could MOOC-Takers’ Behavior Discuss the Meaning of Success-Dropout Rate? Players, Auditors, and Spectators in a Geographical Analysis Course about Natural Risks" Sustainability 12, no. 12: 4878. https://doi.org/10.3390/su12124878

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop