Next Article in Journal
The Effect of Role-Play on the Development of Dialogue Skills among Learners of Arabic as a Second Language
Previous Article in Journal
Teacher Professional Development in Changing Circumstances: The Impact of COVID-19 on Schools’ Approaches to Professional Development
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Nudging Autonomous Learning Behavior: Three Field Experiments

1
Department of Psychology, Education, and Child Studies, Erasmus University Rotterdam, 3062 PA Rotterdam, The Netherlands
2
Breda University of Applied Sciences, Monseigneur Hopmansstraat 2, 4817 JS Breda, The Netherlands
3
School of Education/Early Start, University of Wollongong, Keiraville, NSW 2522, Australia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(1), 49; https://doi.org/10.3390/educsci13010049
Submission received: 18 November 2022 / Revised: 23 December 2022 / Accepted: 30 December 2022 / Published: 1 January 2023

Abstract

:
Autonomous learning behavior is an important skill for students, but they often do not master it sufficiently. We investigated the potential of nudging as a teaching strategy in tertiary education to support three important autonomous learning behaviors: planning, preparing for class, and asking questions. Nudging is a strategy originating from behavioral economics used to influence behavior by changing the environment, and consists of altering the choice environment to steer human behavior. In this study, three nudges were designed by researchers in co-creation with teachers. A video booth to support planning behavior (n = 95), a checklist to support class preparation (n = 148), and a goal-setting nudge to encourage students to ask questions during class (n = 162) were tested in three field experiments in teachers’ classrooms with students in tertiary education in the Netherlands. A mixed-effects model approach revealed a positive effect of the goal-setting nudge on students’ grades and a marginal positive effect on the number of questions asked by students. Additionally, evidence for increased self-reported planning behavior was found in the video booth group—but no increase in deadlines met. No significant effects were found for the checklist. We conclude that, for some autonomous learning behaviors, primarily asking questions, nudging has potential as an easy, effective teaching strategy.

1. Introduction

In tertiary education, an increased focus is put on students to become autonomous learners (e.g., [1,2]). Autonomous learners, also described as self-regulated learners (see [3,4,5]) understand the goals of their learning programme, set learning goals, take initiative and responsibility for their learning activities, and review the effectiveness of their learning activities regularly [6]. Becoming an autonomous learner is critical as autonomous learners are more effective learners [7,8]. A considerable amount of research has shown that autonomous learning behavior is positively related to higher academic achievement and learning motivation [3,9,10,11]. Demonstrating autonomous learning behavior not only has positive effects on students, it also benefits their teachers and the classroom climate [12], and is also important after graduation: employers increasingly demand autonomous learning behavior as they deal with increasingly complex societal problems [13,14]. As demonstrating autonomous learning behavior has many benefits, an important task of education is therefore to focus on and support students in becoming autonomous learners.

1.1. Supporting Autonomous Learning Behavior

It appears, however, that students often experience difficulties with becoming autonomous learners and demonstrating autonomous learning behaviors, like planning and preparing for class [15]. At the same time, teachers often have difficulty promoting autonomous learning behavior [16], because they attempt to improve autonomous learning behaviors using external reinforcements, like punishments and rewards [15]. However, these forms of external regulation diminish rather than support students’ autonomous learning behavior by negatively affecting perceived autonomy and self-regulation [17,18]. Instead, teachers should ideally gradually transfer this external regulation to students [19] (see also scaffolding [20]) when trying to improve autonomous learning behaviors. Teachers, however, indicate they lack the tools to do this successfully, as they are unsure how to balance between guidance and letting students find their own way [16,21].
So far, several attempts have been made to achieve a gradual transfer towards autonomy using other options than external regulation. Workplace simulations were implemented, intended to improve autonomous learning behavior in an environment reflecting the future workspace [21]. It became clear, however, that simply giving autonomy in these workplace simulations was not enough to promote students’ autonomous learning behavior. A review [12] showed positive results of autonomy-supportive teaching when improving autonomous learning behavior (e.g., restructuring learning activities to satisfy individual students’ needs and interests). Despite being effective, these explicit support techniques are often not used in practice by teachers, as they require significant preparation and effort [16]. In the present study, we therefore focused on a method that gives students more freedom by reducing external regulation but still providing guidance and behavioral support, without being taxing on the teacher so it can be easily implemented in practice. To find this middle ground, we look towards nudging as a promising tool.

1.2. Nudging

Nudging is a psychological intervention technique that consists of influencing behavior by creating changes in the environment, aiming to guide people towards more desirable behavior [22]. A nudge is defined as “any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives. To count as a mere nudge, the intervention must be easy and cheap to avoid” [22] (p. 6). A well-known nudge example is automatically opting-in of a pension plan [23]: by having to opt out of a pension plan to quit participating, more employees chose to stay in the program than those who actively had to sign up to participate. A recent meta-review [24] of more than 200 studies revealed that nudging has been a successful tool in changing behavior across different fields.
Thaler and Sunstein used dual-process theory (introduced by Stanovich [25], and popularized by Kahneman [26]) to explain the effect of nudging [22]. This theory states that human behavior is driven by two systems of information processing. System 1, the automatic system, consists of uncontrolled, fast, associative thinking, and steers decision making with heuristics and biases. System 2, the reflective system, represents careful, effortful deliberation, and makes choices based on rational and conscious decision making. System 1 often determines our behavior because it is faster and requires less effort. As System 1 makes irrational decisions, this can cause a person to perform behavior that is inconsistent with their long-term goals, set by System 2. An example here is a student watching a show while needing to study for a test tomorrow. By recognizing the heuristics and biases of System 1 and using these to “nudge” people towards their own long-term goals, nudging may help people to align their choices more with System 2. By having to opt out of a pension program to quit, instead of having to opt in to join, many people chose to remain in the plan [22]. Even though for System 2 nothing changed, System 1 was convinced by making the desirable choice as the easy choice as people have the tendency to opt for the choice that requires the least effort. This successfully nudged people into joining the plan.

1.3. Nudging in Education

While only 4% of nudging research focuses on education [27], there have been multiple promising findings in educational nudges [28]. Nudges in education have primarily be focused on “end result” achievements, or milestones [29], like college enrolment (e.g., [30,31]) and many have been successful in this regard. In education, less research has focused on nudging for continuous behavior change of students [28], and those interventions testing continuous behavior change have produced smaller results than interventions nudging milestone behavior [32]. However, we believe nudging autonomous learning behavior has potential in education, as changing students’ behavior has been a focus of the education field for a longer time (e.g., [18,33]). In a theoretical paper, Weijers et al. [29] argue that nudging on direct behavior in an educational context holds promise for success. Additionally, teachers are often unaware of nudging as a strategy to support autonomous learning behavior [34]. And when improving autonomous learning behavior, nudges could be well suited as they do not coerce or forbid options. This makes them different from external regulation: nudges do not reduce perceived autonomy [35]. At the same time, a nudge still provides students with guidance towards the desired behavior. As such, nudges can be a tool to help transferring regulation from the teacher to the student, fitting the idea of gradual transfer of regulation to the students necessary to achieve self-regulation, as described by Collins et al. [19] and Vermunt and Vermetten [36]. This also is similar to strategies that teachers already use, like prompting and cueing to guide students through learning tasks (e.g., [37] see also [38]). Additionally, nudges are often small changes or interventions that are easily implemented, making them less taxing on the teachers than autonomy-supportive teaching methods that have thus far been examined [16]. For example, an autonomy-supporting teaching style like “flipping the classroom” (see, e.g., [39]) can provide students with the environment to demonstrate autonomous learning behaviors and can provide teachers with an environment to support it more successfully. However, this requires teachers to invest significant time and resources into educational redesign, which may not be available for them. Furthermore, larger-scale interventions like this can readily coexist with nudging, as nudging techniques can easily be integrated in different educational settings due to their fast, cheap, and efficient nature. Lastly, students are often hampered by cognitive barriers (caused by System 1) when demonstrating autonomous learning behaviors [40]. These cognitive barriers can for example consist of a strong present bias (i.e., valuing future rewards less than rewards in the present) or a lack of self-control due to a still developing adolescent brain, preventing these students from performing the desired behavior [41]. Nudges are especially designed to circumvent these kinds of cognitive barriers [42].
The effectiveness of nudging as a tool to improve autonomous learning behavior has already been demonstrated on a small scale. For example, nudging has successfully reduced procrastinating behavior [43] and increased completed homework [44]. However, most nudging research in education targeted learning outcomes instead of behavioral outcomes and did so in university settings [28,29]. Therefore, in the present study we investigated the usage of nudging to improve both autonomous learning behavior and learning outcomes in students. This led us to the following research question: Can nudging as a teaching strategy support students’ autonomous learning behavior? As in the previous studies that only measured learning outcomes, described by Damgaard and Nielsen [28], we expected that nudging could support students in their autonomous learning behavior, thereby increasing their learning outcomes. We formulated the following general hypothesis: Students who are nudged to display autonomous learning behavior, more often display autonomous learning behavior and achieve higher learning outcomes than students who are not nudged. We add to the literature by applying nudging in education and focusing on behavioral outcomes, specifically on autonomous learning behavior.
We conducted three experiments to investigate the effect of nudging on students’ autonomous learning behaviors and learning outcomes. We use a novel design approach focused on collaboration and co-creation with the participating teachers to create the most effective nudge interventions (for details see Method section). After the description of the nudge designs, the specific hypotheses for these experiments are further specified.
In each of the experiments, we aimed to improve different behaviors: planning, preparing for class, and asking questions. These behaviors represent different key aspects of autonomous learning behavior, as defined by Zimmerman [45]: forethought, performance, and reflection. First, planning is a key element of autonomous learning behavior [46] and is associated with higher academic achievement [47]. This behavior represents the forethought phase of self-regulation both in-class and out-of-class [48], in which the student analyses the task, activates prior knowledge, and plans accordingly. Second, class preparation is an important out-of-classroom factor in students’ educational success: a prepared student is better able to comprehend new material, assess their own learning trajectory and ultimately attaining higher grades [49]. This behavior represents the performance phase of self-regulation [45,48] taking place outside the classroom. Third, asking questions is a vital form of classroom behavior to steer the flow of transfer and ensures that students can keep up with instruction [50,51,52], and is also an important skill for students during an internship [53]. It represents the self-reflection phase of self-regulated learning [48] taking place inside the classroom.
This distinction also allows us to look at an appropriate timing for a nudge to successfully change behavior: in-class behavior (asking questions), out-of-class behavior (preparing for class) and both in-class and out-of-class behavior (planning). Additionally, Weijers [54] uncovered categories of autonomous behaviors that students consider as important and in need of improvement. Planning, preparing for class, and asking questions were among the most frequent and prominent of these categories, indicating the practical relevance of developing successful nudges for these behaviors. Figure 1 presents a summary of the theory of change, highlighting the theorized effect of nudging interventions in education to promote autonomous learning behavior tested in this study.

2. Materials, Methods, and Results

The experimental method consisted of two phases. First, teachers and researchers collaborated to create a suitable and potent nudge intervention the teachers could use in their teaching practice (the design phase). This resulted in three nudges. Second, the nudges were tested in a quasi-experimental setting among students from the teachers’ classroom(s) (e.g., the nudge developed by Teacher A was tested in their classroom) (the experimental phase). For all three nudges, a separate experiment was planned. We describe the general method of all three studies, and then describe the specifics of the studies, as well as the results.

2.1. First Step: Design

Nudges have proven effective in changing different kinds of behavior in various situations, but the specific context has a large impact on a nudge’s effectiveness [55]. Therefore, to create an effective nudge, it is important to tailor it to the specific context and target group. To achieve this, the concept of design patterns [56] and stakeholder involvement in intervention design (e.g., [57]) was applied to the nudge design procedure. This was combined with the framework for nudge design by Hansen [55] and the theoretical framework EAST [58] to design nudges that were potentially the most effective in changing the target behavior. All nudges were designed to be applicable in an online or hybrid classroom due to possible COVID-19 regulations, but the research was ultimately carried out in physical classrooms.
Design procedure. We recruited 30 teachers in vocational education and training (VET) and in higher vocational education (HVE) who expressed interest in using nudges to promote autonomous learning behavior. They were split equally, resulting in three parallel design groups of ten teachers, one for each target behavior (planning, preparing for class, and asking questions). A design group held three design sessions, each set up by two researchers—the first author and a researcher familiar with the specific school. In the first session, teachers were introduced to nudging and received examples of nudges being used in practice. Afterwards, they participated in a brainstorm. First, all teachers collectively listed factors they felt prevented students from performing the target behavior. The teachers were then distributed between four tables, corresponding with the four themes of EAST: Easy, Attractive, Social, and Timely [58], where they used brainstorm playing cards [59], (for an example see Figure 2) to generate ideas. Teachers grouped the ideas on their table and presented the best ideas, creating a group discussion.
After the first session, the involved researchers collaborated to make a selection based on the teachers’ input and scientific insights, and worked out specific nudge proposals for each design group. These proposals and the theoretical justification for the final nudge design can be found in the additional materials. In the second design group session, the two researchers of the design group then presented these options. The teachers decided which option they preferred and proposed changes that made them fit better in their environment. These nudges were piloted for practical applicability by individual teachers by implementing them in their classes for a short period. In the third design group session, these teachers reported back to the design group, and the nudges were refined based on their feedback. This resulted in the final nudge that was subsequently experimentally tested in the teachers’ next semester. A visual overview of the process can be found in Figure 3.

2.2. Second Step: Experimental Procedure

Participating teachers submitted a list of their available and suitable classes to the researcher. These were randomly assigned to either control or intervention nudge classes, aiming to keep the group sizes similar, and to balance education and education level as well as possible between conditions. Due to the nature of co-creation, it was not possible to blind teachers to the experiment, nor randomize the nudge they would be implementing. We took several precautions to avoid contamination. Teachers were specifically instructed during design and execution of the experiment that they should not change the way they taught their classes other than implementing the nudge. This, as well as whether the teachers were able to implement the nudge successfully in the nudge classes, was verified after the experiment by a researcher interviewing all participating teachers. Specific student demographics were not collected.
Students were informed about the experiment by their teacher. As opting out of the intervention was difficult for the class-wide interventions, all students were reminded of the possibility to opt out of data collection for the experiment. Informed consent was obtained during the first lesson of the semester in which the experiment took place. The semester took between 6 and 10 weeks, depending on the school. For all participants, final course grades were collected and specific behavioral data were gathered for each experiment (for details see the Methods of each experiment). Grades in the Netherlands range from 1 (very bad) to 10 (excellent), with 5.5 being a barely passing grade. To contextualize the findings, complementary interviews were held with all teachers to gain insight in their perceptions and the effects of the nudges. The interview structure can be found in the additional materials.

2.2.1. Analysis

Due to the extraordinary circumstances of the COVID-19 pandemic, which impacted the participating teachers’ and students’ personal and professional lives, participation in the experiment and data collection across the project proved difficult and at times impossible (see Appendix A). This means that, for all studies, the final sample size was less than expected.
The main hypotheses were investigated using a mixed effects model approach with condition (control vs. nudge) as the independent variable. Dependent on the hypothesis, the behavioral outcome or the final grades were taken as the dependent variable. The variable teacher was taken as a fixed intercept, and the variable students, nested in the variable class, as random intercept. To optimize the random structure, the suggestions of Barr et al. [60] were used as a guideline. The complete models are described in the preregistration on the Open Science Framework (see below). The reported models were run as preregistered.

2.2.2. Transparency and Openness

All participants in all experiments were asked for informed consent ahead of the experiment. This study obtained ethical approval from the Ethical Review Committee of the Department of Psychology, Education, and Child Studies; Erasmus University Rotterdam (application 19-052).
Data were analyzed using the statistical program R, version 3.6.3 [61], using the lme4 package [62]. Using a mixed-effects model approach, the cell means were compared to the grand mean to determine estimates of the effect. To determine p-values, the parametric bootstrap function of the package afex [63] was used. This study’s design and its analysis were pre-registered. All data, analysis code, research materials, and preregistered analyses are available at the Open Science Framework (https://osf.io/ye9av/?view_only=03fef7e33df04058983ad331b214b5a7 accessed on 23 December 2022).

2.2.3. Specific Setup

After the design phase, the three design groups carried out an experiment with their nudge for their target behavior. During the second design session for the group nudging planning behavior, the group split towards two nudge options: a video booth and reminders (described below). The teachers divided themselves in two groups of an equal number based on their interest. These groups each designed a nudge separately from each other, resulting in two nudges and two experiments aimed at planning: a video booth nudge and a reminder nudge. As a strong ceiling effect (i.e., nearly all participating students achieved the deadline that was targeted by the reminder nudge) prevented us from drawing any meaningful conclusions about the effects of the reminder nudge, for brevity of the article the setup and results of this experiment can be found in Appendix B. The setup and results of the remaining three nudges are discussed below.

2.3. Experiment 1: Planning-Video Booth

In this experiment, we aimed to improve individual students’ meeting of deadlines by creating a video booth intervention. The hypotheses for this experiment were:
Hypothesis 1a.
Students who received the video booth nudge would report more planning behavior than students who did not receive the video booth nudge.
Hypothesis 1b.
Students who received the video booth nudge would miss fewer deadlines than students who did not receive the video booth nudge.
Hypothesis 1c.
Students who received the video booth nudge would obtain higher course grades than students who did not receive the video booth nudge.
Participants. In total, 98 first- and 32 third-year VET-students (total 130 students) participated in this experiment. Due to the nature of the intervention, randomization was possible on an individual level instead of per class. Sixty-five students participated in the nudge condition, and sixty-five students participated in the control condition. All participants were VET-students in the sector marketing and events. Individual demographics of the students were not collected, but the students’ age range in this educational track is 18 to 21 years.
Design. Students were individually asked to take part in the experiment, in which they answered questions on cue cards while filming themselves in an empty classroom, the “video booth”. In the nudge condition, the cue cards were designed to self-persuade students of the importance of planning, identify obstacles for planning and design an implementation intention. First, students were being asked why they found planning to be important, thereby engaging in self-persuasion [64], which is more effective in creating behavioral change [65,66]. Next, they were asked to identify their obstacle to demonstrating planning behavior (mental contrasting [67]) and then find a specific solution on how they could circumvent this obstacle (implementation intention [68]). Combining mental contrasting and implementation intentions has been successful in earlier research in education and previously increased attendance, preparation time and grades for students [69,70] but has not yet been applied to increase planning behavior. Similar interventions, called “wise interventions” have been successful in creating positive behavior change in several fields (for an overview, see [71]). No explicit instruction on how to plan was provided on the cue cards. In the control condition, students followed the same procedure, but the topic of the instructions on the cue cards was about healthy eating.
Materials. In the booth, students sat at a desk in front of a handheld camera on a tripod. The camera was pointed at the chair the students sat in. Cue cards with instructions, including when to switch the camera on and off, were numbered in order and placed on the desk. For the used cue cards, see the additional materials.
Measures. Three outcome measures were used during this experiment: (1) self-reported planning behavior, (2) missed deadlines, and (3) final course grades. To measure successful adherence to planning, a short questionnaire was designed that accompanied the final test or deadline. This questionnaire consisted of four questions about a student’s planning adherence for that test or deadline, with answers on a seven-point scale. A sample question is “I was able to do all preparation I wanted to do for this [deadline/test]” with answers ranging from “completely disagree” (1) to “completely agree” (7). This questionnaire can be found in the additional materials. Construct reliability of the planning questionnaire was good (Cronbach’s alpha = 0.74). As an additional measure of successful planning behavior, we also recorded how many deadlines students missed during the semester. The first-year students had 45 deadlines throughout their period (one semester: September to November), and teachers recorded all missed deadlines. Due to being in their exam period more focused on exams rather than assignment deadlines, the third-year students had only a few deadlines. Additionally, the final grades of their courses were collected.
Procedure. The procedure took place over three consecutive school days. During a class, all students were asked to participate in the experiment. Two video booths—in separate rooms—were simultaneously available, so students were taken in pairs from their classes by an experimenter. The teacher remained blind to the condition the student was in. Students were informed in pairs about the experiment in a classroom set up as waiting room, where the experimenter outlined the procedure and told students that the goal of the experiment was to find out how students dealt with everyday problems—planning properly (experimental condition) and eating healthy (control condition). Here, students signed a consent form specific for the video booth, which can be found in the additional materials. This instruction took approximately five minutes. Then, they were taken to the video booth, which was an empty classroom in the school, where a handheld camera on a tripod was set up facing the desk the student was sat at. Participants received cue cards with instructions and were told to follow the instructions on the cards. Students read the cards out loud and answered the questions while thinking out loud, while filming themselves to strengthen their commitment to their answers [72]. They were left alone to ensure their privacy and honesty during the procedure. Students were asked to sit down in front of the camera and were instructed to read the cue cards, follow the instructions on the cards and answered the questions on the cards. When finished, students informed the experimenter in the waiting room. The video booth took approximately five minutes per student. The experimenter thanked the students for participating and asked not to talk about the project with their classmates until the end of the day. Afterwards, students went back to their classes. The video made by the student was sent to the student if they had indicated to want this on their consent form, and then deleted. Deadline adherence was collected throughout the period, which was approximately ten weeks. Figure 4 contains an overview of the timepoints of intervention and data collection.
Results. For the third-year students, no deadlines were missed. Because of this, all third-year students (n = 32) were excluded from the experiment. Three first-year students were excluded because their number of missed deadlines were more than three standard deviations (9.66) from the mean. Results did not change significantly when testing the hypotheses with these students included. This left 95 students in the final sample, of which 49 were in the control condition and 46 in the nudge condition.
For Hypothesis 1a, the planning questionnaire was filled in by 30 students that were also present in the final sample. Of these students, 16 were in the control condition and 14 were in the nudge condition. Bootstrapped multilevel analysis reveals a significant effect (Estimate = −0.44, SE = 0.15, t(30) = 2.94, p < 0.01): nudged students reported more planning behavior (M = 4.70, SD = 0.71) than students in the control condition (M = 3.81, SD = 0.97).
For Hypothesis 1b, the number of missed deadlines per condition were compared using multilevel analysis. No significant effect of the nudge was found on missed deadlines, Estimate = −0.09, SE = 0.19, t(89.90) = 0.51, p = 0.61. Students in the nudge condition did not miss significantly more deadlines (M = 1.15, SD = 2.00) than those in the control condition (M = 0.96, SD = 1.77). The removal of the two outlying students did not significantly change this outcome.
In an exploratory analysis, we correlated the number of missed deadlines with the self-reported planning score for the subgroup of 30 students who had filled in the planning questionnaire. No significant correlation was found between the two measures (ρ = −0.06, p = 0.75), indicating that no relation exists between the planning behavior a student reports and their realized deadlines.
During the research it became apparent that it was difficult, if not impossible, to combine all grades into a meaningful number, given the complexity of the students following different courses and educational tracks and the availability of the data. This led us to not pursue the effect of the nudge on grades.
Discussion. When using the video booth as a nudge to promote planning behavior, we found a positive effect of the nudge on reported planning behavior. However, no effect was found on meeting the actual deadline. A possible explanation is that the behavioral measurement (deadlines) and the intended behavioral change (the planning behavior) are not one on one comparable, as suggested by the lack of correlation between the two measurements. This corresponds with the findings of Ariely and Wertenbroch [73] and Levy and Ramim [43]. In these studies, students were nudged by being able to set their own deadlines. Although this did not lead to improved [43,73] this was because students tried to improve their planning, but did so suboptimally [73]. Their findings correspond with Levy and Ramim’s findings [43] that these students did report less procrastination behavior. In summary, these students attempted to improve their planning behavior, but this behavior was not (yet) reflected in their missed deadlines. A similar process can have taken place in our experiment: students attempted to engage in planning behavior (accepted Hypothesis (1a), but it did not (yet) show in their realized deadlines (rejected Hypothesis (1b). This interpretation should be done with caution, however, as the small sample size for the questionnaire makes finding small effects difficult. This sub-sample is also likely not a random sample of the participants, but likely consists of students who are more actively engaged with school. Additionally, because this was only measured using self-report, it is also a possibility that students overestimated their own planning behavior or gave socially desirable answers because they made the connection with the intervention at the start of the semester. Further research is necessary to verify whether the increase in self-reported planning behavior corresponds with an actual increase in planning behavior.

2.4. Experiment 2: Preparation for Class

In this experiment, we aimed to improve students’ preparation for class by creating a checklist intervention. The hypotheses for this experiment were:
Hypothesis 2a.
Students who received the checklist-nudge would prepare for class more often than students who did not receive the checklist-nudge.
Hypothesis 2b.
Students who received the checklist nudge would obtain higher course grades than students in the control condition.
Participants. Of the teachers participating in the design group that created the nudge for class preparation, 3 were ultimately able to participate in the experiment. Every teacher participated with two parallel classes—classes who were in the same year and who received the same course. Per teacher these classes were divided between control and nudge classes so that type of course and year of study were distributed evenly across conditions. In total, 3 classes with in total 76 students participated in the nudge condition, and 3 classes with in total 72 students participated in the control condition. All students were HVE-students, 98 of which were first-year students and 50 of which were third-year students, following the educational track to become English teachers. Examples of themes of participating courses are didactic skills and English literature. Individual demographics of the students were not collected, but the students’ age range in this educational track is 18 to 23 years.
Design. To improve preparation for class, teachers designed and provided a checklist for their students, where the specific preparation they needed to perform was outlined per class. Students could then tick the box for the work they had completed. Students in the control condition were provided with the regular course material, generally a course syllabus which outlined the preparatory work which was not presented in a checklist format.
Measures. Class preparation was registered per lesson per individual student by self-report (in case of reading/learning) or checking by the teacher (in case of exercises). This was registered as either complete, not complete, or partly complete (in the case of multiple exercises). To account for different courses and schedules, these scores were converted by the teachers to a score: from 1 (student did no preparatory work), 2 (some preparatory work), 3 (most preparatory work) to 4 (all preparatory work). Not all courses had the same number of classes that required preparation, which was taken into account in analysis.
Procedure. The teacher prepared a checklist of the relevant course preparation per lesson based on the regular course materials and preparatory instructions for all classes in the nudge condition, and distributed this at the start of the course. The control classes received the regular materials. This was usually a syllabus with the same information but not in a checklist format. Checklists were not sharable between classes due to specific dates, deadlines and exercises reducing possible contamination effects. An example checklist can be found in the additional materials. Each lesson, the teacher registered each student’s preparation. Individual grades were intended to be collected at the end of the course, but exams were postponed due to intensifying COVID-19 measures. Figure 5 contains an overview of the timepoints of intervention and data collection.
Results. Data were collected per lesson, resulting in a total of 662 data points. The average class preparation score was 2.08 (SD = 1.21). A frequency table of scores can be found in Table 1.
Hypothesis 2a was rejected: students in the nudge condition (M = 2.15, SD = 1.24) did not score significantly higher on class preparation (Estimate = −0.03, SE = 0.06, t(104.61) = −0.53, p = 0.60) than students in the control condition (M = 2.00, SD = 1.18). Because in the preregistration class preparation was assumed to be a binary variable, these scores were converted to a binary and the same tests were run. These results remained stable when reverting scores to either a strict binary (1 = all preparatory work completed, 0 = not all preparatory work completed) or a lenient binary (1 = any preparatory work completed, 0 = no preparatory work completed).
Unfortunately, tests for the participating classes were postponed due to COVID-19, meaning that relevant grades for these classes could not be collected.
Discussion. We did not find a significant effect of the nudge on class preparation. A possible explanation for the non-significant finding is that uncertainty about what to prepare is not the main behavioral barrier encountered by students. For example, a present bias, an overconfidence in one’s own ability, or simply other ongoing affairs can cause a student to consciously decide not to prepare for class. Nudging when the nudgee’s attitude is unsupportive of the nudged behavior is ineffective [74]. Nudging behavior in a transparent way—meaning that both the intervention and its purpose are apparent to those subjected to it [75], as in this experiment—allows students to easily resist the attempted behavioral change [75,76]. Future research should investigate which behavioral barriers prevent students from preparing for class, to design more effective nudges.

2.5. Experiment 3: Asking Questions

In this experiment, we aimed to improve students’ question asking in class by creating a class-wide goal-setting nudge intervention. The hypotheses for this experiment were:
Hypothesis 3a.
Students who received the goal-setting nudge would ask more questions during class than students who did not receive the goal setting nudge.
Hypothesis 3b.
Students who received the goal-setting nudge would obtain higher course grades than who did not receive the goal-setting nudge.
Participants. Of the 7 recruited teachers, 3 ultimately participated in the experiment. Four dropped out due to the strain of data collection during their classes. Classes of the participating teachers were divided randomly between control and nudge condition. Five classes (of which 2 in VET and 3 in HVE) with in total 80 students participated in the nudge condition, and four classes (of which 2 in VET and 2 in HVE) with in total 92 students participated in the control condition. The average class contained 19 students. Students were excluded if they missed more than half of the lessons. Of the participants, 88 students were VET-students and 84 were HVE-students. The participating VET-students followed an educational track focused on ICT, and the experiment was held within their Dutch courses. The HVE-students were teachers-in-training and followed courses on professional development and didactic skills. Individual demographics of the students were not collected, but the students’ age range in this educational track is 18 to 21 years.
Design. To promote asking questions, a class-wide nudge was designed. Every lesson, a goal was set by the teacher: all students should aim to ask at least one question. This goal was explicitly mentioned at the start of the class, as well as that there was no negative consequence for not achieving this goal. This type of goal-setting fits the criteria for a nudge [37] as its proposed effectiveness is due to mechanisms of goal commitment, not external rewards or punishment.
Measures. To measure asking questions, the teacher unobtrusively marked the number of questions an individual student asked in each lesson on the attendance sheet. All questions were counted, apart from necessary procedural questions (e.g., “Can I go to the bathroom?” or “Is my microphone unmuted?”).
Procedure. At the start of each lesson, the nudge classes were reminded of the goal to ask at least one question per lesson. In both conditions, during the lesson the teacher registered the number of questions asked by each student in the class. Figure 6 contains an overview of the timepoints of intervention and data collection.
Results. Ten students were dropped for missing more than half of the sessions, leaving 162 students. Of these students, 88 were in the control condition (of which 38 VET-students and 50 HVE-students) and 74 in the nudge condition (of which 43 VET-students and 31 HVE-students).
Data were collected per lesson, resulting in a total of 1296 data points. Students in the nudge condition (M = 1.49, SD = 1.23) asked 0.25 questions per lesson per student more than students in the control condition (M = 0.99, SD = 0.79). In other words, each student asked one more question every four lessons. Statistically, this was a positive trend (Estimate = 0.25, SE = 0.14, t(36.43) = 1.80, p = 0.08).
Additionally, we investigated the effect of the goal-setting nudge on the students’ grades. The final average grade of a subset of five classes were available for this analysis. All students in this analysis were following the same course with the same test. Students in the nudge condition (M = 6.98, SD = 0.98) scored higher (Estimate = 0.69, SE = 0.24, t(57) = 2.79, p < 0.01) than those in the control condition (M = 6.29, SD = 0.92). No direct effect of number of questions on final grade was found (Estimate = −0.03, SE = 0.11, t(55.74) = 0.30, p = 0.76).
In an exploratory fashion, we investigated whether the increase in questions was because of already active students asking more questions (“the usual suspects”) or that the nudge helped students ask questions where they previously would remain quiet. On the average lesson in the control condition, 60.5% of students (SD = 24.3%) asked a question, compared to 70.7% in the nudge condition (SD = 19.8%). In an exploratory general linear mixed-effects model analysis, we reduced the dependent variable to a binary factor (0 = no questions, 1 = one or more questions). We found no significant effect of condition (Estimate = −0.10, SE = 0.15, z(779) = 0.62, p = 0.54) indicating that we find no evidence of the intervention having increased the number of students asking questions.
Discussion. Regarding the number of questions students asked, we found a marginal positive effect of the nudge. In practice, this meant students on average going from 1 to 1.25 question per lesson in the nudge condition, meaning that teachers received about 25% more questions in their classes. In a course with 20 students consisting of 8 lessons this means 40 extra questions. This corresponds with the teachers’ evaluation of the nudge in the complementary interviews, and the higher grades of nudged students compared with those in the control condition. Teachers said the nudge resulted in an atmosphere where asking questions was normal and desirable. Given the size of the effect and its correspondence with the other findings, we are inclined to see this as a promising result of the nudge. It should be noted that the increase in number of questions asked was not paired with an increase in the number of students asking questions; in an exploratory analysis, we found no increase in the number of students who asked at least one question each lesson. So, although we find evidence that the nudge helped students to ask more questions, our findings support the idea that the nudge does not help students who normally would not ask a question, ask more questions.
Despite no direct effect of number of questions on final grade was found, the nudge did increase the students’ grades. A possible explanation is that the extra questions asked lead to extra answers from the teachers, which could have benefitted all students in class, irrespective of who asked the questions. Alternatively, the nudge contributed to a classroom climate that is productive for learning (and achievement), also for students who do not ask questions. A final explanation is that the teacher was not blind to the condition the students were in, and this—consciously or subconsciously—influenced their attitude positively towards the nudged classes. This possible bias is difficult to avoid for teacher-based interventions where teachers need to be aware of the different conditions. We believe the risk is small for this bias, because we did not find an effect for the other nudges on teacher-measured outcomes.
Due to the nested nature of the data, it is necessary to repeat this research with a larger and more diverse population, to be able to provide a more definitive answer to the effect of the nudge. Additionally, although the nudge increased the number of questions asked, we found no evidence of more students asking questions. Future research could identify whether different nudges could perhaps also help these students in asking questions.
An alternative explanation to consider is that the average class size for the nudge condition was 16 students, while this was an average of 22 for the control condition. Class size is an important factor in class interaction (e.g., [77,78]) with larger groups experiencing lower student participation and interaction. As the nudged classes were on average smaller, this could be an alternative explanation for the effect of the nudge. However, all but one of the participating classes fall within the established “medium” class size of 15 to 34 students [79] (p. 10), for which class interaction has similar patterns. This leaves the possible explanatory power of class size limited.
Lastly, this experiment is limited by the quantitative approach to asking questions. Students’ questions are reduced to a count per student per class, while no question is the same. However, assessing how the question fits in each student’s learning process is very subjective and simply was not feasible to consider in our experiment. Future research could repeat this experimental setup with a more qualitative approach and investigate the type and quality of questions asked, or thoroughly interview teachers about this aspect.

3. Discussion

Autonomous learning behavior is important, but students often do not show this behavior. In this study, we investigated whether nudging is an effective teaching strategy to increase autonomous learning behavior in students. We developed an intervention design method and used this to design four nudges for three autonomous learning behaviors: planning, preparing for class, and asking questions. These behaviors represent the three phases of Zimmerman’s self-regulation model [40] and are also behaviors that teachers find important and seek support for to improve. The nudges were tested in four field experiments. We designed the nudges reminders and video booth to increase planning, a checklist to increase preparation for class, and a goal-setting nudge to increase asking questions. Given the mixed results, it is difficult to provide a clear-cut answer to the research question can nudging as a teaching strategy support students’ autonomous learning behavior. This range of effectiveness of the nudge interventions can also be found in Damgaard and Nielsen’s general overview [28] of nudging in education. Based on our findings, we can provide more insight in what nudges are effective when promoting autonomous learning behavior, and what kind of autonomous learning behavior is susceptible to being nudged. A limitation should be stressed beforehand, which is that all studies experienced teacher attrition due to the added work pressure caused by the pandemic. Although attrition was not based on educational track or educational level, it is possible that this left the teachers in our sample who were not as busy or more motivated than their colleagues. We believe the impact of this attrition to be limited, as the measurements used throughout the experiments are not centered around teachers or teacher characteristics. The experiments using randomization on class level (the checklist and goal-setting nudge) used multiple measurements per individual student. The video booth nudge experiment, which utilized one cumulative measurement of the dependent variable per student, was randomized on an individual level. All dependent variables reported are therefore individual-based, rather than class-based.
Of the three behaviors investigated, nudging students to ask questions seems the most promising one. Despite the small sample size [80], the results indicated that the nudge helped students ask marginally more questions and resulted in higher grades. A nudge that promotes students generating and asking questions can be a useful supporting tool for teachers and students, as this behavior is essential for successful self-regulated learning [48,81], and students often do not show this behavior [54,82,83]. Generating questions benefits students academically, but also helped motivate them [84] and reduce their test anxiety [85]. Especially higher-order generated questions are of great importance for students to achieve their learning goals [85,86]. For an overview of the importance of student-generated questions, see [83]. In addition, a question that is asked enriches the learning of not only the help-seeking student, but also potentially that of other students in the (online) classroom. This was reflected by the modest increase in average grade found for all students after the intervention, reaffirming the nudge’s positive effect despite its marginal significance. Moreover, the positive feedback loop and possible individual “spillover behaviors” can strengthen the behavior [87]. A student receiving a satisfying answer when asking a question is likely to ask more questions, which can “spill over” into asking questions out of class or demonstrating other autonomous learning behaviors. This essentially makes the initial increase in asking questions possibly “snowball” into more help-seeking behavior or even more autonomous learning behavior in general. Additionally, as grades went up also for students who did not ask more questions, this nudge also helps students indirectly, even if they do not ask more questions. Ultimately, while promising, the effect of the nudge and its possible spillover remains relatively small when assessing the practical usefulness of the nudge. Neither the nudge, nor its possible spillover, are sufficient to improve autonomous learning behavior in students on a large scale, but it can be a helpful tool in achieving small steps towards the desired behavior. This means that the Theory of Change described in Figure 1 should be interpreted as being small in effect size, which means the likelihood of the nudge intervention single-handedly achieving larger long-term effects, like increased graduation, is low.
Unfortunately, not all nudges were equally successful. We found no effect of the checklist nudge for preparing for class and also could not determine an effect of the reminders. Although the video booth nudge did not show an effect on missed deadlines, exploratory analysis revealed that nudged students (unsuccessfully) tried to improve their planning skills, as they reported more planning behavior. This first step is a start for a complex skill like planning and self-regulation in general, and is why Ruggeri et al. [40] and Weijers et al. [29] warn against overreliance on just end results in education when discussing the success of behavioral interventions. This nudge could, perhaps combined with explicit support (as in [12]), be helpful for improving planning behavior. When students are nudged into improving their planning behavior, they also need the explicit support and instruction on how they can do that successfully, because nudging students towards behavior they do not know how to perform will ultimately not work.
An explanation for the difference in effectiveness of the nudges can be sought in the type of targeted behavior. The more successful nudge targeted in-class behavior (asking questions) while the less successful nudge targeted out-of-class behavior (preparing for class). The nudge targeting planning behavior was a partial success, and this behavior takes place both in-class and out-of-class. The extra autonomy that a nudge gives when compared to external regulation could be a step too far for the students in the case of out-of-class behaviors, as they are still developing their skills in autonomous learning behavior and need more support. Indeed, Oreopoulos and Petronijevic [32] found in a large sample of students that nudges were not successful in changing out-of-class behavior, like spending extra time studying. On the flip side, in-class behavior has the benefit of teachers being able to support it more directly than the out-of-class behavior: the longer the time between the intervention and the targeted behavior, the less effective the nudge [88]. Future research should aim to improve in-class behaviors and find how nudges, perhaps combined with other techniques, can be used to improve out-of-class behaviors that are important for autonomous learning.
A different explanation for the difference in effectiveness can be sought in characteristics of individual students, in accordance with the concept of “nudgeability” [89], describing that individual factors can determine susceptibility to a nudge. In our research, the goal-setting nudge seemed only to work for students who were already active during class. A possible driver of this difference between students could be academic motivation or engagement, as a different study in the educational field where class attendance was nudged had similar findings, where the nudge had a different effect on students depending on the degree they already were attending class [90]. Different predictors, like gender (see, e.g., [91], age, or educational level, should also be considered, which could be an important line for future research.
Taking the three nudged behaviors as a representation for the three phases of self-regulated learning [45], we find suggestive evidence for a possible effect of nudging behavior in the forethought phase. We should be careful drawing conclusions here, as we only investigated one of each possible representative behavior for each phase. Autonomous learning behavior is a broad concept, and what it looks like can differ between ages, developmental stages, learning needs and even between students [7]. Of course, the three researched behaviors (asking questions, preparing for class, and planning) are not fully representative of the entire spectrum of possible autonomous learning behaviors, but we believe they cover a large part of the relevant behaviors for students and represent the different aspects of autonomous learning behavior [45] well. Future studies could, to obtain a fuller picture of nudging on self-regulated learning, investigate different behaviors associated with self-regulated learning. For example, further research could investigate nudging to promote goalsetting (as in [92], linked to the forethought phase [45]. Similarly, researchers could design a nudge to support their students to pay attention or remove distractions, behavior linked to the performance phase [45]. Alternatively, future research can choose to dive deeper in nudging self-reflective behavior, targeting behaviors like incorporating feedback or performing self-evaluation [48]. All alternative behaviors mentioned in this paragraph are indicated by VET- and HVE-teachers as behaviors they lack the tools to successfully support this behavior [34].
Lastly, a possible limitation of the study is spillover between conditions, as all participants are students in the same school and often same educational track. Nevertheless, we think that there is little room for spillover to affect the results of the study. The goal-setting nudge was done by the teacher in class, making spillover to other classes very unlikely. The effect of the video booth nudge is achieved by answering the questions on the cue cards, and moreover, the students were asked to keep the content to themselves. Lastly, students could potentially share their checklist with other students in the control condition, but as different classes would have different due dates and assignments, the checklist is less usable to be shared with students from different classes.

4. Implications

The current study holds several important implications. In multiple ways, this study can be considered a novel approach. The study outlines a new method to design behavioral interventions based on design patters [56] and stakeholder involvement [57]. Using rapid redesign cycles, researchers collaborated with stakeholders to create nudge interventions that are based in scientific theory and also relevant and applicable in practice. Despite the mixed results, we believe that these design principles remain important when designing an impactful nudge for educational practice. Future research could investigate the effectiveness of this method as a behavioral intervention method, as much theory has been developed as to what aspects are important during nudge design [55,93] but little as for what makes an effective intervention design session [94].
Furthermore, our experiments are indicative of that some types of autonomous learning behavior are more likely to be influenced by nudging than others, though more research is needed to substantiate this. Specifically, our results suggest that when designing a nudge as a teacher, there might be more potential in targeting in-class behaviors with nudging than targeting behaviors that are out-of-class. This corresponds with earlier findings (e.g., [32]) that focusing nudges on continuous behavior that takes place out of the classroom is not very successful. Additionally, teachers should target behaviors that students are capable of performing, or also provide the support students need to be able to perform the behavior. First among these behaviors is asking questions, which—on a small scale—has already shown to be susceptible to nudging intervention.
Additionally, when investigating nudging in education, our study is one of the first studies to investigate nudging in which actual student behavior in the classroom is measured rather than considering behavioral proxies like grades or enrollment (see [28]). Although often more difficult to achieve, behavioral measurements combined with learning outcomes should receive more attention in nudging research, especially focused on education [29,40]. Likewise, a focus of this study was to further investigate the effect of nudges on students, as it is especially important in education to investigate possible underlying processes that nudges affect [29]. We included a questionnaire (see Appendix A) including self-regulatory learning behavior [95], perceived student autonomy [5], behavioral and agentic engagement [96,97], and academic motivation [98]. Unfortunately, lack of questionnaire input rendered these unusable for the research. Future studies should attempt to include these, or other, relevant measures when investigating nudging in education. To increase response rate, these variables could be investigated separately to avoid survey fatigue, or structurally arrange for these surveys to be administered during class time. Additionally, surveys like these could reveal how student characteristics may be determinants for the effectiveness of a nudge.
Our study demonstrates how nudges can be implemented as a teaching strategy even under challenging circumstances. Specifically, this study resulted in behavior changes in students under extraordinary circumstances for both teachers and students. Teachers, facing the extra pressure and difficulties of online teaching during the COVID-19 pandemic, were capable of successfully using nudges to change autonomous learning behavior. This is in sharp contrast with previous successful interventions, which were often not being applied due to time constraints [16]—and that is when teaching in regular circumstances without the additional pressure caused by the COVID-19 pandemic.

5. Conclusions

In this study, we investigated nudging as a possible tool to promote autonomous learning behavior in students, targeting three important behaviors: planning, preparing for class, and asking questions. We found mixed results: nudging increased asking questions and learning outcomes in students, and we found evidence for an increase in planning behavior for nudged students, but no effect of nudging on preparing for class or deadline adherence. Our findings suggest that nudging can have potential as a tool for improving some forms of autonomous learning behavior. Additionally, the nudges appear to be quick and easy to implement given the time constraints and pressure teachers faced during the experiment due to COVID-19. Our mixed findings on different behaviors allow us to speculate about characteristics of autonomous learning behavior that is more likely to be affected by nudging: in-class behavior that is within reach for a student. Future research should focus on designing new nudges for different types of autonomous learning behaviors. When this is done successfully, teachers can use nudges as an additional teaching strategy to support students to become more autonomous learners, helping them both in their academic career as well as after graduation.

Author Contributions

Conceptualization, R.W., B.d.K., Y.V. and F.P.; Methodology, R.W., B.d.K. and Y.V.; Validation, R.W.; Formal Analysis, R.W.; Investigation, R.W. and Y.V.; Data Curation, R.W.; Writing—Original Draft Preparation, R.W.; Writing—Review & Editing, B.d.K., Y.V. and F.P.; Visualization, R.W.; Supervision, B.d.K., Y.V. and F.P.; Project Administration, R.W., B.d.K. and Y.V.; Funding Acquisition, B.d.K. and Y.V. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Dutch organization for scientific research (NWO) [grant number 40.5.18540.132].

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethical Review Committee of the Department of Psychology, Education, and Child Studies; Erasmus University Rotterdam (application 19-052) at the 10th of February 2020.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are openly available in Open Science Framework.

Conflicts of Interest

The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Appendix A

This study was conducted in 2020 and was affected by the COVID-19 pandemic and the subsequent measures taken by the Dutch government. The four experiments in this study were initially scheduled during the period of February–May 2020. Unfortunately, all participating schools were forced to close during the experiment, and the experiments were terminated early. In these experiments, a survey was included. This survey measured several constructs that could be affected by being nudged to investigate if the nudge affects these factors. The measures included in the survey are self-regulatory learning behavior (Self-Regulation of Learning Self-Report Scale [99]), student autonomy (Index of Autonomous Functioning [5]), behavioral engagement (Behavioral engagement measure [96]), agentic engagement (Agentic engagement measure [97]), and motivation (Academic Self-regulation Questionnaire [98]). All teachers distributed a pre-intervention survey among the participating students. After termination of the experiment, the collected pre-intervention surveys were used as a pilot, and the Self-Regulation of Learning Self-Report Scale [99] was replaced with the Motivated Strategies for Learning Questionnaire [100] due to lack of variance in the answers. The analysis of this pilot can be found in the additional materials.
The experiment resumed September–November 2020. Preregistration was finalized October 2020, prior to accessing the data. Due to classes possibly being given online, the used nudges had to be redesigned to be used in a fully online or hybrid classroom. Ultimately, the interventions were held in a physical classroom. A short additional design round for each group was held online. The nudge created for asking for help was redesigned by repeating the second session. The nudge created for preparing for class remained the same, as well as the nudges designed for planning, resulting in the final experimental setup. The improved survey was again implemented in the research. Students filled in a pre-measure at the start of the course, and a post-measure at the end of the course. Teachers distributed the link to the online survey to their students. However, the length of the survey provided an obstacle for the response rate. Across conditions, the response rate for the pre- and especially the post-measurement was so low (between 0% and 15% for each separate experiment) that any conclusions drawn from this data are not representative for the student sample. These questionnaires were therefore not considered for analysis and ultimately not included in the study.

Appendix B

In this experiment, we aim to improve both deadline completion and self-reported planning behavior through the use of a reminder nudge. The specific hypotheses for this experiment are as follows:
Hypothesis A1a.
students who received the reminder nudge would report more planning behavior than students in the control condition.
Hypothesis A1b.
students who received the reminder nudge would meet their deadlines more often than those in the control condition.
Participants. Of the 5 recruited teachers, 4 ultimately participated in the experiment. Classes of the participating teachers were divided randomly between control and nudge classes. Four classes with in total 89 students participated in the nudge condition, and four classes with in total 98 students participated in the control condition. The average class contained 23 students. All participants were VET-students in the sector tourism and recreation.
Design. In this experiment, reminders were used as nudge to improve planning behavior and academic results. Based on Graham et al.’s suggestion for “educators to think carefully not just about the message they are trying to convey, but also the mode of communication” [37] (p. 43) when using reminders, the content, frequency, and mode of communication of the messages was determined by the teachers and approved by the researchers. Given the online context of the experiment, students were reminded weekly by their teacher of the upcoming deadlines for the course in a separately sent email.
Measures. Two outcome measures were used during this experiment. To measure successful adherence to planning, a short questionnaire was designed that accompanied the final test or deadline. This questionnaire consisted of four questions about a student’s planning adherence for that test or deadline, with answers on a seven-point scale. A sample question is “I was able to do all preparation I wanted to do for this [deadline/test]”. This questionnaire can be found in the additional materials. As an additional measure, the final grade of the course was collected.
Results. Due to the pandemic and government measures intensifying near the end of the experimental period, the participating teachers saw no possibility to present their students with the planning questionnaire intended to accompany the test, as it had to be suddenly moved to an online environment. This leaves Hypothesis 1a unable to be answered.
For Hypothesis 1b, of the 187 students, only 7 (3.7%) of these students missed their deadline, of which 6 were in the same class. Complementary teacher interviews revealed this class experienced mentor difficulties during the period of study. This leaves comparing between experimental groups based on the number of missed deadlines unsubstantial.
Lastly, the groups were compared based on their course grade. As the average grade for the control condition (M = 6.35, SD = 1.32) was exactly equal to the average grade in the nudge condition (M = 6.35, SD = 1.68), the analysis revealed indeed no significant effect of reminders on course grade: Estimate = 0.00, SE = 0.27, t(7.89) = 0.02, p = 0.99.
Discussion. Based on the available data, there is not much we are able to conclude about the effect of the reminder nudge, as it became apparent that both students who did and did not receive the nudge generally were very capable of realizing their deadline. The lack of filled in questionnaires makes it difficult to assess self-reported planning behavior. In short, this experiment is inconclusive about the usage of reminders to promote planning behavior. Future research could repeat this experiment using a larger sample size and a target student population prone to missing deadlines.

References

  1. Brockmann, M.; Clarke, L.; Winch, C. Knowledge, skills, competence: European divergences in vocational education and training (VET)—The English, German and Dutch cases. Oxf. Rev. Educ. 2008, 34, 547–567. [Google Scholar] [CrossRef]
  2. Vancouver, J.B.; Halper, L.R.; Bayes, K.A. Regulating our own learning. In Autonomous Learning in the Workplace; Ellingson, J.E., Noe, R.A., Eds.; Routledge: New York, NY, USA, 2017; pp. 95–116. [Google Scholar] [CrossRef]
  3. Kormos, J.; Csizér, K. The Interaction of Motivation, Self-Regulatory Strategies, and Autonomous Learning Behavior in Different Learner Groups. Tesol Q. 2014, 48, 275–299. [Google Scholar] [CrossRef]
  4. Reeve, J.; Ryan, R.; Deci, E.L.; Jang, H. Understanding and promoting autonomous self-regulation: A self-determination theory perspective. In Motivation and Self-Regulated Learning: Theory, Research and Applica-Tions; Schunk, D.H., Zimmerman, B.J., Eds.; Lawrence Erlbaum: New York, NY, USA, 2008; pp. 223–244. [Google Scholar]
  5. Weinstein, N.; Przybylski, A.K.; Ryan, R.M. The index of autonomous functioning: Development of a scale of human autonomy. J. Res. Pers. 2012, 46, 397–413. [Google Scholar] [CrossRef]
  6. Little, D. Learner Autonomy and Second/Foreign Language Learning. In The Guide to Good Practice for Learning and Teaching in Languages, Linguistics and Area Studies. 2002. Available online: https://www.researchgate.net/publication/259874624_Learner_autonomy_and_second_foreign_language_learning (accessed on 23 December 2022).
  7. Little, D.G. Learner autonomy: Definitions, issues and problems. In Authentik Language Learning Resources; Authentik: Dublin, Ireland, 1991. [Google Scholar]
  8. Wijnia, L.; Loyens, S.M.M.; Derous, E.; Schmidt, H.G. Do students’ topic interest and tutors’ instructional style matter in problem-based learning? J. Educ. Psychol. 2014, 106, 919–933. [Google Scholar] [CrossRef]
  9. Dignath, C.; Büttner, G. Components of fostering self-regulated learning among students. A meta-analysis on intervention studies at primary and secondary school level. Metacognition Learn. 2008, 3, 231–264. [Google Scholar] [CrossRef]
  10. Harris, K.R.; Lane, K.L.; Graham, S.; Driscoll, S.A.; Sandmel, K.; Brindle, M.; Schatschneider, C. Practice-Based Professional Development for Self-Regulated Strategies Development in Writing: A randomized controlled study. J. Teach. Educ. 2012, 63, 103–119. [Google Scholar] [CrossRef]
  11. Zimmerman, B.J. Self-Regulated Learning and Academic Achievement: An Overview. Educ. Psychol. 1990, 25, 3–17. [Google Scholar] [CrossRef]
  12. Reeve, J.; Cheon, S.H. Autonomy-supportive teaching: Its malleability, benefits, and potential to improve educational practice. Educ. Psychol. 2021, 56, 54–77. [Google Scholar] [CrossRef]
  13. Dochy, F.; Berghmans, I.; Koenen, A.-K.; Segers, M. Bouwstenen Voor High Impact Learning–Het Leren van de Toekomst in Onderwijs en Organisaties Building Blocks for High Impact Learning–the Learning of the Future in Education and Organisations; Boom: Amsterdam, The Netherlands, 2016. [Google Scholar]
  14. Social and Economic Council. Toekomstgericht Beroepsonderwijs. Deel 2 Voorstellen Voor een Sterk en Innovatief Beroepsonderwijs Advies 15/09 [Future-Oriented Vocational Education. Part 2 Proposals for Strong and Innovative Vocational Education]; Social Economic Council: Den Haag, The Netherlands, 2017.
  15. de Bruijn, E.; Leeman, Y. Authentic and self-directed learning in vocational education: Challenges to vocational educators. Teach. Teach. Educ. 2011, 27, 694–702. [Google Scholar] [CrossRef]
  16. Dignath, C.; Veenman, M.V.J. The Role of Direct Strategy Instruction and Indirect Activation of Self-Regulated Learning—Evidence from Classroom Observation Studies. Educ. Psychol. Rev. 2021, 33, 489–533. [Google Scholar] [CrossRef]
  17. Lewthwaite, R.; Chiviacowsky, S.; Drews, R.; Wulf, G. Choose to move: The motivational impact of autonomy support on motor learning. Psychon. Bull. Rev. 2015, 22, 1383–1388. [Google Scholar] [CrossRef] [PubMed]
  18. Ryan, R.M.; Deci, E.L. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am. Psychol. 2000, 55, 68–78. [Google Scholar] [CrossRef] [PubMed]
  19. Collins, A.; Brown, J.S.; Newman, S.E. Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. In Knowing, Learning and Instruction; Resnick, L.B., Ed.; Essays in honor of Robert Glaser; Lawrence Erlbaum Associates: Hillsdale, NJ, USA, 1989; pp. 453–495. [Google Scholar] [CrossRef]
  20. Lock, J.; Eaton, S.E.; Kessy, E. Fostering self-regulation in online learning in K-12 education. Northwest J. Teach. Educ. 2017, 12, 2–13. [Google Scholar] [CrossRef]
  21. Jossberger, H.; Brand-Gruwel, S.; Boshuizen, H.; van de Wiel, M. The challenge of self-directed and self-regulated learning in vocational education: A theoretical analysis and synthesis of requirements. J. Vocat. Educ. Train. 2010, 62, 415–440. [Google Scholar] [CrossRef]
  22. Thaler, R.H.; Sunstein, C.R. Nudge: Improving Decisions about Health, Wealth, and Happiness; Penguin: London, UK, 2008. [Google Scholar]
  23. Thaler, R.H.; Benartzi, S. Save More Tomorrow™: Using Behavioral Economics to Increase Employee Saving. J. Politi-Econ. 2004, 112, S164–S187. [Google Scholar] [CrossRef]
  24. Mertens, S.; Herberz, M.; Hahnel, U.J.J.; Brosch, T. The effectiveness of nudging: A meta-analysis of choice architecture interventions across behavioral domains. Proc. Natl. Acad. Sci. USA 2022, 119. [Google Scholar] [CrossRef]
  25. Stanovich, K.E. Who is Rational? Studies of Individual Differences in Reasoning; Psychology Press: New York, NY, USA, 1999. [Google Scholar]
  26. Kahneman, D. Thinking, Fast and Slow; Penguin: London, UK, 2011. [Google Scholar]
  27. Szaszi, B.; Palinkas, A.; Palfi, B.; Szollosi, A.; Aczel, B. A Systematic Scoping Review of the Choice Architecture Movement: Toward Understanding When and Why Nudges Work. J. Behav. Decis. Mak. 2018, 31, 355–366. [Google Scholar] [CrossRef] [Green Version]
  28. Damgaard, M.T.; Nielsen, H.S. Nudging in education. Econ. Educ. Rev. 2018, 64, 313–342. [Google Scholar] [CrossRef]
  29. Weijers, R.J.; de Koning, B.B.; Paas, F. Nudging in education: From theory towards guidelines for successful implementation. Eur. J. Psychol. Educ. 2021, 36, 883–902. [Google Scholar] [CrossRef]
  30. Castleman, B.L.; Page, L.C. Summer nudging: Can personalized text messages and peer mentor outreach increase college going among low-income high school graduates? J. Econ. Behav. Organ. 2015, 115, 144–160. [Google Scholar] [CrossRef]
  31. Oreopoulos, P.; Ford, R. Keeping College Options Open: A Field Experiment to Help all High School Seniors Through the College Application Process. J. Policy Anal. Manag. 2019, 38, 426–454. [Google Scholar] [CrossRef] [Green Version]
  32. Oreopoulos, P.; Petronijevic, U. The Remarkable Unresponsiveness of College Students to Nudging and What We Can Learn from It (No. w26059). Natl. Bur. Econ. Res. 2019. [Google Scholar] [CrossRef]
  33. Heckhausen, J.E.; Heckhausen, H.E. Motivation and Action; Cambridge University Press: Cambridge, UK, 2008. [Google Scholar]
  34. Weijers, R.J.; de Koning, B.B.; Klatter, E.B.; Paas, F. How Do Teachers in Vocational and Higher Education Nudge Their Students? A Qualitative Study. Forthcoming. Manuscript submitted for publication.
  35. Wachner, J.; Adriaanse, M.; De Ridder, D. The influence of nudge transparency on the experience of autonomy. Compr. Results Soc. Psychol. 2020. Advance online publication. [Google Scholar] [CrossRef]
  36. Vermunt, J.D.; Vermetten, Y.J. Patterns in Student Learning: Relationships Between Learning Strategies, Conceptions of Learning, and Learning Orientations. Educ. Psychol. Rev. 2004, 16, 359–384. [Google Scholar] [CrossRef]
  37. Graham, A.; Toon, I.; Wynn-Williams, K.; Beatson, N. Using ‘nudges’ to encourage student engagement: An exploratory study from the UK and New Zealand. Int. J. Manag. Educ. 2017, 15, 36–46. [Google Scholar] [CrossRef] [Green Version]
  38. Texas Education Agency. Clarification of Support vs. Cueing and Prompting Terms. 2011. Available online: https://tea.texas.gov/sites/default/files/staaralt-ClarSupportCuePrompt.pdf#:~:text=The%20difference%20between%20a%20cue,leading%20to%20a%20direct%20answer (accessed on 23 December 2022).
  39. Han, Y.J. Successfully flipping the ESL classroom for learner autonomy. NYS Tesol J. 2015, 2, 98–109. Available online: http://journal.nystesol.org/jan2015/Han_98-109_NYSTJ_Vol2Iss1_Jan2015.pdf (accessed on 23 December 2022).
  40. Ruggeri, K. (Ed.) Behavioral Insights for Public Policy: Concepts and Cases; Routledge: London, UK, 2019. [Google Scholar]
  41. Lavecchia, A.; Liu, H.; Oreopoulos, P. Behavioral Economics of Education: Progress and Possibilities. In Handbook of the Economics of Education; Elsevier: Amsterdam, The Netherlands, 2016; Volume 5, pp. 1–74. [Google Scholar] [CrossRef]
  42. Hansen, P.G. The Definition of Nudge and Libertarian Paternalism: Does the Hand Fit the Glove? Eur. J. Risk Regul. 2016, 7, 155–174. [Google Scholar] [CrossRef] [Green Version]
  43. Levy, Y.; Ramim, M.M. An experimental study of habit and time incentive in online-exam procrastination. In Proceedings of the Chais conference on instructional technologies research 2013: Learning in the technological era, Raanana, Israel, 19–20 February 2013; pp. 53–61. [Google Scholar]
  44. Patterson, R.W. Can behavioral tools improve online student outcomes? Experimental evidence from a massive open online course. J. Econ. Behav. Organ. 2018, 153, 293–321. [Google Scholar] [CrossRef]
  45. Zimmerman, B.J. Becoming a Self-Regulated Learner: An Overview. Theory Into Pract. 2002, 41, 64–70. [Google Scholar] [CrossRef]
  46. Wong, T.-L.; Xie, H.; Zou, D.; Wang, F.L.; Tang, J.K.T.; Kong, A.; Kwan, R. How to facilitate self-regulated learning? A case study on open educational resources. J. Comput. Educ. 2020, 7, 51–77. [Google Scholar] [CrossRef]
  47. Eilam, B.; Aharon, I. Students’ planning in the process of self-regulated learning. Contemp. Educ. Psychol. 2003, 28, 304–334. [Google Scholar] [CrossRef]
  48. Pintrich, P.R. A Conceptual Framework for Assessing Motivation and Self-Regulated Learning in College Students. Educ. Psychol. Rev. 2004, 16, 385–407. [Google Scholar] [CrossRef] [Green Version]
  49. Dobson, J.L. The use of formative online quizzes to enhance class preparation and scores on summative exams. Adv. Physiol. Educ. 2008, 32, 297–302. [Google Scholar] [CrossRef] [Green Version]
  50. Jones, R.C. The “Why” of Class Participation: A Question Worth Asking. Coll. Teach. 2008, 56, 59–63. [Google Scholar] [CrossRef]
  51. Jansen, R.S.; van Leeuwen, A.; Janssen, J.; Conijn, R.; Kester, L. Supporting learners’ self-regulated learning in Massive Open Online Courses. Comput. Educ. 2020, 146, 103771. [Google Scholar] [CrossRef]
  52. Pearson, J.C.; West, R. An initial investigation of the effects of gender on student questions in the classroom: Developing a descriptive base. Commun. Educ. 1991, 40, 22–32. [Google Scholar] [CrossRef]
  53. Gurtner, J.-L.; Cattaneo, A.; Motta, E.; Mauroux, L. How Often and for What Purposes Apprentices Seek Help in Workplaces: A Mobile Technology-Assisted Study. Vocat. Learn. 2011, 4, 113–131. [Google Scholar] [CrossRef] [Green Version]
  54. Weijers, R.J. Nudging towards Autonomy: The Effect of Nudging on Autonomous Learning Behavior in Tertiary Education. Doctoral Dissertation, Erasmus University Rotterdam, Rotterdam, The Netherlands, 2022. [Google Scholar]
  55. Hansen, P.G. BASIC: Behavioural Insights Toolkit and Ethical Guidelines for Policy Makers; OECD Publishing: Berlin, Germany, 2018. [Google Scholar] [CrossRef]
  56. Anderson, T.; Shattuck, J. Design-based research: A decade of progress in education research? Educ. Res. 2012, 41, 16–25. [Google Scholar] [CrossRef] [Green Version]
  57. Morton, K.L.; Atkin, A.J.; Corder, K.; Suhrcke, M.; Turner, D.; van Sluijs, E.M.F. Engaging stakeholders and target groups in prioritising a public health intervention: The Creating Active School Environments (CASE) online Delphi study. BMJ Open 2017, 7, e013340. [Google Scholar] [CrossRef] [Green Version]
  58. Behavioural Insights Team. EAST: Four Simple Ways to Apply Behavioural Insights; Cabinet Office: London, UK, 2014. [Google Scholar]
  59. Behavioural Insights Team. EAST Cards; Cabinet Office: London, UK, 2017. [Google Scholar]
  60. Barr, D.J.; Levy, R.; Scheepers, C.; Tily, H.J. Random effects structure for confirmatory hypothesis testing: Keep it maximal. J. Mem. Lang. 2013, 68, 255–278. [Google Scholar] [CrossRef] [Green Version]
  61. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2020; Available online: https://www.R-project.org/ (accessed on 23 December 2022).
  62. Bates, D.; Mächler, M.; Bolker, B.; Walker, S. Fitting Linear Mixed-Effects Models Using lme. J. Stat. Softw. 2015, 67, 48. [Google Scholar] [CrossRef]
  63. Singmann, H.; Bolker, B.; Wesfall, J.; Aust, F.; Ben-Schachar, M.S. Afex: Analysis of Factorial Experiments (R Package, Version 0.27–2) [Computer Software]. Available online: https://CRAN.R-project.org/package=afex (accessed on 23 December 2022).
  64. Aronson, E. The power of self-persuasion. Am. Psychol. 1999, 54, 875–884. [Google Scholar] [CrossRef]
  65. Glock, S.; Müller, B.C.N.; Ritter, S.M. Warning labels formulated as questions positively influence smoking-related risk perception. J. Health Psychol. 2013, 18, 252–262. [Google Scholar] [CrossRef] [PubMed]
  66. Mussweiler, T.; Neumann, R. Sources of Mental Contamination: Comparing the Effects of Self-Generated versus Externally Provided Primes. J. Exp. Soc. Psychol. 2000, 36, 194–206. [Google Scholar] [CrossRef] [Green Version]
  67. Oettingen, G.; Mayer, R.; Sevincer, A.T.; Stephens, E.J.; Pak, H.-J.; Hagenah, M. Mental Contrasting and Goal Commitment: The Mediating Role of Energization. Pers. Soc. Psychol. Bull. 2009, 35, 608–622. [Google Scholar] [CrossRef] [Green Version]
  68. Gollwitzer, P.M. Implementation intentions: Strong effects of simple plans. Am. Psychol. 1999, 54, 493–503. [Google Scholar] [CrossRef]
  69. Duckworth, A.L.; Grant, H.; Loew, B.; Oettingen, G.; Gollwitzer, P.M. Self-regulation strategies improve self-discipline in adolescents: Benefits of mental contrasting and implementation intentions. Educ. Psychol. 2011, 31, 17–26. [Google Scholar] [CrossRef]
  70. Duckworth, A.L.; Kirby, T.A.; Gollwitzer, A.; Oettingen, G. From Fantasy to Action. Soc. Psychol. Pers. Sci. 2013, 4, 745–753. [Google Scholar] [CrossRef] [Green Version]
  71. Walton, G.M.; Wilson, T.D. Wise interventions: Psychological remedies for social and personal problems. Psychol. Rev. 2018, 125, 617–655. [Google Scholar] [CrossRef]
  72. Cialdini, R.B. Influence: The Psychology of Persuasion; Pearson Education, Inc.: Boston, MA, USA, 1993. [Google Scholar]
  73. Ariely, D.; Wertenbroch, K. Procrastination, Deadlines, and Performance: Self-Control by Precommitment. Psychol. Sci. 2002, 13, 219–224. [Google Scholar] [CrossRef]
  74. Dewies, M.; Schop-Etman, A.; Rohde, K.I.M.; Denktaş, S. Nudging is Ineffective When Attitudes Are Unsupportive: An Example from a Natural Field Experiment. Basic Appl. Soc. Psychol. 2021, 43, 213–225. [Google Scholar] [CrossRef]
  75. Hansen, P.G.; Jespersen, A.M. Nudge and the Manipulation of Choice: A framework for the responsible use of the nudge approach to behaviour change in public policy. Eur. J. Risk Regul. 2013, 4, 3–28. [Google Scholar] [CrossRef] [Green Version]
  76. Hansen, P.G.; Skov, L.R.; Skov, K.L. Making Healthy Choices Easier: Regulation versus Nudging. Annu. Rev. Public Health 2016, 37, 237–251. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  77. Kim, J. Influence of group size on students’ participation in online discussion forums. Comput. Educ. 2013, 62, 123–129. [Google Scholar] [CrossRef]
  78. Parks-Stamm, E.J.; Zafonte, M.; Palenque, S.M. The effects of instructor participation and class size on student participation in an online class discussion forum. Br. J. Educ. Technol. 2017, 48, 1250–1259. [Google Scholar] [CrossRef]
  79. Hoyt, D.P.; Lee, E.J. Technical Report No. 12: Basic Data for the Revised IDEA System; The IDEA Center: Manhattan, KS, USA, 2002. [Google Scholar]
  80. Brown, A.W.; Li, P.; Brown, M.M.B.; AKaiser, K.; Keith, S.W.; Oakes, J.M.; Allison, D.B. Best (but oft-forgotten) practices: Designing, analyzing, and reporting cluster randomized controlled trials. Am. J. Clin. Nutr. 2015, 102, 241–248. [Google Scholar] [CrossRef] [Green Version]
  81. Chi, M.T.H.; Wylie, R. The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes. Educ. Psychol. 2014, 49, 219–243. [Google Scholar] [CrossRef]
  82. Aleven, V.; Stahl, E.; Schworm, S.; Fischer, F.; Wallace, R. Help Seeking and Help Design in Interactive Learning Environments. Rev. Educ. Res. 2003, 73, 277–320. [Google Scholar] [CrossRef] [Green Version]
  83. Chin, C.; Osborne, J. Students’ questions: A potential resource for teaching and learning science. Stud. Sci. Educ. 2008, 44, 1–39. [Google Scholar] [CrossRef]
  84. Yu, F.-Y.; Chen, Y.-J. Effects of student-generated questions as the source of online drill-and-practice activities on learning. Br. J. Educ. Technol. 2014, 45, 316–329. [Google Scholar] [CrossRef]
  85. Aflalo, E. Students generating questions as a way of learning. Act. Learn. High. Educ. 2018, 22, 63–75. [Google Scholar] [CrossRef]
  86. Hay, I.; Elias, G.; Fielding-Barnsley, R.; Homel, R.; Freiberg, K. Language delays, reading delays, and learning difficulties: Interactive elements requiring multidimensional programming. J. Learn. Disabil. 2007, 40, 400–409. [Google Scholar] [CrossRef] [PubMed]
  87. Dolan, P.; Galizzi, M.M. Like ripples on a pond: Behavioral spillovers and their implications for research and policy. J. Econ. Psychol. 2015, 47, 1–16. [Google Scholar] [CrossRef] [Green Version]
  88. Marchiori, D.R.; Adriaanse, M.A.; De Ridder, D.T. Unresolved questions in nudging research: Putting the psychology back in nudging. Soc. Pers. Psychol. Compass 2017, 11, e12297. [Google Scholar] [CrossRef]
  89. de Ridder, D.; Kroese, F.; van Gestel, L. Nudgeability: Mapping Conditions of Susceptibility to Nudge Influence. Perspect. Psychol. Sci. 2022, 17, 346–359. [Google Scholar] [CrossRef]
  90. Weijers, R.J.; Ganushchak, L.; Ouwehand, K.; de Koning, B.B. “I’ll Be There”: Improving Online Class Attendance with a Commitment Nudge during COVID-Basic Appl. Soc. Psychol. 2022, 44, 12–24. [Google Scholar] [CrossRef]
  91. van Oldenbeek, M.; Winkler, T.J.; Buhl-Wiggers, J.; Hardt, D. Nudging in blended learning: Evaluation of email-based progress feedback in a flipped-classroom information systems course. In Proceedings of the 27th European Conference on In-formation Systems (ECIS), Stockholm and Uppsala, Uppsala, Sweden, 8–14 June 2019; 2019. Research Papers. ISBN 978-1-7336325-0-8. Available online: https://aisel.aisnet.org/ecis2019_rp/186 (accessed on 23 December 2022).
  92. Clark, D.; Gill, D.; Prowse, V.; Rush, M. Using Goals to Motivate College Students: Theory and Evidence from Field Experiments. Rev. Econ. Stat. 2020, 102, 648–663. [Google Scholar] [CrossRef]
  93. Michie, S.; Van Stralen, M.M.; West, R. The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implement. Sci. 2011, 6, 42. [Google Scholar] [CrossRef] [Green Version]
  94. Olejniczak, K.; Śliwowski, P.; Leeuw, F. Comparing Behavioral Assumptions of Policy Tools: Framework for Policy Designers. J. Comp. Policy Anal. Res. Prac. 2020, 22, 498–520. [Google Scholar] [CrossRef]
  95. Pintrich, P.R.; Smith, D.A.F.; Garcia, T.; McKeachie, W.J. Reliability and Predictive Validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educ. Psychol. Meas. 1993, 53, 801–813. [Google Scholar] [CrossRef]
  96. Elffers, L. Staying on track: Behavioral engagement of at-risk and non-at-risk students in post-secondary vocational education. Eur. J. Psychol. Educ. 2013, 28, 545–562. [Google Scholar] [CrossRef] [Green Version]
  97. Reeve, J. How students create motivationally supportive learning environments for themselves: The concept of agentic engagement. J. Educ. Psychol. 2013, 105, 579–595. [Google Scholar] [CrossRef] [Green Version]
  98. Ryan, R.M.; Connell, J.P. Perceived locus of causality and internalization: Examining reasons for acting in two domains. J. Pers. Soc. Psychol. 1989, 57, 749–761. [Google Scholar] [CrossRef]
  99. Toering, T.; Elferink-Gemser, M.T.; Jonker, L.; Van Heuvelen, M.J.; Visscher, C. Measuring self-regulation in a learning context: Reliability and validity of the Self-Regulation of Learning Self-Report Scale (SRL-SRS). Int. J. Sport Exerc. Psychol. 2012, 10, 24–38. [Google Scholar] [CrossRef]
  100. Pintrich, P.R.; De Groot, E.V. Motivational and self-regulated learning components of classroom academic performance. J. Educ. Psychol. 1990, 82, 33–40. [Google Scholar] [CrossRef]
Figure 1. Proposed theory of change of nudging relative to autonomous learning.
Figure 1. Proposed theory of change of nudging relative to autonomous learning.
Education 13 00049 g001
Figure 2. Example of a brainstorm playing card.
Figure 2. Example of a brainstorm playing card.
Education 13 00049 g002
Figure 3. A visual overview of the design process.
Figure 3. A visual overview of the design process.
Education 13 00049 g003
Figure 4. Overview of the timeline of Experiment 1.
Figure 4. Overview of the timeline of Experiment 1.
Education 13 00049 g004
Figure 5. Overview of the timeline of Experiment 2.
Figure 5. Overview of the timeline of Experiment 2.
Education 13 00049 g005
Figure 6. Overview of the timeline of Experiment 3.
Figure 6. Overview of the timeline of Experiment 3.
Education 13 00049 g006
Table 1. Frequency table of class preparation scores.
Table 1. Frequency table of class preparation scores.
ScoreTotal
(n = 652)
Control
(n = 320)
Nudge
(n = 332)
1 (no preparation)310 (46.8%)162 (50.6%)148 (44.6%)
2 (some preparation)124 (18.7%)55 (17.2%)69 (20.8%)
3 (most preparation)74 (11.2%)43 (13.4%)31 (9.3%)
4 (all preparation)144 (21.8%)60 (18.8%)84 (25.3%)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Weijers, R.; de Koning, B.; Vermetten, Y.; Paas, F. Nudging Autonomous Learning Behavior: Three Field Experiments. Educ. Sci. 2023, 13, 49. https://doi.org/10.3390/educsci13010049

AMA Style

Weijers R, de Koning B, Vermetten Y, Paas F. Nudging Autonomous Learning Behavior: Three Field Experiments. Education Sciences. 2023; 13(1):49. https://doi.org/10.3390/educsci13010049

Chicago/Turabian Style

Weijers, Robert, Björn de Koning, Yvonne Vermetten, and Fred Paas. 2023. "Nudging Autonomous Learning Behavior: Three Field Experiments" Education Sciences 13, no. 1: 49. https://doi.org/10.3390/educsci13010049

APA Style

Weijers, R., de Koning, B., Vermetten, Y., & Paas, F. (2023). Nudging Autonomous Learning Behavior: Three Field Experiments. Education Sciences, 13(1), 49. https://doi.org/10.3390/educsci13010049

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop