Next Article in Journal
New Examination Approach for Real-World Creativity and Problem-Solving Skills in Mathematics
Previous Article in Journal
Development and Validation of the Marburg Self-Regulation Questionnaire for Teachers (MSR-T)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Multi-Case Empirical Study on the Impact of Virtual Currency on Student Engagement and Motivation

1
Department of Computer Science, Winston-Salem State University, Winston Salem, NC 27110, USA
2
Department of Psychological Sciences, Winston-Salem State University, Winston Salem, NC 27110, USA
3
Department of Computing Sciences, Villanova University, Villanova, PA 19085, USA
*
Author to whom correspondence should be addressed.
Trends High. Educ. 2023, 2(3), 462-476; https://doi.org/10.3390/higheredu2030027
Submission received: 6 June 2023 / Revised: 5 July 2023 / Accepted: 10 July 2023 / Published: 17 July 2023

Abstract

:
While the motivational effect of educational gamification is largely recognized, the impact of the gamification element virtual currency (VC) is underexplored, especially in an educational setting. To address this gap, the goal of the presented multi-case empirical study was to systematically explore the impact of virtual currency on learners’ engagement, motivation, and academic performance across different contexts and to uncover potentially generalizable results. Accordingly, this paper presents the outcomes of a multi-perspective analysis of students’ experiences in out-of-class practicing in a learning environment gamified with VC and the effect of this game element. The work builds on previous case studies with analogical goals, which the authors have conducted in different contexts, including the university type, student population, subject area, etc. The provided comprehensive cross-case analysis integrates and extends the previous results tracing the path to generalizable knowledge about the potential of VC. While the results of this multi-case study demonstrate a significant increase in student engagement in out-of-class practicing gamified with virtual currency, they fail to show a significant increase in students’ intrinsic motivation and students’ final course grades. This study is a step forward in enhancing our understanding of the multifaceted effect of virtual currency on learners’ experience.

1. Introduction

Lack of motivation and inability to engage learners to achieve desired learning objectives are among the top barriers to learning [1]. Improving and maintaining learner motivation has consequently emerged as a key challenge in education. Gamified learning refers to technologies that attempt to engender motivations toward learning activities, typically by employing design strategies found in games [2,3]. Usually, the reason for gamifying an activity is that learners’ motivation to engage in it is low, and gamification is seen as a way to strengthen it. Many empirical studies have demonstrated that gamification can inspire motivation for engagement [4]. However, it is difficult to draw definite conclusions about gamification’s motivating potential in education [5] as existing research yields contradictory findings [4,5,6]. Since the existing literature presents an ambivalent picture of the effect of gamifying learning environments, several researchers (e.g., [5,7,8]) argue that gamification practice would benefit from a better understanding of the effect of individual game design elements on learners’ experience and motivational outcomes.
Many learning activities are centered on skill development, where students are not merely memorizing concepts and principles but applying them to solve problems. Pivotal to the notion of skill-based learning is the notion of practice. To develop needed skills, learners need to spend considerable time engaged in practicing problem-solving and developing subject-related skills in a variety of hands-on scenarios. Many available educational tools for practicing are crafted to support skill development through self-study [9], which makes it hard to mandate and control their use. Self-study tools typically support recommended activities, and as such, their use rarely counts towards the student’s course grade, which results in low usage [9]. Thus, although freely available, practice problems that accompany self-assessment tools are rarely fully utilized. This requires employing strategies for boosting student interest and engagement with such tools.
Gamification techniques are particularly popular in contexts where rewards are delayed, such as education, and attempt to provide rewards in the short term to motivate users to stay engaged in the activity [10]. The use of virtual currency (VC) to promote engagement in practicing is an example of reward-based gamified learning. Although VC falls into the reward category, it offers a more complex motivational mechanism driven by the possibility of earning certain rewards that enable obtaining some other desirable objects. It has the potential to appeal to both intrinsically and extrinsically motivated learners, as it can serve several functions. By scoring points when successfully performing tasks, students are rewarded with VC. This creates a simplistic economy, and work is rewarded with VC, which is used to acquire some desirable benefits. In a learning–practicing environment gamified with VC, students can earn VC based on the number, level of difficulty, and correctness of completed practice quizzes. They can spend it on some benefits, possibly course-related, such as deadline extensions, homework re-submission, and others, as decided by the instructors. Thus, in such an environment, there are two separate but related numerical values: a score which is a permanent indication of progress and VC, which is earned and spent on (course) benefits. As in this form, earning virtual currency evokes a perception of gaining benefits with a positive impact on course outcomes, and it is more extrinsic in nature. Still, extrinsic motivation can be beneficial in situations where it can be seen as a basis for the gradual development of intrinsic motivation [11]. On the other hand, based on Ryan and Deci [12], it can be assumed that VC can enhance intrinsic motivation when it is awarded for the accomplishment of specific challenges. Depending on learners’ motivational drivers, VC can be perceived as feedback, as an immediate reward, as a progress indicator, as an accomplishment, or as an incentive for practicing.
Virtual currency is an underexplored game element, especially in educational settings, so our goal was to systematically study it across different contexts to demonstrate potentially generalizable results. We chose to perform a multiple-case, exploratory study since a multi-case study design allows comparisons across several settings [13]. Many authors (e.g., [14,15]) emphasize that multiple case designs are needed for creating a generalizable theory under the replication logic of positivist case research.
The multi-case study reported here was designed with the goal of deepening the knowledge of the individual effect of virtual currency on learners practicing in a gamified learning environment. It was designed as a sequence of three case studies utilizing a single gamification element, VC, and guided by the same research questions but conducted in different contexts regarding the subject, type of university, student population, and academic background, as well as using different schemes for earning and spending VC. The following questions provided a framework for our exploration of evidence present in all case studies and guided the research focus.
RQ1: Does virtual currency encourage more active engagement in voluntary out-of-class practice?
RQ2: Does virtual currency improve students’ academic performance?
RQ3: Do gamified activities using virtual currency improve intrinsic motivation?
The case studies were conducted between 2019 and 2021. While the results of each study were published after completing it (see [16,17,18]), the goal of this paper was to cross-examine and aggregate all results and to perform some additional analyses. Thus, it consolidates the acquired knowledge on the individual effect of the game element virtual currency on learners, which has not been studied before in such scope and depth. This also improves our understanding of how the context influences the impact of VC on learners and how better to tailor its implementation to the specific context.

2. Related Work

A growing number of software systems supporting teaching and learning are making use of game design elements to increase learner engagement. This practice is commonly known as gamified learning [5,19]. Many game elements, including points, badges, leaderboards, competitions, and avatars, are used to engage and motivate learners in various learning activities [20]. One element popular in games but less common in gamified learning is virtual currency (VC) [21]. It is used to reward players and create an in-game economy. Virtual currencies are most powerful when complemented by a virtual marketplace where learners can spend their earned virtual bucks. Learners can earn VC by exhibiting robust performance in gamified activities and challenges. The stronger the performance, the more VC a learner earns [22]. Many games incorporate game design elements that can be redeemed for unlocking or buying objects (e.g., new characters, tools, weapons, stages, etc.). Utilizing rewards in such a way can enhance players’ motivation and engagement due to the possibility of achieving useful objects and tools and using them to progress and perform better in the game [23]. While this idea has been transferred to gamification in educational contexts, typically in the form of virtual currency [5,23], its effect on learners’ engagement and academic outcomes is underexplored.
O’Donovan’s gamified course [24] is among the first examples demonstrating the use of VC, together with a storyline, badges, and a leaderboard in a university-level gamified course. Although the study concluded that in-game currency was very well received, its effect was not statistically confirmed. Another early attempt at using VC was Vassileva et al.’s study of the effects of adding VC along with some social motivators to a university peer help system to incentivize students to help their peers [25]. Reports on the use of VC were favorable, but in general, when gamification is driven by several game elements, the isolation of the effect of individual elements is problematic.
Gamifying a Computer Science course with virtual currency (BitPoints) used together with levels and stars was proposed by Lopes [26]. BitPoints could be earned by overcoming obstacles associated with challenges (practical assignment exercises), where the amount of VC earned is proportional to the level of difficulty of the challenges. The available BitPoints could be used for purchasing information or tools (for use in solving other tasks). An explicit evaluation of VC’s impact on student learning has not been performed. An alternative kind of VC, in the form of coins, used for gamifying a Software Testing course, was studied by de Jesus et al. [27] but with inconclusive results. Outside of computing subjects, Duolingo [28] is a successful example of an app with an effective virtual currency implementation. Users are rewarded for each task they successfully complete in this language-learning app and can purchase bonus lessons, health points, and more with the in-app currency (lingots). Specifically, Munday [29] describes the use of Duolingo in college-level second language courses, where the Duolingo VC was used together with points, streaks, and crowns. Lingots were awarded for learning skills, going up levels, and long streaks (playing many days in a row) and could be used to unlock a bonus skill, a timed-practice option, a progress quiz, or a power-up. Another version of virtual currency, eCoins, was used in a Statistics course [30] in combination with levels, progress feedback, time pressure and pathways. The number of eCoins awarded for a successful attempt was a function of the experience points and the difficulty level of the attempted task. The earned eCoins could be used to remove parts of a question or an entire question from an activity test set. Virtual currency, as a feature for enhancing engagement, has also been studied in a MOOC environment [31], where redeemable points were reported as the second most engaging gamification mechanism. The use of a similar version of VC, called in-course redeemable rewards, along with badges, was reported in [32]. Redeemable rewards were issued to students for completing predefined tasks and could be exchanged for various privileges (e.g., unlock exclusive learning contents, extra attempts and/or more time to perform quizzes, and extended due dates of assignments). Nonetheless, subsequent studies [23] did not demonstrate a significant increase in student engagement.
A major limitation of the works above is that they report either preliminary studies with inconclusive results or informal observations without an explicit evaluation of the impact of VC on student engagement or specific learning outcomes. Therefore, they provide limited data for more diverse and fine-grained analysis.
With a more methodical approach, Snow et al. [33] studied how VC impacts in-system performance and learning outcomes in the context of an intelligent tutoring system (ITS). Students earn iBucks through their interactions with the ITS and use them to unlock game-based features. The study revealed that students who were more interested in spending their earned currency did not perform well and had lower scores on the learned skills. Still, gamifying an ITS is quite different from gamifying an academic course.
A more systematic exploration of the effect of VC on learners’ behavioral and psychological outcomes began with the work of Dicheva et al. [34]. In a Data Structures course gamified with badges, a leaderboard, and VC, students could earn and spend VC based on rules specified by the instructor. The earning rules were based on the amount, the level of difficulty, and the correctness of the solutions of completed problem-solving exercises. Students could spend their VC on purchases of deadline extensions, re-submission of homework, etc. The idea behind this form of gamification economy was to stimulate students to practice more (by incentivizing them with purchasable course-related benefits) in order to attain the intended learning outcomes. The reported results of the study confirmed that the targeted motivational effect was achieved but again without isolating the motivational effect of VC from the other elements used to gamify the course. This early work was followed by three consecutive studies with a focus on examining the effect of VC on learners enrolled in a Discrete Math course [16], in a Computer Networking course [17], and in a Discrete Structures course [18]. Unlike the previous studies, the authors empirically examined the individual effect of VC (which was the single gamification element used) in three different contexts (subject, student population, academic background, and earning and spending rules) as a step towards gaining more generalizable results. These three studies showed that using VC to gamify practicing increased student engagement. The idea of this work was to integrate the three studies in a framework providing ground for further multi-perspective analysis leading to more generalizable knowledge about using VC for gamifying practicing. As the three cases were chosen to differ in their gamified environments, they contain unexplored data that can help improve the understanding of how the context influences the impact of VC on learners’ engagement in practicing and how to better tailor its implementation to the specific context to achieve the intended outcomes.

3. Methods

3.1. Case Studies

Due to the specific nature of the intervention, we conducted quasi-experiments, where the applications of the interventions in the experiments were not randomized. Thus, different class sections played the roles of experimental and comparison groups. However, for both groups of each case study, the course had the same structure and content: the exact same syllabus, textbook, lectures, labs, assignments, tests, etc.
All case studies were semester-long studies. They were conducted between Fall 2019 and Spring 2021. A total of 171 students took part in the study: 91 in the experimental groups and 80 in the comparison groups.

3.1.1. Case Study A

The first case study was conducted in a Discrete Mathematics course at a public HBCU (Historically Black University) in North Carolina. The class in Fall 2019 (19 students) served as a comparison group, and the class in Spring 2020 (21 students) served as an experimental group.

3.1.2. Case Study B

This case study was conducted in a Discrete Structures course at a private research university in Pennsylvania. Both groups, the experimental one with 49 students and the comparison one with 33 students, took the course in Spring 2021.

3.1.3. Case Study C

This case study was conducted in a Computer Networking course at a private university in Missouri. Both groups, the experimental one with 21 students and the comparison with 28 students, took the course online in Fall 2020.
Table 1 below summarizes the demographic information for the studies. The gender, race, age, and major information related to the students in the experimental groups was collected with the pre-survey that they completed. The universities were selected to provide different contexts for the case studies regarding the participating student population. Thus, we have a university with predominantly African Americans, as well as others with predominantly European Americans; a university where the students are predominantly in the 18–25 age range, and one where the students are predominantly above that age range; private and public universities, as well as teaching and research universities. We wanted to see if these factors impact the results of our research.

3.2. Course Gamification Used in the Studies

Gamifying the Courses Used in the Studies

All courses participating in the multi-case study were gamified using OneUp, a highly configurable course gamification platform [34]. Experimental studies typically perceive the effect of some intervention. In our case, the intervention was the use of gamification, more specifically of virtual currency (VC) in a learning environment. The students of both the experimental and comparison groups had access and were encouraged to use OneUp as a practice system on their own. However, the gamification feature (virtual currency) was set only for the experimental group.
The instructors were free to produce their own VC earning rules and VC spending rules based on their individual teaching practice and preferences. Although OneUp does not set any restrictions on how the students can earn VC and what students can purchase with their accumulated VC, all instructors participating in this multi-case study chose to set rules for earning VC that depend on the number and score of taken practice quizzes (also known as warm-up challenges in OneUp) and to offer purchase for course-related benefits. The rules created by the instructors bear some similarities. Examples of common VC earning rules are completion of a practice quiz or home assignment with a high score, e.g., 85 or 90, or with a ‘passing score’, e.g., 70 (intended to be engaging and achievable for low-performing students). Another common rule is related to participation in class activities. Concerning the VC spending rules, rules are common for extending deadlines for turning in home or lab assignments, dropping the lowest lab or quiz grade, etc. It should be noted that most of these actions were part of the instructors’ previous practice without the need for the students ‘to pay’ for them.

3.3. Research Instruments and Data Collection

For all case studies, we used three complementary methods to collect data for answering the research questions. To answer the first research question (RQ1), data from the OneUp system log were extracted. They provided information concerning students’ visits to gamification-related pages, the number of unique warm-up challenges and multiple challenge attempts students have completed, etc. The final course grades of the experimental groups were compared with the grades of the corresponding comparison groups to assess the effect of gamifying the course on students’ academic performance (research question RQ3). As a theoretical basis for collecting information for assessing student motivation (RQ2), we selected the self-determination theory (SDT) [12,35,36].
SDT is a well-known psychological theory of motivation and behavior change. According to it, humans have three fundamental psychological needs: autonomy, competence, and relatedness [12,37]. People’s intrinsic motivation is related to the satisfaction of these needs. Thus, an individual is intrinsically motivated when the activities in which they participate make them feel that they have autonomy (can make their own choices), competence (can effectively perform the behavior), and relatedness (have reliable social connections with others). Intrinsic and extrinsic motivation are frequently discussed but rarely empirically studied constructs in gamification research, as several literature reviews report (e.g., [5,38]).
In particular, we were interested in whether gamifying learning activities would have an impact on the intrinsic motivation of the students because of the wide expectation that applying game design elements in a learning context would transfer some of the remarkable motivational powers of the games to it. Thus, we selected as an instrument for our motivational survey the Basic Psychological Needs Satisfaction Scale—Work Domain [39]. We chose this scale since there is considerable research that links elements of SDT to basic psychological needs, i.e., autonomy, competence, and relatedness [39,40,41]. We slightly modified the items of this Likert-type scale, ranging from 1 (not at all true) to 7 (very true), to represent work in the classroom instead of work on the job, e.g., “I feel like I can make a lot of inputs regarding how my classwork gets done”.
We were also interested in clarifying the relationship between the academic performance of the students measured by their course grades and their perceptions of gamified practicing measured by the intrinsic motivation inventory (IMI) [42]. Factors of the IMI were extracted for the current study as they are straightforwardly related to intrinsic motivation (i.e., interest/enjoyment and perceived choice) or internalization of motivation (i.e., value/usefulness). This would clarify whether any of these three factors could significantly predict students’ grades in the gamified environment.
In addition, in the motivational survey for the study, we included short versions of the following scales: the Performance Domain of Engagement: Student Course Engagement Questionnaire (SCEQ) [43], the Big Five Inventory-2-Extra Short Form on personality [44], and a 3-item Growth Mindset Scale [45]. Since these factors are relatively stable over time, they were included only in the pre-test survey.
We analyzed the collected data using a range of statistical methods, including a paired-sample t-test and regression analysis.

4. Results

Our main purpose here is not to present the complete original findings from each case study but rather to focus on a cross-case analysis. In this section, we compare how VC impacts student engagement, motivation, and academic performance in the three case studies. For student engagement, we look at the use of VC by the students in the experimental group and compare the number of warm-up challenges taken by the students in the experimental and comparison groups. We used these numbers as a measure of engagement, while students’ course grades were used as a measure of student academic performance. We used well-established motivational scales to measure student motivation (see Section 3.3).

4.1. The Use of Virtual Currency

We first looked at the use of VC in the three case studies. Table 2 presents information on the earning transactions (how students earned their VC) and spending transactions (how they spent their accumulated VC).
This information was extracted from the OneUp transaction log. Each VC earning transaction is recorded there when a student satisfies a VC earning rule defined by the instructor. Considering the numbers in the table, note that some instructors have opted for giving generous quantities of VC for completing learning activities, but their “prices” in the course shop were also high. As the table shows, there is a group of students in each of the cases, which either have no earning transactions at all or only have one or two transactions. For case studies A and B, the percentage of students in this group is very similar (16% and 17%), while it is higher for Case Study C, for which the group also included students who had two transactions.
The high percentages of the students who earned VC for taking a warm-up challenge with results of at least 90% correct (55% in Study A) and taking at least five warm-up challenges in one topic with results of 85% correct (37% in Study B) and 90% correct (40% in Study C) show the persistence of the students to improve their challenge scores by re-taking challenges.
It is interesting that a substantial number of the students who earned virtual currency, approximately one-third of the students in Case Study A (33%) and Case Study C (37%) and 16% in Case Study B did not make any purchase in the course shop. The distribution of the spending transactions by category shows that students favored buying extra credit points for a test or homework, dropping the lowest score on lab or quiz, and in Study 1, an extension of the homework deadline. To obtain insight into the reasons for a particular student to make a particular purchase, we directly asked them for a reason via a pop-up question in OneUp at the time of purchase (see Section 5).

4.2. Effect of VC on Student Engagement

We compared the number of warm-up challenges taken in both the control and experimental groups for each individual study to assess how virtual currency impacted students’ engagement in out-of-class practicing.
In Table 3 below, we present the total number of unique warm-up challenges completed, as well as the total number of attempts, which includes taking the same warm-up challenge multiple times by a student trying to improve their score. The average number of taken challenges per student for the experimental and control groups are also included in the table. We have presented details about the distribution of the students over different ranges of taken challenges for the individual cases in [16,17,18]. These distributions provide a better insight than presenting the average number of taken challenges per student.
It is difficult to compare the numbers of completed challenges across the different courses since the challenges created by the instructors consist of different numbers of problems; in Study B, a challenge typically included 5 problems, while in Study A, 10 problems were included. Moreover, instructors in Study A and Study C used dynamic problems, where one problem generates a number of similar problems. However, the difference between the experimental and comparison groups in the same case study clearly shows that in all three studies, the students from the experimental groups completed more unique warm-up challenges, as well as more challenge attempts.
While the increase in the number of warm-up challenges and the challenge attempts taken from the experimental group in Study A is close to 50% more than those of the comparison group, the increase in student engagement with OneUp for Study B and Study C is striking. For Study B, the number of warm-up challenges taken from the experimental group is close to four times (373%), and the number of challenge attempts is close to five times (470%) as many as those of the comparison group. For Study C, the increase in the number of warm-up challenges taken from the experimental group is close to 274%, and the number of challenge attempts is 300% higher than those of the comparison group. Thus, all three case studies answered positively to RQ1, “Does virtual currency encourage more active engagement in out-of-class practicing?”.

4.3. Effect of VC on Student Motivation

Pre-test and post-test surveys, including the Basic Psychological Needs Satisfaction Scale, were conducted with the experimental groups in all case studies at the beginning and at the end of the corresponding semester. A paired-sample t-test was conducted to explore potential pre-test to post-test differences in the autonomy, competence, and relatedness of the students to answer RQ3, “Do gamified activities using virtual currency improve intrinsic motivation?”. Table 4 presents the results of the t-test.
As the table shows, the pre-test to post-test effects for all dependent variables (autonomy, competence, and relatedness) for the participants in both Study A and Study C are not significant. However, the factor-level mean scores for the three factors imply that participants came to the study with strong ratings for them, and those strong ratings stayed stable from the pre-test to the post-test. These durable rather than malleable intrinsic motivation indicators demonstrate that using virtual currency as a gamification element did not significantly alter students’ basic psychological needs.
Differently, the t-test results for Study B indicate that the difference from pre-test to post-test on the relatedness factor of the Basic Need Satisfaction scale is significant (p = 0.02). While this means that participants who took both the pre-test and post-test felt more positively about how they related to other students in class, due to the nature of our intervention (gamifying out-of-class practice with only virtual currency), we could not attribute the change of the relatedness factor to it.
Thus, the results of our multi-case study consistently showed that contrary to some expectations using virtual currency did not have significant effects on the students’ intrinsic motivation (RQ2).

4.4. Effect of VC on Student Performance

Our last research question (RQ3) was meant to explore whether the gamification intervention had any impact on student academic performance, measured by the students’ final course grades.
In Study A, in addition to the final course grades of the experimental and comparison groups, students’ grades on Test 2 were also compared. This was carried out since Test 2 for both groups was conducted in the same condition, in person, while due to the COVID-19 pandemic, the course for the experimental group was transferred to an online mode. The results showed that the mean score of Test 2 grades was 78.68 for the comparison group and 81.67 for the experimental group (a difference of 2.99 points), while the mean score of the final grades was 82.37 for the comparison and 85.53 for the experimental group (a difference of 3.16 points). While the improvement is not significant, for the experimental group, the number in the final student grades increased, and the number of Ds and Fs significantly decreased in comparison to Test 2 grades.
Study B compared the course quiz grade scores of the groups as well as the final course grades. The results showed that the mean score of the quizzes was 79.40 for the comparison group and 84.32 for the experimental group (t-test p-value of 0.13), while the mean score of the final grades was 85.85 for the comparison and 87.73 for the experimental group (t-test p-value of 0.48). Thus, the results show that the improvement in academic performance is not significant.
In Study C, although the comparison of the course grades did not show significant improvement, it showed some improvements in course grades for the experimental group. The total percentages of As and Bs combined was 86% in the experimental group vs. 79% in the comparison group. In addition, the total percentages of Ds and Fs combined were 5% in the experimental group vs. 11% in the comparison group. In the experimental group, there were no Ds and fewer Fs than in the comparison group.
The results of the three case studies show that there is some, but not significant, improvement in the student’s academic performance. However, it is noticeable that the number of Ds and Fs across the studies decreased. This is important since one of the major goals of the instructors is to reduce course dropouts and failures.
After answering the initial research questions, we were interested in exploring relationships between students’ final course grades and their intrinsic motivation as measured by three factors of the intrinsic motivation inventory: value/usefulness, interest/enjoyment, and perceived choice [42]. For Case Study A, we conducted an exploratory stepwise regression analysis to determine which of these factors most strongly predicted participants’ final course grades. Thus, for this regression model, value/usefulness, interest/enjoyment, and perceived choice were the predictor variables, and the participant’s final grade was the outcome variable. Considering the results of our exploratory analysis, which illuminated value/usefulness as a significant predictor of final course grades above and beyond interest/enjoyment and perceived choice, and considering issues with multicollinearity between the three predictor variables in our datasets, we ran a series of simple regressions for Case Studies B and C using the same predictor and outcome variables to determine if value/usefulness would hold as a strong predictor of final course grades. The results are shown in Table 5.
As Table 5 shows, the only significant correlation found was for the factor value/usefulness in Study A. It emerged as a significant predictor of student course grades, explaining 32% of the course grades’ variance. The other two variables, interest/enjoyment and perceived choice, were excluded from the final stepwise regression model in Study A. For the participants in Study B and Study C, neither interest/enjoyment, perceived choice, nor value/usefulness appeared as a significant predictor of their final course grades.

5. Discussion

Comparing the results associated with the research questions in the three case studies, we found many commonalities but also some differences. This section elaborates on them.
Concerning students’ interaction with OneUp, in all cases, there was a group of students who either did not log in to OneUp even once or only took one or two warm-ups at the beginning of the semester (possibly out of curiosity). Some of these students were even awarded VC by the instructor for class activities, but none of them spent the awarded VC. Interestingly, some of them had final course grades either between 85 and 89 or between 65 and 69 and might have benefited from using the awarded virtual currency. However, they never logged in to OneUp to take advantage of this opportunity. A possible explanation for the students in the first group is that they did not feel they needed additional practicing as they were confident in their knowledge. Thus, having never logged in to OneUp, they were not aware that they had virtual currency to spend.
A possible explanation for the second group can be rooted in the expectancy value theory [46]. The related expectancy–value–cost model postulates that achievement behavior is mostly affected by the expectancy of success (a learner’s task-specific expectations for success), subjective task values (the enjoyment obtained from a task, the usefulness, and the importance of a task), and the cost to participate in the task (required effort and time, lost opportunities because of the participation, and emotional costs) [47]. In a follow-up study, we used the EVC scale [48] to examine whether expectancy-value beliefs predict engagement in gamified activities [11]. The corresponding findings indicate that the second group consists mostly of students who perceived the gamified activity as of low value, had low expectancy for success, or deem the activity too difficult, resulting in unwillingness or inability to engage in the gamified activity [11].
Another interesting result was that in all case studies, a sizeable number of the students did not spend any of their earned virtual currency. In addition, there is a variation in the studies for those who made purchases. Although the majority of students from Study B spent most of or even all their earned VC, in Study A and Study C, almost half of the earned VC remained unspent. Possible reasons for not spending VC include:
  • Students did not earn sufficient VC to buy desired items.
  • Desired items were time-sensitive, and students missed the deadline for buying them. For example, they might have been allowed to buy a 2-day lab assignment extension not later than one day before the assignment deadline).
  • Students collected VC with the intention of making purchases at the end of the course but then realized that they did not need any of the course benefits offered.
  • Students withdrew in the middle of the course when the course was converted to an online mode because of the COVID-19 pandemic (Study A).
  • Students were told by the instructor that all unspent VC would be credited to their lowest-performed lab, assignment, or project, so they knew that they were not actually losing it (Study C).
Concerning the reason for purchases, the students from Study B (50%) and Study C (60%) gave “worry for their performance” as the main reason. These are much higher percentages than the percentage reported for the same reason from Study A (18%). In contrast, Study A reported the highest percentage for “need more time” (27%) compared to only 4% for the same reason reported by both Study B and Study C. In a search for possible moderators for the observed differences, we looked at the difference in the demographics of the student population that participated in the studies: for Studies B and Study C, the European American students were predominant, and the universities were private, while for Study A, African American students were predominant, and the university was public. Certain cultural and economic specifics and differences could explain some behavioral tendencies of students. The students in Study B and Study C might have planned more carefully so they did not need an extension for homework. While most of the students in both Study A and Study C worked full time (for varied reasons), those in Study C were more mature (~67% were in the age group 26–45, and ~17% older than 46). This may explain why the students from Studies B and C did not favor buying deadlines for submitting homework and labs, while those from Study A did.
A good indicator of student engagement in a learning activity which is not required and graded, such as practicing (in our case, taking warm-up challenges), is how regularly and how persistently students do it. Our analyses show that in all three case studies, gamification intervention increased student engagement. The results of Study B and Study C confirmed our expectations by reporting significant increases. A likely explanation for the non-significant increase in student engagement in Study A is that it is due to the disruption of the normal instructional process because of the COVID-19 pandemic. For the experimental group in Study A, the mode of the course was changed to online in the middle of the semester, which negatively impacted the work of both students and instructors.
In addition to the significant increase in student engagement in practicing in a learning environment gamified with virtual currency, the results of this multi-case study do not demonstrate a significant change either in the intrinsic motivation of the students or in their final course grades. We hypothesized that by spending more time practicing and studying, encouraged by the use of virtual currency, students would significantly improve their performance [49]. However, the analysis does not support this hypothesis. While there could be different reasons for this result, such as the timing of practicing, knowledge retaining, and level of similarity of the practice problems offered in OneUp to the graded course quizzes and tests, there is one possibility that is more related to the nature of the game element used in this study, virtual currency. It is possible that some of the students would engage in practicing just to earn VC. So, instead of studying before practicing and taking time to analyze and understand their errors when their solution is incorrect, they look at the correct solution and immediately re-take the challenge to obtain a higher score. Such behaviors may signify that practicing for these students is instigated by the desire to gain course benefits instead of learning. Gaming the system is a well-known problem in educational environments and is not an easy one to prevent. Our future agenda includes work aimed at approaches to preventing such behaviors.
In addition to looking into the relationships between intrinsic motivation and students’ final course grades when using a practicing environment gamified with virtual currency, we were interested in how factors such as school type (HBCU vs. PWI), gender (male, female), or race (AA, EA, etc.) impact the relationship between distinct warm-ups/warm-up attempts taken and final course grades. We collected data related to these factors (see Section 3.3) in order to do a cross-institution analysis of their impact. Since the results from the multi-case study did not indicate a strong correlation between students’ intrinsic motivation and their course grades, we have not analyzed the impact of those factors on it.
A series of bivariate correlations were run to determine if there was a significant relationship between grades and warm-up attempts and grades and distinct warm-ups for each of the schools involved in Study A, Study B, and Study C. There was a moderate, positive relationship between the number of warm-up challenges students took and their final course grades for students from Study A, r(28) = 0.55, p = 0.00. Participants in Study A who completed more warm-up challenges also earned better end-of-semester grades. The bivariate correlation between grades and the number of warm-up attempts was also significant, r(28) = 0.419. Again, positive valence and moderate magnitude indicated a moderate, positive relationship between the participant’s grades and the number of warm-up attempts completed. Students who had better grades completed more warm-up attempts. However, the same relationship was not apparent for Study B or Study C. Next, a series of linear regressions (simple and multiple) was run to determine the intensity of relationships between the predictor variables and the outcome variable (final course grades). The regression indicated a significant effect of warm-ups on grades such that more warm-up attempts resulted in higher grades, R2 = 0.07, F(26) = 6.65, and p = 0.01. We deduced that this relationship might be related to school type considering the significant bivariate correlation between warm-up attempts, distinct warm-ups, and course grades for Study A’s participants and, thus, split the data by institution type (HBCU vs. PWI) and re-ran the analyses. We found that school type determined the relationship between warm-ups and grades for PWI students (b = 0.00, SE = 0.00, p = 0.03 *) rather than HBCU students (b = 0.01, SE = 0.00, p = 0.07 *). There was also a significant effect of distinct warm-ups and final course grades such that taking more distinct warm-ups was related to higher end-of-course grades for both HBCU and PWI students, R2 = 0.07, F(26) = 6.47, and p = 0.01.
Next, a multiple regression for basic psychological needs (autonomy, competence, and relatedness) was run to determine the intensity of their relationship to final course grades. Only competence emerged as a significant predictor of final course grades (b = 0.48, SE = 0.19, and p = 0.01 *). We then split the data by school type and found that school type (HBCU vs. PWI) does not impact the relationship between BSN and final course grades. However, when the data were split by gender, we found that for men, competence significantly predicted final course grades (b = 0.70, SE = 0.23, and p = 0.00).
When split by race, an interesting finding emerged from the data. The number of warm-up challenges for European Americans predicted final course grades (R2 = 0.08, F(26) = 3.79, and p = 0.05). However, this was not the case for the number of distinct warm-ups. Logically, engagement in the performance domain is a significant predictor of grades R2 = 0.14, F(26) = 9.07, and p = 0.00, which was significant for PWI students (b = 0.44, SE = 0.14, and p = 0.00 *) but not for HBCU students (b = 0.41, SE = 0.40, and p = 0.32).
Essentially, the results of this study shed further light on the situational factors that may impact the expected engagement and motivational effect of gamified learning activities.

6. Conclusions

Virtual currency is frequently included in games as a mechanism to increase player engagement and promote enjoyment. However, it is underused in gamifying learning environments and insufficiently studied. The goal of this paper is to investigate learners’ experiences with gamified practicing based on VC and examine empirically its effect on learners’ engagement and motivational outcomes using cross-case analysis. While this work builds on findings from our previous case studies, it aggregates and synthesizes the findings and provides rich insights into the impact of VC on various behavioral and psychological factors. The cross-evaluation of the prior results enabled the generalization of some effects of VC. The results demonstrated a significant improvement in students’ engagement in gamified practicing, while the impact of VC on student intrinsic motivation and academic achievements (measured by final course grades) was found to be insignificant. The latter inconclusive result warrants further empirical studies. The cross-institution analysis also sheds light on the impact of some factors, such as school type, gender, or race, on the relationship between students’ engagement in gamified practicing and their course grades. In addition, the paper discusses the earning and spending behavior of the involved learners in an attempt to deepen our knowledge of this more complex game element. The empirical observations suggest that further research is needed to connect the dots between earning and spending virtual currency in a gamified learning environment and link them to relevant learning outcomes.
We believe that this study made a significant step forward in enhancing our understanding of the multifaceted effect of virtual currency on learners’ experience in gamified learning environments.

Author Contributions

Conceptualization, D.D., K.I., C.D. and L.C.; formal analysis, D.D. and B.G.; investigation, D.D., K.I., C.D. and L.C.; methodology, D.D.; project administration, D.D; software, K.I.; validation, B.G.; writing—original draft, D.D, C.D. and B.G.; writing—review and editing, L.C. and K.I.; funding acquisition, D.D. and L.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the US National Science Foundation, grant number #1821189. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the NSF’s views.

Institutional Review Board Statement

All subjects gave their informed consent for inclusion before they participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Ethics Committee of Study # 19-0004 on 12 August 2019.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the individual case studies, the results of which are used in the present cross-case review.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy restrictions.

Acknowledgments

We thank the instructors Vassil Yorgov, Wen-Jung Hsin, and Robert Styer, who conducted the three empirical studies in their classes.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Priego, R.G.; Peralta, A.G. Engagement Factors and Motivation in E-Learning and Blended-Learning Projects. In ACM International Conference Proceeding Series; ACM: New York, NY, USA, 2013. [Google Scholar]
  2. Dicheva, D.; Dichev, C.; Agre, G.; Angelova, G. Gamification in education: A systematic mapping study. J. Educ. Technol. Soc. 2015, 18, 75–88. [Google Scholar]
  3. Hamari, J.; Koivisto, J. Why do people use gamification services? Int. J. Inf. Manag. 2015, 35, 419–431. [Google Scholar] [CrossRef]
  4. van Roy, R.; Zaman, B. Unravelling the ambivalent motivational power of gamification: A basic psychological needs perspective. Int. J. Hum. Comput. Study 2018, 127, 38–50. [Google Scholar] [CrossRef]
  5. Dichev, C.; Dicheva, D. Gamifying education: What is known, what is believed and what remains uncertain: A critical review. Int. J. Educ. Technol. High. Educ. 2017, 14, 1–36. [Google Scholar] [CrossRef] [Green Version]
  6. Hamari, J.; Koivisto, J.; Sarsa, H. Does gamification work?—A literature review of empirical studies on gamification. In Proceedings of the 47th Hawaii International Conference on System Sciences, Waikoloa, HI, USA, 6–9 June 2014; p. 2914. [Google Scholar]
  7. Nacke, L.E.; Deterding, S. The maturing of gamification research. Comput. Hum. Behav. 2017, 71, 450–454. [Google Scholar] [CrossRef] [Green Version]
  8. Van Roy, R.; Deterding, S.; Zaman, B. Uses and Gratifications of Initiating Use of Gamified Learning Platforms. In Proceedings of the CHI’18 Extended Abstracts, ACM CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018. [Google Scholar]
  9. Loboda, T.D.; Guerra, J.; Hosseini, R.; Brusilovsky, P. Mastery grids: An open-source social educational progress visualization. In Proceedings of the 2014 Conference on Innovation & Technology in Computer Science Education, Uppsala, Sweden, 21–25 June 2014. [Google Scholar]
  10. Garaialde, D.; Cox, A.L.; Cowan, B.R. Designing gamified rewards to encourage repeated app selection: Effect of reward placement. Int. J. Hum. -Comput. Stud. 2021, 153, 102661. [Google Scholar] [CrossRef]
  11. Dichev, C.; Dicheva, D.; Ismailova, R. Motivators Matter When Gamifying Learning Activities. In Proceedings of the 25th International Conference on Interactive Collaborative Learning (ICL), Wien, Austria, 27–30 September 2022. [Google Scholar]
  12. Ryan, R.M.; Deci, E.L. Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemp. Educ. Psychol. 2000, 25, 54–67. [Google Scholar] [CrossRef]
  13. Doolin, B. Alternative Views of Case Research in Information Systems. Australas. J. Inf. Syst. 1996, 3, 383. [Google Scholar] [CrossRef] [Green Version]
  14. Eisenhardt, K.M. Building Theories from Case Study Research. Acad. Manag. Rev. 1989, 14, 532–550. [Google Scholar] [CrossRef]
  15. Bass, J.M.; Beecham, S.; Noll, N. Experience of Industry Case Studies: A Comparison of Multi-Case and Embedded Case Study Methods (CESI’18). In Proceedings of the 6th International Workshop on Conducting Empirical Studies in Industry, Gothenburg, Sweden, 27 May–3 June 2018. [Google Scholar]
  16. Dicheva, D.; Guy, B.; Yorgov, V.; Dichev, C.; Irwin, K.; Mickle, C. A study of using virtual currency in a discrete mathematics course. In Proceedings of the 2021 IEEE Global Engineering Education Conference (EDUCON, 2021), Online, 21–23 April 2021. [Google Scholar]
  17. Dicheva, D.; Hsin, W.J.; Dichev, C.; Guy, B.; Cassel, L.; Irwin, K. Exploring the effect of virtual currency on learners engagement. In Proceedings of the International Conference on Advanced Learning Technologies (ICALT), Online, 12–15 July 2021; pp. 83–87. [Google Scholar]
  18. Dicheva, D.; Cassel, L.; Styer, R.; Dichev, C.; Guy, B.; Irwin, K. An Empirical Study of the Effects of Virtual Currency on Learners in Out of Class Practicing. In Proceedings of the 17th European Conference on Technology Enhanced Learning (EC-TEL), Toulouse, France, 12–16 September 2022. [Google Scholar]
  19. Sailer, M.; Homner, L. The gamification of learning: A meta-analysis. Educ. Psychol. Rev. 2020, 32, 77–112. [Google Scholar] [CrossRef] [Green Version]
  20. Albertazzi, D.; Gomes-Ferreira, M.G.; Forcellini, F.A. A wide view on gamification. Technol. Knowl. Learn. 2019, 24, 191–202. [Google Scholar] [CrossRef]
  21. Dicheva, D.; Irwin, K.; Dichev, C. OneUp: Engaging students in a gamified data structures course. In Proceedings of the 50th ACM SIGCSE Conference, Minneapolis, MN, USA, 27 February–2 March 2019. [Google Scholar]
  22. Virtual Currency. Available online: https://uk.nice.com/glossary/virtual-currency (accessed on 15 January 2023).
  23. Ortega-Arranz, A. Supporting Practitioners in the Gamification of MOOCs through Reward-Based Strategies. Ph.D. Thesis, Universidad de Valladolid, Valladolid, Spain, 2021. [Google Scholar]
  24. O’Donovan, S.; Gain, J.; Marais, P. A case study in the gamification of a university-level games development course. In Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference, East London, South Africa, 7–9 October 2013; pp. 242–251. [Google Scholar]
  25. Vassileva, J.; McCalla, G.I.; Greer, J.E. From small seeds grow fruitful trees: How the PHhelpS peer help system stimulated a diverse and innovative research agenda over 15 years. Int. J. Artif. Intell. Educ. 2016, 26, 431–447. [Google Scholar] [CrossRef] [Green Version]
  26. Lopes, R.P. An award system for gamification in higher education. In Proceedings of the 7th International Conference of Education, Research and Innovation (ICERI 2014), Seville, Spain, 17–19 November 2014. [Google Scholar]
  27. de Jesus, G.M.; Ferrari, F.C.; Paschoal, L.N.; Souza, S. Is It Worth Using Gamification on Software Testing Education? In Proceedings of the XVIII Brazilian Symposium on Software Quality (SBQS’ 19), Fortaleza, Brazil, 28 October–1 November 2019. [Google Scholar]
  28. Duolingo. Available online: https://www.duolingo.com/ (accessed on 15 January 2023).
  29. Munday, P. The case for using Duolingo as part of the language classroom experience. Rev. Iberoam. Educ. Distancia 2016, 19, 83–101. [Google Scholar] [CrossRef] [Green Version]
  30. Tenório, M.M.; Lopes, R.P.; Góis, L.A.; Junior, G. Design and Evaluation of a Gamified e-Learning System for Statistics Learning Activities. Lit. Inf. Comput. Educ. J. 2019, 10, 3078–3085. [Google Scholar] [CrossRef]
  31. Chan, J.W.; Wei, H.Y. Exploring engaging gamification mechanics in massive online open courses. Educ. Technol. Soc. 2016, 19, 177–203. [Google Scholar]
  32. Ortega-Arranz, A.; Kalz, M.; Martınez-Mones, M. Creating engaging experiences in MOOC through in-course redeemable rewards. In Proceedings of the IEEE Global Engineering Education Conference, Santa Cruz de Tenerife, Spain, 17–20 April 2018. [Google Scholar]
  33. Snow, E.L.; Allen, L.K.; Jackson, G.T.; McNamara, D.S. Spendency: Students’ Propensity to Use System Currency. Int. J. Artif. Intell. Educ. 2015, 25, 407–427. [Google Scholar] [CrossRef] [Green Version]
  34. Dicheva, D.; Irwin, K.; Dichev, C. OneUp: Supporting Practical and Experimental Gamification of Learning. Int. J. Serious Games 2018, 5, 5–21. [Google Scholar] [CrossRef]
  35. Deci, E.L.; Ryan, R.M. Self-Determination Theory. In International Encyclopedia of the Social & Behavioral Sciences; Wright, J., Ed.; Elsevier: Amsterdam, The Netherlands, 2015; pp. 486–491. [Google Scholar]
  36. Cerasoli, C.P.; Nicklin, J.M.; Ford, M.T. Intrinsic motivation and extrinsic incentives jointly predict performance: A 40-year meta-analysis. Psychol. Bull. 2014, 140, 980–1008. [Google Scholar] [CrossRef] [Green Version]
  37. Vansteenkiste, M.; Niemiec, C.P.; Soenens, B. The development of the five mini-theories of self-determination theory: An historical overview, emerging trends, and future directions. Adv. Motiv. Achiev. 2010, 16, 105–165. [Google Scholar]
  38. Seaborn, K.; Fels, D.I. Gamification in theory and action: A survey. Int. J. Hum.-Comput. Stud. 2015, 74, 14–31. [Google Scholar] [CrossRef]
  39. Deci, E.; Ryan, R.; Gagné, M.; Leone, D.; Usunov, J.; Kornazheva, B. Need satisfaction, motivation, and well-being in the work organizations of a former Eastern bloc country: A cross-cultural study of self-determination. Personal. Soc. Psychol. Bull. 2001, 27, 930–942. [Google Scholar] [CrossRef] [Green Version]
  40. Ilardi, B.C.; Leone, D.; Kasser, T.; Ryan, R.M. Employee and supervisor ratings of motivation: Main effects and discrepancies associated with job satisfaction and adjustment in a factory setting. J. Appl. Soc. Psychol. 1993, 23, 1789–1805. [Google Scholar] [CrossRef]
  41. Kasser, T.; Davey, J.; Ryan, R. Motivation and employee-supervisor discrepancies in a psychiatric vocational rehabilitation setting. Rehabil. Psychol. 1992, 37, 175–188. [Google Scholar] [CrossRef]
  42. Deci, E.L.; Eghrari, H.; Patrick, B.C.; Leone, D. Facilitating internalization: The self-determination theory perspective. J. Personal. 1994, 62, 119–142. [Google Scholar] [CrossRef]
  43. Handelsman, M.M.; Briggs, W.L.; Sullivan, N.; Towler, A. A Measure of College Student Course Engagement. J. Educ. Res. 2005, 98, 184–192. [Google Scholar] [CrossRef]
  44. Soto, C.J.; John, O.P. Short and extra-short forms of the Big Five Inventory–2: The BFI-2-S and BFI-2-XS. J. Res. Personal. 2017, 68, 69–81. [Google Scholar] [CrossRef]
  45. Growth Mindset Scale. The Stanford University Project for Education Research That Scales. Available online: http://sparqtools.org/mobility-measure/growth-mindset-scale/#all-survey-questions (accessed on 15 January 2023).
  46. Eccles, J.S.; Wigfield, A. Motivational beliefs, values, and goal. Annu. Rev. Psychol. 2002, 53, 109–132. [Google Scholar] [CrossRef] [Green Version]
  47. Hulleman, C.S.; Durik, A.M.; Schweigert, S.A.; Harackiewicz, J.M. Task value, achievement goals, and interest: An integrative analysis. J. Educ. Psychol. 2008, 100, 398–416. [Google Scholar] [CrossRef]
  48. Meyer, J.; Fleckenstein, J.; Köller, O. Expectancy value interactions and academic achievement: Differential relationships with achievement measures. Contemp. Educ. Psychol. 2019, 58, 58–74. [Google Scholar] [CrossRef]
  49. Louw, J.; Muller, J.; Tredoux, C. Time-on-task, technology and mathematics achievement. Eval. Program Plan. 2008, 31, 41–50. [Google Scholar] [CrossRef]
Table 1. Demographic information.
Table 1. Demographic information.
IndicatorCase Study ACase Study BCase Study C
# StudentsExperimental Group214921
Comparison Group193328
GenderMale70.8%62.5%86%
Female29.2%37.5%14%
RaceEuropean American/White20.8%75.1%61.1%
African American/Black54.2%4.5%11.1%
Mexican/Hispanic/Latin4.2%15.9%11.1%
Asian American4.5%5.6%
Other20.8%11.1%
Age18–25 age range80%100%16.7%
26–35 age range20%38.9%
36–45 age range27.7%
46 years of age or older16.7%
MajorComputer Science/IT58.3%34.1%85.7%
Mathematics33.3%15.9%
Other8.4%50%14.3%
Table 2. Summary of the VC earning and spending transactions for the case studies.
Table 2. Summary of the VC earning and spending transactions for the case studies.
Case Study ACase Study BCase Study C
VC earning transactions (ET)Total # ET554774693
% Students with 0, 1 or 2 ET17%16%30%
Major groups have taken between20–40 ET (31%),
41–60 ET (26%)
11–20 ET (27%),
21–30 ET (30%)
21–50 ET (22%),
51–100 ET (37%)
How the majority earned VC- Solving a challenge with score ≥ 90 (55%).
- Taking a new challenge with score >70 (34%).
- 5 challenges in 1 topic > 70% correct (51%).
- 5 challenges in 1 topic > 85% correct (37%).
- 5 challenges in 1 topic ≥ 90% correct (40%).
- New challenge > 70% correct (35%).
VC spending transactions (ST)I% Students w/earned VC but no purchases33%16%37%
Favorite purchases- Extra point for a test.
- Extension to HW deadline.
- Retake of a test problem.
- Extra point on an exam.
- 5 points on the final exam.
- Add 10% to a HW grade.
- Drop the lowest lab grade.
- Drop the lowest quiz grade.
- Skip a post or a peer response in a discussion.
% Student per reason to spend VCNeed extra time27%4%4%
Worry about performance18%50%60%
Has VC to spend22%46%18%
Prefer not to say33%20%18%
Table 3. Summary of the warm-up challenges taken by the experimental and comparison groups.
Table 3. Summary of the warm-up challenges taken by the experimental and comparison groups.
Case Study ACase Study BCase Study C
Comparison Group (19)Experimental Group (21)Comparison Group (33)Experimental Group (49)Comparison Group (28)Experimental Group (21)
# Unique warm-ups taken2423439853674198544
Average # unique warm-ups taken12.7316.3329.8574.987.0725.90
Total # attempts507746138464853691108
Average #attempts taken26.6835.5241.93132.3513.1752.76
Average # attempts per student28.1639.2665.90162.1211.9046.17
Largest groups (%Students) with #attempts in specified intervals44% in (11–30)
50% in (31–50)
26% in (31–50)
27% in (51–170)
44% in (1–50)23% in (51–100)
23% in (101–150)
18% in (11–30) 62% in (30–70)
24% > 100
Table 4. Results of the performed paired-samples t-test.
Table 4. Results of the performed paired-samples t-test.
Dependent
Variables
Case Study ACase Study BCase Study C
Pre-TestPost-TesttpPre-TestPost-testtpPre-TestPost-Testtp
MSDMSDMSDMSDMSDMSD
Autonomy4.70.534.960.73−1.180.264.520.644.680.79−1.100.284.60.794.850.87−1.670.11
Competence5.040.764.830.731.250.234.910.675.000.76−0.850.404.990.775.141.191.250.48
Relatedness4.220.724.260.88−0.400.694.250.744.561.0−2.290.024.220.724.31.20−0.360.72
Table 5. Results of the performed regression analyses.
Table 5. Results of the performed regression analyses.
Predictor VariablesCase Study A: StepwiseCase Study B: SimpleCase Study C: Simple
Zero-Order Correlation w/GradesBetatpZero-Order Correlation w/GradesBetatpZero-Order Correlation w/GradesBetatp
Value/Usefulness−0.56−0.335.080.00 *0.12−0.08−0.680.500.380.361.120.28
Interest/Enjoyment−0.400.080.250.810.04−0.02−0.210.830.410.211.780.09
Perceived Choice−0.49−0.32−1.410.180.27−0.15−1.50.13−0.30−0.15−1.240.23
* p < 0.05
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dicheva, D.; Guy, B.; Dichev, C.; Irwin, K.; Cassel, L. A Multi-Case Empirical Study on the Impact of Virtual Currency on Student Engagement and Motivation. Trends High. Educ. 2023, 2, 462-476. https://doi.org/10.3390/higheredu2030027

AMA Style

Dicheva D, Guy B, Dichev C, Irwin K, Cassel L. A Multi-Case Empirical Study on the Impact of Virtual Currency on Student Engagement and Motivation. Trends in Higher Education. 2023; 2(3):462-476. https://doi.org/10.3390/higheredu2030027

Chicago/Turabian Style

Dicheva, Darina, Breonte Guy, Christo Dichev, Keith Irwin, and Lillian Cassel. 2023. "A Multi-Case Empirical Study on the Impact of Virtual Currency on Student Engagement and Motivation" Trends in Higher Education 2, no. 3: 462-476. https://doi.org/10.3390/higheredu2030027

APA Style

Dicheva, D., Guy, B., Dichev, C., Irwin, K., & Cassel, L. (2023). A Multi-Case Empirical Study on the Impact of Virtual Currency on Student Engagement and Motivation. Trends in Higher Education, 2(3), 462-476. https://doi.org/10.3390/higheredu2030027

Article Metrics

Back to TopTop