Next Article in Journal
Exploring Methods for Predicting Important Utterances Contributing to Meeting Summarization
Next Article in Special Issue
Learning Prosody in a Video Game-Based Learning Approach
Previous Article in Journal
Strengthening Engagement in Science Understanding with Learning Trails

Multimodal Technologies Interact. 2019, 3(3), 49; https://doi.org/10.3390/mti3030049

Communication
Socrative in Higher Education: Game vs. Other Uses
Faculty of Education of Toledo, University of Castilla-La Mancha, 13071 Ciudad Real, Spain
*
Author to whom correspondence should be addressed.
Received: 15 May 2019 / Accepted: 4 July 2019 / Published: 6 July 2019

Abstract

:
The integration of clickers in Higher Education settings has proved to be particularly useful for enhancing motivation, engagement and performance; for developing cooperative or collaborative tasks; for checking understanding during the lesson; or even for assessment purposes. This paper explores and exemplifies three uses of Socrative, a mobile application specifically designed as a clicker for the classroom. Socrative was used during three sessions with the same group of first-year University students at a Faculty of Education. One of these sessions—a review lesson—was gamified, whereas the other two—a collaborative reading activity seminar, and a lecture—were not. Ad-hoc questionnaires were distributed after each of them. Results suggest that students welcome the use of clickers and that combining them with gamification strategies may increase students’ perceived satisfaction. The experiences described in this paper show how Socrative is an effective means of providing formative feedback and may actually save time during lessons.
Keywords:
Socrative; gamification; clickers; English language teaching; Higher Education; formative feedback

1. Introduction

Clickers have become a very common tool in educational contexts since they offer several advantages while increasing students’ interest. They are Personal Response Systems (PRS) that provide instant feedback and can be used in the classroom to assess students’ opinions, to encourage interaction or just to keep attention during long lectures. Clickers are especially useful when working with large groups to make teachers aware of students’ pace in content acquisition and comprehension. Apart from PRS hardware such as remote devices, different applications have been designed to function as clickers, as BYOD (Bring Your Own Device) practice [1], which makes the experience easier to carry out in most educational contexts where tablets, PCs or smartphones are not scarce among students.
Many scholars highlight the increase in motivation and interactivity in the classroom with the implementation of PRSs [2]. Technology use usually correlates with an increase in student engagement [3] and, in fact, these tools can be a successful way to increase students’ performance while engaging them in the learning process [4,5,6].
Real-time responses ensure an interactive atmosphere between peers and with the teacher. Consequently, students can benefit from the advantages of immediate “formative feedback” [7], which is one of the main assets of these tools.
The clicker used for the experiences described in this paper was Socrative, a mobile application created to provide educational support through a real-time question/answer system. Our main aim was to design different sessions where Socrative could be implemented to enhance University students’ learning outcomes in a course on English Language and its Didactics.
Some weaknesses -regarding both linguistic and methodological contents- were detected among our students. This application was suitable to design specific lesson plans offering our students different learning opportunities to overcome these shortcomings. Furthermore, our study aimed to address the relevance of clickers for each of the sessions and to check whether students noticed an improvement in the teaching-learning process, especially regarding the problems that were tackled in each session. One of the sessions incorporated Socrative to a cooperative review game, while the other two sessions (a lecture and a collaborative reading task) did not include gamified elements, this fact enabled further comparison between sessions with game-based and non-game-based support. The three uses have been examined and analyzed separately with the assistance of independent anonymous questionnaires after each session.
The exploration of the uses of Socrative here described may let teachers know some of the functions that this tool can fulfil in the classroom. The methodological proposals described might be of special interest for other language teachers, although, regardless of the discipline, they could be replicated in any higher education setting. From a research perspective, the main question addressed in this study arose from the fact that clickers, as discussed under next section, have sometimes been regarded as a way of gamification per se. We aimed at checking whether their sheer use, without either adding any proper gamification elements or turning to the gamification elements many of them offer, was enough for the perceived satisfaction to equate that of a gamified experience.
This paper starts with an up-to-date state-of-the-art review on clickers and gamification (Section 2), and then moves on to a description of Socrative itself. Section 4 offers three replicable examples of how this clicker may be used in Higher Education settings to promote interactivity and multidirectional formative feedback. Our quantitative research methodology is explained in Section 5, its results are represented in Section 6 and further discussed in Section 7, and finally the conclusion closes the article by summarizing the main findings, contributing to answer our research question.

2. Related Work

This section explores previous work describing different experiences with clickers and how they are related to gamification theories.

2.1. Clickers

Clickers have become very common tools in educational contexts. PRSs were already available as remote control devices at the turn of the century [8], these offered students a chance to have a more active role in the classroom. This wireless technology is still productive in classrooms and research [9,10]. In the last decade several mobile applications that can be used as clickers have proliferated, such as Kahoot, Socrative, Quizlet or Quizziz, to name just a few. Their availability as (in most cases) free tools for both teachers and students facilitate their incorporation into everyday classroom practice, especially in Higher Education contexts. Comparisons between remote devices and BYODs reveal no major differences when using either for the same purpose [11].
Several review articles focusing on existing literature on clicker use in different disciplines highlight the main areas of concern in the studies under observation. Thus, Keough [12] notices eight main aspects, all of them with positive results: actual performance, satisfaction, perceived performance, attention span, attendance, participation, feedback and ease of use. Similarly, Boscardin and Penuel [13] summarize the types of outcomes reported in previous studies in four major areas, two of them with contrasted positive outcomes, namely learner engagement and peer instruction/interaction and discourse opportunity, while the remaining two, knowledge gain and formative assessment, would still pose some challenges for scholars. A more recent review article [14] observes the following themes as the most relevant ones in the majority of studies it analyzes: Engagement, participation, learning experience, performance, satisfaction, feedback, attendance and attitude. Although most studies notice these factors as advantages of clickers, a few of them regard them as disadvantages in their particular contexts.
The literature records a wide catalogue of uses of clickers in the classroom, which can be easily implemented for different purposes. Some methodological approaches include the development of cooperative or collaborative learning strategies, which can enhance collaborative learning environments [5,6,15]. Success in the application of this approach could be related to some of the advantages mentioned, since, as noted by some scholars, students’ enjoyment of activities using PRSs contributes to their active participation in collaborative learning [16]. On the other hand, the use of mobile applications in collaborative learning may bring about potential handicaps such as distraction and consequent disengagement [17]. Another caveat of using clickers for multiple-choice tasks in collaborative learning concerns the avoidance of pure guessing, not easily solved with penalties for guessing [18].
Different types of assessment can resort to clickers in order to make it less time-consuming both for teachers and students. One of them is formative assessment, which is usually related to how the feedback provided to students is used to improve the teaching-learning process and the acquisition of contents. Benefits of feedback for formative assessment are well attested [19], however, this type of assessment is often challenging when teachers need to check understanding during the lesson, and especially in large groups [20], and it can be also difficult to involve students in the feedback process [21]. Formative feedback through clickers increases students’ awareness of content understanding while providing them with a sense of active participation [20,22]. Conversely, many of the benefits in large groups might not be generalizable to smaller groups [23].
Another related function of clickers can be live peer-assessment in the classroom, after oral presentations for instance [24]. Students learn to give more objective marks when they use clickers for this purpose repeatedly, and in large groups there are no significant differences between marks given by peers and teachers over time [25].
Some simpler tools and materials, such as cards with Plickers, could be used for similar assessing purposes, however, applications like Socrative simplify formative assessment to a greater extent since immediate feedback can be included for each question or the teacher can decide how to provide it depending on real-time results. Additionally, they enable the recording of data from students’ answers (named or anonymous) so teachers can still go back to questions that have posed more difficulties.
Clickers are also suggested as a support to classroom innovative approaches, as in the case of flipped or blended classroom environments [26,27,28,29]. These tools could be used in different ways in those contexts, as individual or group assessment tools before, during or after the class [28]. Students favor the introduction of clickers in flipped learning, and positive effects concern an increase in both engagement and performance [29].
Although clickers have been more commonly used in so-called “hard sciences” [12], they have been also tested in English language teaching in different contexts and for varied purposes. These tools increase engagement and content comprehension [30,31], they can contribute to enhancing the learning experience in writing courses [32,33,34], they help increase peer discussion and metacognitive awareness on English language learning [35] and enhance the above-mentioned flipped classroom methodology [26,29].
Previous studies have also made use of Socrative with future Pre-School or Primary School teachers [36,37,38,39], who can thus benefit from experiencing as learners some methodological approaches they will be able to apply in their professional practice.

2.2. Gamification

The gamified strategies that can be introduced when using clickers represent one of their most engaging features. Several experts consider the use of these devices in the classroom as gamified activities [29,40,41] or even define them as gamification tools [42,43], as their methodological use may turn a lecture into a game.
One of the earliest definitions of gamification applied to education regards it as “a careful and considered application of game thinking to solving problems and encouraging learning using all the elements of games that are appropriate” [44] (pp. 15–16). Motivation and engagement are commonly mentioned as the main advantages of applying gamification features to the classroom practice [45,46,47]. These benefits are usually associated to the novelty of the tools and the techniques used, but positive effects, although to a lesser extent, are still present after novelty wears out [48].
Gaming in education and gamification have often been associated to mobile learning [39,49,50,51] and online environments [52,53,54] although simpler arrangements could be applied with or without the need of any technology. Most frequent gamification strategies are points, badges and leaderboards, which are easy to adapt to most classroom activities [44,47,55]. Points are given after students’ right answers and they can be made public or not, depending on our learning aims. Similarly, badges can be provided as a reward for a special achievement. When any of these is made public, leaderboards can be used to show students’ performance as compared to others. Gamified activities with clickers usually resort to points and leaderboards, which are provided by applications like Socrative or Kahoot [29,48].
Other techniques that can be implemented in the classroom include time restrictions, such as countdowns which can also be introduced in competitive games and can be useful for classroom management purposes. Countdown timers are available in Kahoot quizzes and also in Socrative paid versions. More complex features such as avatars, progress bars or levels are linked to the use of online platforms. Teachers should be cautious when using strategies that can affect students’ performance such as penalties or countdowns [55].
Gamified experiences using clickers in Higher Education are abundant, many of them relate to the use of the tool during lectures [56], either at specific times of the year when given contents need to be reinforced [42] or for longer periods during which clickers are used at every lecture [48]. Clickers can be used to support different kinds of games in the classroom given their quiz-format, and they may thus become a platform for playing (board)games, such as content revision games, or they can assist more specific activities, as an “ask the audience” lifeline or even polling or voting for different options, for instance to choose the best answer. Teachers can obtain both individual and whole-class reports containing students’ answers.
The rules of the games should be clearly stated at the beginning of each activity providing students with some examples of the tasks or questions in case dynamics are complex. The design of games for educational purposes should take into account several elements, such as the methodological framework, how the game should be adapted to the learning needs or to the context. Teachers are usually aware whether competitive or collaborative strategies work in a given group and therefore activities should include gamified elements that create a positive atmosphere and contribute to make students feel comfortable while having an active role in their learning.

3. Socrative

Socrative is a mobile app that offers different Teacher and Student versions. In this section we will focus on how teachers can use it. In order to start using it with our students, we must register on www.socrative.com [57]. We will have to choose between signing up for the free plan or paying a fee for one of the two so-called “pro” plans which Socrative has offered since 2017—one for K-12 teachers and one for Higher Education. The free version allows us to work with 1 room, 50 students and 1 activity at a time, whereas the pro plans allow us to use 10 rooms and up to 10 activities at once. The Higher Education version admits up to 150 students per session.
Once we have registered, we can download the application onto our mobile device or use it from a computer. A classroom code will always appear at the top of the screen. We will have to give this code to our students so they can “attend” our session. The free version only offers one classroom per account, whose default code we can modify or simplify before giving it to our students.
Socrative offers a wide range of possibilities to create tasks: Different ways of managing the activity (teacher paced or students working at their own pace, questions appearing in a pre-established or random order), combining different types of questions (true/false, multiple choice, short answer) and inserting an image for each question. There is no limit to the number of questions per activity or quiz. It also allows us to store our quizzes and share them with other teachers, a very useful option when we want to carry out a simultaneous session in different rooms.
Combining the Teacher-Paced mode (the teacher launches questions and controls from his/her screen what students see) and the “How’d we do?” option allows the teacher to easily visualize students’ performance from his/her mobile device or computer—how many students have answered, what they have answered, the percentages of right and wrong answers, etc. As wrong answers appear in red and right ones in green, we are able to spot difficulties at a single glance and provide students with immediate formative feedback, which is particularly useful when working with numerous groups, where participation may otherwise be limited.
Other Socrative options may also be very interesting: “Space Race” to organize competitive group quizzes; “Quick Question” to launch questions during the lesson and receive every student’s immediate answer; “Create Quiz” to create our own Quiz; “My Quizzes” to reuse already created ones; “Import Quiz” to use quizzes previously uploaded by other teachers. Finally, the option “Exit Ticket” allows teachers to get feedback from students at the end of a lesson.
As mentioned above, Socrative can also be used for assessment purposes. After a session, we can receive an Excel report on our email account. This shows each student or group’s answers for each question. In smaller groups, teachers can send each of their students individual pdfs including questions and their own answers, which can be used for reviewing or recording purposes. Another possibility would be to print our Socrative activity and hand it out as a traditional test in the classroom.

4. Socrative for the University Classroom: Three Examples

4.1. Participants

We aim to compare three experiences where Socrative was used during the same academic year and with the same participants: 68 first-year University students from the Degree in Pre-School Teaching, enrolled in an English Language and its Didactics course at the Faculty of Education in Toledo, University of Castilla-La Mancha, Spain. The number of those who completed questionnaires varied for each activity: n = 52 (Activity 1); n = 46 (Activity 2); n = 44 (Activity 3). Our objective with this course is dual. On the one hand, students reach an intermediate or B1 level of English [58]. On the other hand, they learn how to teach this language to Pre-School pupils. The way students are grouped varies depending on the type of session. The 68 students are together in the classroom for two-hour plenary sessions, whereas they are subdivided in smaller groups for one-hour seminars.

4.2. Experiences

Socrative teacher-paced quizzes were designed and implemented as formative assessment tools for different methodological purposes: Two of them did not include any gamified element other than the use of the clicker (i.e. a collaborative reading task in the first experience and checking understanding after theoretical explanations during a lecture in the second) while the third one (a content review cooperative game which had been purposely designed for a whole session) was game-based. In all cases, activities were designed to grant simultaneous participation of all the students in the classroom, which was possible with the aid of Socrative. Table 1 summarizes the main features of the three experiences.

4.2.1. Activity 1: Collaborative Reading

Current pedagogic trends, like the communicative approach or even CLIL, tend to focus on other skills—namely oral—to the detriment of reading, which may moreover be perceived as time-consuming and difficult to fit in by teachers feeling syllabus pressure.
Students’ capacity to concentrate on horizontal reading has been limited by vertical reading strategies resulting from screen reading habits. We fully agree that “Internet technology has had a significant impact upon reading strategies, resulting in a need to reshape our thinking about classroom reading practice” [59] (p. 662).
Having noticed (in previous years) not only that our students were apathetic about the reading skill, but also that their reading exam results were worryingly poor, we decided to look for original ways to encourage their interest on a skill that they found difficult.
Profiting from the 400th anniversaries of Shakespeare and Cervantes in 2016, we designed a collaborative reading task regarding both authors: we selected a compilation of online texts dealing with Cervantes and Shakespeare commemorations. All texts were real materials, not adapted, carefully chosen to guarantee that complexity was not excessive for B1 level students. As noted by other scholars [60], real texts can be used with different proficiency levels for tasks that require different reading skills, as scanning or skimming [61]. Based on these texts and bearing in mind the most common formats of reading exam questions, both true/false and multiple choice questions were produced, designing them in such a way that students could practice reading techniques (scanning, skimming or intensive reading) they usually need to solve reading exam questions.
Although the reading skill is normally practiced individually (collaborative methodology is more often linked to writing [62]), we made students work in groups of three to allow them to share their reading techniques with their peers and to notice the different reading strategies they should use depending on text and question types.
During a one-hour seminar, students had to answer 14 questions regarding 9 different texts, which were variable in length depending on the reading type that was necessary to solve each question. Each group of three students received a printed copy of the 9 texts. Questions were projected, one at a time and together with a time limit, using a PowerPoint presentation. The same questions together with their possible answers were only available for students on the Socrative session -only one per group—they accessed from their mobile devices. After having read the text and while it was still in front of them, students had time to debate and choose an answer, which made them explain and justify their choice to their group mates, activating their metacognitive strategies: It was not just a matter of doing the task; students were also reflecting on how they were doing it.
Data provided by Socrative enabled us to check -more reliably than classroom and exam observation-which strategies were more difficult for our students. We were also able to immediately tackle these problems insisting on how different text types should be approached by means of different reading strategies.

4.2.2. Activity 2: Lecture

The syllabus for this course includes a specific section on teaching English to very young learners which comprises legislation, teaching materials and methodologies, and language learning theoretical approaches. These contents are usually taught during two-hour lectures. Drawing from experience, challenges in these long theoretical sessions regard both lecturers and students. The former may find it difficult to detect whether explanations are enough or whether students have understood all the topics and concepts explained, and with large groups it is not easy to know when students are paying attention. For the latter, problems include keeping attention during long periods of time or distinguishing the level of importance of different contents seen during the session.
We decided to use Socrative during a plenary lecture to reinforce certain contents which had obtained poor exam results in the subject in previous years. In order to do so, contents were divided into blocks of similar length. Questions were posed after each block with the “Quick Question” option in Socrative. Students were asked to work in pairs or groups of three so that they could discuss the questions and the possible answers. They were given some time to decide and enter their definite answer. In this way, attention was promoted and the lecturer could easily verify if students had grasped the main ideas of the session. For those questions that got a few incorrect replies, immediate feedback was provided with new explanations and examples, thus guaranteeing a new chance for students with difficulties.
Using the “Quick Question” option during a plenary lecture was not time consuming. Other than carefully selecting the contents we wanted to emphasize and thinking of the questions we would ask our group -which we would have had to do anyway when planning a session without Socrative- there was nothing to prepare beforehand. The actual questions were written on the board during the lesson and students answered them from their mobile devices, which turned out much more effective than the lecturer just asking the whole group if they had understood.

4.2.3. Activity 3: Cooperative Review Game

With the aim of helping students to keep up with contents and to discriminate the most relevant ones, review lessons are planned several times a year, usually at the end of units or, as in this case, at the end of the semester. They are announced in advance and students are asked to revise for them. This session took place just before the Christmas break. It combined Socrative with a cooperative learning gamified system already described in a previous work [63].
A Socrative teacher-paced quiz consisting of 35 multiple-choice questions was used to review main language contents seen during the first semester (grammar, vocabulary, pronunciation, etc.). Multiple-choice questions are included in the Use of English part of the final exam for this subject, so these review games are an enjoyable way for our students to become familiar with a format they will have to face at the end of the year.
All questions were linked to The Jolly Christmas Postman [64], a classic and interactive picture book which is very rich in references to children’s literature (fairy tales and nursery rhymes) as well as cultural content regarding Christmas celebrations in the English-speaking world.
Students worked cooperatively in teams of 4 members. They used one mobile device per team. At the beginning of the lesson, to ensure participation of every member, each team received an identical set of four color stickers to distribute so that each member had a different color. We started reading the book together, stopping to ask quiz questions (like how a certain word is pronounced, for example). When a question was launched, teams were given a set time for in-group discussion, but a certain member (color indicated by the lecturer) was in charge of entering the answer on Socrative each time. Answers were checked, awarding teams a point for each correct answer, updating a simple leaderboard on the wall and providing immediate formative feedback if mistakes had arisen.
This game combined thus English children’s literature with purely linguistic aims. These future Pre-School teachers discovered and manipulated an interactive picture book and competed in a gamified way, collaborating with their team to gain points towards a symbolic prize.

5. Methodology

Ad-hoc quantitative questionnaires were designed considering relevance, logical sequence and distinctness of each item in order to gather students’ feedback on each session for our teaching purposes. They were discussed with two experienced lecturers and researchers in the Department. They were then piloted with another group of students in the same subject during the previous academic year, with whom the understanding of each question was orally checked. A few questions were removed and others were consequently modified taking into account feedback both from students and lecturers, thus clarifying some questions which may have been confusing for respondents or not specific or simple enough.
In total, participants completed three anonymous questionnaires on paper, which were distributed to them one immediately after each session. In order to make them comparable, and although each questionnaire focused on different aspects so as to gather information on the most relevant issues of the activity for our purposes, some 4-item Likert-scale (0 = not at all; 3 = very much) questions on the usefulness of the application for the different sessions remained identical.

6. Results

6.1. Overall Results

This section offers an account of the responses to the different ad-hoc questionnaires with tables detailing questions and mean scores for the responses. Distributions of responses to each question are further provided in boxplot figures generated using the R statistics and graphics programme. Questions with the highest and lowest means are discussed more in detail.
Table 2 shows the questions included in the questionnaire for the collaborative reading task together with the mean score of the responses.
Means broadly suggest a good acceptance of the activity and the use of the tool. As for the highest scores, as shown in Table 2, Question 6, which concerns the appropriateness of Socrative for the type of task, obtains the highest mean score, and, as observed from the boxplot in Figure 1, the distribution of the responses is placed exclusively on the top end of the scale. Similarly, Questions 5 and 7, which obtain very high means, receive similar homogenous high distributions except for some outliers; they regard the perceived enjoyment of the task as compared to a traditional one and the appropriateness of the tool to work in small groups.
On the contrary, Question 4 receives the lowest average score, and shows also a greater variability in its distribution. This question concerned duration of the task, since a time limit had been established for reading each text, which meant a restraint for students who need more time to process the information.
The remaining questions (1, 2, 3 and 8), which involved reading strategies, show similar distributions and also obtained high mean scores. In increasing order, Question 1 (2.17), on the usefulness of the task to differentiate reading strategies, Question 8 (2.21) on the metacognitive awareness of reading strategies and Question 3 (2.37) on the awareness of reading task completion and finally Question 2 (2.63) on the usefulness of the activity as reading practice.
Regarding the second session, Table 3 shows the questions included in the questionnaire for the lecture together with the mean scores of the responses:
As can be gathered from Table 3, means are remarkably high for this session, and as shown in Figure 2, most of the observations are placed again on the top end of the scale. Highest mean scores are found in Questions 5 and 8. Question 5, which asks respondents about the perceived enjoyment of the task, is further commented on in Section 6.2. Question 8 concerns the desirable frequency of using Socrative in lectures, and the homogenous distribution in Figure 2 highlights students’ agreement in favor of a more regular use of the tool.
The lowest mean score corresponds to Question 2, on the use of Socrative as an asset to keep attention. Although this was one of the features we aimed to tackle during the session, it was not felt among the most salient advantages of the app. Similar distributions are found in Questions 3, on ranking content relevance (further discussed in Section 6.2) and 4, on the recall gains from using Socrative.
As for the remaining Questions (1, 6 and 7), they obtain high mean scores: Question 1 (2.50), on the usefulness to check own understanding, shows a slight different distribution, with a lower median, while Questions 6 (2.62) and 7 (2.70), on the appropriateness of Socrative for the type of task and for increasing dynamic participation, show similar high score distributions.
Table 4 shows the questions included in the questionnaire for the cooperative review game together with the mean score of the responses.
Responses to the third session gathered the highest overall means of the three activities. As shown in Figure 3, the distribution of the responses is more evidently observed on the top end of the scale. The highest mean scores correspond to Questions 6, on the appropriateness of Socrative for the task; 7, on the appropriateness of the tool to work in small groups; and 5, on the perceived enjoyment of using Socrative as compared to more traditional review lessons. These three Questions, which are further discussed in Section 6.2, show similar uniform distributions as reflected in Figure 3. In contrast, Question 4, on the usefulness of time limits, received the lowest mean score, as was the case with the equivalent Question in the collaborative reading task. The distribution is similar to that of Question 3, on the usefulness of the tool to rank contents according to relevance, further discussed in Section 6.2.
The remaining Questions (1, 2 and 8) concerned the purposes of the session and they also obtained high mean scores: Question 1 (2.43), on the usefulness to check own level, shows a lower median than Questions 2 (2.61), on the usefulness of the tool for reviewing contents, and 8 (2.68), on the cooperative learning methodology, with similar distributions.

6.2. Comparisons between Socrative Uses

After each session where Socrative was used, participants were questioned about whether this clicker had rendered the task enjoyable. As shown in Question 5 in Table 2, Table 3 and Table 4, the usage of the software was similarly appreciated for all three activities. The game-based session got the highest average score (2.84) followed by the lecture (2.83), and the collaborative reading task (2.69), though ANOVA test revealed no statistically significant differences between group means (F(2,139) = 1.37, p = 0.26), and distributions shown in Figure 4 are alike.
The other question which was common to all three questionnaires focused on the appropriateness of Socrative for each of the tasks. As shown in Question 6 in Table 2, Table 3 and Table 4, the highest mean score corresponds to the cooperative review game (2.89), and the use of Socrative was also deemed appropriate for the collaborative reading task (2.71) and the lecture (2.62). There were statistically significant differences between group means as determined by one-way ANOVA (F(2,138) = 3.46, p = 0.03). Differences concerned the game-based activity, as t-tests showed significance between this activity and both the reading (p = 0.03) and the lecture sessions (p = 0.004), while there is no significance between non-gamified sessions (p = 0.18), which show similar distributions in Figure 5.
As explained above, two of the sessions were used to review contents, in the lecture students were asked about the topics just explained during the class, while in the cooperative review game the revision comprehended all the contents seen during the semester. In both cases the questionnaires included a Question (3) regarding the usefulness of the activity to notice which contents were more relevant. The two activities show very similar means (2.33 for the lecture and 2.34 for the cooperative review game), and similar distributions, as observed in Figure 6, with no statistically significant differences between group means with an ANOVA test (F(1,88) = 0.01, p = 0.91). There are almost no variations in the extent to which students perceived the task to be a useful means of discriminating the most relevant contents in the subject.
Finally, after sessions where participants had worked in small collaborative groups, they were questioned about the appropriateness of Socrative to work in such groups. As evidenced in Question 7 in Table 2 and Table 4, means are higher for the game-based activity (2.86) than for the non-game-based one (2.65) but ANOVA test revealed no statistical significance (F(1,94) = 3.19, p = 0.08), and, in fact, distributions are similar, as Figure 7 shows.

7. Discussion

The three experiences described in this paper confirm that Socrative can be used in very varied classroom situations, as noted by other scholars, including collaborative or cooperative learning environments [5,6,15], lectures [20,22,40] and gamified settings [41,42,43]. The mere introduction of BYOD practice undoubtedly attracts students’ curiosity and attention [48,51], but its immediate and time-saving nature is probably its biggest asset. Our experience proved that using Socrative does not necessarily increase lesson preparation time. Even when it does, we consider it worthy of time investment because using this clicker will actually make lessons more efficient, as we saw with the “Quick Question” option. Socrative guarantees immediate and multidirectional feedback, which helps both lecturers to check content acquisition and students to achieve objectives and clear up doubts.
All three activities had great acceptance among students. Regardless of their content, Socrative was perceived as appropriate and as a means of rendering all three sessions more enjoyable than they would have been without the application. It was also considered appropriate for working in small groups as it promotes active learning strategies. The use of Socrative in both gamified and non-gamified activities seems to be perceived as equally relevant regarding its function to review contents. Gamification elements do not seem to affect content prioritization; they rather impact other aspects such as motivation.
There were, however, differences in favor of the game-based session, which also implied the longest use of the tool, and where Socrative obtained the highest mean scores. Comparatively, Socrative was perceived as even more appropriate for the task; as even more appropriate for working in small groups, and as rendering the task even more enjoyable when used in gamified than in non-gamified sessions.
Due to their collaborative and cooperative nature, these three experiences were only used as formative assessment [19], helping each student become aware of their own progress. Nevertheless, clickers may well function as a tool for summative assessment. Quiz results, for instance, could count toward students’ final mark.
Previous research on clickers has noticed an increase in peer discussion and metacognitive awareness in L2 contexts [35], which our students’ responses seem to confirm. In fact, they were able to check their own language level and to develop their ability to understand and recall contents, while ranking them according to relevance. Peer discussion promoted metalinguistic abilities, as students had to reflect on different reading strategies and the reasons to select the most suitable one for each kind of reading question.
As already noted [55], teachers should be careful when introducing time-limits and countdowns in the activities. Questions dealing with time limits received the lowest mean scores in the two sessions where they were used. They might affect students’ performance during the activity, and consequently, reduce their perceived satisfaction and learning outcomes.
A plethora of experiences using clickers has been published in recent years. Yet, literature lacks enough data on which gamification strategies—and to what extent—have an impact on students’ engagement. Obviously and just like with any other resource that we may use to reinforce content areas which present difficulties among our students, our planning must be tailored to the needs of our classroom. Other practitioners from related disciplines could design and implement similar activities. As regards the present study and in order to further confirm that gamification was the key to the greater acceptance of the application among participants, a future study could involve gamifying the non-gamified activities herein analyzed, i.e. the collaborative reading task and the lecture.

8. Conclusions

Drawing from the three experiences here described, we fully agree that clickers increase students’ motivation, interactivity [2], participation, satisfaction [12,14], interaction [13], performance and engagement [4,5,6]. They are undoubtedly helpful for feedback and formative assessment [14,19] and encourage collaborative learning [16]. Therefore, the range of uses of clickers, and Socrative in particular, should not be limited to lectures, as other session types and activities can benefit from the incorporation of this tool.
However, we differ with those who define clickers as gamification tools [42,43] or consider their mere use as a means of gamification [29,40,41]. Admittedly, Socrative and other clickers provide gamification elements such as points or leaderboards, but deliberately including them in the activity design is a basic requirement in order for an activity to qualify as gamified.
Although our results show that both with and without gamification elements, clickers increase students’ engagement, there is a significant increase in the extent to which the use of Socrative is perceived to be enjoyable and appropriate when sessions include points and leaderboards. This confirms what has been noted by most studies analyzing the use of clickers in Higher Education: The inclusion of gamified elements can further increase engagement. We thus argue that clickers do not necessarily equal gamification.

Author Contributions

Conceptualization, F.F.C. and A.M.-M.H.; methodology, F.F.C. and A.M.-M.H.; formal analysis, F.F.C. and A.M.-M.H.; investigation, F.F.C.; resources, A.M.-M.H.; data curation, F.F.C. and A.M.-M.H.; writing—original draft preparation, F.F.C. and A.M.-M.H.; writing—review and editing, A.M.-M.H.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chou, P.N.; Chang, C.C.; Lin, C.H. BYOD or not: A comparison of two assessment strategies for student learning. Comput. Hum. Behav. 2017, 74, 63–71. [Google Scholar] [CrossRef]
  2. Trindade, J. Promoção da interatividade na sala de aula com Socrative: estudo de caso. Indagatio Didact. 2014, 6, 1. [Google Scholar]
  3. Rashid, T.; Asghar, H.M. Technology use, self-directed learning, student engagement and academic performance: Examining the interrelations. Comput. Hum. Behav. 2016, 63, 604–612. [Google Scholar] [CrossRef]
  4. Blasco-Arcas, L.; Buil, I.; Hernández-Ortega, B.; Sesel, F.J. Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Comput. Educ. 2013, 62, 102–110. [Google Scholar] [CrossRef]
  5. Dakka, S.M. Using Socrative to enhance in-class student engagement and collaboration. Int. J. Integr. Technol. Educ. 2015, 4, 13–19. [Google Scholar] [CrossRef]
  6. McDonough, K.; Foote, J.A. The impact of individual and shared clicker use on students’ collaborative learning. Comput. Educ. 2015, 86, 236–249. [Google Scholar] [CrossRef]
  7. Shute, V.J. Focus on formative feedback. Rev. Educ. Res. 2008, 78, 153–189. [Google Scholar] [CrossRef]
  8. Paschal, C.B. Formative assessment in physiology teaching using a wireless classroom communication system. Adv. Physiol. Educ. 2002, 26, 299–308. [Google Scholar] [CrossRef]
  9. Kuriakose, R.B.; Luwes, N. Student perceptions to the use of paperless technology in assessments–a case study using clickers. Proc. Soc. Behav. Sci. 2016, 228, 78–85. [Google Scholar] [CrossRef]
  10. Stringer, M.; Stringer, K.; Hunter, J.A.; Finlay, C. Assuring quality through student evaluation. In Handbook of Quality Assurance for University Teaching; Roger, E., Elaine, H., Eds.; Routledge: New York, NY, USA, 2019. [Google Scholar]
  11. Stowell, J.R. Use of clickers vs. mobile devices for classroom polling. Comput. Educ. 2015, 82, 329–334. [Google Scholar] [CrossRef]
  12. Keough, S.M. Clickers in the classroom: A Review and a Replication. J. Manag. Educ. 2012, 36, 822–847. [Google Scholar] [CrossRef]
  13. Boscardin, C.; Penuel, W. Exploring benefits of audience-response systems on learning: A review of the literature. Acad. Psychiatry 2012, 36, 401–407. [Google Scholar] [CrossRef] [PubMed]
  14. Rana, N.P.; Dwivedi, Y.K.; Al-Khowaiter, W.A. A review of literature on the use of clickers in the business and management discipline. Int. J. Manag. Educ. 2016, 14, 74–91. [Google Scholar] [CrossRef]
  15. Awedh, M.; Mueen, A.; Zafar, B.; Manzoor, U. Using Socrative and smartphones for the support of collaborative learning. Int. J. Integr. Technol. Educ. 2014, 3, 17–24. [Google Scholar] [CrossRef]
  16. Chan, S.C.; Wan, J.C.; Ko, S. Interactivity, active collaborative learning, and learning performance: The moderating role of perceived fun by using personal response systems. Int. J. Manag. Educ. 2019, 17, 94–102. [Google Scholar] [CrossRef]
  17. Heflin, H.; Shewmaker, J.; Nguyen, J. Impact of mobile technology on student attitudes, engagement, and learning. Comput. Educ. 2017, 107, 91–99. [Google Scholar] [CrossRef]
  18. Kulikovskikh, I.M.; Prokhorov, S.A.; Suchkova, S.A. Promoting collaborative learning through regulation of guessing in clickers. Comput. Hum. Behav. 2017, 75, 81–91. [Google Scholar] [CrossRef]
  19. Black, P.; Wiliam, D. Assessment and classroom learning. Assess. Educ. Princ. Policy Pract. 1998, 5, 7–74. [Google Scholar] [CrossRef]
  20. Ludvigsen, K.; Krumsvik, R.; Furnes, B. Creating formative feedback spaces in large lectures. Comput. Educ. 2015, 88, 48–63. [Google Scholar] [CrossRef]
  21. Hatziapostolou, T.; Paraskakis, I. Enhancing the impact of formative feedback on student learning through an online feedback system. Electron. J. e-Learn. 2010, 8, 111–122. [Google Scholar]
  22. Egelandsdal, K.; Krumsvik, R.J. Clickers and formative feedback at university lectures. Educ. Inf. Technol. 2017, 22, 55–74. [Google Scholar] [CrossRef]
  23. Dunnett, A.J.; Shannahan, K.L.; Shannahan, R.J.; Treholm, B. Exploring the impact of clicker technology in a small classroom setting on student class attendance and course performance. J. Acad. Bus. Educ. 2011, 12, 43–56. [Google Scholar]
  24. Barwell, G.; Walker, R. Peer assessment of oral presentations using clickers: the student experience. In Proceedings of the 32nd HERDSA Annual Conference: The Student Experience, Hammondville, Australia, 6–9 July 2009; Wozniak, H., Bartoluzzi, S., Eds.; Milperra: Higher Education Research and Development Society of Australasia: Hammondville, Australia, 2009; pp. 23–32. [Google Scholar]
  25. Nájera, A.; Villalba, J.M.; Arribas, E.; Gallego, S.; Beléndez, A.; Francés, J.; de Pablo, F. Evaluación entre pares (peer evaluation) usando clickers. In Más Experiencias de Innovación Docente en la Enseñanza de la Física Universitaria; Nájera, A., Arribas, E., Eds.; Lulu Enterprises: Albacete, Spain, 2011; pp. 73–84. [Google Scholar]
  26. Mehring, J. Present research on the flipped classroom and potential tools for the EFL classroom. Comput. Sch. 2016, 33, 1–10. [Google Scholar] [CrossRef]
  27. Nouri, J. The flipped classroom: for active, effective and increased learning–especially for low achievers. Int. J. Educ. Technol. High. Educ. 2016, 13, 33. [Google Scholar] [CrossRef]
  28. Heinerichs, S.; Pazzaglia, G.; Gilboy, M.B. Using flipped classroom components in blended courses to maximize student learning. Athl. Train. Educ. J. 2016, 11, 54–57. [Google Scholar] [CrossRef]
  29. Hung, H.T. Clickers in the flipped classroom: bring your own device (BYOD) to promote student learning. Interact. Learn. Environ. 2017, 25, 983–995. [Google Scholar] [CrossRef]
  30. Kaya, A.; Balta, N. Taking advantages of technologies: Using the Socrative in English language teaching classes. Int. J. Soc. Sci. Educ. Stud. 2016, 2, 4–12. [Google Scholar]
  31. El Shaban, A. The use of Socrative in ESL classrooms: Towards active learning. Teach. Engl. Technol. 2017, 17, 64–77. [Google Scholar]
  32. Mork, C.M. Benefits of using online student response systems in Japanese EFL classrooms. JALT Call J. 2014, 10, 127–137. [Google Scholar]
  33. Ohashi, L. Enhancing EFL writing courses with the online student response system Socrative. Kokusaikeiei Bunkakenkyu 2015, 19, 135–145. [Google Scholar]
  34. Sprague, A. Improving the ESL graduate writing classroom using Socrative: (Re) considering exit tickets. TESOL J. 2016, 7, 989–998. [Google Scholar] [CrossRef]
  35. Zhonggen, Y. The influence of clickers use on metacognition and learning outcomes in college English classroom. In Exploring the New Era of Technology-Infused Education; Tomei, L., Ed.; IGI Global: Hershey, PA, USA, 2017; pp. 158–171. [Google Scholar]
  36. Paz-Albo, J. The impact of using smartphones as student response systems on prospective teacher education training: a case study. El Guiniguada. Rev. Investig. Exp. Cienc. Educ. 2014, 23, 125–133. [Google Scholar]
  37. Benítez-Porres, J. Socrative como herramienta para la integración de contenidos en la asignatura “Didáctica de los Deportes”. In XII Jornadas Internacionales de Innovación Universitaria Educar para Transformar: Aprendizaje Experiencial; Ruiz Rosillo, M.A., Ed.; Universidad Europea: Madrid, Spain, 2015; pp. 824–831. [Google Scholar]
  38. Aslan, B.; Seker, H. Interactive response systems (IRS) SOCRATIVE application sample. J. Educ. Learn. 2017, 6, 167–174. [Google Scholar] [CrossRef]
  39. Bicen, H.; Kocakoyun, S. Determination of university students’ most preferred mobile application for gamification. World J. Educ. Technol. Curr. Issues 2017, 9, 18–23. [Google Scholar] [CrossRef]
  40. Pettit, R.K.; McCoy, L.; Kinney, M.; Schwartz, F.N. Student perceptions of gamified audience response system interactions in large group lectures and via lecture capture technology. BMC Med. Educ. 2015, 15, 92. [Google Scholar] [CrossRef] [PubMed]
  41. Solmaz, E.; Çetin, E. Ask-response-play-learn: Students’ views on gamification based interactive response systems. WJEIS. 2017, 7, 28–40. [Google Scholar]
  42. Bullón, J.J.; Hernández Encinas, A.; Santos Sánchez, M.J.; Gayoso Martínez, V. Analysis of student feedback when using gamification tools in Math subjects. In Proceedings of the 2018 IEEE Global Engineering Education Conference (EDUCON), Tenerife, Spain, 17–20 April 2018; pp. 1818–1823. [Google Scholar]
  43. De Soto García, I.S. Herramientas de gamificación para el aprendizaje de ciencias de la tierra. Edutec Rev. Electrónica Tecnol. Educ. 2018, 65, 29–39. [Google Scholar] [CrossRef]
  44. Kapp, K.M. The Gamification of Learning and Instruction: Case-Based Methods and Strategies for Training and Education; Pfieffer: An Imprint of John Wiley & Sons: New York, NY, USA, 2012. [Google Scholar]
  45. Lee, J.J.; Hammer, J. Gamification in education: What, How, Why Bother? Acad. Exch. Q. 2011, 15, 1–5. [Google Scholar]
  46. Serrano Lara, J.J.; Fajardo Magraner, F. The ICT and gamification: tools for improving motivation and learning at universities. In Proceedings of the 3rd International Conference on Higher Education Advances, Valencia, Spain, 21–23 June 2017; Editorial Universitat Politècnica de València: Valencia, Spain, 2017; pp. 540–548. [Google Scholar]
  47. Subhash, S.; Cudney, E.A. Gamified learning in higher education: A systematic review of the literature. Comput. Hum. Behav. 2018, 87, 192–206. [Google Scholar] [CrossRef]
  48. Wang, A.I. The wear out effect of a game-based student response system. Comput. Educ. 2015, 82, 217–227. [Google Scholar] [CrossRef]
  49. Schwabe, G.; Göth, C. Mobile learning with a mobile game: Design and motivational effects. J. Comput. Assist. Learn. 2005, 21, 204–216. [Google Scholar] [CrossRef]
  50. Kittl, C.; Edegger, F.; Petrovic, O. Learning by pervasive gaming: An empirical study. In Innovative Mobile Learning: Techniques and Technologies; IGI Global: Hershey, PA, USA, 2009; pp. 60–82. [Google Scholar]
  51. Bartel, A.; Hagel, G. Engaging students with a mobile game-based learning system in university education. In Proceedings of the 2014 IEEE Global Engineering Education Conference (EDUCON), Istanbul, Turkey, 3–5 April 2014; pp. 957–960. [Google Scholar]
  52. Hakulinen, L.; Auvinen, T.; Korhonen, A. Empirical study on the effect of achievement badges in TRAKLA2 online learning environment. In Proceedings of the 2013 Learning and Teaching in Computing and Engineering, Macau, China, 21–24 March 2013; pp. 47–54. [Google Scholar]
  53. McGrath, N.; Bayerlein, L. Engaging online students through the gamification of learning materials: The present and the future. In ASCILITE-Australian Society for Computers in Learning in Tertiary Education Annual Conference; Australasian Society for Computers in Learning in Tertiary Education: Tugun, Australia, 2013; pp. 573–577. [Google Scholar]
  54. Pirker, J.; Gutl, C.; Astatke, Y. Enhancing online and mobile experimentations using gamification strategies. In Proceedings of the 2015 3rd Experiment International Conference (exp. at’15), Ponta Delgada, Portugal, 2–4 June 2015; pp. 224–229. [Google Scholar]
  55. Dicheva, D.; Dichev, C.; Agre, G.; Angelova, G. Gamification in education: A systematic mapping study. Educ. Technol. Soc. 2015, 18, 75–88. [Google Scholar]
  56. Ferrándiz, E.; Puentes, C.; Moreno, P.J.; Flores, E. Engaging and assessing students through their electronic devices and real time quizzes. Multidiscip. J. Educ. Soc. Technol. Sci. 2016, 3, 173–184. [Google Scholar] [CrossRef]
  57. Socrative. Available online: www.socrative.com (accessed on 10 May 2019).
  58. Council of Europe, Council for Cultural Co-operation. Education Committee, Modern Languages Division. Common European Framework of Reference for Languages: Learning, Teaching, Assessment; Cambridge University Press: Cambridge, UK, 2001. [Google Scholar]
  59. Sutherland-Smith, W. Weaving the literacy Web: Changes in reading from page to screen. Read. Teach. 2002, 55, 662–669. [Google Scholar]
  60. Lee, J.F.; Musumeci, D. On hierarchies of reading skills and text types. Mod. Lang. J. 1988, 72, 173–187. [Google Scholar] [CrossRef]
  61. Berardo, S.A. The use of authentic materials in the teaching of reading. Read. Matrix 2006, 6, 60–69. [Google Scholar]
  62. Storch, N.; Wigglesworth, G. Writing tasks: The effects of collaboration. In Second Language Acquisition: Investigating Tasks in Formal Language Learning; Multilingual Matters: Clevedon, UK, 2006; pp. 157–177. [Google Scholar]
  63. Martín-Macho Harrison, A.; Faya Cerqueiro, F. El juego en el aula de lengua inglesa para consolidar contenidos: experiencia con futuros docentes de educación infantil. In Aprendizajes Plurilingües y Literarios: Nuevos Enfoques Didácticos; Díez Mediavilla, A.E., Brotons Rico, V., Escandell, D., Rovira Collado, J., Eds.; Universitat d´Alacant, Servicio de Publicaciones: Alicante, Spain, 2016; pp. 873–878. [Google Scholar]
  64. Ahlberg, J.; Ahlberg, A. The Jolly Christmas Postman; Little, Brown & Co.: Boston, MA, USA, 1991. [Google Scholar]
Figure 1. Collaborative reading: Distribution of responses.
Figure 1. Collaborative reading: Distribution of responses.
Mti 03 00049 g001
Figure 2. Lecture: Distribution of responses.
Figure 2. Lecture: Distribution of responses.
Mti 03 00049 g002
Figure 3. Cooperative review game: Distribution of responses.
Figure 3. Cooperative review game: Distribution of responses.
Mti 03 00049 g003
Figure 4. Responses to the question “Socrative made the task more enjoyable than (i) traditional reading tasks (ii) a lesson consisting only of theoretical explanations (iii) traditional review tasks/lessons”.
Figure 4. Responses to the question “Socrative made the task more enjoyable than (i) traditional reading tasks (ii) a lesson consisting only of theoretical explanations (iii) traditional review tasks/lessons”.
Mti 03 00049 g004
Figure 5. Responses to the question “Socrative seemed appropriate for this type of task: (i) brief readings, battery of short questions, multiple choice, etc. (ii) quick answer; (iii) battery of short questions, multiple choice, etc.”.
Figure 5. Responses to the question “Socrative seemed appropriate for this type of task: (i) brief readings, battery of short questions, multiple choice, etc. (ii) quick answer; (iii) battery of short questions, multiple choice, etc.”.
Mti 03 00049 g005
Figure 6. Responses to the question “The activity helped me to see which contents are/have been the most relevant”.
Figure 6. Responses to the question “The activity helped me to see which contents are/have been the most relevant”.
Mti 03 00049 g006
Figure 7. Responses to the question “Socrative seemed appropriate to work in small groups”.
Figure 7. Responses to the question “Socrative seemed appropriate to work in small groups”.
Mti 03 00049 g007
Table 1. Main features of the three Socrative sessions.
Table 1. Main features of the three Socrative sessions.
Sessions
DescriptionCollaborative Reading TaskLectureCooperative Review Game
Gamification strategiesNoNoYes
Type of sessionSeminarPlenarySeminar
Respondents524644
Main methodologyCollaborative learningTheoretical explanationsCooperative learning
Group distributionSmall groups (3 members)Pairs or small groups (3 members)Small groups (4–5 members)
Teaching objectivesTo promote reading skills through practice
To raise awareness on reading strategies
To check understanding after explanationsTo review contents at the end of a unit/semester
Socrative optionTeacher-paced quizQuick QuestionTeacher-paced quiz
Table 2. Collaborative reading: questions and mean scores.
Table 2. Collaborative reading: questions and mean scores.
QuestionsMean Score
1. This task was useful to understand that each type of question implies a different type of reading.2.17
2. This task was a useful training to find answers in a text more quickly.2.63
3. This task helped me to realize that I can answer questions about a text without understanding everything it says.2.37
4. The time limit for each question helped me to find the information I was looking for more quickly.1.85
5. Socrative made the task more enjoyable than traditional reading tasks.2.69
6. The use of Socrative seemed appropriate for this type of task (brief readings, battery of short questions, multiple choice, etc.)2.71
7. The use of Socrative seemed appropriate to work in small groups.2.65
8. Working in a group during this reading task enabled me to notice the strategies used by my classmates.2.21
Table 3. Lecture: Questions and mean scores.
Table 3. Lecture: Questions and mean scores.
QuestionsMean Score
1. This task helped me to check what I had understood after each explanation.2.50
2. This task helped me to keep attention during the lesson.2.17
3. This task helped me to see which contents are the most relevant in the syllabus.2.33
4. This task helped me to better remember the aspects we have been asked about.2.43
5. Socrative made the task more enjoyable than a lesson consisting only of theoretical explanations.2.83
6. The use of Socrative seemed appropriate for this type of task (quick answer).2.62
7. The use of Socrative seemed appropriate to make all students’ participation more dynamic.2.70
8. I would like to use this type of tool more often in theoretical lessons.2.83
Table 4. Cooperative review game: Questions and mean scores.
Table 4. Cooperative review game: Questions and mean scores.
QuestionsMean Score
1. The task helped me to be aware of my own level.2.43
2. This task helped me to consolidate the contents (grammar, vocabulary, pronunciation) seen during the semester. 2.61
3. The activity helped me to see which contents are the most relevant in the syllabus.2.34
4. The time limit for each question helped me to find the information I was looking for more quickly.2.25
5. Socrative made the task more enjoyable than traditional review tasks/lessons.2.84
6. The use of Socrative seemed appropriate for this type of task (battery of short questions, multiple choice, etc.).2.89
7. The use of Socrative seemed appropriate to work in small groups.2.86
8. When working in a group during this review activity, it was useful to agree the answers with my classmates.2.68

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Back to TopTop