Next Article in Journal
Qualitative Evaluation of Scaffolded Teaching Materials in Business Analysis Classes: How to Support the Learning Process of Young Entrepreneurs
Previous Article in Journal
Teacher Professional Development in Higher Education: The Impact of Pedagogical Training Perceived by Teachers
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Interactive Learning with Student Response System to Encourage Students to Provide Peer Feedback

School of Industrial Education and Technology, King Mongkut’s Institute of Technology Ladkrabang, Bangkok 10520, Thailand
Department of Computer Science, University of Warwick, Coventry CV4 7AL, UK
Department of Accounting and Auditing, University of Santiago of Chile, Av. L. B. O’Higgins 3363, Santiago 9160000, Chile
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(3), 310;
Submission received: 5 February 2023 / Revised: 9 March 2023 / Accepted: 12 March 2023 / Published: 15 March 2023
(This article belongs to the Section Higher Education)


This study analyzed anonymous peer feedback among two groups of university students—a lower-performing class and a higher-performing class. Students used an audience response system to anonymously comment on each other’s work. Each peer feedback or comment was categorized into one of seven types: Praise+, Praise−, Criticism+, Criticism−, Combined Praise and Criticism, Opinion, and Irrelevant. The plus (+) and minus (−) signs were used to categorize the quality of the feedback. The learning performance of the two groups of students was also analyzed. The main result showed that the lower-performing class (based on the average midterm scores) provided more substantial Criticism+ and Opinion-type comments than the higher-performing students. Contrary to expectation, no significant difference was found between the two classes on the final exam, suggesting that anonymity allowed lower-performing students to express themselves more effectively than higher-performing students, leading them to improve their learning outcomes.

1. Introduction

1.1. Peer Review Approach

Student-centered learning encourages interaction in the classroom, but teacher-centered delivery might stop students from expressing their opinions and other class engagement activities [1,2]. Carless and Boud [3] mentioned that “it is unrealistic and ineffective to expect teachers to provide more and more comments to large numbers of learners”. Giving feedback is likely to be more beneficial to students than simply receiving feedback since they will engage in critical thinking to point out faults and provide suggestions to improve their peers’ work [4]. Therefore, the main activity of this research is encouraging students to provide peer feedback.
Peer review is a learning approach that encourages students to reflect on their understanding, and enhance their critical thinking, by evaluating each other based on the marking criteria that have been set by the teachers or students [5,6,7]. In order to provide feedback to their classmates, students are encouraged to think (active learning) more than just receive feedback (passive learning). Moreover, students will have a better understanding of marking criteria and learn how to judge the quality of work [8]. This approach also increases the transparency of the assessment. However, if students are required to give immediate feedback in front of the class, they may feel uncomfortable that their identities have been revealed [9]. Technology can help during this process by offering the option to implement anonymous feedback.

1.2. Peer Feedback Type

A student participates in the peer review process in two ways: either as an assessor who provides feedback on peers’ work or as an assessee who receives feedback. Peer feedback types were a modification of the feedback coding scheme that was developed by Chien, Hwang, and Jong [5] and Stracke and Kumar [10], and includes Praise, Criticism, Opinion, and Irrelevance. When grouping the peer feedback, it was found in this study that some feedback could be classified into both Praise and Criticism, and a further type of peer feedback was added as shown in Table 1. Some feedback is simple, without detailed explanation. Therefore, “+” and “−” are used to classify the quality of feedback, where “+” denotes detailed explanation and “−” denotes simple feedback.
According to Chen et al. [11], there are two levels of feedback, namely surface level (simple compliments or simple criticism) and critical level (accurate and practical suggestions). Pointing out strengths and weaknesses and giving suggestions for improvement requires higher-order thinking or critical thinking [3]. Tai et al. [12] also reported that to provide effective feedback, students need to develop critical thinking skills by judging the quality of their own work and other people’s work.

1.3. Student Response Systems Support Interactive Learning

Using technology integrated into learning can enhance students’ learning experiences and increase student participation and interaction in the classroom [13]. A student response system (SRS), also known as a classroom response system, is a system that allows students to respond to teacher questions [14,15]. An SRS is also a tool to collect students’ instant responses and can be used in collaborative learning as students can share knowledge and exchange ideas with each other [16,17]. Therefore, an SRS can be used to change students from passive learners into active learners, with the anonymous environment allowing the possibility for more shy students to be more comfortable with answering and expressing their opinions [18].
SRSs include Poll Everywhere (, Piazza (, Socrative (, and GoSoapbox ( Poll Everywhere ( was chosen for this study because it offers a free plan, it enables students to provide immediate anonymous feedback in a face-to-face classroom, and it allows students to express their ideas from their phones and laptops, with responses exportable as an Excel spreadsheet.

2. Research Questions

The main purpose of this study was to encourage students to improve their critical thinking skills by providing peer feedback in less pressurized (anonymous) conditions through the SRS. The following four research questions shaped this investigation.
  • How do the Architecture Education Student (AES) and Engineering Education Student (EES) groups compare as regards the type of feedback students provide to their peers?
  • Is there any relationship between the percentage of Opinion-type feedback (critical thinking) and the subsequent assignment scores when compared between the AES and EES groups?
  • How do the AES and EES groups compare based on student opinions as to what the good points of being an anonymous assessor are?
  • How do the AES and EES groups compare based on the impact of peer feedback activities on the students’ performances as measured by final exam scores?

3. Method

3.1. Context of the Study and Participants

Undergraduate students at the Faculty of Industrial Education and Technology at the first author’s university must complete a course called “Innovation and Information Technology in Education”. The course objectives are to (1) explore ways that innovation and information technology can improve learning quality; (2) analyze the challenges that innovation and information technology pose in education; and (3) evaluate innovation and information technology. This fifteen-week course includes two hours of lectures and two hours of computer labs per week. As a result, it is expected that students will obtain the necessary knowledge and skills through the use of educational technology.
Two groups of 62 s year undergraduate students, between the ages of 19 and 21 and enrolled in an “Innovation and Information Technology in Education” course, participated in this study. None of them had any prior experience in using the Poll Everywhere tool. The first group of 36 Engineering Education students consisted of 16 males and 20 females. The second group of 26 Architecture Education students consisted of 7 males and 19 females. These two groups were taught by the same lecturer. Based on the Engineering Education curriculum, students had some background in technology before attending this module, such as principles of computer programming and telecommunications technology. However, the Architecture Education curriculum did not include any technology-related modules in the first year.
Based on the average midterm scores, Engineering Education students (mean = 14.32 out of 20; SD = 1.92) were assessed to be higher-ability students than Architecture Education students (mean = 13.09 out of 20; SD = 2.01), with an effect size d = 0.63 95% [0.11–1.14] in this study. The first author’s university provided funding for this research. Students’ rights were protected, and they were informed that their opinions would not be linked to their identities. Participants were informed that they could refuse to participate or withdraw from the study at any time. The study was approved by the Research Ethics Committee of the first author’s university.

3.2. Experimental Procedure

Students in each class were asked to form small groups composed of 3–4 participants per group. Each group was assigned to do the same assignments (design brochures, create surveys, and make video clips) and deliver a presentation for 15 min. The lecturer in charge of delivering the class chose which group would present in each assignment. Due to time restrictions, only 3 groups presented in each assignment, although all groups had the opportunity to present after completing the three assignments. The order of the groups’ presentations was randomized. Marking criteria were discussed in the classroom. Students were asked to assess their peers’ work (3 assignments that were chosen randomly by the lecturer) by providing anonymous comments on their peers’ presentations using the application. The expectation was that this tool would be used to encourage immediate feedback more effectively without class pressure and with less anxiety than in a face-to-face classroom. Students gave individual feedback for the first 2 assignments, but they discussed with their group partners before giving feedback for assignment 3 (see Figure 1). In the first assignment, students practiced providing feedback without being rewarded with scores. In assignments 2 and 3, peer assessors gained scores for the role of feedback giver. The whole class could see all anonymous peer feedback through on the screen in front of the classroom immediately. No one knew which feedback was given by whom. Before the final exam week, students answered the questionnaire for 15–20 min.

3.3. Instruments was used to collect anonymous peer feedback on three assignments. Based on the peer feedback type in Table 1, two researchers coded all of the peer feedback (double review). The questionnaire gathered students’ perspectives on their learning experience when using the peer feedback approach. The questionnaire comprised three parts: (1) student background, such as age, gender, technology background, peer feedback prior experiences, etc.; (2) student perceptions of the impact of peer assessors on their performance on subsequent assignments, measured using a Likert scale, which ranked from 1 (strongly disagree) to 10 (strongly agree); (3) student perceptions of the benefits of being an anonymous assessor and their learning experiences of the peer feedback approach, which were collected through open-ended questions. Students’ knowledge and critical thinking skills were assessed in the final test.

4. Results and Discussion

This section presents the results of the intervention comparing data from peer feedback, students’ perceptions, and scores on the final exam between the Architecture Education Students and Engineering Education Students. Table 2 shows the relationship between the research questions, data used to respond to each question, and method employed to analyze the data.

4.1. Research Question 1: How Do the AES and EES Groups Compare as Regards the Type of Feedback Students Provide to Their Peers?

The analysis of peer feedback type, based on feedback on the three assignments from the two groups of students, elicited 163 responses from EES students and 165 from AES students.

4.1.1. The Analysis of Peer Feedback from Engineering Education Students

From Figure 2, it can be observed that the most frequent feedback from assignment 1, by a wide margin, was given in the form of Criticism+, followed by almost equal percentages for Praise+ and Praise−. Percentages of other forms of feedback were low, although non-zero. Criticism+ being the highest feedback type suggests that students already had some conception of giving constructive feedback, but that the feedback was largely negative in nature may suggest they mistakenly thought that the purpose of feedback was just to point out mistakes, rather than also to give positive feedback and suggestions. One student reported that:
There is no perfect work. Most work always gets a negative comment. I can point out the mistakes of my friend’s work anyway.
In assignment 2, both Praise+ and Praise− were, again, almost equal, though this time were the most frequent types of feedback. All other types were very low frequency and, in the case of Criticism−, non-existent. This could be due to overcompensation because students had become aware that Praise was highly valued by their peers (see student’s quote below).
After the first round of peer feedback, I saw many negative feedbacks, and some of them are rude and irrelevant. So for the next round, I tried to point out the good things in order to encourage my classmates and not upset them. I guess most people prefer compliments and polite feedback.
In assignment 3, the highest frequency type was Combined Praise and Criticism, with almost equal but lower percentages for Praise+ and Criticism+, as well as a small increase for Opinion, and no percentages for Praise−, Criticism−, and Irrelevant. This might indicate that students had assimilated critical thinking, which they had been inducted into by their teacher, evaluation experience in assignments 1–2, and a discussion with their partner. This is observable by the absence of feedback without explanation and Irrelevant feedback; students were now giving feedback constructively with explanations, balanced feedback through combined Praise and Criticism, and Opinion-type feedback. This result is in line with Jones, Antonenko, and Greenwood’s [16] findings that combining the SRS with peer discussion is more beneficial than having students work independently.

4.1.2. The Analysis of Peer Feedback from Architecture Education Students

Figure 3 illustrates the percentage of Architecture Education students’ feedback types from three assignments. In contrast to the previous group, Praise+ and Criticism+ were much more evenly distributed from the start. On the other hand, Praise− and Criticism− only existed in the first assignment. This is probably due to no scores being awarded for the peer assessor role in the first assignment. Therefore, students did not take the assessor roles seriously. Opinion, though starting small in assignment 1, more than doubled with each assignment. This suggests students participated more wholeheartedly in feedback and were quick to assimilate the reason for providing feedback. Boud [19] and Vanderhoven et al. [20] reported that the more students participate in peer feedback activity, the more they improve their judgment skills and critical thinking. This is highlighted by the following student comment:
For assignment 3, we have a chance to discuss before giving a paired comment. So we were brainstorming. My partner had an opinion that I didn’t think of. Then we discussed about our different opinions. Finally, we came up with agreement, conclusion. So it’s a high quality of feedback.
However, one student reported that: “Paired discussion is a waste of time. We will have different opinions anyway.
Combined Praise and Criticism increased substantially in assignment 2, though fell in assignment 3, perhaps as students were more able to provide constructive suggestions (the Opinion type). Though the Irrelevant type was encountered more often for assignment 1 than for the other group, this comment type was not seen after assignment 1, agreeing with the idea that students were quicker to assimilate.
These results imply the AES group had a better understanding of the purposes of feedback in terms of giving explanations compared to the EES group. This result is in line with Ge’s [21] study, which found that students of lower ability benefit more from the peer feedback process than students of higher ability. In addition, this is linked to the notion of Thai culture [22]—this group could be seen as being more traditional than the other group and more restrained in giving verbal or public criticism where they would be identifiable. However, in this feedback process, they were anonymous, and so perhaps less restrained in giving feedback and opinion. Additionally, the AES group appeared to pay more attention to the teacher due to the hierarchical nature of the relationship (students pay respect to teachers) and wanted to participate more fully in order to satisfy the teacher. Conversely, the EES group did not hold the same perspective, and were more dismissive of the process, as they were more comfortable in using verbal and public channels of providing feedback (although they did assimilate and participate more fully in later assignments once they had been inducted into ways to provide feedback more productively).
Additionally, the EES group always raised their hands to answer questions, as a competition in the classroom. As we observed, this is in contrast with the Architecture Education students who are always reluctant to raise their hands in the classroom. They were more likely to participate in verbal public feedback as they were rewarded for doing so in class by receiving higher participation scores for good opinions. The AES group, though perhaps no less wanting to participate, appeared not to have the confidence to overcome the barrier of restraint, even with the same reward motivation offered. However, at the start of the feedback process examined here, teaching staff observed that positions were reversed; the EES group took it less seriously, whereas the AES group participated more fully from the start due to them having a new avenue of communication which did not require them to identify themselves.
I prefer to give feedback anonymously to my classmates because I feel braver to express my opinions without embarrassment and criticize my friends’ works without hesitating.

4.2. Research Question 2: Is There Any Relationship between the Percentage of Opinion-Type Feedback (Critical Thinking) and the Subsequent Assignment Scores When Compared between the AES and EES Groups?

Good feedback should contain strengths, weaknesses, and suggestions for improvement so that students can utilize the feedback to enhance their work in the future [3]. Figure 4 illustrates the comparison between assignment grades with Opinion-type feedback. Figure 4a shows that the percentage of Opinion-type feedback made by the AES students was progressively superior to the percentage of this type of feedback by the EES students. Figure 4b shows that the performance on each assignment was very similar among the AES and EES students. This suggests that the AES students were taking more advantage of the peer feedback than the EES students since the expectation was the EES students would perform better than the AES students.
According to Camacho-Miñano and del Campo [23] and Lantz and Stawiski [24], an SRS is effective for the student learning processes and can lead to improved learning results. EES grades improved only in assignment 3, and the grades for assignment 2 showed a slight decrease, indicating that the feedback from assignment 1 may not have been useful. The grades for assignment 3 showed an increase; it can be suggested that the students, while performing the assignment, might have taken on board the feedback from assignment 2, which was previously noted to be of a higher standard than that provided for assignment 1. The unhelpful feedback from assignment 1, followed by more helpful feedback in assignment 2, resulted in a delayed improvement in grades. On the other hand, the AES grades showed a continuous increase. This might have been due to these students having provided more constructive feedback throughout, allowing them to use the ideas generated by this feedback in their work. According to Lu and Law [25] and Lundstrom and Baker [26], students who provide more feedback, especially high-quality feedback, are more likely to be critical of their own work.

4.3. Research Question 3: How Do the AES and EES Groups Compare Based on Student Opinions as to What the Good Points of Being an Anonymous Assessor Are?

Instead of being passive recipients of teacher feedback, providing immediate feedback in a face-to-face classroom can increase students’ anxiety because their identity may be revealed, and some shy students may be reluctant to join this activity. Therefore, Poll Everywhere was used to provide anonymous peer feedback to address the previous problems and increase student participation [15,27]. Figure 5 shows students’ perceptions of the good points of being an anonymous assessor. The largest percentages of students stated they liked being able to criticize frankly and without hesitation. This suggests that they may have felt able to do something that was not an option for them in real life, perhaps due to the restrictive consequences of so doing. Li, Steckelberg, and Srinivasan [28] pointed out that students experienced stress when providing frank feedback in the face-to-face class when their identity was revealed. One student from our study mentioned that:
It’s good that I can provide anonymous feedback or else I will be reluctant to criticize my friends’ work. I couldn’t say what I think exactly. So it’s not my real feedback.
However, although feeling braver or less shy was a popular statement, and personal comfort level was also high (especially with the AES group), the reasons given were directed towards the prevention of negative reactions in other students—the elimination of argument, not wanting their friends to be sad, or not wanting them to dislike the commenter. This suggests a desire to keep their environment a positive one, which could be related to aspects of Thai culture, in particular conflict avoidance. It should be noted that some EES students preferred being anonymous assessors to avoid argument and gossip, but none of the AES students mentioned this issue. One EES student mentioned that:
I prefer to provide feedback anonymously to avoid argument because my friend might be angry and disagree with my comment as he knew it’s my comment.
Van den Bos and Tan [29] reported that students can think critically in the creation of feedback within a less controlled environment (without revealing their identity). From the quantitative data, we found that most students had positive feelings about being anonymous assessors. Of the EES group, 63%, and 72% of the AES group, were satisfied with using Poll Everywhere to provide anonymous peer feedback. Interestingly, 59% of EES students and 78% of AES students would like to use Poll Everywhere to support anonymous peer feedback activities in their future classes when they become teachers.
However, some students expressed concern about the disadvantages of anonymous peer feedback, which included the following.
  • Using rude words in feedback: Some students were concerned about using rude words or inappropriate words in feedback because of not revealing their identity. Therefore, it might encourage more impolite feedback or too frank feedback with strong words that might upset the receiver. Further study is required to understand how to reduce rude words in feedback.
  • Unreliable feedback: Many studies have analyzed the quality of feedback comparing peer feedback and teacher feedback, based on the level of their knowledge [30,31]. In this study, several students mentioned that friends should help friends. Although this is an anonymous peer feedback process, some students do not want to upset their friends; therefore, the quality of feedback is not always based on the student’s ability but may be based on friendly evaluation. Double marking of the quality of peer feedback is required to make the students take assessor roles more seriously [31]. One of the students made the following comment during our study:
We are close friends. So I couldn’t upset her with my frank feedback.

4.4. Research Question 4: How Do the AES and EES Groups Compare Based on the Impact of Peer Feedback Activities on the Students’ Performances as Measured by Final Exam Scores?

The final exam was divided into two categories: multiple-choice questions and essay writing. Table 3 summarizes the results of the AES and EES group statistics on writing essays. An independent samples t-test showed that the mean difference between the AES and EES groups was not statistically significant, t (60) = 0.798, p = 0.610, with an effect size d = 0.21. Despite this result, students reported that Poll Everywhere helped them learn better and that giving feedback aided their own learning process, which is also reported in Sheng et al. [15] and Gielen et al.’s [32] studies. An important aspect to highlight is the fact that EES students showed better average midterm grades than the AES students before the study, so the expectation was that EES students would perform better on this study’s final exam score; however, as the results showed, there were no significant differences between the AES and EES students. Consequently, these results show that anonymity allowed lower-performing students to express themselves more effectively than higher-performing students, leading them to improve their learning outcomes.
Finally, the results from the students’ survey regarding their participation in this peer feedback learning activity is shown in Table 4. A total of 54 out of 62 students responded to the survey about their perceptions, which consisted of four questions based on a 10-point scale (from 1 to 10, representing strongly disagree to strongly agree). The scale values were grouped in the following manner: 8–10 positive, 5–7 neutral, and 1–4 negative. The following percentages were calculated from the group of positive values. Seventy-eight percent of students agreed that evaluating their peers’ work made them compare it with their own work and thus wanted to improve it. Sixty-seven percent of students report that seeing peer feedback encouraged them to improve their own work and would like to improve their ability to provide feedback. Seventy-eight percent of students agree that the evaluation of peer work inspired them in creative thinking. This result aligns with results from Gielen et al. [33] and Lu and Law [25], who concluded that peer feedback encourages students to think, analyze, compare, judge, and communicate more.
From the information above, most students regarded the peer feedback process positively. We highlight five points from the students’ views on the peer feedback process from the open-ended question.
  • Develop new ideas: “I get new ideas from evaluating peer work and from seeing a variety of feedback. Many students point out different issues with some good suggestions that I’ve never thought of. I like to listen to the different opinions.” [AES12]
  • Encourage to work better: “Many positive and negative feedbacks from my classmates encourage me to work better. I like my work to be accepted. It’s good to see many feedbacks and this evaluation process makes me pay more attention to marking criteria. So I understand how to gain good scores.” [EES20]
  • Realize my own mistakes: “When I evaluate my peer work, I compare it with my own work. So I realize my own mistakes. This peer feedback process makes me think more and learn more from other people’s mistakes and my own mistakes.” [AES15]
  • Share knowledge and exchange ideas: “Peer feedback process provides an opportunity to share knowledge and exchange ideas between classmates. So we can learn from each other. It’s good. We don’t learn only how to do the assignment, we also learn how to provide the feedback and see how to improve it from many classmates’ ideas.” [AES09]
  • Compare evaluation ability: “Some of my classmates’ comments are similar to my comments. Seeing many peer feedbacks make me aware of my and my classmates’ evaluation abilities. We learn in the same class, same lesson, sometimes we have the same comment, sometime we have different opinions of how to criticize work and how to improve work.” [EES25]
In addition, the peer feedback process increases the transparency of assessment and enables students to develop confidence according to the number of times being an assessor. This result is confirmed by 66% of EES students and 67% of AES students who responded positively to participating in this process, as shown in a comment by one of the students: “I don’t like being an assessor from the start. I don’t think I have enough knowledge to provide feedback. However, my confidence is increased by how many times I assess my friends’ work. For assignment 3, I discussed with my partner, and she agreed with my comment.” [AES03]

5. Limitations

Students like to give feedback via the Poll Everywhere anonymous tool because they will not have the arguments of a sincere comment, but one student reported:
My friend who sat next to me read my comment. So this is not a real anonymous tool. We better use this tool in our free time (not in face-to-face classroom) and without the time limit.
Another problem is the Internet connection, which sometimes was slow or became disconnected. Some students mentioned that Poll Everywhere should be redesigned, to make it more attractive and more exciting. However, this was the first time that students used Poll Everywhere and they suggested using it in other modules as well (see student’s quote below). This student idea is also supported by Cartney [34], who suggested that peer feedback should be used at the program level, not the module level, to reduce student anxiety and involve them in active learning.
It’s a good website to express anonymous feedback. So I can criticize frankly. I recommend using it in other subjects as well and it will be great if became more well known for peer feedback activities.
Although the peer feedback analysis shows positive results, the peer feedback activity is a time-consuming process and increases student workload. Three rounds of peer feedback might be a maximum, as students’ attention might decrease over time. In addition, teaching in a computer laboratory is not easy, as some students like to check their online messages or surf the Internet all the time. Especially when there are presentations by classmates, some students do not pay attention at all. As peer assessors, students paid more attention to listen to their classmates’ presentations.
Methodologically, we have taken steps to minimize any subjectivity in the coding, but it is impossible for it to be completely eliminated.

6. Conclusions

Based on quantitative and qualitative analyses, the results confirm that lower-ability Architecture Education students provided more substantial Criticism+ and Opinion type comments than the higher-ability Engineering Educations students, which led to better performances in the subsequent assignments. Students have demonstrated positive thinking about the peer feedback process, including getting new ideas, being encouraged to work better, realizing their own mistakes, sharing knowledge and exchanging ideas, and comparing evaluation ability. Although critical thinking abilities were developed more in the Architecture Education students than in the Engineering Education students, the difference between the final exam scores of the two groups was not statistically significant. The expectation was that the Engineering Education students would perform better than the Architecture Education students given their higher midterm scores. Therefore, participation in this study appeared to have benefited the Architecture Education students more than the Engineering Education students. Students reported that the Poll Everywhere tool helped them learn better and that giving feedback aided their own learning process. They were satisfied with using Poll Everywhere to provide anonymous peer feedback. As a result, it is possible that anonymous peer feedback can be used to turn a passive learner into an active one.

Author Contributions

Conceptualization, J.S.; methodology, J.S.; validation, J.S.; formal analysis, J.S.; investigation, J.S.; resources, J.S.; writing—original draft preparation, J.S.; writing—review and editing, J.S., M.J. and H.R.P.; supervision, M.J. and H.R.P. All authors have read and agreed to the published version of the manuscript.


This work was supported by King Mongkut’s Institute of Technology Ladkrabang, Thailand [grant number KREF206218].

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, The Belmont Report, CIOMS Guideline, International Conference on Harmonization in Good Clinical Practice (ICH-GCP) and 45CFR, 46.101(b), and approved by the Research Ethics Committee of King Mongkut’s Institute of Technology Ladkrabang (EC-KMITL_65_041, 18 March, 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Anderson, O.W.; Krathwohl, D.R. A Taxonomy for Learning, Teaching and Assessing. A Revision of Bloom’s Taxonomy of Educational Objectives (Abridged ed.); Addison Wesley Longman, Inc.: New York, NY, USA, 2001. [Google Scholar]
  2. Kay, R.; MacDonald, T.; DiGiuseppe, M. A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. J. Comput. High. Educ. 2019, 31, 449–4712. [Google Scholar] [CrossRef]
  3. Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef] [Green Version]
  4. Nicol, D.; Thomson, A.; Breslin, C. Rethinking Feedback Practices in Higher Education: A Peer Review Perspective. Assess. Eval. High. Educ. 2014, 39, 102–122. [Google Scholar] [CrossRef]
  5. Chien, S.Y.; Hwang, G.J.; Jong, M.S.Y. Effects of peer assessment within the context of spherical video-based virtual reality on EFL students’ English-Speaking performance and learning perceptions. Comput. Educ. 2020, 146, 103751. [Google Scholar] [CrossRef]
  6. Rotsaert, T.; Panadero, E.; Schellens, T.; Raes, A. “Now you know what you’re doing right and wrong!” Peer feedback quality in synchronous peer assessment in secondary education. Eur. J. Psychol. Educ. 2017, 33, 1–21. [Google Scholar] [CrossRef] [Green Version]
  7. Topping, K. Peer assessment between students in colleges and universities. Rev. Educ. Res. 1998, 68, 249–276. [Google Scholar] [CrossRef]
  8. Falchikov, N.; Goldfinch, J. Student PA in higher education: A meta-analysis comparing peer and teacher marks. Rev. Educ. Res. 2000, 70, 287–322. [Google Scholar] [CrossRef]
  9. Li, L. The role of anonymity in peer assessment. Assess. Eval. High. Educ. 2017, 42, 645–656. [Google Scholar] [CrossRef]
  10. Stracke, E.; Kumar, V. Feedback and self-regulated learning: Insights from supervisors’ and PhD examiners’ reports. Reflective Pract. 2010, 11, 19–32. [Google Scholar] [CrossRef]
  11. Chen, I.-C.; Hwang, G.-J.; Lai, C.-L.; Wang, W.-C. From design to reflection: Effects of peer-scoring and comments on students’ behavioral patterns and learning outcomes in musical theater performance. Comput. Educ. 2020, 150, 103856. [Google Scholar] [CrossRef]
  12. Tai, J.; Ajjawi, R.; Boud, D.; Dawson, P.; Panadero, E. Developing Evaluative Judgement: Enabling Students to Make Decisions about the Quality of Work. High. Educ. 2018, 76, 467–481. [Google Scholar] [CrossRef] [Green Version]
  13. Masikunis, G.; Panayiotidis, A.; Burke, L. Changing the nature of lectures using a personal response system. Innov. Educ. Teach. Int. 2009, 46, 199–212. [Google Scholar] [CrossRef]
  14. Dunn, P.K.; Richardson, A.; Oprescu, F.; McDonald, C. Mobile-phone-based classroom response systems: Students’ perceptions of engagement and learning in a large undergraduate course. Int. J. Math. Educ. Sci. Technol. 2013, 44, 1160–1174. [Google Scholar] [CrossRef]
  15. Sheng, R.; Goldie, C.L.; Pulling, C.; Lucktar-Flude, M. Evaluating student perceptions of a multi-platform classroom response system in undergraduate nursing. Nurse Educ. Today 2019, 78, 25–31. [Google Scholar] [CrossRef]
  16. Jones, M.E.; Antonenko, P.D.; Greenwood, C.M. The impact of collaborative and individualized student response system strategies on learner motivation, metacognition, and knowledge transfer. J. Comput. Assist. Learn. 2012, 28, 477–487. [Google Scholar] [CrossRef]
  17. Wang, Y.-H. Interactive response system (IRS) for college students: Individual versus cooperative learning. Interact. Learn. Environ. 2018, 26, 943–957. [Google Scholar] [CrossRef]
  18. De Gagne, J.C. The impact of clickers in nursing education: A review of literature. Nurse Educ. Today 2011, 31, e34–e40. [Google Scholar] [CrossRef]
  19. Boud, D. Sustainable assessment: Rethinking assessment for the learning society. Stud. Contin. Educ. 2000, 22, 151–167. [Google Scholar] [CrossRef]
  20. Vanderhoven, E.; Raes, A.; Montrieux, H.; Rotsaert, T. What if pupils can assess their peers anonymously? A quasi-experiment study. Comput. Educ. 2015, 81, 123–132. [Google Scholar] [CrossRef] [Green Version]
  21. Ge, Z.G. Exploring e-learners’ perceptions of net-based peer-reviewed English writing. Int. J. Comput. Support. Collab. Learn. 2011, 6, 75–91. [Google Scholar] [CrossRef]
  22. Richard, B. Life in a Thai School. 2020. Available online: thaischoollife.comin-a-thai-school/ (accessed on 1 November 2022).
  23. Camacho-Miñano, M.-d.-M.; del Campo, C. Useful interactive teaching tool for learning: Clickers in higher education. Interact. Learn. Environ. 2016, 24, 706–723. [Google Scholar] [CrossRef]
  24. Lantz, M.E.; Stawiski, A. Effectiveness of clickers: Effect of feedback and the timing of questions on learning. Comput. Hum. Behav. 2014, 31, 280–286. [Google Scholar] [CrossRef]
  25. Lu, J.; Law, N. Online peer assessment: Effects of cognitive and affective feedback. Instr. Sci. 2012, 40, 257–275. [Google Scholar] [CrossRef] [Green Version]
  26. Lundstrom, K.; Baker, W. To give is better than to receive: The benefits of peer review to the reviewer’s own writing. J. Second Lang. Writ. 2009, 18, 30–43. [Google Scholar] [CrossRef]
  27. Jones, A.G. Audience response systems in a Korean cultural context: Poll everywhere’s effects on student engagement in English courses. J. Asia TEFL. 2019, 16, 624–643. [Google Scholar] [CrossRef] [Green Version]
  28. Li, L.; Steckelberg, A.L.; Srinivasan, S. Utilizing Peer Interactions to Promote Learning through a Web-based Peer Assessment System. Can. J. Learn. Technol./Rev. Can. L’apprentissage Technol. 2009, 34. [Google Scholar] [CrossRef]
  29. van den Bos, A.H.; Tan, E. Effects of anonymity on online peer review in second-language writing. Comput. Educ. 2019, 142, 103638. [Google Scholar] [CrossRef]
  30. Sridharan, B.; Tai, J.; Boud, D. Does the use of summative peer assessment in collaborative group work inhibit good judgement? High. Educ. 2019, 77, 853–870. [Google Scholar] [CrossRef]
  31. Sitthiworachart, J.; Joy, M. Computer support of effective peer assessment in an undergraduate programming class. J. Comput. Assist. Learn. 2008, 24, 217–231. [Google Scholar] [CrossRef]
  32. Gielen, S.; Tops, L.; Dochy, F.; Onghena, P.; Smeets, S. A comparative study of peer and teacher feedback and of various peer feedback forms in a secondary school writing curriculum. Br. Educ. Res. J. 2010, 36, 143–162. [Google Scholar] [CrossRef]
  33. Gielen, S.; Peeters, E.; Dochy, E.; Onghena, P.; Struyven, K. Improving the effectiveness of peer feedback for learning. Learn. Instr. 2010, 20, 304–315. [Google Scholar] [CrossRef]
  34. Cartney, P. Exploring the use of peer assessment as a vehicle for closing the gap between feedback given and feedback used. Assess. Eval. High. Educ. 2010, 35, 551–564. [Google Scholar] [CrossRef]
Figure 1. Main activities in the experiment.
Figure 1. Main activities in the experiment.
Education 13 00310 g001
Figure 2. Engineering Education students’ feedback types.
Figure 2. Engineering Education students’ feedback types.
Education 13 00310 g002
Figure 3. Architecture Education students’ feedback types.
Figure 3. Architecture Education students’ feedback types.
Education 13 00310 g003
Figure 4. Comparison of percentages of Opinion-type feedback (a) with average assignment score (b).
Figure 4. Comparison of percentages of Opinion-type feedback (a) with average assignment score (b).
Education 13 00310 g004
Figure 5. Good points of being an anonymous assessor.
Figure 5. Good points of being an anonymous assessor.
Education 13 00310 g005
Table 1. Peer feedback types with examples (adapted from Chien, Hwang, and Jong [5] and Stracke and Kumar [10]).
Table 1. Peer feedback types with examples (adapted from Chien, Hwang, and Jong [5] and Stracke and Kumar [10]).
TypeDefinitionExamples of Comments on Students’
Brochure Designs
Praise+Feedback that is positive or supporting with detailed explanation.The brochure is excellent. The design is good, with the frame and different font size making the reader understand the content easier.
Praise−Simple feedback that is positive or supporting.Good job!
Combined Praise and CriticismFeedback that is both positive and negative with detailed explanation.Text color and background color contrast well, but it is not a good choice of color, as it is too dark.
Criticism+Feedback that is negative or unfavorable with detailed explanation.Font and color are not attractive. The fancy font is too small and difficult to read. There is not enough information for the reader.
Criticism−Simple feedback that is negative or unfavorable.It’s not great!
OpinionFeedback or suggestion that is constructive.This brochure uses white space well. The title font should be bigger than this. I could not read it. Bullets or arrows should be applied for important information and to attract reader attention.
IrrelevantFeedback that is not related to content or non-sense.I don’t want to go. Good bye!
Table 2. Research questions, data sources, and data analysis.
Table 2. Research questions, data sources, and data analysis.
Research QuestionData Source Data Analysis
  • How do the Architecture Education Student (AES) and Engineering Education Student (EES) groups compare as regards the type of feedback students provide to their peers?
Peer feedback on the three assignments from the two groups of studentsCategorizing the types of peer feedback into seven types
Is there any relationship between the percentage of Opinion-type feedback (critical thinking) and the subsequent assignment scores when compared between the AES and EES groups?
Peer Opinion-type feedback and assignment scores from the three assignmentsDescriptive analysis
How do the AES and EES groups compare based on student opinions as to what the good points of being an anonymous assessor are?
Students’ perceptions of the good points of being an anonymous assessorDescriptive and content analysis
How do the AES and EES groups compare based on the impact of peer feedback activities on the students’ performances as measured by final exam scores?
Scores on essay writing (final exam)Independent samples t-test
Students’ agreement (Likert scale) on the sentences regarding peer feedback process motivating them to improve their subsequent assignment and feedback 10-point scale analysis summing the value of each selected option
Students’ views on the peer feedback process from the open-ended question (e.g., encourage to work better, realize their own mistakes, etc.)Content analysis
Table 3. Scores on essay writing.
Table 3. Scores on essay writing.
Final exam scoresAES267.3451.266
Table 4. Students’ perceptions.
Table 4. Students’ perceptions.
Strongly Disagree----------Strongly Agree
1Evaluating my peers’ work makes me compare it with my own work and thus want to improve it 3921912
2Seeing peer feedback encourages me to improve my own work 99121212
3Seeing peer feedback encourages me to improve my ability to provide feedback 369121212
4Evaluating my peers’ work inspires my creative thinking 3 9181212
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sitthiworachart, J.; Joy, M.; Ponce, H.R. Interactive Learning with Student Response System to Encourage Students to Provide Peer Feedback. Educ. Sci. 2023, 13, 310.

AMA Style

Sitthiworachart J, Joy M, Ponce HR. Interactive Learning with Student Response System to Encourage Students to Provide Peer Feedback. Education Sciences. 2023; 13(3):310.

Chicago/Turabian Style

Sitthiworachart, Jirarat, Mike Joy, and Héctor R. Ponce. 2023. "Interactive Learning with Student Response System to Encourage Students to Provide Peer Feedback" Education Sciences 13, no. 3: 310.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop