Interactive Learning with Student Response System to Encourage Students to Provide Peer Feedback

: This study analyzed anonymous peer feedback among two groups of university students—a lower-performing class and a higher-performing class. Students used an audience response system to anonymously comment on each other’s work. Each peer feedback or comment was categorized into one of seven types: Praise+, Praise − , Criticism+, Criticism − , Combined Praise and Criticism, Opinion, and Irrelevant. The plus (+) and minus ( − ) signs were used to categorize the quality of the feedback. The learning performance of the two groups of students was also analyzed. The main result showed that the lower-performing class (based on the average midterm scores) provided more substantial Criticism+ and Opinion-type comments than the higher-performing students. Contrary to expectation, no signiﬁcant difference was found between the two classes on the ﬁnal exam, suggesting that anonymity allowed lower-performing students to express themselves more effectively than higher-performing students, leading them to improve their learning outcomes.


Introduction 1.Peer Review Approach
Student-centered learning encourages interaction in the classroom, but teacher-centered delivery might stop students from expressing their opinions and other class engagement activities [1,2].Carless and Boud [3] mentioned that "it is unrealistic and ineffective to expect teachers to provide more and more comments to large numbers of learners".Giving feedback is likely to be more beneficial to students than simply receiving feedback since they will engage in critical thinking to point out faults and provide suggestions to improve their peers' work [4].Therefore, the main activity of this research is encouraging students to provide peer feedback.
Peer review is a learning approach that encourages students to reflect on their understanding, and enhance their critical thinking, by evaluating each other based on the marking criteria that have been set by the teachers or students [5][6][7].In order to provide feedback to their classmates, students are encouraged to think (active learning) more than just receive feedback (passive learning).Moreover, students will have a better understanding of marking criteria and learn how to judge the quality of work [8].This approach also increases the transparency of the assessment.However, if students are required to give immediate feedback in front of the class, they may feel uncomfortable that their identities have been revealed [9].Technology can help during this process by offering the option to implement anonymous feedback.

Peer Feedback Type
A student participates in the peer review process in two ways: either as an assessor who provides feedback on peers' work or as an assessee who receives feedback.Peer feedback types were a modification of the feedback coding scheme that was developed by Chien, Hwang, and Jong [5] and Stracke and Kumar [10], and includes Praise, Criticism, Opinion, and Irrelevance.When grouping the peer feedback, it was found in this study that some feedback could be classified into both Praise and Criticism, and a further type of peer feedback was added as shown in Table 1.Some feedback is simple, without detailed explanation.Therefore, "+" and "−" are used to classify the quality of feedback, where "+" denotes detailed explanation and "−" denotes simple feedback.[5] and Stracke and Kumar [10]).

Type Definition Examples of Comments on Students' Brochure Designs
Praise+ Feedback that is positive or supporting with detailed explanation.
The brochure is excellent.The design is good, with the frame and different font size making the reader understand the content easier.
Praise− Simple feedback that is positive or supporting.

Good job!
Combined Praise and Criticism Feedback that is both positive and negative with detailed explanation.
Text color and background color contrast well, but it is not a good choice of color, as it is too dark.

Criticism+
Feedback that is negative or unfavorable with detailed explanation.
Font and color are not attractive.The fancy font is too small and difficult to read.There is not enough information for the reader.
Criticism− Simple feedback that is negative or unfavorable.It's not great!Opinion Feedback or suggestion that is constructive.
This brochure uses white space well.The title font should be bigger than this.I could not read it.Bullets or arrows should be applied for important information and to attract reader attention.

Irrelevant
Feedback that is not related to content or non-sense.I don't want to go.Good bye!According to Chen et al. [11], there are two levels of feedback, namely surface level (simple compliments or simple criticism) and critical level (accurate and practical suggestions).Pointing out strengths and weaknesses and giving suggestions for improvement requires higher-order thinking or critical thinking [3].Tai et al. [12] also reported that to provide effective feedback, students need to develop critical thinking skills by judging the quality of their own work and other people's work.

Student Response Systems Support Interactive Learning
Using technology integrated into learning can enhance students' learning experiences and increase student participation and interaction in the classroom [13].A student response system (SRS), also known as a classroom response system, is a system that allows students to respond to teacher questions [14,15].An SRS is also a tool to collect students' instant responses and can be used in collaborative learning as students can share knowledge and exchange ideas with each other [16,17].Therefore, an SRS can be used to change students from passive learners into active learners, with the anonymous environment allowing the possibility for more shy students to be more comfortable with answering and expressing their opinions [18].SRSs include Poll Everywhere (polleverywhere.com),Piazza (piazza.com),Socrative (socrative.com), and GoSoapbox (gosoapbox.com).Poll Everywhere (PollEV.com) was chosen for this study because it offers a free plan, it enables students to provide immediate anonymous feedback in a face-to-face classroom, and it allows students to express their ideas from their phones and laptops, with responses exportable as an Excel spreadsheet.

Research Questions
The main purpose of this study was to encourage students to improve their critical thinking skills by providing peer feedback in less pressurized (anonymous) conditions through the SRS.The following four research questions shaped this investigation.

1.
How do the Architecture Education Student (AES) and Engineering Education Student (EES) groups compare as regards the type of feedback students provide to their peers?2.
Is there any relationship between the percentage of Opinion-type feedback (critical thinking) and the subsequent assignment scores when compared between the AES and EES groups?3.
How do the AES and EES groups compare based on student opinions as to what the good points of being an anonymous assessor are? 4.
How do the AES and EES groups compare based on the impact of peer feedback activities on the students' performances as measured by final exam scores?

Context of the Study and Participants
Undergraduate students at the Faculty of Industrial Education and Technology at the first author's university must complete a course called "Innovation and Information Technology in Education".The course objectives are to (1) explore ways that innovation and information technology can improve learning quality; (2) analyze the challenges that innovation and information technology pose in education; and (3) evaluate innovation and information technology.This fifteen-week course includes two hours of lectures and two hours of computer labs per week.As a result, it is expected that students will obtain the necessary knowledge and skills through the use of educational technology.
Two groups of 62 s year undergraduate students, between the ages of 19 and 21 and enrolled in an "Innovation and Information Technology in Education" course, participated in this study.None of them had any prior experience in using the Poll Everywhere tool.The first group of 36 Engineering Education students consisted of 16 males and 20 females.The second group of 26 Architecture Education students consisted of 7 males and 19 females.These two groups were taught by the same lecturer.Based on the Engineering Education curriculum, students had some background in technology before attending this module, such as principles of computer programming and telecommunications technology.However, the Architecture Education curriculum did not include any technology-related modules in the first year.
Based on the average midterm scores, Engineering Education students (mean = 14.32 out of 20; SD = 1.92) were assessed to be higher-ability students than Architecture Education students (mean = 13.09out of 20; SD = 2.01), with an effect size d = 0.63 95% [0.11-1.14] in this study.The first author's university provided funding for this research.Students' rights were protected, and they were informed that their opinions would not be linked to their identities.Participants were informed that they could refuse to participate or withdraw from the study at any time.The study was approved by the Research Ethics Committee of the first author's university.

Experimental Procedure
Students in each class were asked to form small groups composed of 3-4 participants per group.Each group was assigned to do the same assignments (design brochures, create surveys, and make video clips) and deliver a presentation for 15 min.The lecturer in charge of delivering the class chose which group would present in each assignment.Due to time restrictions, only 3 groups presented in each assignment, although all groups had the opportunity to present after completing the three assignments.The order of the groups' presentations was randomized.Marking criteria were discussed in the classroom.Students were asked to assess their peers' work (3 assignments that were chosen randomly by the lecturer) by providing anonymous comments on their peers' presentations using the PollEv.comapplication.The expectation was that this tool would be used to encourage immediate feedback more effectively without class pressure and with less anxiety than in a face-to-face classroom.Students gave individual feedback for the first 2 assignments, but they discussed with their group partners before giving feedback for assignment 3 (see Figure 1).In the first assignment, students practiced providing feedback without being rewarded with scores.In assignments 2 and 3, peer assessors gained scores for the role of feedback giver.The whole class could see all anonymous peer feedback through PollEV.com on the screen in front of the classroom immediately.No one knew which feedback was given by whom.Before the final exam week, students answered the questionnaire for 15-20 min.
the opportunity to present after completing the three assignments.The order of the groups' presentations was randomized.Marking criteria were discussed in the classroom.
Students were asked to assess their peers' work (3 assignments that were chosen randomly by the lecturer) by providing anonymous comments on their peers' presentations using the PollEv.comapplication.The expectation was that this tool would be used to encourage immediate feedback more effectively without class pressure and with less anxiety than in a face-to-face classroom.Students gave individual feedback for the first 2 assignments, but they discussed with their group partners before giving feedback for assignment 3 (see Figure 1).In the first assignment, students practiced providing feedback without being rewarded with scores.In assignments 2 and 3, peer assessors gained scores for the role of feedback giver.The whole class could see all anonymous peer feedback through PollEV.com on the screen in front of the classroom immediately.No one knew which feedback was given by whom.Before the final exam week, students answered the questionnaire for 15-20 min.

Instruments
PollEv.com was used to collect anonymous peer feedback on three assignments.Based on the peer feedback type in Table 1, two researchers coded all of the peer feedback (double review).The questionnaire gathered students' perspectives on their learning experience when using the peer feedback approach.The questionnaire comprised three parts: (1) student background, such as age, gender, technology background, peer feedback prior experiences, etc.; (2) student perceptions of the impact of peer assessors on their performance on subsequent assignments, measured using a Likert scale, which ranked from 1 (strongly disagree) to 10 (strongly agree); (3) student perceptions of the benefits of being an anonymous assessor and their learning experiences of the peer feedback

Instruments
PollEv.com was used to collect anonymous peer feedback on three assignments.Based on the peer feedback type in Table 1, two researchers coded all of the peer feedback (double review).The questionnaire gathered students' perspectives on their learning experience when using the peer feedback approach.The questionnaire comprised three parts: (1) student background, such as age, gender, technology background, peer feedback prior experiences, etc.; (2) student perceptions of the impact of peer assessors on their performance on subsequent assignments, measured using a Likert scale, which ranked from 1 (strongly disagree) to 10 (strongly agree); (3) student perceptions of the benefits of being an anonymous assessor and their learning experiences of the peer feedback approach, which were collected through open-ended questions.Students' knowledge and critical thinking skills were assessed in the final test.

Results and Discussion
This section presents the results of the intervention comparing data from peer feedback, students' perceptions, and scores on the final exam between the Architecture Education Students and Engineering Education Students.Table 2 shows the relationship between the research questions, data used to respond to each question, and method employed to analyze the data.The analysis of peer feedback type, based on feedback on the three assignments from the two groups of students, elicited 163 responses from EES students and 165 from AES students.

The Analysis of Peer Feedback from Engineering Education Students
From Figure 2, it can be observed that the most frequent feedback from assignment 1, by a wide margin, was given in the form of Criticism+, followed by almost equal percentages for Praise+ and Praise−.Percentages of other forms of feedback were low, although non-zero.Criticism+ being the highest feedback type suggests that students already had some conception of giving constructive feedback, but that the feedback was largely negative in nature may suggest they mistakenly thought that the purpose of feedback was just to point out mistakes, rather than also to give positive feedback and suggestions.One student reported that: In assignment 2, both Praise+ and Praise-were, again, almost equal, though this time were the most frequent types of feedback.All other types were very low frequency and, in the case of Criticism-, non-existent.This could be due to overcompensation because students had become aware that Praise was highly valued by their peers (see student's quote below).

"After the first round of peer feedback, I saw many negative feedbacks, and some of them are rude and irrelevant. So for the next round, I tried to point out the good things in order to encourage my classmates and not upset them. I guess most people prefer compliments and polite feedback." [EES05]
In assignment 3, the highest frequency type was Combined Praise and Criticism, with almost equal but lower percentages for Praise+ and Criticism+, as well as a small increase for Opinion, and no percentages for Praise-, Criticism-, and Irrelevant.This might indicate that students had assimilated critical thinking, which they had been inducted into by their teacher, evaluation experience in assignments 1-2, and a discussion with their partner.This is observable by the absence of feedback without explanation and Irrelevant feed- In assignment 2, both Praise+ and Praise− were, again, almost equal, though this time were the most frequent types of feedback.All other types were very low frequency and, in the case of Criticism−, non-existent.This could be due to overcompensation because students had become aware that Praise was highly valued by their peers (see student's quote below).
"After the first round of peer feedback, I saw many negative feedbacks, and some of them are rude and irrelevant.So for the next round, I tried to point out the good things in order to encourage my classmates and not upset them.I guess most people prefer compliments and polite feedback.

" [EES05]
In assignment 3, the highest frequency type was Combined Praise and Criticism, with almost equal but lower percentages for Praise+ and Criticism+, as well as a small increase for Opinion, and no percentages for Praise−, Criticism−, and Irrelevant.This might indicate that students had assimilated critical thinking, which they had been inducted into by their teacher, evaluation experience in assignments 1-2, and a discussion with their partner.This is observable by the absence of feedback without explanation and Irrelevant feedback; students were now giving feedback constructively with explanations, balanced feedback through combined Praise and Criticism, and Opinion-type feedback.This result is in line with Jones, Antonenko, and Greenwood's [16] findings that combining the SRS with peer discussion is more beneficial than having students work independently.

The Analysis of Peer Feedback from Architecture Education Students
Figure 3 illustrates the percentage of Architecture Education students' feedback types from three assignments.In contrast to the previous group, Praise+ and Criticism+ were much more evenly distributed from the start.On the other hand, Praise− and Criticism− only existed in the first assignment.This is probably due to no scores being awarded for the peer assessor role in the first assignment.Therefore, students did not take the assessor roles seriously.Opinion, though starting small in assignment 1, more than doubled with each assignment.This suggests students participated more wholeheartedly in feedback and were quick to assimilate the reason for providing feedback.Boud [19] and Vanderhoven et al. [20] reported that the more students participate in peer feedback activity, the more they improve their judgment skills and critical thinking.This is highlighted by the following student comment: PEER REVIEW 7 of 14

The Analysis of Peer Feedback from Architecture Education Students
Figure 3 illustrates the percentage of Architecture Education students' feedback types from three assignments.In contrast to the previous group, Praise+ and Criticism+ were much more evenly distributed from the start.On the other hand, Praise-and Criticismonly existed in the first assignment.This is probably due to no scores being awarded for the peer assessor role in the first assignment.Therefore, students did not take the assessor roles seriously.Opinion, though starting small in assignment 1, more than doubled with each assignment.This suggests students participated more wholeheartedly in feedback and were quick to assimilate the reason for providing feedback.Boud [19] and Vanderhoven et al. [20] reported that the more students participate in peer feedback activity, the more they improve their judgment skills and critical thinking.This is highlighted by the following student comment: "For assignment 3, we have a chance to discuss before giving a paired comment.So we were brainstorming.My partner had an opinion that I didn't think of.Then we discussed about our different opinions.Finally, we came up with agreement, conclusion.So it's a high quality of feedback."[AES18].
However, one student reported that: "Paired discussion is a waste of time.We will have different opinions anyway."[AES06].
Combined Praise and Criticism increased substantially in assignment 2, though fell in assignment 3, perhaps as students were more able to provide constructive suggestions (the Opinion type).Though the Irrelevant type was encountered more often for assignment 1 than for the other group, this comment type was not seen after assignment 1, agreeing with the idea that students were quicker to assimilate."For assignment 3, we have a chance to discuss before giving a paired comment.So we were brainstorming.My partner had an opinion that I didn't think of.Then we discussed about our different opinions.Finally, we came up with agreement, conclusion.So it's a high quality of feedback."[AES18] However, one student reported that: "Paired discussion is a waste of time.We will have different opinions anyway.

" [AES06]
Combined Praise and Criticism increased substantially in assignment 2, though fell in assignment 3, perhaps as students were more able to provide constructive suggestions (the Opinion type).Though the Irrelevant type was encountered more often for assignment 1 than for the other group, this comment type was not seen after assignment 1, agreeing with the idea that students were quicker to assimilate.These results imply the AES group had a better understanding of the purposes of feedback in terms of giving explanations compared to the EES group.This result is in line with Ge's [21] study, which found that students of lower ability benefit more from the peer feedback process than students of higher ability.In addition, this is linked to the notion of Thai culture [22]-this group could be seen as being more traditional than the other group and more restrained in giving verbal or public criticism where they would be identifiable.However, in this feedback process, they were anonymous, and so perhaps less restrained in giving feedback and opinion.Additionally, the AES group appeared to pay more attention to the teacher due to the hierarchical nature of the relationship (students pay respect to teachers) and wanted to participate more fully in order to satisfy the teacher.Conversely, the EES group did not hold the same perspective, and were more dismissive of the process, as they were more comfortable in using verbal and public channels of providing feedback (although they did assimilate and participate more fully in later assignments once they had been inducted into ways to provide feedback more productively).
Additionally, the EES group always raised their hands to answer questions, as a competition in the classroom.As we observed, this is in contrast with the Architecture Education students who are always reluctant to raise their hands in the classroom.They were more likely to participate in verbal public feedback as they were rewarded for doing so in class by receiving higher participation scores for good opinions.The AES group, though perhaps no less wanting to participate, appeared not to have the confidence to overcome the barrier of restraint, even with the same reward motivation offered.However, at the start of the feedback process examined here, teaching staff observed that positions were reversed; the EES group took it less seriously, whereas the AES group participated more fully from the start due to them having a new avenue of communication which did not require them to identify themselves.

Research Question 2: Is There Any Relationship between the Percentage of Opinion-Type Feedback (Critical Thinking) and the Subsequent Assignment Scores When Compared between the AES and EES Groups?
Good feedback should contain strengths, weaknesses, and suggestions for improvement so that students can utilize the feedback to enhance their work in the future [3]. Figure 4 illustrates the comparison between assignment grades with Opinion-type feedback.Figure 4a shows that the percentage of Opinion-type feedback made by the AES students was progressively superior to the percentage of this type of feedback by the EES students.Figure 4b shows that the performance on each assignment was very similar among the AES and EES students.This suggests that the AES students were taking more advantage of the peer feedback than the EES students since the expectation was the EES students would perform better than the AES students.According to Camacho-Miñano and del Campo [23] and Lantz and Stawiski [24] SRS is effective for the student learning processes and can lead to improved learning sults.EES grades improved only in assignment 3, and the grades for assignment 2 show a slight decrease, indicating that the feedback from assignment 1 may not have been u ful.The grades for assignment 3 showed an increase; it can be suggested that the studen while performing the assignment, might have taken on board the feedback from assi ment 2, which was previously noted to be of a higher standard than that provided assignment 1.The unhelpful feedback from assignment 1, followed by more helpful fe back in assignment 2, resulted in a delayed improvement in grades.On the other ha the AES grades showed a continuous increase.This might have been due to these stude having provided more constructive feedback throughout, allowing them to use the id generated by this feedback in their work.According to Lu and Law [25] and Lundstr and Baker [26], students who provide more feedback, especially high-quality feedba are more likely to be critical of their own work.

Research Question 3: How Do the AES and EES Groups Compare Based on Student Opinions as to What the Good Points of Being an Anonymous Assessor Are?
Instead of being passive recipients of teacher feedback, providing immediate fe back in a face-to-face classroom can increase students' anxiety because their identity m be revealed, and some shy students may be reluctant to join this activity.Therefore, P Everywhere was used to provide anonymous peer feedback to address the previous pr lems and increase student participation [15,27].Figure 5 shows students' perceptions the good points of being an anonymous assessor.The largest percentages of stude stated they liked being able to criticize frankly and without hesitation.This suggests t they may have felt able to do something that was not an option for them in real life, p haps due to the restrictive consequences of so doing.Li, Steckelberg, and Srinivasan [ pointed out that students experienced stress when providing frank feedback in the fa  According to Camacho-Miñano and del Campo [23] and Lantz and Stawiski [24], an SRS is effective for the student learning processes and can lead to improved learning results.EES grades improved only in assignment 3, and the grades for assignment 2 showed a slight decrease, indicating that the feedback from assignment 1 may not have been useful.The grades for assignment 3 showed an increase; it can be suggested that the students, while performing the assignment, might have taken on board the feedback from assignment 2, which was previously noted to be of a higher standard than that provided for assignment 1.The unhelpful feedback from assignment 1, followed by more helpful feedback in assignment 2, resulted in a delayed improvement in grades.On the other hand, the AES grades showed a continuous increase.This might have been due to these students having provided more constructive feedback throughout, allowing them to use the ideas generated by this feedback in their work.According to Lu and Law [25] and Lundstrom and Baker [26], students who provide more feedback, especially high-quality feedback, are more likely to be critical of their own work.Instead of being passive recipients of teacher feedback, providing immediate feedback in a face-to-face classroom can increase students' anxiety because their identity may be revealed, and some shy students may be reluctant to join this activity.Therefore, Poll Everywhere was used to provide anonymous peer feedback to address the previous problems and increase student participation [15,27].Figure 5 shows students' perceptions of the good points of being an anonymous assessor.The largest percentages of students stated they liked being able to criticize frankly and without hesitation.This suggests that they may have felt able to do something that was not an option for them in real life, perhaps due to the restrictive consequences of so doing.Li, Steckelberg, and Srinivasan [28] pointed out that students experienced stress when providing frank feedback in the face-to-face class when their identity was revealed.One student from our study mentioned that:

"It's good that I can provide anonymous feedback or else I will be reluctant to criticize my friends' work. I couldn't say what I think exactly. So it's not my real feedback." [AES21]
However, although feeling braver or less shy was a popular statement, and personal comfort level was also high (especially with the AES group), the reasons given were directed towards the prevention of negative reactions in other students-the elimination of argument, not wanting their friends to be sad, or not wanting them to dislike the commenter.This suggests a desire to keep their environment a positive one, which could be related to aspects of Thai culture, in particular conflict avoidance.It should be noted that some EES students preferred being anonymous assessors to avoid argument and gossip, but none of the AES students mentioned this issue.One EES student mentioned that:

"I prefer to provide feedback anonymously to avoid argument because my friend might be angry and disagree with my comment as he knew it's my comment." [EES04]
Van den Bos and Tan [29] reported that students can think critically in the creation of feedback within a less controlled environment (without revealing their identity).From the quantitative data, we found that most students had positive feelings about being anonymous assessors.Of the EES group, 63%, and 72% of the AES group, were satisfied with using Poll Everywhere to provide anonymous peer feedback.Interestingly, 59% of EES students and 78% of AES students would like to use Poll Everywhere to support anonymous peer feedback activities in their future classes when they become teachers.
However, some students expressed concern about the disadvantages of anonymous peer feedback, which included the following.

•
Using rude words in feedback: Some students were concerned about using rude words or inappropriate words in feedback because of not revealing their identity.Therefore, it might encourage more impolite feedback or too frank feedback with strong words that might upset the receiver.Further study is required to understand how to reduce rude words in feedback.

•
Unreliable feedback: Many studies have analyzed the quality of feedback comparing peer feedback and teacher feedback, based on the level of their knowledge [30,31].In this study, several students mentioned that friends should help friends.Although this is an anonymous peer feedback process, some students do not want to upset their However, although feeling braver or less shy was a popular statement, and personal comfort level was also high (especially with the AES group), the reasons given were directed towards the prevention of negative reactions in other students-the elimination of argument, not wanting their friends to be sad, or not wanting them to dislike the commenter.This suggests a desire to keep their environment a positive one, which could be related to aspects of Thai culture, in particular conflict avoidance.It should be noted that some EES students preferred being anonymous assessors to avoid argument and gossip, but none of the AES students mentioned this issue.One EES student mentioned that: "I prefer to provide feedback anonymously to avoid argument because my friend might be angry and disagree with my comment as he knew it's my comment.

" [EES04]
Van den Bos and Tan [29] reported that students can think critically in the creation of feedback within a less controlled environment (without revealing their identity).From the quantitative data, we found that most students had positive feelings about being anonymous assessors.Of the EES group, 63%, and 72% of the AES group, were satisfied with using Poll Everywhere to provide anonymous peer feedback.Interestingly, 59% of EES students and 78% of AES students would like to use Poll Everywhere to support anonymous peer feedback activities in their future classes when they become teachers.However, some students expressed concern about the disadvantages of anonymous peer feedback, which included the following.

•
Using rude words in feedback: Some students were concerned about using rude words or inappropriate words in feedback because of not revealing their identity.Therefore, it might encourage more impolite feedback or too frank feedback with strong words that might upset the receiver.Further study is required to understand how to reduce rude words in feedback.

•
Unreliable feedback: Many studies have analyzed the quality of feedback comparing peer feedback and teacher feedback, based on the level of their knowledge [30,31].In this study, several students mentioned that friends should help friends.Although this is an anonymous peer feedback process, some students do not want to upset their friends; therefore, the quality of feedback is not always based on the student's ability but may be based on friendly evaluation.Double marking of the quality of peer feedback is required to make the students take assessor roles more seriously [31].One of the students made the following comment during our study: The final exam was divided into two categories: multiple-choice questions and essay writing.Table 3 summarizes the results of the AES and EES group statistics on writing essays.An independent samples t-test showed that the mean difference between the AES and EES groups was not statistically significant, t (60) = 0.798, p = 0.610, with an effect size d = 0.21.Despite this result, students reported that Poll Everywhere helped them learn better and that giving feedback aided their own learning process, which is also reported in Sheng et al. [15] and Gielen et al.'s [32] studies.An important aspect to highlight is the fact that EES students showed better average midterm grades than the AES students before the study, so the expectation was that EES students would perform better on this study's final exam score; however, as the results showed, there were no significant differences between the AES and EES students.Consequently, these results show that anonymity allowed lower-performing students to express themselves more effectively than higher-performing students, leading them to improve their learning outcomes.Finally, the results from the students' survey regarding their participation in this peer feedback learning activity is shown in Table 4.A total of 54 out of 62 students responded to the survey about their perceptions, which consisted of four questions based on a 10-point scale (from 1 to 10, representing strongly disagree to strongly agree).The scale values were grouped in the following manner: 8-10 positive, 5-7 neutral, and 1-4 negative.The following percentages were calculated from the group of positive values.Seventy-eight percent of students agreed that evaluating their peers' work made them compare it with their own work and thus wanted to improve it.Sixty-seven percent of students report that seeing peer feedback encouraged them to improve their own work and would like to improve their ability to provide feedback.Seventy-eight percent of students agree that the evaluation of peer work inspired them in creative thinking.This result aligns with results from Gielen et al. [33] and Lu and Law [25], who concluded that peer feedback encourages students to think, analyze, compare, judge, and communicate more.
From the information above, most students regarded the peer feedback process positively.We highlight five points from the students' views on the peer feedback process from the open-ended question.In addition, the peer feedback process increases the transparency of assessment and enables students to develop confidence according to the number of times being an assessor.This result is confirmed by 66% of EES students and 67% of AES students who responded positively to participating in this process, as shown in a comment by one of the students: "I don't like being an assessor from the start.I don't think I have enough knowledge to provide feedback.However, my confidence is increased by how many times I assess my friends' work.For assignment 3, I discussed with my partner, and she agreed with my comment."[AES03]

Limitations
Students like to give feedback via the Poll Everywhere anonymous tool because they will not have the arguments of a sincere comment, but one student reported: "My friend who sat next to me read my comment.So this is not a real anonymous tool.We better use this tool in our free time (not in face-to-face classroom) and without the time limit.

" [EES14]
Another problem is the Internet connection, which sometimes was slow or became disconnected.Some students mentioned that Poll Everywhere should be redesigned, to make it more attractive and more exciting.However, this was the first time that students used Poll Everywhere and they suggested using it in other modules as well (see student's quote below).This student idea is also supported by Cartney [34], who suggested that peer feedback should be used at the program level, not the module level, to reduce student anxiety and involve them in active learning."It's a good website to express anonymous feedback.So I can criticize frankly.I recommend using it in other subjects as well and it will be great if PoolEV.combecame more well known for peer feedback activities.

" [AES07]
Although the peer feedback analysis shows positive results, the peer feedback activity is a time-consuming process and increases student workload.Three rounds of peer feedback might be a maximum, as students' attention might decrease over time.In addition, teaching in a computer laboratory is not easy, as some students like to check their online messages or surf the Internet all the time.Especially when there are presentations by classmates, some students do not pay attention at all.As peer assessors, students paid more attention to listen to their classmates' presentations.
Methodologically, we have taken steps to minimize any subjectivity in the coding, but it is impossible for it to be completely eliminated.

Conclusions
Based on quantitative and qualitative analyses, the results confirm that lower-ability Architecture Education students provided more substantial Criticism+ and Opinion type comments than the higher-ability Engineering Educations students, which led to better performances in the subsequent assignments.Students have demonstrated positive thinking about the peer feedback process, including getting new ideas, being encouraged to work better, realizing their own mistakes, sharing knowledge and exchanging ideas, and comparing evaluation ability.Although critical thinking abilities were developed more in the Architecture Education students than in the Engineering Education students, the difference between the final exam scores of the two groups was not statistically significant.The expectation was that the Engineering Education students would perform better than the Architecture Education students given their higher midterm scores.Therefore, participation in this study appeared to have benefited the Architecture Education students more than the Engineering Education students.Students reported that the Poll Everywhere tool helped them learn better and that giving feedback aided their own learning process.They were satisfied with using Poll Everywhere to provide anonymous peer feedback.As a result, it is possible that anonymous peer feedback can be used to turn a passive learner into an active one.

Figure 1 .
Figure 1.Main activities in the experiment.

Figure 1 .
Figure 1.Main activities in the experiment.

Figure 4 .
Figure 4. Comparison of percentages of Opinion-type feedback (a) with average assignment sc (b).

Figure 4 .
Figure 4. Comparison of percentages of Opinion-type feedback (a) with average assignment score (b).

4. 3 .
Research Question 3: How Do the AES and EES Groups Compare Based on Student Opinions as to What the Good Points of Being an Anonymous Assessor Are?

Figure 5 .
Figure 5. Good points of being an anonymous assessor.

Figure 5 .
Figure 5. Good points of being an anonymous assessor."It's good that I can provide anonymous feedback or else I will be reluctant to criticize my friends' work.I couldn't say what I think exactly.So it's not my real feedback."[AES21]

Table 1 .
Peer feedback types with examples (adapted from Chien, Hwang, and Jong

Table 2 .
Research questions, data sources, and data analysis.
Is there any relationship between the percentage of Opinion-type feedback (critical thinking) and the subsequent assignment scores when compared between the AES and EES groups?Peer Opinion-type feedback and assignment scores from the three assignments Descriptive analysis 3. How do the AES and EES groups compare based on student opinions as to what the good points of being an anonymous assessor are?Students' perceptions of the good points of being an anonymous assessor Descriptive and content analysis 4. How do the AES and EES groups compare based on the impact of peer feedback activities on the students' performances as measured by final exam scores?Scores on essay writing (final exam) Independent samples t-test Students' agreement (Likert scale) on the sentences regarding peer feedback process motivating them to improve their subsequent assignment and feedback 10-point scale analysis summing the value of each selected option Students' views on the peer feedback process from the open-ended question (e.g., encourage to work better, realize their own mistakes, etc.) Content analysis 4.1.Research Question 1: How Do the AES and EES Groups Compare as Regards the Type of Feedback Students Provide to Their Peers?

"
There is no perfect work.Most work always gets a negative comment.I can point out the mistakes of my friend's work anyway."[EES30] Figure 2. Engineering Education students' feedback types."There is no perfect work.Most work always gets a negative comment.I can point out the mistakes of my friend's work anyway."[EES30]

"
We are close friends.So I couldn't upset her with my frank feedback."[AES01] 4.4.Research Question 4: How Do the AES and EES Groups Compare Based on the Impact of Peer Feedback Activities on the Students' Performances as Measured by Final Exam Scores?

Table 3 .
Scores on essay writing.