Next Article in Journal
Knowledge Acquisition of Biology and Physics University Students—the Role of Prior Knowledge
Previous Article in Journal
Learning Dispositif and Emotional Attachment: A Preliminary International Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Supportive Peer Feedback in Tertiary Education: Analysis of Pre-Service Teachers’ Perceptions

1
Department of Specific Didactics, University of Girona, 17004 Girona, Spain
2
Teaching Innovation Networks on Reflective and Cooperative Learning, Institute of Sciences Education, University of Girona, 17003 Girona, Spain
3
Department of Physics, University of Girona, 17003 Girona, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2019, 9(4), 280; https://doi.org/10.3390/educsci9040280
Submission received: 26 October 2019 / Revised: 22 November 2019 / Accepted: 24 November 2019 / Published: 26 November 2019

Abstract

:
To acquire knowledge about student-mediated peer-to-peer collaborative activities, pre-service teachers’ perceptions of peer feedback are analyzed and categorized as receiver, provider, or cognitive feedback. A questionnaire of 15 survey questions concerning supportive feedback from peers was designed and validated using assessments from more than 200 pre-service teachers. The questionnaire was aligned with the activities promoting supportive feedback between pre-service teachers from three bachelor’s degrees at a tertiary education institution. Their perceptions were then quantified in terms of the peer feedback categories. While there were significant correlations between the scores for all 15 questions, real insights were produced when the highest correlations were analyzed. As such, being involved as both feedback providers and receivers was highly rated. The self-efficacy of pre-service teachers receiving feedback, (i.e., the extent to which peer instructional strategies and the selected learning tasks were cognitively challenging so as to improve receiver feedback), proved to be correlated with their perceptions of involvement, autonomy, and structure. Likewise, motivation for providing or receiving feedback was also closely correlated with the self-efficacy of pre-service teachers providing feedback. Finally, all three questions in the cognitive feedback category were highly correlated. The pre-service teachers were, thus, motivated to improve their learning and considered feedback as a useful task and as a way to strengthen their relationships with their peers.

1. Introduction

Education in tertiary institutions is directed more towards successfully promoting initiative skills such as proactivity, autonomy, and criticism [1]. In tertiary education, once the active role of students is acknowledged, education relies not only on learners to process quality information from various sources and use it to increase learning outcomes [2], but also on the ability of teachers to promote student interaction and relationships in socialization and learning [3,4,5]. Education is founded on proactive processes that ensure educational problems are eliminated before they occur [6]. Here, learner-driven strategies are essential and require students to engage their capabilities and skills through action [7]. Within the teaching–learning context, metacognition emphasizes that students are self-regulating and responsible for their own learning, and that teachers should guide them in that very process. Feedback begins with students and teachers establishing a continuous and cyclical process of active dialogic interaction. Social constructivism, on the other hand, focuses on knowing how students actively participate in the construction of their knowledge [8,9] through dialog and sense-making [2,10]. The starting point for feedback is a student’s pre-existing knowledge, especially when interacting with their peers. Feedback, then, is no longer controlled by the teacher but the student instead. Within this educational paradigm, feedback is not an evaluative process, nor it does not judge, instead, it is a balanced, constructive, and stimulating activity that is of inherent interest for the students themselves and is based on social constructivism approaches [11]. First, the quality of a student’s performance is fostered when students are involved in dual peer-to-peer dialogs using activities designed to promote such dialog or when talking about learning. Second, students are guided on how to monitor and evaluate their own learning capacities, their capacity for lifelong learning, for goal setting and for planning their learning outcomes. Third, specific disciplines, curricula, and contextual assessment tasks are designed to facilitate student engagement in developing complex tasks, and to generate continuous critical feedback derived from multiple assignments. By fostering feedback, students become competent in decision-making and initiative skills that will later promote autonomy and criticism [12,13,14]. However, students must also be aware of the value feedback has and understand the importance of their own active role (as a provider or receiver) in its processes [2].
Feedback can be observed from the perspectives of the provider or receiver and benefit both parties involved in the learning process [15,16,17]. For instance, when analyzing the work of their peers, the feedback providers must first reflect on their own work to improve the quality of what they themselves produce [18,19]. This also contributes to creating reflective knowledge because students have to evaluate their own work as well as that of others in relation to the group [20,21,22]. Finally, reflection is reinforced because, when articulating an evaluation judgment, the feedback provider must offer a coherent explanation. Although peer feedback is also mediated by students’ perceptions of the critique they have received [17,23], typically students perceive the benefits of providing feedback rather than receiving it [17,24,25]. Generally, in mediating peer feedback, students who provide and receive feedback are more motivated when seeing themselves totally involved in the activities undertaken in class [21]. According to the theory of self-determination, a learning environment should support the basic psychological needs for autonomy, competence, and involvement [26]. Indeed, an optimal process of needs-supportive feedback should be facilitated by the psychological need for autonomy in which feedback providers and receivers feel that they are at the origin of their actions and that their actions are concordant with their values, which include responsibility, commitment, criticism, and perseverance [27]. As noted in [28], a high-quality feedback process must support autonomy, and providing and receiving feedback may be conceptualized as a specific aspect of the structure and be related to, in our case, the pre-service teachers’ experiences of effectiveness. The third category, involvement, promotes feelings of relatedness, i.e., experiencing close emotional bonds with significant others [26]. Feedback providers and receivers may show understanding and/or the ability to offer support during the feedback process. Finally, self-efficacy means that both providers and receivers have individual capabilities to bring about desired outcomes, especially in terms of engagement and learning.
Although research usually focuses on understanding the skills needed to make sense of complex, quality information, and use it to produce significant learning outcomes [2], this paper targets peer feedback and specifically analyzes the perceptions pre-service teachers have from their roles as feedback providers and receivers. We are interested in defining the categories that characterize the process in terms of how developing peer feedback provides structure, autonomy, and involvement, as well as self-efficacy. We base our analysis on the undescribed supportive peer feedback approach that is aimed at analyzing pre-service teachers’ basic needs. Supportive peer feedback distinguishes three basic needs: provider and receiver autonomy, involvement, and the feedback structure which, together with self-efficacy, promotes self-regulated learning [29]. For both provider and receiver feedback, cognitive feedback relates to how the selected learning tasks are cognitively challenging [30]. Self-efficacy, in the context of peer feedback, may be defined as judgments about individual capabilities to bring about desired student engagement and learning outcomes [31]. In our proposed model, self-efficacy peer feedback has been operationalized as a dimension construct including efficacy for both receivers and providers, self-regulation and instructional strategies.
As such, this paper focuses on engaging pre-service teachers from three bachelor’s degrees, through collaborative activities and peer feedback. Furthermore, by using an analytical model of categories concerning their basic needs, the paper also examines peer feedback evaluation. We quantitatively analyzed the categories that are supported by 15 questions, all of which are directed towards determining the pre-service teachers’ perceptions of the feedback processes they were involved in. Through the statistical analysis of the correlated questions, we have been able to determine the most significant interactions that define pre-service teachers’ co-construction of knowledge through supportive feedback. Although there is some research on teachers’ and pre-service teachers’ perceptions about peer feedback [7,32,33,34], studies into pre-service teachers’ perceptions about learner-driven supportive feedback are lacking.

2. Methods

2.1. Context

The experiment was carried out with three groups of undergraduates from three bachelor’s degrees at the University of Girona: the Bachelor’s degree in Pre-school Education (BECE), the Bachelor’s degree in Primary Education (BEP), and the Double Degree in Pre-school Education and Primary Education (DD). The Spanish curriculum for undergraduate teaching degrees requires four years of study. This experimental study was carried out during a 75-hour module required in all three degrees at the Josep Pallach Institute of Education Sciences, University of Girona (ICE-UdG), Spain.

2.2. Participants

There were 214 pre-service teachers in total, with ages ranging between 18 and 37 years old; though the majority (84%) were between 18 and 25 years old. The sample had a higher percentage of female (69%) than male pre-service teachers (31%) and 30% were enrolled in the BECE, 56% in the BEP, and 14% in the DD.

2.3. Peer Feedback Activities

All the peer activities proposed for all the pre-service teachers were designed based on collaborative learning. First, the pre-service teachers from BEP, were asked to produce a graphical scientific abstract based on an experimental scientific experience. Feedback was initiated through peer interaction in groups of three. Dialog between two feedback providers and one receiver was based on the changes the feedback providers proposed to improve the quality of the receiver’s initial abstract. All three members of the group took turns at being a feedback receiver (one interaction) and a feedback provider (two interactions). Next, by taking into consideration the feedback they had received, each group member replotted their graphical abstract to create a second version. The activity was repeated once a week over six weeks, totaling six peer feedback interactions. The interaction between the pre-service teachers was based on true collaborative peer learning [35], which maximizes the supportive feedback between the provider and the receiver, because the pre-service teachers from each group found themselves in the same situation of lacking knowledge about the new material. To develop the activity and to improve their processing and retention of the learning tasks, the pre-service teachers followed a fixed and controlled script that had been provided by the teacher. Although the same script can be applied to different types of tasks such as reading or writing, here it was applied to a task that required the group members to summarize their acquired knowledge [36].
Meanwhile, each pre-service student from the Double Degree (DD) was given an incomplete piece of information from a jigsaw activity [37]. (NB: The jigsaw technique considers each member in the group to be a unique piece of the puzzle; therefore, to complete the final artifact, the participation and contribution of each one is essential). Thus, the need for reciprocal communication from all three members in each of the cooperative groups was generated and the pre-service teachers had to integrate their piece of the puzzle with the other group members’ pieces to complete the final item. The activity evolved over four phases. In Phase 0, the base teams were set up. In the jigsaw activity in question, a text was divided into three sections: A, B, and C and each student was responsible for controlling (reading and understanding) their part (A, B, or C) of the text. In Phase 1, the groups of three met to discuss and share the information they had from the text and then, in Phase 2, elaborated the final version. Once each group had finalized their text, two base teams were brought together and one team was asked to be the feedback providers and the other the feedback receivers. As in the case of the BEP pre-service teachers, the activity was repeated once a week over six weeks, totaling in six peer feedback interactions. The groups alternated their roles as the group providing or receiving feedback, i.e., reciprocal peer feedback [38].
Finally, the pre-service teachers from BECE carried out individual autonomous research into a subject proposed by their teacher, which was then presented to the rest of their base team members. This is a complex task as its main purpose is to create debate between the group members [39]. In this case, the pre-service teachers were left to organize themselves into groups of four or six. As with the BEP and DD pre-service teachers, the BECE pre-service teachers were also involved in six activities, one per week, over six weeks. As each member of the team presented their findings, the other members provided feedback. The presenters alternated each week. This activity is an example of reciprocal plural peer feedback [4].

2.4. Peer Feedback Questionnaire and Conceptual Framework

After the activities had been completed, the pre-service teachers answered the 15 PeerFQuest questions (Table 1). PeerFQuest was designed, constructed, implemented, and tested by analyzing prospective models dealing with introducing peer feedback processes into tertiary education. Questionnaires are typically used to optimize effective learning strategies, in terms of multi-method research [40], metacognitive aspects [41], and methodological skills [42,43,44]. According to [45], using questionnaires to complement tasks provides an accurate reflection of cognitive processing. Therefore, PeerFQuest is not only designed to evaluate peer interaction in the feedback process and the roles of providers or receivers, but also the cognitive processes inherent to supportive feedback.
As such, the self-reporting PeerFQuest was elaborated to gain insight into the pre-service teachers’ assessment of various aspects of supportive feedback. The questionnaire was constructed around three general feedback categories: receiver, provider, and cognitive. The two categories on providing and receiving feedback are aligned with teachers playing an active role in motivating the pre-service teachers through high levels of support for autonomy, structure (support of competence), and involvement (support of relatedness) [26,46]. Pre-service teachers who are motivated to learn are more likely to be actively engaged in activities and to ask for feedback [46]. Therefore, questions Q1, Q2, and Q7 could be grouped into the category of peer feedback involvement, Q3 and Q8 into peer feedback autonomy support, and Q4, Q9, and Q10 into peer feedback structure. Questions Q5, Q6, Q11, and Q12 referred to student self-efficacy and Q13, Q14, and Q15 cognitive feedback [26,27,28].

2.5. Statistical Analysis

The pre-service teachers’ answers to the PeerFQuest (Table 1) were scaled on a Likert scale (1 = disagree, 5 = strongly agree). A reliability analysis was conducted to ensure the dependability of the answers as good development procedures may result in a reasonably reliable survey instrument [47,48,49]. For each of the three categories (Table 1), the Cronbach’s coefficient alpha was 0.87 (receiver feedback), 0.81 (provider feedback), and 0.81 (cognitive feedback), therefore, ensuring that the PeerFQuest presents an excellent internal-consistency reliability [50,51]. The analyses of the mean, standard deviation, and correlation (Table 2) were carried out with SPSS Statistics 19.0.

3. Results

Table 2 shows the main descriptive statistics (mean and standard deviation) for the 15 PeerFQuest questions, as well as the linear correlation between the scores the pre-service teachers gave for the questions. In terms of descriptive statistics, all the questions were completed in their entirety (214).
The highest scores were found for questions Q1 (Did you like receiving feedback from your partner(s)?) and Q2 (How tactfully did your partner(s) give you feedback?) in the receiver feedback category. Meanwhile, in the provider feedback category, the highest scores were for questions Q9 (Did you think about how to tactfully provide feedback to your partner(s)?) and Q11 (Do you think that your criticism of the work was precise/specific enough to help your partner(s) to improve their learning?). Finally, in cognitive feedback, question Q13 (Do you think providing and receiving feedback is useful for improving peer learning?) received the highest scores. An ANOVA analysis without a replication test for the individual questions within the three general categories (receiver, provider and cognitive feedback), provided F values above the critical Fcr for the analysis of each pre-service teacher’s perception in each general category, thus rejecting the null hypothesis and indicating that the means between the questions were significantly different (with a 99% level of significance). The ANOVA analysis between the three general categories as a whole, proved no significant differences, indicating that the pre-service students’ perceptions could only be significant on a single question scale.
Amongst the correlations, statistically significant values (even higher than 0.6) [52] were found between some questions. For instance, question Q1 (receiver feedback) was highly correlated with questions Q5 (receiver feedback) and Q7 (provider feedback), question Q3 with questions Q5 (receiver feedback) and Q14 (cognitive feedback), and question Q4 with Q5 (receiver feedback). In the provider feedback category, questions Q8 and Q11 were highly correlated with Q12 (provider feedback) and with Q14 (cognitive feedback). Finally, all three questions concerning cognitive feedback presented high correlations between them. In contrast, question Q9 (provider feedback) presented the lowest correlation with questions Q1, Q3, Q4, Q5, and Q6, all in the provider feedback categories. Likewise, question Q10 presented the lowest correlation with questions Q2 (receiver feedback) and Q8 and Q9 (provider feedback).

4. Discussion

Supportive peer feedback is a learner-driven strategy that forms one of the bases of higher education. Here, learning was promoted through pre-service teacher interaction which, in turn, fostered their co-construction of knowledge and acquisition of intuitive skills [2]. We found that in developing pre-service teachers’ peer feedback literacy, they were then able to provide and receive multidimensional feedback. The overall objective was to foster their decision-making and initiative skills so they could then develop self-regulated learning and higher levels of cognitive growth. From their perceptions of supportive feedback, we can deduce that there is an interplay between the supportive feedback processes and the relationships between the peer internal correlations found between the categories. Supportive peer feedback encourages collective interaction to not only develop collaborative activities, but also the instructional/engaging tasks used for involving the pre-service teachers in higher-order processes; as in the case of supportive peer feedback [2,14,20,53]. Opportunities for dialog and making evaluative judgments represent examples of literacy feedback that can strengthen the social-relationship aspect of peer interaction and reduce the power differentials and negative emotional reactions that can arise in a more directed teacher-student feedback process [2,54].
PeerFQuest’s cognitive, provider, and receiver feedback categories presented significant correlations. Among the highest correlations were those between the self-efficacy of pre-service teachers receiving feedback (Q5): (i.e., the extent to which the peer instructional strategies and the selected learning tasks are cognitively challenging to improve receiver feedback [27]), were correlated with the pre-service teachers’ peer feedback involvement (Q1), autonomy (Q3), and structure (Q4) categories. Indeed, the cumulative experience of multiple peer feedback occurrences over time may positively affect pre-service teachers’ attitudes, beliefs, and/or performance [17], and therefore favor positive perceptions of self-efficacy, and, later, perceptions of improved learning. In fact, the pre-service teachers confirmed this by stating that receiving structured feedback from their peers was a way to improve their learning, and their feelings of autonomy and being involved. The mutual process of receiving and giving feedback was highly correlated (Q1 with Q7) by all the pre-service teachers, as they felt that by being providers or receivers they were highly involved in the feedback process. For instance, they felt the benefits of providing vs. receiving peer feedback were the same when it came to modifying and improving their writing assignments and performance in terms of content, structure, and style [17]. (That said, there are more gains in writing ability for feedback providers than for feedback receivers [54]). Our results show that the self-efficacy of pre-service teachers in providing feedback (Q12) was highly correlated with their feelings of gaining autonomy (Q8) and the positive feeling of being able to provide precise and specific feedback to peers (Q11) [52]. Student motivation for providing or receiving feedback (Q14) was highly correlated with the self-efficacy of providing feedback (Q12). Finally, all three questions (Q13, Q14, and Q15) in the cognitive feedback category were closely interrelated and included pre-service teachers being motivated to advance their learning, and to consider feedback as a useful task and as a way to improve and strengthen their relationships with their peers.
Although the theory of self-determination postulates that a learning environment should support basic psychological needs for autonomy, competence, and involvement, our findings show that an optimal process of supportive peer feedback is greatly facilitated by the psychological needs for structure and involvement, which are closely correlated with self-efficacy. In other words, pre-service teachers felt the structured peer-to-peer feedback process was effective, (regardless of whether they were giving or receiving feedback) during the twofold dialog process [55]. These results align with those describing dialogic feedback processes being improved through a feedback triangle of cognitive dimension, interpersonal learning in feedback negotiation, and the sense of autonomy.
In other words, the pre-service teachers perceived their learning advanced through feedback [54]. Likewise, involvement, correlating with self-efficacy, promotes feelings of relatedness. This occurs because interpersonal emotional bonds with peers are encouraged through the twofold feedback process [53,54,55,56,57]. In other words, feedback providers and receivers may show understanding and/or empathy and be able to offer support (on an individual basis and/or in small or large groups) during the feedback process. The significant correlation between provider and receiver involvement and structure with self-efficacy, means that both feedback providers and receivers have the individual capabilities required to bring about desired outcomes, especially concerning student engagement and learning. Although all the activities in the experiment were developed within the frame of cooperative learning, it would be interesting to determine if there were differences in the use of one or other techniques, in relation to the quantity and quality of the feedback received by the pre-service teachers. This could also identify what cooperative learning techniques allowed the student a more active role in their development as a feedback provider or receiver. It would also be interesting to expand the study to identify what critical skills the pre-service teachers developed in relation to decision-making during the peer feedback process. It would also be worth studying whether the pre-service teachers would prefer, or feel more active in, a more structured feedback process that uses instruments that would foster internal technical feedback to enable them to then be more critical feedback providers.
The results of this study are consistent with the development of the analysis, although for future research it would be interesting to carry out an initial seminar for the pre-service teachers to conceptualize what is understood by good feedback and what the characteristics are that frame it. Furthermore, programming group discussions at the initiation of the activities would result in a good accompaniment to redirect or improve those aspects that might be considered critical when developing peer supportive feedback. This focus group would be part of the training and shared assessment of the teaching and learning process that would encourage a process of reflection on the part of the student. Needless to say, the complexity of the analysis of the relevant categories that emerged during peer supportive feedback activities would be enriched if a large number of pre-service teachers were to be included. All in all, this manuscript has shown that the architecture for the supportive peer feedback, which is based on collaborative activities, relies on the intercorrelation between cognitive feedback and provider and receiver feedback. Furthermore, there is an intra-correlation between self-efficacy and receiver involvement, autonomy and structure, as well as provider autonomy. These inter- and intra-correlations between the feedback dimensions deliver an important message for peer feedback, in that the activities and evaluation should include cognitive, metacognitive, and self-efficacy aspects [58].

5. Limitations of the Study

More research could be conducted where longitudinal information on pre-service teachers’ efficacy at both receiving and providing feedback is incorporated, or cognitive verification of the activities that later support the feedback process included. Likewise, there could be further research into what types of activities used to foster initiative skills are the most adequate for both the receivers and providers involved in the feedback processes. The results are limited by the design of the activities themselves. It would be necessary to formulate optimum situations of peer feedback, in particular, to instruct pre-service teachers in those activities that determine peer feedback effectiveness. On the other hand, while the quantitative analysis that we have presented here is supported by the perceptions of the pre-service teachers, it does not allow us to determine what feedback elements would guarantee the relationships between the dimensions of the feedback, so it would be interesting to obtain information from a qualitative analysis, for example, to propose a focus group in order to be able to determine what would be required to improve the quality of peer feedback. This focus group would be part of the instructional and shared assessment of a ‘gold star’ or best-practice teaching and learning process that would favor the process of reflection.

Author Contributions

Conceptualization, D.C. and J.C.; methodology, D.C. and T.S.; validation, T.S. and J.C.; formal analysis, L.N., D.C., and T.S.; investigation, D.C. and J.C.; resources, D.C.; data curation, D.C. and L.N.; writing—original draft preparation, J.C.; writing—review and editing, J.C.; visualization, T.S.; supervision, J.C. and D.C.; funding acquisition, D.C.

Funding

This research was funded by the Josep Pallach Institute of Education Sciences (ICE), University of Girona (ICE-UdG).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alsina, Á.; Ayllón, S.; Colomer, J.; Fernández-Peña, R.; Fullana, J.; Pallisera, M.; Pérez-Burriel, M.; Serra, L. Improving and evaluating reflective narratives: A rubric for higher education students. Teach. Teach. Educ. 2017, 63, 148–158. [Google Scholar] [CrossRef]
  2. Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef]
  3. Johnson, D.W.; Johnson, R.T. An educational psychology success story: Social interdependence theory and cooperative learning. Educ. Res. 2009, 38, 365–379. [Google Scholar] [CrossRef]
  4. Cañabate, D.; Martínez, G.; Rodríguez, D.; Colomer, J. Analysing emotions and social skills in physical education. Sustainability 2018, 10, 1585. [Google Scholar] [CrossRef]
  5. Colomer, J.; Serra, L.; Cañabate, D.; Serra, T. Evaluating Knowledge and Assessment-Centered Reflective-Based Learning Approaches. Sustainability 2018, 10, 3122. [Google Scholar] [CrossRef]
  6. Akinci, Z.; Yurcu, G.; Ekin, Y. Relationships between student personality traits, mobbing, and depression within the context of sustainable tourism education: The case of a faculty of tourism. Sustainability 2018, 10, 3418. [Google Scholar] [CrossRef]
  7. Herranen, J.; Vesterinen, V.; Aksela, M. From learner-centered to learner-driven sustainability education. Sustainability 2018, 10, 2190. [Google Scholar] [CrossRef]
  8. Jonassen, D.H. Evaluating constructivist learning. Educ. Technol. 1991, 31, 28–33. [Google Scholar]
  9. Paris, S.G.; Byrnes, J.P. The constructivist approach to self-regulation and learning in the classroom. In Self-Regulated Learning and Academic Achievement; Zimmerman, B.J., Schunk, D.H., Eds.; Springer: New York, NY, USA, 1989; pp. 196–200. [Google Scholar] [CrossRef]
  10. Price, M.; Handley, K.; Millar, J. Feedback: Focusing attention on engagement. Stud. High. Educ. 2011, 36, 879–896. [Google Scholar] [CrossRef]
  11. Boud, D.; Molloy, E. Rethinking models of feedback for learning: The challenge of design. Assess. Eval. High. Educ. 2013, 38, 698–712. [Google Scholar] [CrossRef]
  12. Hagger, M.S.; Sultan, S.; Hardcastle, S.J.; Chatzisarantis, N.L. Perceived autonomy support and autonomous motivation toward mathematics activities in educational and out-of-school contexts is related to mathematics homework behavior and attainment. Contemp. Educ. Psychol. 2015, 41, 111–123. [Google Scholar] [CrossRef]
  13. Hagger, M.S.; Koch, S.; Chatzisarantis, N.L. The effect of causality orientations and positive competence-enhancing feedback on intrinsic motivation: A test of additive and interactive effects. Personal. Individ. Differ. 2015, 72, 107–111. [Google Scholar] [CrossRef]
  14. Baena-Extremera, A.; Granero-Gallegos, A.; Baños, R.; Ortiz-Camacho, M. Can physical education contribute to learning English? Structural model from self-determination theory. Sustainability 2018, 10, 3613. [Google Scholar] [CrossRef]
  15. Ertmer, P.A.; Richardson, J.C.; Belland, B.; Camin, D.; Connolly, P.; Coulthard, G.; Lei, K.; Mong, C. Using peer feedback to enhance the quality of student online postings: An exploratory study. J. Comput.-Mediat. Commun. 2007, 12, 412–433. [Google Scholar] [CrossRef]
  16. Van Popta, E.; Kral, M.; Camp, G.; Martens, R.L.; Simons, P.R. Exploring the value of peer feedback in online learning for the provider. Educ. Res. Rev.-Neth. 2017, 20, 24–34. [Google Scholar] [CrossRef]
  17. Huisman, B.; Saab, N.; van Driel, J.; van den Broek, P. Peer feedback on academic writing: Undergraduate students’ peer feedback role, peer feedback perceptions and easy performance. Assess. Eval. High. Educ. 2018, 43, 955–968. [Google Scholar] [CrossRef]
  18. Dunlap, J.C.; Grabinger, S. Preparing students for lifelong learning: A review of instructional features and teaching methodologies. Perform. Improv. Q. 2003, 16, 6–25. [Google Scholar] [CrossRef]
  19. Wooley, R.; Was, C.A.; Schunn, C.D.; Dalton, D.W. The effects of feedback elaboration on the giver of feedback. In Proceedings of the 30th Annual Conference of the Cognitive Science Society, Washington, DC, USA, 23–26 July 2008. [Google Scholar]
  20. Nicol, D.; Macfarlane-Dick, D. Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Stud. High. Educ. 2006, 31, 199–218. [Google Scholar] [CrossRef]
  21. Nicol, D. Resituating feedback from the reactive to the proactive. In Feedback in Higher and Professional Education: Understanding It and Doing It Well; Boud, D., Molloy, E., Eds.; Routledge: London, UK, 2012. [Google Scholar]
  22. Nicol, D. Guiding principles of peer review: Unlocking learners’ evaluative skills. In Advances and Innovations in University Assessment and Feedback; Kreber, C., Anderson, C., Entwistle, N., McArthur, J., Eds.; Edinburgh University Press: Edinburgh, UK, 2014; pp. 195–258. [Google Scholar] [CrossRef]
  23. Strijbos, J.W.; Narciss, S.; Dünnebier, K. Peer feedback content and sender’s competence level in academic writing revision tasks: Are they critical for feedback perceptions and efficiency? Learn. Instr. 2010, 29, 291–303. [Google Scholar] [CrossRef]
  24. Ludemann, P.M.; McMakin, D. Perceived helpfulness of peer editing activities: First-year students’ views and writing performance outcomes. Psychol. Learn. Teach. 2014, 13, 129–136. [Google Scholar] [CrossRef]
  25. McConlogue, T. Making judgements: Investigating the process of composing and receiving peer feedback. Stud. High. Educ. 2015, 40, 1495–1506. [Google Scholar] [CrossRef]
  26. Leenknecht, M.J.M.; Wijnia, L.; Loyens, S.M.M.; Rikers, R.M.J.P. Need-supportive teaching in higher education: Configurations of autonomy support, structure and involvement. Teach. Teach. Educ. 2017, 68, 134–142. [Google Scholar] [CrossRef]
  27. Berger, R.; Rugen, L.; Woodfin, L. Leaders of Their Own Learning: Transforming Schools through Student-Engaged Assessment; Jossey-Bass: San Francisco, CA, USA, 2014. [Google Scholar]
  28. Carpentier, J.; Mageau, G.A. When change-oriented feedback enhances motivation, well-being and performance: A look at autonomy-supportive feedback in sport. Psychol. Sport Exerc. 2013, 14, 423–435. [Google Scholar] [CrossRef]
  29. Michalsky, T.; Schechter, C. Preservice teachers’ capacity to teach self-regulated learning: Integrating learning from problems and learning from success. Teach. Teach. Educ. 2013, 30, 60–73. [Google Scholar] [CrossRef]
  30. Depaepe, F.; König, J. General pedagogic knowledge, self-efficacy and instructional practice: Disentangling their relationship in pre-service teacher education. Teach. Teach. Educ. 2018, 69, 177–180. [Google Scholar] [CrossRef]
  31. González, A.; Conde, Á.; Díaz, P.; Garcia, M.; Ricoy, C. Instructors’ teaching styles: Relation with competences, self-efficacy, and commitment in pre-service teachers. High. Educ. 2018, 75, 625–642. [Google Scholar] [CrossRef]
  32. Alsina, A.; Ayllón, S.; Colomer, J. Validating the narrative reflection assessment rubric (NARRA) for reflective narratives in higher education. Assess. Eval. High. Edu. 2019, 44, 155–168. [Google Scholar] [CrossRef]
  33. Yang, G.; Lam, C.-C.; Wong, N.-Y. Developing an instrument for identifying secondary teachers’ beliefs about education for sustainable development in China. J. Environ. Educ. 2010, 41, 195–207. [Google Scholar] [CrossRef]
  34. Whitley, C.T.; Takahashi, B.; Zwickle, A.; Besley, J.C.; Lertpratchya, A.P. Sustainability behaviors among college students: An application of the VBN theory. Environ. Educ. Res. 2016, 24, 245–262. [Google Scholar] [CrossRef]
  35. O’Donell, A.M.; Dansereau, D.F. Scripted cooperation in student dyads: A method for analyzing and enhancing academic learning and performance. In Interaction in Cooperative Groups: The Theoretical Anatomy of Group Learning; Hertz-Lazarowitz, R., Miller, N., Eds.; Cambridge University Press: Cambridge, UK, 1992; pp. 120–144. [Google Scholar]
  36. García, A.J.; Troyano, Y. Cooperative learning for the third age university students. Implementation strategies at the European Higher Education. Rev. Interam. Educ. Adul. 2010, 32, 6–21. [Google Scholar]
  37. Aronson, E.; Patnoe, S. The Jigsaw Classroom: Building Cooperation in the Classroom, 2nd ed.; Addison Wesley Longman: New York, NY, USA, 1997. [Google Scholar]
  38. De Backer, L.; Van Keer, H.; Valcke, M. Exploring the potential impact of reciprocal peer tutoring on higher education students’ metacognitive knowledge and metacognitive regulation. Instr. Sci. 2012, 40, 559–588. [Google Scholar] [CrossRef]
  39. Sharan, S.; Hertz-Lazarowitz, R. A group investigation method of cooperative learning in the classroom. In Cooperation in Education; Sharan, S., Hare, P., Webb, C.D., Hertz-Lazarowitz, R., Eds.; Bringham, Young University Press: Provo, UT, USA, 1980; pp. 14–46. [Google Scholar]
  40. Schellings, G. Applying learning strategy questionnaires: Problems and possibilities. Metacognit. Learn. 2011, 6, 91–109. [Google Scholar] [CrossRef] [Green Version]
  41. Thomas, G.P.; Anderson, D.; Nashon, S. Development of an instrument designed to investigate elements of science students’ metacognition, self-efficacy and learning processes: The SEMLI-S. Int. J. Sci. Educ. 2008, 30, 1701–1724. [Google Scholar] [CrossRef]
  42. Mokhtari, K.; Reichard, C. Assessing students’ metacognitive awareness of reading strategies. J. Educ. Psychol. 2002, 94, 249–259. [Google Scholar] [CrossRef]
  43. Feldon, D.F.; Peugh, J.; Timmerman, B.E.; Maher, M.A.; Hurst, M.; Strickland, D.; Gilmore, J.A.; Stiegelmeyer, C. Graduate students’ teaching experiences improve their methodological research skills. Science 2011, 333, 1037–1039. [Google Scholar] [CrossRef] [Green Version]
  44. Fullana, J.; Pallisera, M.; Colomer, J.; Fernández, R.; Pérez-Burriel, M. Reflective learning in higher education: A qualitative study on students’ perceptions. Stud. High. Educ. 2016, 41, 1008–1022. [Google Scholar] [CrossRef] [Green Version]
  45. Richardson, J.T.E. Methodological issues in questionnaire-based research on student learning in higher education. Educ. Psychol. Rev. 2004, 16, 347–358. [Google Scholar] [CrossRef]
  46. Stroet, K.; Opdenakker, M.; Minnaert, A. Effects of need supportive teaching on early adolescents’ motivation and engagement: A review of the literature. Educ. Res. Rev.-Neth. 2013, 9, 65–87. [Google Scholar] [CrossRef]
  47. Creswell, J. Research Design: Qualitative, Quantitative and Mixed Methods Approaches, 2nd ed.; Psychological Association: Washington, DC, USA, 2003. [Google Scholar]
  48. Barrera, A.; Braley, R.T.; Slate, J.R. Beginning teacher success: An investigation into the feedback from mentors of formal mentoring programs. Mentor. Tutoring Partnersh. Learn. 2010, 18, 61–74. [Google Scholar] [CrossRef]
  49. Colomer, J.; Pallisera, M.; Fullana, J.; Pérez-Burriel, M.; Fernández, R. Reflective learning in higher education: A comparative analysis. Procedia-Soc. Behav. Sci. 2013, 93, 364–370. [Google Scholar] [CrossRef] [Green Version]
  50. Cronbach, L.J. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef] [Green Version]
  51. Gliem, J.A.; Gliem, R.R. Calculating, interpreting, and reporting Cronbach’s alpha reliability coefficient for Likert-type scales. In Proceedings of the Midwest Research-to-Practice Conference in Adult, Continuing, and Community Education, The Ohio State University, Columbus, OH, USA, 8–10 October 2003; pp. 82–88. [Google Scholar]
  52. Ayllón, S.; Alsina, Á.; Colomer, J. Teachers’ involvement and students’ self-efficacy: Keys to achievement in higher education. PLoS ONE 2019, 14, e0216865. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  53. Esterhazy, R.; Damşa, C. Unpacking the feedback process: An analysis of undergraduate students’ interactional meaning-making of feedback comments. Stud. High. Educ. 2017, 44, 260–274. [Google Scholar] [CrossRef]
  54. Yang, M.; Carless, D. The feedback triangle and the enhancement of dialogic feedback processes. Teach. High. Educ. 2013, 18, 285–297. [Google Scholar] [CrossRef] [Green Version]
  55. Lundstrom, K.; Baker, W. To give is better than to receive: The benefits of peer review to the reviewer’s own writing. J. Second Lang. Writ. 2009, 18, 30–43. [Google Scholar] [CrossRef]
  56. Pitt, E.; Norton, L. Now that’s the feedback I want! Students’ reactions to feedback on graded work and what they do with it. Assess. Eval. High. Educ. 2017, 42, 499–516. [Google Scholar] [CrossRef]
  57. Lizzio, A.; Wilson, K. Feedback on assessment: Students’ perceptions of quality and effectiveness. Assess. Eval. High. Educ. 2008, 33, 265–275. [Google Scholar] [CrossRef]
  58. Alqassab, M.; Strijbos, J.-W.; Ufer, S. Preservice mathematics teachers’ beliefs about peer feedback, perceptions of their peer feedback message, and emotions as predictors of peer feedback accuracy and comprehension of the learning task. Assess. Eval. High. Educ. 2018, 44, 139–154. [Google Scholar] [CrossRef]
Table 1. The fifteen questions in the Peer Feedback Questionnaire (PeerFQuest) divided into three categories.
Table 1. The fifteen questions in the Peer Feedback Questionnaire (PeerFQuest) divided into three categories.
Peer Feedback Categories and PeerFQuest Questions
Receiver feedback:
Q1. Did you like receiving feedback from your partner(s)?
Q2. How tactfully did your partner(s) give you feedback?
Q3. Do you think the feedback helped you to improve your learning?
Q4. Was the feedback you received accurate and specific enough to improve your learning?
Q5. Was the feedback you received helpful/valuable for improving your learning?
Q6. Thanks to the feedback you received, were you able to process/modify your work so that you improved your learning?
Provider feedback:
Q7. Did you enjoy providing feedback to your partners?
Q8. Do you think the feedback that you have provided to your partner(s) was well-received?
Q9. Did you think about how to tactfully provide feedback to your partner(s)?
Q10. To what extent did you use your previous knowledge of that area to provide feedback?
Q11. Do you think that your criticism of the work was precise/specific enough to help your partner(s) to improve their learning?
Q12. Do you feel that the feedback you provided to your partner(s) was useful and improved their learning?
Cognitive feedback:
Q13. Do you think providing and receiving feedback is useful for improving peer learning?
Q14. Do you think providing and receiving feedback has improved your motivation for learning?
Q15. Do you think that the feedback provided and received has improved your relationships with your partner(s)?
Table 2. Mean (M), standard deviation (SD), and correlations between the 15 questions in PeerFQuest. Note: All correlations were significant at level 0.01, except those marked with an asterisk for which the correlations were significant at level 0.05. The correlations marked in bold were higher than 0.6.
Table 2. Mean (M), standard deviation (SD), and correlations between the 15 questions in PeerFQuest. Note: All correlations were significant at level 0.01, except those marked with an asterisk for which the correlations were significant at level 0.05. The correlations marked in bold were higher than 0.6.
Q1Q2Q3Q4Q5Q6Q7Q8Q9Q10Q11Q12Q13Q14Q15
Q110.550.590.560.760.510.620.430.270.350.440.570.500.480.42
Q2 10.320.410.500.390.300.500.440.220.310.410.360.350.39
Q3 10.480.740.480.520.280.12*0.420.340.500.410.620.46
Q4 10.750.430.490.490.170.330.510.530.470.570.48
Q5 10.540.560.330.200.430.460.590.480.590.51
Q6 10.490.370.240.360.470.580.480.560.39
Q7 10.380.14*0.410.330.560.420.580.37
Q8 10.360.260.500.600.390.460.36
Q9 10.250.470.360.390.57*0.37
Q10 10.320.310.330.500.60
Q11 10.690.500.460.44
Q12 10.540.720.51
Q13 10.770.72
Q14 10.68
Q15 1
M4.214.313.763.533.953.883.973.964.493.784.043.884.513.943.84
SD1.251.391.211.850.941.311.621.421.191.660.761.320.721.111.46

Share and Cite

MDPI and ACS Style

Cañabate, D.; Nogué, L.; Serra, T.; Colomer, J. Supportive Peer Feedback in Tertiary Education: Analysis of Pre-Service Teachers’ Perceptions. Educ. Sci. 2019, 9, 280. https://doi.org/10.3390/educsci9040280

AMA Style

Cañabate D, Nogué L, Serra T, Colomer J. Supportive Peer Feedback in Tertiary Education: Analysis of Pre-Service Teachers’ Perceptions. Education Sciences. 2019; 9(4):280. https://doi.org/10.3390/educsci9040280

Chicago/Turabian Style

Cañabate, Dolors, Lluís Nogué, Teresa Serra, and Jordi Colomer. 2019. "Supportive Peer Feedback in Tertiary Education: Analysis of Pre-Service Teachers’ Perceptions" Education Sciences 9, no. 4: 280. https://doi.org/10.3390/educsci9040280

APA Style

Cañabate, D., Nogué, L., Serra, T., & Colomer, J. (2019). Supportive Peer Feedback in Tertiary Education: Analysis of Pre-Service Teachers’ Perceptions. Education Sciences, 9(4), 280. https://doi.org/10.3390/educsci9040280

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop