Next Article in Journal
Interactive Feedback for Learning Mathematics in a Digital Learning Environment
Next Article in Special Issue
Finnish and Portuguese Parents’ Perspectives on the Role of Teachers in Parent-Teacher Partnerships and Parental Engagement
Previous Article in Journal
A Cross-Sectional Study of Educational Aspects and Self-Reported Learning Difficulties among Female Prisoners in Norway
Previous Article in Special Issue
Science Teachers’ Perceptions and Self-Efficacy Beliefs Related to Integrated Science Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Automated Feedback Is Nice and Human Presence Makes It Better: Teachers’ Perceptions of Feedback by Means of an E-Portfolio Enhanced with Learning Analytics

1
Faculty of Social Sciences, Institute of Education, University of Tartu, 51005 Tartu, Estonia
2
Utrecht Center for Research and Development of Health Professions Education, University Medical Center Utrecht, 3584 CX Utrecht, The Netherlands
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(6), 278; https://doi.org/10.3390/educsci11060278
Submission received: 5 May 2021 / Revised: 27 May 2021 / Accepted: 31 May 2021 / Published: 4 June 2021
(This article belongs to the Special Issue Contemporary Teacher Education: A Global Perspective)

Abstract

:
While there is now extensive research on feedback in the context of higher education, including pre-service teacher education, little has been reported regarding the use of feedback from teachers to other teachers. Moreover, literature on the potential advantages that the use of technology, for example electronic portfolios and learning analytics, has in improving feedback in the in-service workplace practices, is also sparse. Therefore, the aim of this exploratory case study was to explore how in-service teachers perceived the peer feedback they received and provided through a web-based electronic portfolio during a professional development course carried out in their workplace. Questionnaire and interview data were collected from 38 teachers who received feedback through a learning analytics enhanced electronic portfolio and from 23 teachers who received feedback only by the electronic portfolio. Additionally, one individual and four focus group interviews were conducted with 15 teachers who were the feedback providers. Several common topics were identified in the interviews with the feedback receivers and providers, involving the benefits and challenges of human interaction and the flexibility of the feedback process that the electronic portfolio offered. The results also revealed better feedback experience within the group of teachers who received extra feedback by means of learning analytics. It is concluded that although an electronic portfolio provides a useful tool in terms of flexibility in the provision and receipt of feedback, the need for human interaction was acknowledged.

1. Introduction

Teachers’ professional development is a process that starts in pre-service education and continues throughout teachers’ professional lives, involving several supporting features [1]. Effective professional development provides teachers with time to think about and receive input on making changes to their professional activities by reflection and feedback [2]. The importance and impact of feedback has been established in research. In their seminal article on feedback, Hattie and Timperley [3] conceptualise feedback as “information provided by an agent (e.g., teacher, peer, book, parent, self, experience) regarding aspects of one’s performance or understanding” (p. 81). This definition, however, has been criticised for focusing only on the transmission of information and not including any expectations for the feedback receivers to respond [4]. Current understandings of feedback, in the domain of higher education, include the ideas that feedback must have more beneficial developmental effects for the feedback receivers [5] and improve their work and learning [6,7]. However, this also constitutes the field of in-service teachers’ education, where feedback (e.g., received from peers, mentors, school leaders) should give input to the improvement of their professional development. So far, when talking about in-service teachers, the literature about feedback is rather sparse, mainly focusing on student learning. Consequently, more information is needed [8,9].
There is considerable debate about what makes feedback effective [10]. In their discussion about the conditions under which assessment supports learners’ learning, Gibbs and Simpson [11] have established four conditions that focus on the characteristics of the feedback: quantity, timing, quality, and the use of feedback. In order to be effective, the feedback should be understandable, timely, sufficient in detail and quantity, and acted upon by the feedback receivers [11]. The immediacy of the feedback and its connection to the feedback content as a characteristic of effective feedback is also discussed by Hattie and Timperley [3]. They argue that the optimal timing of feedback varies depending on the feedback content. Simple error correction may be most effective when delivered immediately, however the delayed feedback on processes or complex tasks allows the feedback receivers time to carry out the task without interruptions. Scheeler and colleagues [8] also emphasize that immediate feedback can raise concerns about interruption of the flow of instruction. Therefore, feedback providers should investigate ways to provide immediate feedback in the least intrusive manner, but as close to the instructional event as possible [8].
Providing and receiving feedback is often complicated due to several aspects, such as the means through which it is delivered [8], the tasks, the content of the feedback, the context in which feedback is given, and the interplay between these aspects [12,13]. The feedback to support in-service teachers’ learning and development can be provided by various sources, one of which is another teacher. Peer feedback is considered to be successful because the power differentials are minimized [8]. The input for this feedback may come from peer observations. In peer observations, peers should act as “critical friends” and this relies heavily on trust [14]. A study carried out by Parr and Hawe [15] showed that teachers value the possibility to observe one another’s practice especially when these observations are carried out with a clear purpose and in a guided way. The teachers found the observations to be a ‘very useful professional learning activity’ (p. 724).
Technological developments offer new and different forms of professional development for teachers [16]. In order to benefit from these technological tools, it is vital that teachers know how to use them [17]. Therefore, the importance of teachers’ digital or ICT competence in teachers’ professional development is emphasized in several guidelines for teachers. In the ICT Competency Framework for Teachers [18] it is stated that ICT-competent teachers guide their students to develop their ICT competencies, as well as use ICT for their own professional improvement. A similar idea can be found in the European Framework for the Digital Competence of Educators—in addition to enhancing teaching, teachers’ digital competence entails fostering their own professional development by using digital technologies [19]. Although digital or ICT competence is considered as an important factor to improve teachers’ professional development, the teachers lack qualifications and have insufficient training in ICT [20].
Many technological tools have been applied to support teachers’ professional development. For example, Aubusson and colleagues [21] showed that mobile technologies have the capacity to add new dimensions to teacher professional learning. Their study also supports the literature indicating that teachers are more eager to use technology for their students’ learning rather than for their own learning. In order to support teachers in reflecting on their practice based on everyday evidence and feedback, three design-based research iterations were carried out by Prieto and colleagues [22]. The suggestions that authors give based on their study include, among others, attention to ownership (teachers should be able to personalise the items/behaviours to observe and reflect upon) and design for overload (it should be considered that teachers may lack spare energy and attention during the school day).
Reflection is also at the centre of portfolios [23] and electronic portfolios (e-portfolios), which have been used among teachers to support their professional learning and the provision and receipt of feedback [24,25,26,27,28,29]. There are also studies suggesting that implementing e-portfolios takes too much time and effort [30,31], the implementation needs a lot of support [24,32], and it does not necessarily enhance the development of teaching competencies [33]. Therefore, new feedback models should be considered. Pardo [34] argues that new feedback models need to consider combining computer-based and human-produced feedback into a more precise description, involving elements that capture the interaction between learners, computers, instructors and resources at different design levels. One way to do this is by the implementation of learning analytics (LA), i.e., “the measurement, collection, analysis, and reporting of data about learners and their contexts, for the purposes of understanding and optimizing learning and the environments in which it occurs” [35]. LA is conceived as an effective and efficient way to provide immediate feedback [36] and therefore improve the quality of the learning processes [37]. In this way e-portfolios could possibly be used in a more tailored and timely manner [38]. However, Clow [39] emphasises that learning is improved by the LA system only then when the given feedback reflects and rewards those aspects of learning that are valued by the learners. What is more, to date a number of studies have focused on the feedback receiver, especially in the context of peer feedback, and the possible learning benefits of providing feedback in online settings have not been extensively studied [40].
Research has shown that feedback processes, whether carried out in person or through some digital medium, can shape the learner’s behaviour, learning, and experience [41]. The feedback receiver and provider play the central role as the agents of the feedback process. Depending on the context, numerous agents could be involved in the feedback process [42], bringing their own different subjective perceptions and experiences into the process [43,44]. The perceptions and experiences of the different agents in the feedback process have been widely researched in the context of higher education [41,45,46,47]. In pre-service education, there is also a large and growing number of published studies about the perceptions and experiences of feedback. For example, Ferguson [48] studied students’ perceptions of quality feedback in teacher education, Dowden and colleagues [46] investigated pre-service teachers’ perceptions of written feedback, and Buhagiar [49] examined mathematics student teachers’ views on tutor feedback during teaching practice. In in-service teacher education, a considerable amount of literature has been published on teachers’ perceptions of feedback that they provide to their students e.g., [12]. However, so far, research about feedback processes regarding in-service teachers is still sparse [8]. This also counts for research regarding in-service teachers’ perceptions of feedback in innovative digital learning environments.
The context in which the feedback is shared also plays an important part in the feedback process. The context of this study is a professional development course aimed at in-service teachers. Parsons and colleagues [16], who studied teachers’ interpretations about their online professional development experiences, found that the teachers indicated several factors that made the online professional development beneficial, especially the possibility to access and complete the course at any time and at their own pace. Powell and Bodur [50] examined teachers’ mixed perceptions of design and implementation features of a job-embedded online teacher professional development experience. On the one hand, the participants in their study saw reflection as a key element of online teacher professional development. On the other hand, the lack of social interaction and collaboration was seen as a weakness in the process.
In sum, feedback, although having an important part in the learning process, includes several factors that make the feedback process challenging. The use of technology, more specifically e-portfolios and LA, may have a number of potential advantages as a means of providing and receiving feedback in the workplace practices. The perceptions of the feedback of the agents involved in the feedback process shape the potential learning. Few studies have investigated the implementation of technology in in-service teachers’ professional learning. In this exploratory case study, we aim to present teachers’ perceptions about receiving and providing feedback by the means of a web-based e-portfolio during a professional development course carried out in their workplace. We seek answers to the following research questions:
  • How do teachers as feedback receivers perceive the feedback by the means of an e-portfolio?
  • Is there a difference between the perceptions of feedback receivers who received extra feedback by the means of learning analytics in the e-portfolio?
  • How do teachers as feedback providers perceive giving feedback by the means of an e-portfolio?

2. Materials and Methods

An exploratory case study methodology [51] was chosen to explore in-service teachers’ perceptions of feedback that was received and provided by the means of an e-portfolio.

2.1. Research Context

The literature speaks about several types of e-portfolios, such as showcase portfolio, assessment portfolio, learning portfolio, reflective portfolio [52]. A web-based e-portfolio, named Electronic Portfolio Assessment and Support System (EPASS), was used in this study. It provides users with tools for assessment, feedback and reflection. EPASS enables the collection of specific information about the performance and development of the users within a competency framework. This competency framework was developed with the collaboration of Estonian and Dutch teacher educators and then adapted into an Estonian context. The framework was adapted to the form of an assessment rubric, containing five professional roles, twelve professional activities and five performance levels for each (see further [53]).
The e-portfolio was enhanced with two LA applications. The Just-in-Time (hereafter JIT) feedback module provided users with automated feedback messages determined on the performance scores in the e-portfolio and defined based on the rubric. Moreover, the users could also see written personalised feedback in the JIT feedback module inserted by the feedback provider. An example of the automated feedback and the written feedback is presented in Table 1.
The second LA application was the Visualisation (hereafter VIZ) module. This module gave the users a visual overview of their development in different ways based on the users’ wishes (e.g., in the form of a line graph, bar chart, spider diagram, table). The different features of the VIZ module are presented in Figure 1.
A four-week in-service teacher professional development course “Lesson observation and analysis” was carried out. The course consisted of two seminars, at the beginning and at the end of the course at the university, and a practical task in the workplace. The practical task involved the deployment of the e-portfolio as a means to help the in-service teachers to gain more effective feedback on their professional activities and therefore enhance their professional development.
Altogether, five groups of teachers were formed; three experimental and two control groups. In the first seminar, the groups were given an overview of the requirements for the course and of the e-portfolio; the teachers were provided with manuals and videos of how to use the e-portfolio as a means for providing and receiving feedback. For three experimental groups, an overview of the LA applications was provided, and the teachers in the control groups did not see the LA applications in the e-portfolio. Then, as a practical task, the teachers were asked to receive or provide feedback on three lessons at their workplace during the period of one month. If the teachers decided to receive feedback on their activities, they were given access to the e-portfolio and they were asked to find a colleague who would observe their activities in the lesson and then receive feedback through the e-portfolio. Vice versa, if the teachers wanted to provide feedback, they were asked to find a colleague who they would observe and provide feedback through the e-portfolio. However, the feedback providers did not have access to the e-portfolio and they used the e-portfolio as external users.
The feedback receivers were asked to fill in the feedback form in the e-portfolio for context information (e.g., the name of the school, subject, etc.) and send it to the feedback provider with the request to fill in the form. The feedback provider then observed the lessons of the feedback receiver, marked the performance levels and provided written feedback in the form in the e-portfolio. Based on the scores the feedback provider marked, the feedback receivers received information about their activities in the e-portfolio. The feedback receivers in the experimental groups also received automated feedback in the JIT feedback module and the graphical representations of their professional activities’ scores in the VIZ module. Through the course, constant support was provided via e-mail. The course ended with another seminar at the university where the teachers could reflect on their experiences and data were collected from the participants.

2.2. Participants

The overall course enrolment was 135 teachers, with 56% volunteering to participate in the study. Data were collected from two samples. The first sample consisted of 61 in-service teachers who used the e-portfolio to receive feedback from their colleagues. The experimental group of feedback receivers (N = 38, distributed across three groups) used the e-portfolio with LA, and the control group (N = 23, distributed across two groups) used the e-portfolio without the LA. Fifty-nine of the participants were female and two were male. The teachers’ age varied from 25 years to 67 years and the mean age was 44.48, SD = 10.54 (experimental group: 25–67, M = 43.35, SD = 11.06, control group: 27–62, M = 46.30, SD = 9.60). Out of 61 in-service teachers, 33 had previous experiences with e-learning environments (22 in experimental and 11 in the control group). Only 13 teachers had previous experiences with e-portfolios (8 in experimental and 5 in the control group). The second sample consisted of 15 in-service teachers who gave feedback to their colleagues via the e-portfolio.

2.3. Data Collection

To understand whether there is a difference between the experimental and control group in feedback perception among the feedback receivers, the adapted Assessment Experience Questionnaire (AEQ) [54] was administered. The AEQ examines the extent to which learners experience various conditions of learning. Three scales that are related to feedback in the questionnaire were used: quantity and timing of feedback (5 items, α = 0.54; e.g., I get plenty of feedback on my professional activities from my colleague via EPASS), quality of feedback (6 items, α = 0.81; e.g., The feedback my colleague provides me, via EPASS, shows me how to do better next time) and how the feedback is used (4 items, α = 0.81; e.g., I use the feedback my colleague provides me, via EPASS, to adapt my behaviour and activities). In total, the questionnaire consisted of 15 items, α = 0.85. For all parts of the questionnaire, the responses were given on a 5-point Likert-type scale (1—fully disagree, 5—fully agree).
Qualitative data from the feedback receivers were gathered with open-ended questions in the questionnaire and focus group interviews. Participants were asked in the interviews what they thought about the feedback they received and how they used the received feedback. The teachers in the experimental group were also asked to comment on the different LA applications—how useful they found the visualised, automated and written feedback, how they understood the feedback they received via the LA applications, and how they used the received feedback.
Data from the feedback providers were collected during the last seminar of the professional development course in the university with interviews, involving one individual and four focus group interviews (number of participants ranging from two to five). The feedback providers were asked about their perceptions about the feedback provision process, how the use of the system affected the feedback they gave, what impact the feedback had on the receiver and how the feedback was used by the receiver in their perception.

2.4. Data Analysis

Shapiro-Wilk test was used for testing the normality of the AEQ scores of the whole scale and three subscales in the questionnaire. For comparisons of the experimental and control group, independent-samples t-test was used in case of parametric data, and the Mann–Whitney U-test for non-parametric data. However, the results regarding one subscale (quantity and timing of feedback) are considered with caution since the Cronbach’s alpha of this subscale was rather low.
Open-ended responses in the questionnaire and the focus group interviews were analysed following the inductive thematic analysis procedure [55] in order to find common themes in participants’ responses. This means that data were explored without any predetermined framework and themes were inductively drawn from the data. Two researchers read the open-ended responses, interview notes, and reports several times to acquaint themselves with the data. As the next step, initial codes were generated and compared between two researchers. Based on the similarities, the codes were then grouped into themes. As the final step, a detailed description of the results was written and illustrated with quotations.

3. Results

The results are grouped into two parts. Results about the perceptions of the feedback receivers are based on questionnaire and interview data, whilst the second part, perceptions of the feedback providers, draws only on the qualitative data.

3.1. Perceptions of the Feedback Receivers

The first question in this study sought to explore how the teachers as feedback receivers perceived the feedback by the means of an e-portfolio. Based on the qualitative data, the feedback receivers pointed out the ways in which they organised the timing of the feedback process. One possibility for structuring the process was that the feedback provider and the feedback receiver sat together after the observation, had a discussion, and filled in the e-portfolio jointly. One participant brought out: “It was best to fill in the e-portfolio together after the lesson—you can discuss with your colleague and insert feedback at the same time”. The second possibility of how the process was structured involved the discussion between the feedback provider and the feedback receiver after the lesson, however, the feedback provider filled in the e-portfolio after the discussion independently. This way, the feedback provider could choose the time at which to insert the feedback in the e-portfolio. This structure was criticised by a feedback receiver indicating that “when my colleague gave me oral feedback right after the lesson, it was frustrating for her to fill in the same comments in the e-portfolio afterwards”. Therefore, this option was seen as double the work for feedback providers.
There was also an option not to have a discussion at all between the feedback provider and the feedback receiver and the feedback was inserted in the e-portfolio independently right after the observation or with a delay by the feedback provider. This possibility was met with mixed feelings. On the one hand, this enabled more flexibility for the participants in the feedback process. As one feedback receiver said: “The e-portfolio made the feedback process more compact. When it is really busy at school, we do not have the time to sit and talk. My colleague could fill in the feedback when she had time and I could read it afterwards when I have time”. On the other hand, this way the timing of the feedback really depended on both parties. The teachers indicated that although the e-portfolio itself provided feedback at the same moment, if the feedback provider did not insert the feedback data in the e-portfolio, the feedback was delayed. The same was true for feedback receivers, as one teacher noted: “It was difficult to understand the feedback. Maybe I read it too long after the lesson”. Thus, it was more beneficial for the feedback receivers to receive and correspond to the feedback immediately after the task.
The teachers pointed out that they valued the possibility to choose the activities they wanted to receive feedback on. This made the process more personalised for them. Even though the participants indicated that they valued the comments from their colleagues rather than the scores in the feedback form, the challenges of the written feedback were also addressed. One participant emphasised that because written feedback can be misunderstood, there should always be a face-to-face meeting. As one participant put it: “I am afraid that my colleague was not honest with me”. Therefore, the face-to-face meeting gave the participants the possibility to discuss the feedback further.
The second research question aimed to understand whether there was a difference in the perceptions between the experimental group and the control group of teachers. In other words, whether there were differences in the two groups of feedback receivers: the experimental group received feedback through the e-portfolio which was enhanced with LA, the control group did not see the LA. With regard to the results focusing on the quantity and timing of the feedback, no statistically significant difference in scores were identified between the experimental (M = 3.44, SD = 0.56) and the control group (M = 3.26, SD = 0.55). Although the scores in the quality of the feedback scale were higher in the experimental group (Mdn = 3.85) compared to the control group (Mdn = 3.67), the difference was not statistically significant. This also constitutes what the teachers do with the received feedback where Mann-Whitney U test showed no statistically significant difference between the experimental (Mdn = 4.10) and the control group (Mdn = 3.88). However, the whole feedback experience was estimated as significantly higher in the experimental group with LA (M = 3.78, SD = 0.45), t(59) = −2.1, p < 0.05 compared to the control group (M = 3.50, SD = 0.60) with no LA. Following this, the results from the qualitative data will be reported in order to understand the feedback receivers’ perceptions in more depth.
The teachers who could see the LA applications had mixed feelings about how understandable the feedback in the LA modules was. On the one hand, the participants noted that the JIT feedback module was an “eye-catcher” but on the other, the feedback in the VIZ module was more useful. As one participant noted: “I did not understand the JIT module. Since my colleague did not write anything in the comments’ section, I saw no value in it. However, I liked the VIZ module, it gave me information and overview of my activities”. In terms of the quantity of the received feedback, some of the participants in the experimental group indicated that there was too much detailed information in the LA modules. As one participant noted: “The picture was too colourful, there were too many things”. Several teachers who had the possibility to see LA would have liked to get more aggregated feedback and not so many different choices.
Overall, the feedback received through the e-portfolio was perceived as useful for further discussion and reflection. The participants indicated that the e-portfolio enabled them to revise the feedback whenever they wanted as the feedback was together in one system. In order to gain more from the feedback, the participants suggested that more training on how to understand and benefit from the feedback in the LA applications should be provided.

3.2. Perceptions of the Feedback Providers

The third research question set out to explore how the teachers as feedback providers perceived the feedback. Data from the feedback providers were collected with one individual and four focus group interviews.
The feedback providers were asked how the use of the system affected the way they gave the feedback. Similar to the feedback receivers, the answers to these questions were divided into two parts. There were some feedback providers that preferred to discuss their observations right after the lessons and fill in the feedback form with the colleague they had been observing. By contrast, there were teachers who took some time after the lesson observation, gathered their thoughts and filled in the feedback in the e-portfolio alone after some time had passed (e.g., in the evening at home). Although the e-portfolio provided enough flexibility for the feedback providers on when to fill in the feedback, they all agreed that the feedback filled in the feedback form should always be accompanied with oral feedback because “you cannot write the feedback that you have not said before”.
Almost all feedback providers agreed that using the e-portfolio made their feedback more specific and structured. According to most of the feedback providers, the structure lay in the feedback form that provided specific criteria for the assessment procedure. As one feedback provider put it: “I could see that the professional activities with their level descriptions in the feedback form were based on the teachers’ standard, but the feedback form provided even more precise activities”. However, there were feedback providers who were critical about this rubric, indicating that “it was too general to provide specific feedback or the differences between the levels were too small making it difficult to choose the right one”.
There was no consensus in the perceptions of the written feedback among the participants. Some of the feedback providers appreciated the possibility to write the feedback in the comments section. For example, one interviewee said: “I like that I could add my own feedback after marking the score. I felt I could add something extra to the existing criterion. The existing criterion directed me to think and pay attention to aspects that otherwise may have been unnoticed”. Nevertheless, some feedback providers did not like that writing the comments was obligatory. They felt that some things were already said in the performance level descriptions and therefore they had to use copy and paste or they wrote only a dash in the comments section. Moreover, there were feedback providers who said that they had given all their comments in the discussion and therefore filling in the comments section in the e-portfolio was demotivating.
Several feedback providers stated that in their opinion, the e-portfolio was useful for their colleagues as they benefited a lot from the feedback they received because the e-portfolio was seen as a good tool for reflection. However, there was one feedback provider who found the e-portfolio to be “a distant tool that does not foster discussion or reflection”.
Another interesting topic that emerged was that the feedback providers admitted that giving feedback for a colleague was a unique situation and they felt that they were supposed to be friendly with them. Therefore, they did not give too much critical feedback and also rated their colleagues with high scores. Overall, the feedback providers were rather positive about the experience with giving feedback via the e-portfolio. The consensus was that using the e-portfolio made giving the feedback easier and developed their feedback providing skills.

4. Discussion and Conclusions

This exploratory case study looked at the perceptions of the feedback process of in-service teachers who received and provided feedback through a web-based e-portfolio during a professional development course in their workplace. Several common topics were identified among the two groups.
The importance of oral feedback and discussion in the feedback process was emphasised in almost all interviews by both groups—the feedback receivers and providers. Although the e-portfolio was seen as a good tool to gather, record and contain the feedback, the participants indicated that feedback is created in interaction and therefore face-to-face discussions should also be part of the process. This topic was also emphasized by Powell and Bodur [33] who showed in their study that the lack of social interaction and collaboration was seen as a weakness in an online teacher professional development experience. This means also that better balance combining computer-based and human-produced feedback should be found [34].
The receipt and provision of feedback within this study was conducted between peers. This, however, was noted as a challenge by the participants even if the importance of social interaction was emphasised. The feedback providers indicated the feeling of not being able to be critical to their colleagues and the feedback receivers felt that the feedback providers had not been honest with them. Teachers could choose the peers they wanted to involve in the feedback process themselves and therefore it seems that they chose the peers they felt most comfortable with. Shortland [14] emphasises the importance of the peer observation partners’ selection and warns that feedback can be dangerous to relationships—if perceived as critical it may damage the relationship, however, little gain is served if the problem is avoided. Therefore, the peer should not only be a “friend” or yet alone only “critical”, but rather act as a “critical friend”. Additionally, taking time to review another person’s work can encourage teachers to examine their own activities and viewpoints, and therefore providing peer feedback has potential learning benefits for the provider as well [40]. Overall, building this sort of relationship takes time and trust [14].
One common topic was the flexibility of the feedback process that the e-portfolio offered and this was seen as its most beneficial aspect. Both groups described the ways in which they had structured the feedback process and in a broad sense, three patterns emerged. The flexibility of the feedback process is important, however, all of these patterns involved benefits and challenges indicated by the participants. The first pattern where the feedback provider and receiver sat together after the observation, had a discussion, and filled in the e-portfolio jointly was seen as beneficial but time-consuming. The second pattern where the feedback provider and the receiver sat together after the observation, had a discussion, but the feedback provider filled in the e-portfolio after the discussion independently enabled the feedback provider to take some time to think through what to write in the e-portfolio. However, this pattern was often seen as double work by the feedback providers. The third pattern, where the feedback provider and the feedback receiver did not have a discussion after the observation and the feedback provider inserted the feedback in the e-portfolio independently right after the observation or with a delay, was the most time-saving, however, the lack of oral feedback and discussion was mentioned as a weakness of this pattern. One criterion for the effective provision of feedback is that the timing of the provision needs to be agreed upon and needs to reflect the balance between the requirements for immediacy and considered reflection [14]. This also supports the idea of Prieto and colleagues [22], that technological tools for teachers’ professional development should be “designed for overload”—teachers may lack spare energy and attention during the school day and therefore can choose the pattern most suitable for them.
The feedback framework that the e-portfolio offered was also mentioned by both of the groups. The feedback providers valued the structured criteria as the basis for the e-portfolio and the LA applications for their observations. Although there were some critical comments about the framework suggesting that the criteria were too general or the performance levels were too similar, having a clear framework as a guidance in the feedback process has also been emphasized in the literature [15]. Gašević, Dawson and Siemens [56] have also emphasised that the LA approach only holds a promise to improve learning when it is developed from theoretically established instructional strategies. The feedback receivers saw the main benefit in the personalisation the framework offered, having the possibility to choose for themselves the activities they wanted to receive feedback on. This result is consistent with the idea by Clow [39], who pointed out that learning is improved by the LA system only then when the given feedback reflects and rewards those aspects of learning that are valued by the learner. If the teachers cannot choose the activities by themselves, the feedback that the LA system provides is not meaningful to the teachers and does not enhance their learning. This supports the suggestion by Prieto and colleagues [22], that in designing the tools for teacher reflection, special attention should be given to allowing teachers to take ownership of their own learning.
Another interesting result was that although the whole feedback experience was perceived as significantly higher by the group of teachers who saw LA, there were no significant differences in perceptions of the quantity, quality and use of feedback between the two groups. Although Pardo [34] notes that LA could improve the overall learners’ experience, the teachers in this study had mixed perceptions about the LA applications, indicating that understanding of information in the LA applications took time and the extra value of the LA applications was limited. Although training on how to use the system was provided to the teachers, according to the teachers they needed even more support. This may have been due to the fact that teachers may lack digital skills. In addition, more attention should have been given to supporting their uptake of feedback and feedback literacy [6,7,57].
When interpreting the results, it must be noted that the teachers in the study volunteered to participate, a characteristic that may have shaped their perceptions. The e-portfolio was tested over a period of one month only during a professional development course. The implementation of new technology takes a lot of time and effort in order to be of more benefit and therefore a longer period of time should be allocated for the implementation. A third limitation is that the study examined a particular e-portfolio and LA applications that is not generalizable to other types of e-portfolios and LA applications. A different e-portfolio system and course design may have revealed features that could further contribute to feedback research.
The current study focused on teachers’ perceptions of feedback. Although perceptions of feedback have implications for the acceptance and use of feedback [58], we still do not know how the teachers really used the feedback they received through the e-portfolio and whether there was an improvement in their competencies. A further study could investigate the actual uptake and use of the feedback in which teachers apply the feedback to improve their subsequent activities. Given the diverse sample of in-service teachers, research could also explore whether their digital or ICT competence may impact teachers’ perceptions of the feedback received or provided via the e-portfolio. Moreover, feedback on teachers’ activities was not collected from students—the stakeholders who teachers’ learning should most affect. Students should not be a feature in the background, but rather have an active voice in teachers’ professional development process [22].
To conclude, despite the contextuality of our research process, the perspectives of teachers about the feedback they received and provided through the e-portfolio provide an implication for the feedback research among in-service teachers and for the broader field of teachers’ professional development. Although the e-portfolio with learning analytics offered different possibilities (e.g., automated and visualised feedback) to enhance the learning and development of teachers, human presence and interaction was still highly valued by teachers, despite the time and effort it required. In order to benefit from the feedback, extra attention should be given to the training and support in using the system, with special focus on how to understand the feedback in the LA and how to implement it in the subsequent activities.

Author Contributions

Conceptualisation, M.v.d.S., P.H. and Ä.L.; methodology, M.v.d.S., P.H. and Ä.L.; formal analysis, P.H.; investigation, P.H.; resources, M.v.d.S. and Ä.L.; data curation, P.H.; writing—original draft preparation, P.H.; writing—review and editing, P.H., Ä.L. and M.v.d.S.; supervision, Ä.L. and M.v.d.S.; project administration, M.v.d.S.; funding acquisition, M.v.d.S. All authors have read and agreed to the published version of the manuscript.

Funding

This study was conducted as part of the “Workplace-Based e-Assessment Technology for competency-Based Higher Multi-Professional Education” (WATCHME) project supported by the European Commission 7th Framework Programme (grant agreement No. 619349).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data are not publicly available due to privacy or ethical restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sancar, R.; Atal, D.; Deryakulu, D.A. New Framework for Teachers’ Professional Development. Teach. Teach. Educ. 2021, 101. [Google Scholar] [CrossRef]
  2. Darling-Hammond, L.; Hyler, M.E.; Gardner, M. Effective Teacher Professional Development; Learning Policy Institute: Palo Alto, CA, USA, 2017; pp. 14–15. [Google Scholar]
  3. Hattie, J.; Timperley, H. The Power of Feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
  4. Boud, D.; Molloy, E. Rethinking Models of Feedback for Learning: The Challenge of Design. Assess. Eval. High. Educ. 2013, 38, 698–712. [Google Scholar] [CrossRef]
  5. Winstone, N.E.; Nash, R.A.; Parker, M.; Rowntree, J. Supporting Learners’ Agentic Engagement With Feedback: A Systematic Review and a Taxonomy of Recipience Processes. Educ. Psychol. 2017, 52, 17–37. [Google Scholar] [CrossRef] [Green Version]
  6. Carless, D.; Boud, D. The Development of Student Feedback Literacy: Enabling Uptake of Feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef] [Green Version]
  7. Carless, D.; Winstone, N. Teacher Feedback Literacy and Its Interplay with Student Feedback Literacy. Teach. High. Educ. 2020, 1–14. [Google Scholar] [CrossRef]
  8. Scheeler, M.C.; Ruhl, K.L.; McAfee, J.K. Providing Performance Feedback to Teachers: A Review. Teach. Educ. Spec. Educ. J. Teach. Educ. Div. Counc. Except. Child. 2004, 27, 396–407. [Google Scholar] [CrossRef]
  9. Thurlings, M.; Vermeulen, M.; Kreijns, K.; Bastiaens, T.; Stijnen, S. Development of the Teacher Feedback Observation Scheme: Evaluating the Quality of Feedback in Peer Groups. J. Educ. Teach. 2012, 38, 193–208. [Google Scholar] [CrossRef] [Green Version]
  10. Shute, V.J. Focus on Formative Feedback. Rev. Educ. Res. 2008, 78, 153–189. [Google Scholar] [CrossRef]
  11. Gibbs, G.; Simpson, C. Conditions Under Which Assessment Supports Students’ Learning. Learn. Teach. High. Educ. 2004, 3–33. [Google Scholar] [CrossRef]
  12. Brown, G.T.L.; Harris, L.R.; Harnett, J. Teacher Beliefs about Feedback within an Assessment for Learning Environment: Endorsement of Improved Learning over Student Well-Being. Teach. Teach. Educ. 2012, 28, 968–978. [Google Scholar] [CrossRef]
  13. Thurlings, M.; Vermeulen, M.; Bastiaens, T.; Stijnen, S. Understanding Feedback: A Learning Theory Perspective. Educ. Res. Rev. 2013, 9, 1–15. [Google Scholar] [CrossRef]
  14. Shortland, S. Feedback within Peer Observation: Continuing Professional Development and Unexpected Consequences. Innov. Educ. Teach. Int. 2010, 47, 295–304. [Google Scholar] [CrossRef]
  15. Parr, J.M.; Hawe, E. Facilitating Real-Time Observation of, and Peer Discussion and Feedback about, Practice in Writing Classrooms. Prof. Dev. Educ. 2017, 43, 709–728. [Google Scholar] [CrossRef]
  16. Parsons, S.A.; Hutchison, A.C.; Hall, L.A.; Parsons, A.W.; Ives, S.T.; Leggett, A.B. U.S. Teachers’ Perceptions of Online Professional Development. Teach. Teach. Educ. 2019, 82, 33–42. [Google Scholar] [CrossRef]
  17. Hennessy, S.; Dragovic, T.; Warwick, P. A Research-Informed, School-Based Professional Development Workshop Programme to Promote Dialogic Teaching with Interactive Technologies. Prof. Dev. Educ. 2018, 44, 145–168. [Google Scholar] [CrossRef]
  18. UNESCO. UNESCO ICT Competency Framework for Teachers; United Nations Educational, Scientific and Cultural Organization: Paris, France, 2018; pp. 1–66. [Google Scholar]
  19. Redecker, C. European Framework for the Digital Competence of Educators: DigCompEdu; Punie, Y., Ed.; Publications Office of the European Union: Luxembourg, 2017; pp. 38–40. [Google Scholar] [CrossRef]
  20. Fernández-Batanero, J.M.; Montenegro-Rueda, M.; Fernández-Cerero, J.; García-Martínez, I. Digital Competences for Teacher Professional Development. Systematic Review. Eur. J. Teach. Educ. 2020, 1–19. [Google Scholar] [CrossRef]
  21. Aubusson, P.; Schuck, S.; Burden, K. Mobile Learning for Teacher Professional Learning: Benefits, Obstacles and Issues. ALT J. 2009, 17, 233–247. [Google Scholar] [CrossRef]
  22. Prieto, L.P.; Magnuson, P.; Dillenbourg, P.; Saar, M. Reflection for Action: Designing Tools to Support Teacher Reflection on Everyday Evidence. Technol. Pedagog. Educ. 2020, 29, 279–295. [Google Scholar] [CrossRef]
  23. van der Schaaf, M.F.; Stokking, K.M. Developing and Validating a Design for Teacher Portfolio Assessment. Assess. Eval. High. Educ. 2008, 33, 245–262. [Google Scholar] [CrossRef]
  24. Granberg, C. E-Portfolios in Teacher Education 2002-2009: The Social Construction of Discourse, Design and Dissemination. Eur. J. Teach. Educ. 2010, 33, 309–322. [Google Scholar] [CrossRef]
  25. Hamilton, M. Evidence-Based Portfolios: A Cross-Sectoral Approach to Professional Development among Teachers. Prof. Dev. Educ. 2018, 1–15. [Google Scholar] [CrossRef]
  26. Kori, K.; Pedaste, M.; Leijen, Ä.; Mäeots, M. Supporting Reflection in Technology-Enhanced Learning. Educ. Res. Rev. 2014, 11, 45–55. [Google Scholar] [CrossRef]
  27. Slepcevic-Zach, P.; Stock, M. ePortfolio as a Tool for Reflection and Self-Reflection. Reflective Pract. 2018, 19, 291–307. [Google Scholar] [CrossRef]
  28. Strudler, N.; Wetzel, K. The Diffusion of Electronic Portfolios in Teacher Education: Issues Of Initiation and Implementation. J. Res. Technol. Educ. 2014, 37, 411–433. [Google Scholar] [CrossRef]
  29. Wray, S. Electronic Portfolios in a Teacher Education Program. E Learn. Digit. Media. 2007, 4, 40–51. [Google Scholar] [CrossRef] [Green Version]
  30. Evans, M.A.; Powell, A. Conceptual and Practical Issues Related to the Design for and Sustainability of Communities of Practice: The Case of E-portfolio Use in Preservice Teacher Training. Technol. Pedagog. Educ. 2007, 16, 199–214. [Google Scholar] [CrossRef]
  31. Kabilan, M.K.; Khan, M.A. Assessing Pre-Service English Language Teachers’ Learning Using e-Portfolios: Benefits, Challenges and Competencies Gained. Comput. Educ. 2012, 58, 1007–1020. [Google Scholar] [CrossRef]
  32. Kankaanranta, M. Constructing Digital Portfolios: Teachers Evolving Capabilities in the Use of Information and Communications Technology. Teach. Dev. 2001, 5, 259–275. [Google Scholar] [CrossRef]
  33. Struyven, K.; Blieck, Y.; De Roeck, V. The Electronic Portfolio as a Tool to Develop and Assess Pre-Service Student Teaching Competences: Challenges for Quality. Stud. Educ. Eval. 2014, 43, 40–54. [Google Scholar] [CrossRef]
  34. Pardo, A. A Feedback Model for Data-Rich Learning Experiences. Assess. Eval. High. Educ. 2018, 43, 428–438. [Google Scholar] [CrossRef]
  35. Siemens, G. Learning Analytics: The Emergence of a Discipline. Am. Behav. Sci. 2013, 57, 1380–1400. [Google Scholar] [CrossRef] [Green Version]
  36. Johnson, L.; Adams Becker, S.; Cummins, M.; Freeman, A.; Ifenthaler, D.; Vardaxis, N. Technology Outlook for Australian Tertiary Education 2013–2018: An NMC Horizon Project Regional Analysis; New Media Consortium: Austin, TX, USA, 2013; pp. 1–28. [Google Scholar]
  37. Greller, W.; Drachsler, H. Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educ. Technol. Soc. 2012, 15, 42–57. [Google Scholar]
  38. Aguiar, E.; Ambrose, G.A.A.; Chawla, N.V.; Goodrich, V.; Brockman, J. Engagement vs. Performance: Using Electronic Portfolios to Predict First Semester Engineering Student Persistence. J. Learn. Anal. 2014, 1, 7–33. [Google Scholar] [CrossRef] [Green Version]
  39. Clow, D. An Overview of Learning Analytics. Teach. High. Educ. 2013, 18, 683–695. [Google Scholar] [CrossRef] [Green Version]
  40. van Popta, E.; Kral, M.; Camp, G.; Martens, R.L.; Simons, P.R.-J. Exploring the Value of Peer Feedback in Online Learning for the Provider. Educ. Res. Rev. 2017, 20, 24–34. [Google Scholar] [CrossRef]
  41. Adams, J.; McNab, N. Understanding Arts and Humanities Students’ Experiences of Assessment and Feedback. Arts Humanit. High. Educ. 2013, 12, 36–52. [Google Scholar] [CrossRef]
  42. Evans, C. Making Sense of Assessment Feedback in Higher Education. Rev. Educ. Res. 2013, 83, 70–120. [Google Scholar] [CrossRef] [Green Version]
  43. Beaumont, C.; O’Doherty, M.; Shannon, L. Reconceptualising Assessment Feedback: A Key to Improving Student Learning? Stud. High. Educ. 2011, 36, 671–687. [Google Scholar] [CrossRef]
  44. Carless, D. Differing Perceptions in the Feedback Process. Stud. High. Educ. 2006, 31, 219–233. [Google Scholar] [CrossRef] [Green Version]
  45. Dawson, P.; Henderson, M.; Mahoney, P.; Phillips, M.; Ryan, T.; Boud, D.; Molloy, E. What Makes for Effective Feedback: Staff and Student Perspectives. Assess. Eval. High. Educ. 2018, 1–12. [Google Scholar] [CrossRef]
  46. Dowden, T.; Pittaway, S.; Yost, H.; McCarthy, R. Students’ Perceptions of Written Feedback in Teacher Education: Ideally Feedback Is a Continuing Two-Way Communication That Encourages Progress. Assess. Eval. High. Educ. 2013, 38, 349–362. [Google Scholar] [CrossRef]
  47. Mulliner, E.; Tucker, M. Feedback on Feedback Practice: Perceptions of Students and Academics. Assess. Eval. High. Educ. 2017, 42, 266–288. [Google Scholar] [CrossRef]
  48. Ferguson, P. Student Perceptions of Quality Feedback in Teacher Education. Assess. Eval. High. Educ. 2011, 36, 51–62. [Google Scholar] [CrossRef]
  49. Buhagiar, M.A. Mathematics Student Teachers’ Views on Tutor Feedback during Teaching Practice. Eur. J. Teach. Educ. 2013, 36, 55–67. [Google Scholar] [CrossRef]
  50. Powell, C.G.; Bodur, Y. Teachers’ Perceptions of an Online Professional Development Experience: Implications for a Design and Implementation Framework. Teach. Teach. Educ. 2019, 77, 19–30. [Google Scholar] [CrossRef]
  51. Yin, R.K. Applications of Case Study Research; Sage Publications, Inc.: Thousand Oaks, CA, USA, 2003. [Google Scholar]
  52. Barrett, H.C.; Garrett, N. Online Personal Learning Environments: Structuring Electronic Portfolios for Lifelong and Life-wide Learning. Horizon 2009, 17, 142–152. [Google Scholar] [CrossRef]
  53. Leijen, Ä.; Slof, B.; Malva, L.; Hunt, P.; van Tartwijk, J.; Schaaf, M. van der Performance-Based Competency Requirements for Student Teachers and How to Assess Them. Int. J. Inf. Educ. Technol. 2017, 7, 190–194. [Google Scholar] [CrossRef] [Green Version]
  54. Gibbs, G.; Simpson, C. Measuring the Response of Students to Assessment: The Assessment Experience Questionnaire. In Proceedings of the 11th Improving Student Learning Symposium, Hinckley, UK, 1–3 September 2003. [Google Scholar]
  55. Braun, V.; Clarke, V. Using Thematic Analysis in Psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef] [Green Version]
  56. Gašević, D.; Dawson, S.; Siemens, G. Let’s Not Forget: Learning Analytics Are about Learning. TechTrends 2015, 59, 64–71. [Google Scholar] [CrossRef]
  57. Molloy, E.; Boud, D.; Henderson, M. Developing a Learning-Centred Framework for Feedback Literacy. Assess. Eval. High. Educ. 2020, 45, 527–540. [Google Scholar] [CrossRef] [Green Version]
  58. Altahawi, F.; Sisk, B.; Poloskey, S.; Hicks, C.; Dannefer, E.F. Student Perspectives on Assessment: Experience in a Competency-Based Portfolio System. Med. Teach. 2012, 34, 221–225. [Google Scholar] [CrossRef] [PubMed]
Figure 1. VIZ Module in the E-portfolio Dashboard (A), Example of the Timeline Graph (B), Example of the Spider Diagram (C), Example of the Bar Chart (D), and Example of the Table (E).
Figure 1. VIZ Module in the E-portfolio Dashboard (A), Example of the Timeline Graph (B), Example of the Spider Diagram (C), Example of the Bar Chart (D), and Example of the Table (E).
Education 11 00278 g001
Table 1. Examples of the Feedback in the JIT Feedback Module.
Table 1. Examples of the Feedback in the JIT Feedback Module.
FeedbackExample
AutomatedYou are at level 2 (sufficient) on “Plans the execution of learning activities” In order to achieve the next level, you should: “Plan a lesson that is clearly structured: introduction, core and closing. Have an alternative plan for different (to be expected) situations”.
WrittenYou are good at anticipating different and often unexpected situations. You can find solutions really quickly and continue with the lesson at a calm pace. Your lesson is timely planned, but you still have left spare time for unexpected situations.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hunt, P.; Leijen, Ä.; van der Schaaf, M. Automated Feedback Is Nice and Human Presence Makes It Better: Teachers’ Perceptions of Feedback by Means of an E-Portfolio Enhanced with Learning Analytics. Educ. Sci. 2021, 11, 278. https://doi.org/10.3390/educsci11060278

AMA Style

Hunt P, Leijen Ä, van der Schaaf M. Automated Feedback Is Nice and Human Presence Makes It Better: Teachers’ Perceptions of Feedback by Means of an E-Portfolio Enhanced with Learning Analytics. Education Sciences. 2021; 11(6):278. https://doi.org/10.3390/educsci11060278

Chicago/Turabian Style

Hunt, Pihel, Äli Leijen, and Marieke van der Schaaf. 2021. "Automated Feedback Is Nice and Human Presence Makes It Better: Teachers’ Perceptions of Feedback by Means of an E-Portfolio Enhanced with Learning Analytics" Education Sciences 11, no. 6: 278. https://doi.org/10.3390/educsci11060278

APA Style

Hunt, P., Leijen, Ä., & van der Schaaf, M. (2021). Automated Feedback Is Nice and Human Presence Makes It Better: Teachers’ Perceptions of Feedback by Means of an E-Portfolio Enhanced with Learning Analytics. Education Sciences, 11(6), 278. https://doi.org/10.3390/educsci11060278

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop