4.1.1. Quantitative Data
Overall, both the quantitative and qualitative data suggest that teachers have generally had some training in language assessment; however, the majority of participants indicated that they received most or all of their LAL informally (
Table 2). Three participants left written comments on the survey, two regarding learning about assessment through certification courses and one through in-service training. However, despite the vast majority of participants receiving formal training, more than half also indicated their reliance on compensation strategies. No participant left additional comments about their compensation strategies.
Participants also responded to a question that asked about the modes through which they received formal training. Participants could choose all of the training modes they had undertaken and which compensation strategies they engaged in (
Table 3).
Participants responded to whether they have received training on the following selected assessment content areas as shown in
Table 4.
Of the assessment activities, open-ended and closed-answer tests received the highest responses, as well as class participation. On the other hand, fewer teachers indicated that they had received training on writing-related activities, both timed and untimed writing, as well as rubric development. With regard to decision-making, more teachers were trained in classroom-specific procedures, such as giving grades or making instructional decisions, than program-level decisions such as awarding certifications or program placement.
4.1.2. Qualitative Results
The present section details participants influential learning experiences, which includes both formal training and compensation strategies. Interview participants indicated that they have received some formal LAL training. From the analysis of the qualitative data about learning experiences, the broad themes that emerged were assessment procedures.
Formal training: pre-service. With regard to pre-service training, five participants noted that they learned about language assessment in a structured training setting prior to beginning to teach, with most participants learning through their degree programs and one through a non-degree training course, which served to further illustrate survey responses. However, participants tended to speak in vague terms regarding what they learned in these courses and did not give specifics. For example, at the bachelor’s degree level, four participants indicated that they learned about language assessment during their pedagogical methods courses. E1 mentioned that in his undergraduate program, they covered “how to assess through different [types] of evaluation, like exams and rubrics, and all the different stages of evaluation like the diagnostic one, continuous one.” Furthermore, one participant, B6, possessed a master’s degree in foreign language teaching. She explained that she took a course specifically for language testing where they “would talk about strategies to evaluate listening, grammar, reading [and] speaking.” Additionally, E5, whose undergraduate degree was in psychology, learned about assessment through a teacher certification course. She learned “different kinds of evaluation that teachers are supposed to do.” These examples show that interview participants could confirm the survey findings that about a quarter of the sample underwent pre-service training in language assessment but illustrated that it may not have been adequate.
Formal training: in-service. Eleven participants also noted that they had participated in assessment training workshops through their schools in which they learned nuts-and-bolts assessment practices. As with the survey data, a plurality of participants indicated that they attended a short-term training session after they began teaching. Overall, participants agreed that training was mostly practical and not theoretical. For example, at workshops offered by their respective schools, B2 mentioned learning how to develop closed-answer and open-ended test items and E6 learned about rubric development. Two participants, B5 and B6, mentioned learning about the standardized tests they implemented at their school, including Mexico’s Secretary of Public Education exams and the Cambridge Key English Test and Preliminary English Test. Furthermore, when discussing these workshops, seven participants underscored a communicative and individualistic perspective on assessment. To illustrate, B5, who had undergone training for the Cambridge tests, noted that she learned “the way [she] should evaluate students… depending on if you understand them” and that “pronunciation is not the most important thing.” Moreover, E5′s training experiences had taught her to “evaluate students… not [only] through exams” but also to see “the child as a whole.” The qualitative findings diverged from the survey data in that assessing through communicative and engaging activities was the most salient theme of the interviews, while the survey data indicated a plurality of participants were trained in paper–pencil-based assessment methods.
Informal learning: experiences as students. Participants mentioned instances of informal learning both as teacher trainees and language students. For example, 16% of survey participants indicated that they informally asked professors to address assessment, and one interview participant illustrated such interaction. In the methods course of her bachelor’s degree, E2 mentioned that language assessment was “not part of the course,” but “one of [her] classmates made a push” to include it. Therefore, “at that one moment,” her professor “tried to help [them] to design exams, but it was pretty general.” Second, four participants also mentioned experiences as English learners influenced their classroom assessment practices, especially regarding activities such as presentations, projects, and essay writing. To illustrate, B1 connected much of her language assessment knowledge and practices to her experience with international standardized English tests, stating that she “knew the skills required” because she had “taken a few language tests.” She gave the example that she “forces students to write paragraphs” due to the importance of writing skills in standardized testing. While the survey did not ask participants about their experiences as language learners, this finding reflects previous research that indicates that teachers rely on these experiences to assess their own students.
Informal learning: on-the-job. Themes that were mentioned by multiple participants were discussions with peers and supervisors, looking up information online, and trial and error, which concurred with the survey items. First, in the interview data, the most salient theme of informal, on-the-job learning was discussions with colleagues, which reflects the survey data in that 63% of survey participants indicated that they learned aspects of language assessment from colleagues. Four participants mentioned talking to more colleagues about assessment in the qualitative data. B4 gave specific examples of strategies she implemented after discussions with colleagues. Specifically, she learned about peer assessment and rubrics from other teachers. She noted that she “implemented [peer assessment] in class” after “someone explained it to [her].” Furthermore, E2 described her coordinators as “very strict” and asked her to explain “what exactly [she] was checking,” which “helped [her] a lot” with “how to evaluate.” Second, while over half of the survey participants indicated that they used the internet to learn about language assessment, only two interview participants, E5 and B7, discussed their use of online resources. E5 mentioned looking up articles about assessment concepts and B7 stated that he “discovered all the different types of assessment” and “the importance and parts of feedback” in his personal “research about assessment.” Finally, a quarter of survey participants mentioned learning by trial and error, and one interview participant, E4, noted that he “learn[ed] more by practicing, by experiments” than from his previous training experiences.