Next Article in Journal
Testing the Bilingual Cognitive Advantage in Toddlers Using the Early Executive Functions Questionnaire
Next Article in Special Issue
Automated Discourse Analysis Techniques and Implications for Writing Assessment
Previous Article in Journal
Affective Distancing Associated with Second Language Use Influences Response to Health Information
Previous Article in Special Issue
Diagnostic Assessment of Academic Reading: Peeping into Students’ Annotated Texts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Squaring the Circle of Alternative Assessment in Distance Language Education: A Focus on the Young Learner

by
Christina Nicole Giannikas
Department of Rehabilitation Science, Cyprus University of Technology, Limassol 3036, Cyprus
Languages 2022, 7(2), 121; https://doi.org/10.3390/languages7020121
Submission received: 3 February 2022 / Revised: 7 April 2022 / Accepted: 24 April 2022 / Published: 12 May 2022
(This article belongs to the Special Issue Recent Developments in Language Testing and Assessment)

Abstract

:
Because of the suspension of face-to-face (F2F) teaching activities caused by COVID-19, practitioners are in limbo regarding the assessment of young learners (YLs) in the virtual learning environment, as they are left with minimal guidance and evidence on what can be applicable and effective in the new context. In many countries worldwide, as a consequence of the COVID-19 pandemic lockdown, schools in Greece were closed in March 2020. Schools began re-opening in September 2020; however, the second wave of COVID-19 struck, and the number of cases began to grow dangerously. Consequently, schools closed for the second time at the beginning of November 2020, causing teachers and students to face significant assessment challenges. The article presents a case study that concentrates on eight YLs aged 8–10 years old. Alternative assessment was applied during the students’ online language lessons as a means for the teacher to assess and evaluate students’ progress and learning of vocabulary and spelling. For the needs of the study, online observations were conducted, and field notes, record sheets and checklists were kept for a period of two months. Two months after the online lessons commenced, the students were interviewed in order to gain a holistic view of their progress and their feelings toward the experience of an alternative form of assessment.

1. Introduction

Research regarding alternative assessment and young learners (YLs) in the language classroom has led to an assessment construct that is grounded in a view of foreign language as a tool for gaining knowledge, meaning, and cognitive processing skills (Donitsa-Schmidt et al. 2004). However, because of the suspension of face-to-face (F2F) teaching activities caused by COVID-19, practitioners are in limbo regarding the assessment of YLs in the virtual learning environment, as they are left with minimal guidance and evidence on what can be applicable and effective in this new context. As in many countries worldwide, as a consequence of the COVID-19 pandemic lockdown, schools in Greece were closed in March 2020. Although schools began re-opening in September, the second wave of COVID-19 struck, and the number of cases began to grow dangerously. Consequently, schools closed for the second time at the beginning of November 2020, causing teachers and students to face significant assessment challenges. The present article aims to present conceptual models of assessment in order to explore how traditional assessment practices can be displaced by alternative forms of evidence-intensive profiling in the YL’s virtual classroom, with a specific focus on spelling and vocabulary. When referring to alternative assessment, the author concentrates on a stress-reduced environment, which provides an atmosphere that helps children perceive the assessment procedure as an integral component of the learning and teaching process and not as a tool to grade them competitively (Shaaban 2001). In other words, and given the characteristics of young children, the assessment procedures used for them should include methods that take into consideration children’s physical, social, and cognitive development; appeal to the age and interest of the children; and include feedback that underlines what they can do instead of what they cannot. In the case of traditional assessment practices, the author refers to formal assessment approaches in which students, despite their young age, are assessed on their ability to recollect and reproduce the content they have been taught (see Coombe et al. 2020), mainly in a summative way. Additionally, this paper aims to fill the current gap in the literature regarding virtual alternative assessment and assessment practices in the YL’s online learning environment. The two elements are integral components in foreign language learning and are essential to the development of all basic language skills. The abrupt change in children’s mode of learning and assessment led to the current study, which aspires to shed light on possible approaches when young children are taught foreign languages online. The aims of the research were:
  • To collect data via digital tools that focus on vocabulary and spelling in a meaningful context.
  • To collect data via digital feedback.
  • To observe the students’ progress and learning approach to the newly introduced tools and mode of learning.
  • To analyze findings during a child-friendly assessment process.
The article presents a case study that concentrates on eight YLs aged 8–10. Alternative ssessment was applied during the students’ online language lessons as a means for the teacher to assess and evaluate students’ progress and learning of vocabulary and spelling. For the needs of the study, online observations were conducted, and field notes, observation sheets, and checklists were kept for a period of 5 weeks. Two months after the online lessons had commenced, the students were interviewed in order to gain a holistic view of their progress and feelings toward the experience of the alternative form of assessment that took place online. This study answers the following research questions:
RQ1.
To what extent do YLs progress in addition to mastering the core challenges of spelling skills and vocabulary development through online learning environments?
RQ2.
How can the process of alternative assessment facilitate assessment and evaluation in the YL’s online environment?
RQ3.
What are children’s attitudes toward alternative forms of assessment during unprecedented times and lockdown?
The objective was to analyze and discuss the role of alternative assessment in the distance language education of YLs during the COVID-19 outbreak. The paper presents implications for YLs and the positive effects of online alternative assessment, and it suggests designs of suitable assessment opportunities in virtual environments.

1.1. Background Review

This section aims to provide a background review of online learning, online engagement, and alternative assessment. To the author’s knowledge, connecting online (alternative) assessment, YLs, and the COVID-19 language education era has been neglected in the literature. There have been studies conducted to clarify the role of teachers amidst COVID-19 online assessment (Forrester 2020; Chung and Choi 2021; Zhang et al. 2021); however, none have focused on YLs and their learning perspective.

1.2. Emergency Remote Teaching and Assessment

Emergency Remote Teaching (ERT) is known as an abrupt shift in instructional delivery to an online delivery mode. This occurs because of an immense catastrophe or crisis, much like COVID-19. This emergency solution is very different from planned online courses and was initially and purposely designed to be delivered virtually (Mohmmed et al. 2020). Furthermore, because of the lack of planning and time, the literature has associated ERT with poor online teaching infrastructure, teachers’ inexperience, and complex home environments (Zhang et al. 2020), all of which can resemble a quarantine situation. A rushed transfer to online education could not support a well-planned course of any kind as it is impossible to make a smooth transition under the pressure of such circumstances (Hodges et al. 2020; Bozkurt and Sharma 2020). The pandemic has given the language education community the opportunity to explore and develop coping mechanisms for ERT or for any other emergencies or disasters that may potentially emerge in the future (Ferri et al. 2020). This especially applies to the development of online assessment, which has been in place for many years but has not been challenged as much as it has after the outbreak of the pandemic. In fact, the transition from the physical classroom-based traditional assessment to an online environment caused by COVID-19 has had a far-reaching impact on different aspects of the assessment (Ghanbari and Nowroozi 2021). According to the existing literature, different variables have been researched within higher education contexts, such as lack of contact with the teachers, poor digital literacy, and lack of effective interaction and feedback mechanism, all from the teachers’ perspective (Holmes and Gardner 2006; Tarhini et al. 2013). The greatest challenges have been seen in administrative procedures, digitizing paper-based systems and online testing that includes multiple choice tests and assessment of problem-solving skills (Ridgway et al. 2004). When the pandemic broke out, teachers struggled to adapt their assessment practices for online modes; yet, some saw it as an opportunity to re-conceptualize their assessment approaches and try alternative assessment practices that aligned with online teaching and learning. More specifically, Chung and Choi (2021) investigated the adaptations made to assessment at a university language center in South Korea. It was found that teachers developed a series of continuous alternative assessments, which involved opportunities for self and peer-evaluation. These were very different from their previous summative and product-oriented practices. Online teaching was also found to support specific feedback practices. Xu (2021) found positives in the online assessment process, as students favored online written corrective feedback and teachers’ online tutorials and feedback as they could be reviewed at any time, indefinitely. A number of scholars have argued that children are in a continuous state of cognitive development and hold dissimilar assessment needs to those of adults. Specifically, McKay (2006) suggests that results from ‘traditional tests’ may not always be ideal for children, as they can have negative results on their motivation, self-esteem and attitude towards language learning. Therefore, investigations into online alternative assessments are needed, as such techniques can present the language education community with a dynamic perspective of children’s online learning progress, rather than a static picture. Throughout the literature, many calls were found for more research on best practices in online language assessment, indicating that a huge gap in our understanding remains.

1.3. Alternative Assessment Online

Barootchi and Keshavarz (2002) have suggested that alternative assessment, also known also as nontraditional assessment, is used as an umbrella term for types of assessment other than standardized traditional tests. Similarly, Bailey (1998) mentioned in earlier research that traditional assessment approaches are one-off, indirect and likely to be inauthentic, as opposed to alternative assessment which is continuous, longitudinal, and a more direct form of assessment. Alternative assessment has the potential to alter the traditional assessment paradigm of students’ passivity and replace it with student initiative, self-discipline, and choice (Janisch et al. 2007). Specifically, this can be beneficial to younger age groups, as Hasselgreen (2005) argues that children frequently need input and tasks that consider their maturity and short-term motivation. Alternative assessment provides teachers with the opportunity to gather information on students’ abilities, talents, interests, and potential, since these methods encourage reflection of the students’ performance in pedagogical settings (Barootchi and Keshavarz 2002). Moreover, Shaaban (2001, p. 8) highlights “the need for teachers to use a variety of types of alternative assessment, especially non-threatening informal techniques, with young EFL/ESL learners”. In the case of alternative assessment, activities are part of the classroom dynamic and ongoing learning and classroom practices; they provide specific feedback to teachers which prompts them to reflect and adjust their teaching so as to meet students’ needs and integrate assessment into their instruction (Shepard 2000). According to Ioannou-Georgiou and Pavlou (2003), such types of alternative assessment can be portfolios, games, demonstrations, self-assessment, peer-assessment, projects and performance tasks. This process of non-threatening and authentic assessment can empower young language learners. Students learn to appreciate and take ownership for their learning, set appropriate personal goals, and evaluate their own progress toward these goals (Hansen 1998).

1.4. The Context

In line with EU developments in language learning and policy recommendations, in 2010, the Greek Ministry of Education and Religious Affairs introduced EFL in the first and second grades of primary state schools (Karavas 2014). Students within the Greek context continue their English language studies through primary and secondary school (Giannikas 2013). Parallel to state school English language learning, a high percentage of students take lessons that are offered at private language schools, also known as supplementary private education (Bray 2011). This has resulted in a strong private sector of foreign language institutes, which provides intensive foreign language tuition to students as young as 7 years old; the goal of this intensification of language studies is to help and encourage learners to prepare for high-stakes English language exams and, eventually, obtain language certificates (Alexiou and Mattheoudakis 2011; Tsagari 2011).
The phenomenon of private language education is not limited within the Greek context. Many countries worldwide experience supplementary private education (Bray and Lykins 2012; Tsagari and Giannikas 2018). Specifically, Bray’s (2011) research suggests that, every year, families in Europe spend astonishing amounts of capital on private tutoring.
With the outbreak of the pandemic and the abrupt shift to online learning, students and teachers in Greece (and beyond) were in shock with their new norm as they needed to become familiar with online tools, resources, and adjust their teaching/learning approaches overnight (Giannikas 2020). The 10 March 2020 marked 89 confirmed cases and no deaths in the country, the government, in collaboration with the Greek National Health Organisation, announced the suspension of the operation of schools and universities across the country (cadefop.europa.eu) in order to avoid the spread. Additionally, the Ministry of Education and Religious Affairs saw this as an opportunity to bring forward long-awaited reforms for the development of digital knowledge and skills (https://gemreportunesco.wordpress.com, accessed on 10 August 2021). Nonetheless, this would be challenging because of a longstanding legacy of a financial crisis prior to the pandemic. According to Schleicher (2020), poor finances have delayed necessary investments in ICT in the country. According to the data, 1 in 5 students attending Greek state schools do not have access to a computer, while 1 in 10 do not have access to the internet. More than 1 in 3 students attended schools where their teachers did not have the necessary equipment and/or pedagogical/digital literacy skills to integrate devices/digital tools in their instruction, which led head teachers to acknowledge that an effective online learning support platform was not possible. Nonetheless, this was not the case in the private sector. Since foreign language schools across the country are self-funded, it was up to the school owner to decide whether they would invest in technology, training, and resources. The school owners’ decision-making process was a no-brainer, as the future of shadow education was at stake. In the beginning of November 2020, language schools closed, prompting professionals in the field of private language education to adjust to their new norm. In contrast to the general negative climate that followed the virus’s outbreak, private language schools demonstrated quick reflexes in regard to managing the health crisis and remained closed for six months until their communities reached critical benchmarks for controlling the virus.

2. Materials and Methods

2.1. Participants

There were 8 students who participated in the current case study. The children were beginners between 8 and 10 years old, who attended different Greek mainstream schools in Southwestern Greece. Four of the students were siblings and connected online from different rooms of their home. The focus was on the language learners; however, it is important to mention that the teacher had 15 years of classroom experience, but she had never taught in an online setting. Before and during the lockdown, she underwent training on teaching online and had conducted her own research on alternative assessment. The practitioner holds a BA in English Literature and an MA in TESOL. The teacher introduced Vocabulary and Spelling assessment activities synchronously and asynchronously, as displayed in Table 1 and Table 2:
The participants were beginners (Junior A class). The course encourages YLs student-centered and interactive activities that enable learners to develop and understand basic linguistic concepts. The objectives of the course are to enhance: (1) problem solving, (2) critical thinking, (3) decision making, (4) collaboration, and (5) communication. The classes took place twice a week for an hour and 25’ for each lesson. The students used EFL course books and additional material the teacher created and provided students with via email. The students were encouraged to use computers/laptops and/or smart devices to complete the tasks the teacher assigned. Alternative assessment activities took place weekly and feedback was shared with the students, but not in the form of grades. To increase the validity of the activities used for assessment purposes, the students were encouraged to work on 2–3 activities during each lesson for the period of 5 weeks, which was the duration of the study. The activities took place in a consistent environment, and at a specific time 40 min into the lesson. Finally, prior to the start of each assessment activity, the teacher ensured the students had clear audio and video.

2.2. Data Collection Tools

For the needs of the study, a total of 8 online observations were conducted (in a period of 5 weeks). The observations lasted for 1.5 h in which time field notes, activity records, observation sheets and checklists were kept. Additionally, all participants shared the same L1 (Greek). Other than information about the class (i.e., date, level, aim, etc.), the observation sheet was divided into 4 sections, collecting information about important aspects of the lessons observed, as recommended in the literature (Tsagari and Georgiou 2016; Tsagari and Giannikas 2017, 2018). Observation was deemed essential to the specific study, as it is an important data collection method for qualitative inquiry. Through consistent observations, the researcher was able to capture the reality of the assessment process and how the participants interacted with the surroundings (Allwright 1988; Robson 2002; Merriam 2009). Two months after the online lessons had commenced, semi-structured online interviews were conducted, which focused on gaining more in-depth information on the students’ perception of their progress and their feelings towards the experience of an alternative form of assessment. The supporting instruments for data analysis, such as question type form, checklists, and observation sheets were piloted and modified before the real data collection.
It is important to mention that when the data collection and analysis were complete, the same assessment approaches were applied in other online classes of similar age groups at the same language school in order to validate the findings.

2.3. Data Collection Process

Prior to data collection, parents were provided with letters of informed consent, which they were asked to sign and return to the researcher online. The letters emphasized students’ privacy (with the use of pseudonyms), and that the children would not be identifiable in the publication and dissemination of the findings.
The data collection procedure was as follows:
Step 1:
The researcher created an observation scheme for the study.
Step 2:
There was a selection of the participants of the research.
Step 3:
The researcher visited the online class, sitting from the beginning to the end of the session, taking notes of the assessment process. The classes were observed as carefully as possible. The researcher was given permission to record the sessions. The researcher’s camera and microphone were switched off to avoid validity threats.
Step 4:
The students were interviewed.
Step 5:
After the preliminary data collection from both observation and interview, the data was analyzed.

2.4. Data Analysis

In light of Creswell and Plano-Clark’s (2007) clarification that ‘qualitative analysis begins with coding the data, dividing the text into small units (phrases, sentences, and paragraphs), and assigning a label to each unit’ (p. 131), the researcher reviewed what was witnessed and recorded during the observations and synthesized it with the participants’ performance. The observations were conducted to answer RQ1 (To what extent do YLs progress in addition to mastering the core challenges of spelling skills and vocabulary development through online learning environments?). They were transcribed on Atlas.ti and instances were calculated. The seven themes that emerged from the analysis of the observations are the following, and are detailed in Table 3:
  • Responses;
  • Efforts;
  • Accuracy;
  • Participation;
  • Peer empowerment;
  • Reactions to feedback;
  • Changes.
It is important to mention here that the assessment activities introduced online were not used in the classroom. The assessment that was conducted within the classroom was in the form of testing and written dictation.
The interviews were conducted in order to answer RQ2 (How can the process of alternative assessment facilitate assessment and evaluation in the YLs’ online environment?) and RQ3 (What are children’s attitudes towards alternative forms of assessment during unprecedented times and lockdown?). Interviews (see Appendix A) were also transcribed verbatim on Atlas.ti and thematically analyzed. The 5 themes that emerged from the interviews are:
  • Impressions and attitude towards online alternative assessment;
  • Equipment/digital tools;
  • Vocabulary comprehension;
  • Feedback needs;
  • Spelling logic.
The following sections of the article elaborate on the observation findings and outcomes per case, i.e., per student. The interview findings follow and are presented per theme.

3. Results

3.1. Observation Findings

The inclusion of qualitative data added valuable insight to the study and placed more emphasis on participants’ practices and perspectives (Taylor et al. 2015). As previously mentioned, data gathered during the online observations address RQ1 (To what extent do YLs progress in addition to mastering the core challenges of spelling skills and vocabulary development through online learning environments?). More specifically, by studying each case the researcher looked at the progress the YLs made via the assessment process, while in lockdown and while learning in an online environment. The study views learning, teaching and assessment as a unified continuum, and the analysis approach aims is to evaluate each case in depth and state how effective the online assessment approach was when it came to the elements of spelling and vocabulary.
Figure 1 displays the progress students C1 (pseudonym) made in each category, i.e., theme, and the extent to which YLS overcame challenges of spelling skills and vocabulary development through online learning environments. In the first case, the most progress was identified in responses, efforts and participation. According to the instances calculated from the observation data and activity records, over the 5-week period the rate of responses increased from 70% to 88%, efforts ratings from 75% to 88% and participation ratings from 70% to 90%. Furthermore, the accuracy levels increased from 70 to 85% and showed a slight increase in responses to feedback from either the teacher or peers (80% to 90% over 5 weeks). There was also an indicative change in spelling and vocabulary activities (70–90% over 5 weeks). The theme that seemed to develop less by comparison was peer empowerment (70–80% over 5 weeks) during the online assessment process.
The analysis of Case 2 indicated higher levels, as can be seen in Figure 2. According to the data, most progress was recorded in efforts (95–100% over 5 weeks), participation (90–95% over 5 weeks) and responses to feedback (95–100% over 5 weeks). The levels which were also particularly high were responses, accuracy and changes. However, peer empowerment was distinctly low (60–70% over 5 weeks).
As shown in Figure 3, the analysis of Case 3 showed most progress in efforts (60% to 90% over 5 weeks) and participation (60% to 88% over 5 weeks), and a slight but gradual increase in accuracy (60%→65%→75%→78%→84% over 5 weeks). Nonetheless, according to the data, the theme with less gradual progress was the overall change in vocabulary and spelling (70%→75%→75%→80%→82% over 5 weeks) and low rates in responses (65%→65%→70%→80%→80% over 5 weeks). The low responses could explain the fact that even though efforts and participation increased, the approach to proficiency targets used for the needs of assessment saw limited alterations.
Figure 4 displays the analysis Case 4 began at high levels, and indicated most progress in the element of efforts (80% to 95% over 5 weeks), responses to feedback (90% to 95% over 5 weeks) and overall change in vocabulary and spelling (80% to 90% over 5 weeks). However, it is important to note here that responses (85%→88%→92%→95%→90% over 5 weeks) and peer empowerment levels lowered towards the end of the online learning period (80%→80%→80%→75%→75% over 5 weeks)—see the 4th and 5th week of the study. Furthermore, accuracy competency levels, responses to feedback and changes made in spelling and vocabulary activities increased over the time of data collection.
As seen in Figure 5, the data indicate that in the fifth case there was substantial progress in all categories, with the highest levels were of change (55%→55%→60%→70%→72% over 5 weeks) and the lowest were in responses (55%→55%→60%→70%→72% over 5 weeks).
According to the data displayed in Figure 6, the competencies assessed in Case 6 showed most progress in responses (65% to 87% over 5 weeks), efforts (60% to 86% over 5 weeks), participation (60% to 86% over 5 weeks) and accuracy (65% to 85% over 5 weeks), and far less in peer empowerment (65%→65%→65%→68%→69% over 5 weeks). Furthermore, there was a slight increase in responses to feedback, and changes made in spelling and vocabulary activities.
Figure 7 shows the competencies assessed in Case 7 and indicates that online assessment had high percentages and substantial progress in efforts (86% to 94% over 5 weeks), but shows inconsistency in participation (60%→65%→70%→80%→86% over 5 weeks). According to the data, less progress was recorded in responses to feedback (65%→67%→70%→75%→75% over 5 weeks), where the levels only increased in the final 2 weeks of the study and low levels of peer empowerment where there was an increase only in the final two weeks of the study (65%→65%→65%→68%→69% over 5 weeks).
Finally, as displayed in Figure 8, Case 8, there was a substantial increase in accuracy (78% to 94% over 5 weeks), responses (70% to 78% over 5 weeks) and responses to feedback (80% to 94% over 5 weeks); however, there no changes in levels of peer empowerment.

3.2. Interview Findings

As mentioned previously, the interviews were conducted to response RQ2 (How can the process of alternative assessment facilitate assessment and evaluation in the YLs’ online environment?) and RQ3 (What are children’s attitudes towards alternative forms of assessment during unprecedented times and lockdown?). The findings are presented per theme that emerged: (1) Impressions and attitude towards online alternative assessment; (2) Equipment/digital tools; (3) Vocabulary comprehension; (4) Feedback needs; (5) Spelling logic.

3.2.1. Impressions and Attitude towards Online Alternative Assessment

Under this theme, four sub-themes including digital culture, parental influence, attitudinal factors and assessment culture, can be named. A rigorous analysis of the interviews revealed that, early after the shift to online assessment/learning, the students faced the greatest shock when placed in front of the computer screen that replaced their classroom, which was a huge part of their social life. All the students stated that they were not familiar with the use of technology for educational purposes and this confused and scared them at first. In fact, the students were rarely exposed to technology in the classroom before the pandemic. Besides the students’ digital struggles, the parents had little faith in the technology as a reliable assessment/learning tool. Parents were supporters of traditional means of assessment, as Greece is known to hold an exam-oriented culture. According to the students, their parents could not see an alternative process having the same impact online. Their parents’ opinion influenced the students to make them more reluctant at first. The following sample of extracts describe the days after the shift to online assessment:
Extract 1, C2: “I did not know how we could use the internet to learn because I always use it to play games and watch videos. When we were in lockdown and we were told we had to work online, I did not take it seriously. Then my parents explained that we won’t be able to go outside for a while and this will be the only way to continue school. I couldn’t understand how this would work. I didn’t think we would be tested in my English classes, because we were not tested in Greek school”.
Extract 2, C5: “I was very scared when my teacher told me I would be assessed online because I couldn’t understand how this would happen and how she would grade us. I thought she would send us tests to do at home and then we would send them back to her. But I think what we did at the end was a lot of fun and it was not scary at all”.
Extract 3, C3: “I was very worried about being assessed online because my parents had told me that the teacher won’t be able to assess us all properly online as she would in the classroom. In the classroom she can monitor us, so I was afraid she would not be able to see that I study and try my best. When the teacher explained the way she would assess us we saw it as a game, at first. Now I think it is the kids’ way of testing”.

3.2.2. Equipment/Digital Tools

What is expected is that, at the time of crisis, policy-makers provide the digital tools and equipment to those who need them. This can help avoid anxiety that might affect the normal performance of a young student. According to the data, families were caught off-guard at first, as those who were not technologically equipped were to seek laptops online, as the trade market was temporarily closed. The process of finding the right laptop and waiting for it to arrive interfered with the students’ language learning and assessment. Half the students first joined the online classes on a mobile phone, which made the interactive assessment activities challenging and frustrating for the students. Samples of the students’ inputs are as follows:
Extract 4, C1: “The first two weeks were very difficult for me because we did not have a laptop at home and I had to use my mother’s smartphone. It was difficult for me to follow the lesson and take part in the spelling and vocabulary activities. Thankfully, when the laptop arrived all this changed. It was much easier to work on a laptop. I use it for school, too”.
Extract 5, C5: “I had a laptop at home, so I could participate in all the activities from the start of the lockdown. Some friends from my class couldn’t though and it had an effect on everyone because we had to go slower and repeat a lot. We also had to wait for the other children to receive their laptops so we could do the new activities with our teacher. I felt bad for the other children that did not have a laptop. Laptops are very easy to use”.

3.2.3. Vocabulary Comprehension

The sudden shift to online alternative assessment created a number of pedagogical challenges for the students and their parents, as seen in the previous sections of the interview findings. A major concern was how assessment was to be executed, how the evaluation would take place, how the teacher was going to apply alternative assessment activities online, and whether they were going to be credible or not. These concerns suggest that online alternative assessment introduces new variables in the assessment context, which can contaminate the assessment outcomes. In a context where online alternative assessment was a new experience for students, it was important to investigate the matter. In the process, students mentioned the following:
Extract 6, C7: “Before we started working on Vocabulary online, I didn’t pay much attention to the process my teacher explained to us at the start of our online lessons. I didn’t take it very seriously because I didn’t think it would last long. Now, I really love these activities and I always do well. They are really fun and I feel that I learn a lot and have fun at the same time. I can’t do it for very long though because sometimes the bright light of the monitor hurts my eyes and my back”.
Extract 7, C8: “The Vocabulary assessment we do online is very different to what we did in class. In the classroom, we took tests on a piece of paper. Now we use the internet and the tests look like the games I play on my mother’s mobile phone. I do very well and get very high scores. I feel like I remember more words now because of all the pictures in the activities. I translate the words in my head but I still learn them”.
Extract 8, C6: “The Vocabulary assessment online is a game we play now that we are in lockdown. When we go back to the classroom we will go back to the serious tests, although we cannot work together on those”.

3.2.4. Feedback Needs

The matter of feedback was overlooked, given the circumstances and conditions of the language learners during quarantine. The students were vulnerable during this time, so the way feedback was provided was key. For this reason, and for the needs of this research, the feedback was either provided during class to the students as a group, instantly as part of the digital activity (instant feedback system) or in a private synchronous chat. Since the feedback was gradeless, the children were not intimidated or stressed by the feedback process. There was a greater focus on reflection and improvement, as can seen in the sample extracts below:
Extract 8, C2: “The teacher tells us if we do well or not but we don’t get grades. Even if I don’t do well on an activity though, I know I can try again next time”.
Extract 9, C7: “The teacher corrects us and helps us get better. There are no grades so I don’t think it counts if we made mistakes”.
Extract 10, C5: ‘There is no pressure now that we don’t get grades”.

3.2.5. Spelling Logic

Language learning and assessment was viewed in a new light while conducted online, and in the form of alternative assessment. In fact, the fact that the students were not preoccupied with grades and test-taking offered them the opportunity to reflect on their learning and focus on their improvement. However, this process was not considered a ‘serious part of learning’ as mentioned previously, which ultimately connects to the assessment culture the children are accustomed to. Learning about the relationships between letters and sounds can be difficult for language learners of all ages. This research shows that the participants rely on visual memory for spelling, which is closely connected to the language processing networks in the brain (International Dyslexia Association 2011), without referring to spelling logic or discovering features of a word. As evidence, the students’ spelling logic did not see noticeable enhancement. An alternative to the traditional spelling curriculum can be compatible with alternative assessment because it will allow students to comprehend abstract patterns, form connections between old and new, and build connections through integrated study. This combination was not supported during the study, which the researcher estimates to be the reason why spelling ability plateaued, as shown in Table 4 and extracts 11, 12 and 13:
Extract 11, C5: “We didn’t do any spelling, only in online games, but we didn’t write dictation. We studied new vocabulary, but the teacher could only ask us what the words meant, not how they were spelled”.
Extract 12, C3: “We didn’t do spelling because we were working online and our teacher couldn’t correct us, so we will continue with spelling when we return to the classroom”.
Extract 13, C4: “We didn’t do any spelling online, only when we did spelling games. We will do more when we return to our school”.

4. Discussion

The results of the present study revealed that the pandemic influenced different aspects of vocabulary and spelling assessment in the particular YLs’ context. As seen previously, the article explores the way the students progressed and viewed the assessment process during the different challenges after the shift to online learning. This emphasizes the role of alternative assessment and the misunderstood credibility of online education/assessment as fluid and dynamic contexts. The findings of the study show that several issues that seemed difficult for students at first gradually improved. The observations show that the inhibiting attitudes of the students to online assessment gradually improved as learners accepted their new norm and enjoyed the benefits of online assessment. Nonetheless, the cloud of ‘serious’ assessment still shadows the development of alternative online assessment in the particular context. This can be confirmed by considering the statements students made during their interviews.
However, the research shows that the students adopted strategies and gradually changed their attitude toward online learning/assessment. In other words, it shows that, despite the shift to an online mode of learning, which happened at short notice, the students evaluated their resources and circumstances to cope with the new situation. For example, while the students in this study seriously doubted the efficiency of online assessment, they gradually learned to use the benefits it provided to facilitate their progress and enjoy it in the process. Nonetheless, further research is needed, with a larger sample of participants in less stressful circumstances, where students can experience this form of assessment as an effective process rather than the only alternative due to lockdown. Nonetheless, few studies have investigated the outcomes of online alternative assessment as a pedagogical resource at the time of crisis, such as that of the COVID-19 pandemic. At a time when many educational institutions were forced into lockdown to avoid the spread of the virus, the integration of digital tools into the evaluation system is of high importance. Therefore, the findings of the present study are valuable as they show the efforts of eight different cases, and how they perceived handling the impediments of online assessments at the particular time of COVID-19 pandemic.
In sum, the findings of this study confirmed that YLs can progress in addition to mastering the core challenges of spelling skills and vocabulary development through online learning environments. However, squaring the circle of alternative assessment in distance language learning is a process. There is a need to change the attitude towards online education and assessment in the YL context. The pandemic has introduced online education as a solution to a crisis, where the quality of language learning is most likely to be poor. The same perception can be seen by students, and coincidently by their environment. In a situation where the circumstances are positive, there is no reason to view online education and assessment as an inferior Plan B. On the contrary, the fact of the matter is that language learners across the globe are presented with more options, more exposure to foreign language and more opportunities to extend their knowledge and gradually evaluate it.
Specifically, there is room for improvement in the field of language assessment, whether this is in an online or onsite setting. Specifically, its connection to the learning process can prove to be a revolution in early language learning. The process of alternative assessment in specific studies not only confirmed that it is age appropriate, but also demonstrated that it can facilitate and validate the assessment and evaluation of YLs’ online learning environment, fulfilling the aims of formative assessment. Alternative assessment in the online setting observed showed the need for new formats for gathering information about students’ achievements, and the need for new processes through which such information is synthesized. The children’s attitudes towards alternative forms of assessment during unprecedented times were positive; however, the assessment culture in the specific context can work as a barrier. Although students were recorded to progress and improve, the process was not appreciated as much as traditional test-taking, due to the assessment and test-taking culture within the specific education system. Additionally, due to the fact that data were gathered in a time of crisis, the students, apart from high-quality online education and assessment, needed emotional support, encouragement, and positivity. Further research is needed where studies can take place in less emotionally critical circumstances and online alternative assessment can be ‘put to the test’ in a less stressful scenario. Although the benefits were revealed in the current study, which took place in difficult circumstances, it is the authors’ estimate that the benefits would continue to develop in improved circumstances and help change the assessment culture in specific contexts and beyond.

5. Conclusions

When we consider the main characteristics of the young distance language learner, alternative assessment seems to be more beneficial. Based on the data of the current study, the ongoing assessment activities in vocabulary and spelling removed the pressure from the learner (see also Simonson et al. 2000). The participants’ descriptive accounts revealed that, initially, they were challenged on different technological grounds, but as the lessons proceeded, solutions and strategies were found to adapt their assessment practices in the new context. Although they were in disbelief and distrusted the technology at first, the students dynamically experimented and adjusted to their assessment in the existing resources and circumstances. Since the educational system is at the beginning of exploring different aspects of the online assessment, the results of the present study can contribute to the field and help teachers in their practice in similar contexts, as well as provide new directions for future research. Different aspects of the challenges and the solutions that emerge can be separately studied in larger investigations.
This study also has a number of limitations. First, the study was conducted in a single private language school in Greece. Future studies should include more institutions. In addition, the study was conducted with a limited sample of students, which restricts the generalizability of the findings to the particular context. Finally, as previously mentioned, the study was conducted during a time of crisis, where students, teachers and parents experienced circumstances they had never experienced before. This could have had an impact on the outcomes of the study, and it is vital that future research investigates online alternative assessment in a less stressful context. This would help validate the findings published during the COVID-19 education crisis.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study due to the fact that there was parental consent (the participants were children).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy issues.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

Interview Questions
Translated from Greek to English
  • How do you feel about learning English online?
  • Do you have the equipment to learn English online?
  • Do you like the spelling and vocabulary activities you do?
  • Does it feel like you are being assessed?
  • Which activities were your favorite?
  • Can you spell the words: lucky, ice-cream, friends, helicopter, tunnel, out of, London, tourist
  • Why do you think you did so well?/didn’t do so well?

References

  1. Alexiou, Thomaï, and Marina Mattheoudakis. 2011. Bridging the gap: Issues of transition and continuity from primary to secondary schools in Greece. In 23rd International Symposium Papers on Theoretical and Applied Linguistics—Selected Papers. Thessaloniki: School of English, Aristotle University of Thessaloniki. [Google Scholar]
  2. Allwright, Dick R. 1988. Observation in the Language Classroom. London: Longman. [Google Scholar]
  3. Bailey, Kathleen. 1998. Learning about Language Assessment: Dilemmas, Decisions, and Directions. Boston: Heinle & Heinle. [Google Scholar]
  4. Barootchi, Nasrin, and Mohammad Hosein Keshavarz. 2002. Assessment of achievement through portfolios and teacher-made tests. Educational Research 44: 279–88. [Google Scholar] [CrossRef]
  5. Bozkurt, Aras, and Ramesh C. Sharma. 2020. Emergency remote teaching in a time of global crisis due to CoronaVirus pandemic. Asian Journal of Distance Education 15: i–vi. [Google Scholar] [CrossRef]
  6. Bray, Mark. 2011. The Challenge of Shadow Education: Private Tutoring and Its Implications for Policy Makers in the European Union. Brussels: European Commission. [Google Scholar]
  7. Bray, Mark, and Chad Lykins. 2012. Shadow Education: Private Supplementary Tutoring and Its Implications for Policy Makers in Asia. Philippines: Asian Development Bank. [Google Scholar]
  8. Chung, Sun-Joo, and Lee-Jin Choi. 2021. The development of sustainable assessment during the COVID-19 pandemic: The case of the English language program in South Korea. Sustainability 13: 4499. [Google Scholar] [CrossRef]
  9. Coombe, Christine, Vafadar Hossein, and Mohebbi Hassan. 2020. Language assessment literacy: What do we need to learn, unlearn, and relearn? Lang Test Asia 10: 3. [Google Scholar] [CrossRef]
  10. Creswell, John W., and Vicki L. Plano-Clark. 2007. Designing and Conducting Mixed Methods Research. Thousand Oaks: SAGE Publications. [Google Scholar]
  11. Donitsa-Schmidt, Smadar, Ofra Inbar, and Elana Shohamy. 2004. The effects of teaching spoken Arabic on students’ attitudes and motivation in Israel. Modern Language Journal 88: 218–29. [Google Scholar] [CrossRef]
  12. Ferri, Fernando, Patrizia Grifoni, and Tiziana Guzzo. 2020. Online Learning and Emergency Remote Teaching: Opportunities and Challenges in Emergency Situations. Societies 10: 86. [Google Scholar] [CrossRef]
  13. Forrester, Adam. 2020. Addressing the challenges of group speaking assessments in the time of the Coronavirus. International Journal of TESOL Studies 2: 74–88. [Google Scholar] [CrossRef]
  14. Ghanbari, Nasim, and Sima Nowroozi. 2021. The practice of online assessment in an EFL context amidst COVID-19 pandemic: Views from teachers. Language Testing Asia 11: 27. [Google Scholar] [CrossRef]
  15. Giannikas, Christina N. 2013. The benefits of management and organisation: A case study in a young learners' classroom. CEPS Journal 3: 87–104. [Google Scholar] [CrossRef]
  16. Giannikas, Christina N. 2020. Prioritizing when Education is in Crisis: The language teacher. Humanising Language Teaching 22: 1755–9715. [Google Scholar]
  17. Hansen, Anders. 1998. Content Analysis. In Mass Communication Research Methods. Edited by Anders Hansen, Simon Cottle, Ralph M. Negrine and Chris Newbold. Thousand Oaks: SAGE Publications, pp. 91–129. [Google Scholar]
  18. Hasselgreen, Angela. 2005. Assessing the Language of Young Learners. Language Testing 22: 337–54. [Google Scholar] [CrossRef]
  19. Hodges, Charles, Stephanie Moore, Barbara Lockee, Torrey Trust, and Mark Bond. 2020. The Difference Between Emergency Remote Teaching and Online Learning. EDUCAUSE Review. Available online: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning (accessed on 12 August 2020).
  20. Holmes, Bryn, and John Gardner. 2006. E-Learning Concepts and Practice. London: SAGE Publications. [Google Scholar]
  21. International Dyslexia Association. 2011. Spelling [Fact Sheet]. Available online: https://app.box.com/s/phcrmtjl4uncu6c6y4qmzml8r41yc06r (accessed on 6 July 2021).
  22. Ioannou-Georgiou, Sophie, and Pavlos Pavlou. 2003. Assessing Young Learners. Oxford: Oxford University Press. [Google Scholar]
  23. Janisch, Carole, Xiaoming Liu, and Amma Akrof. 2007. Implementing Alternative Assessment: Opportunities and Obstacles. The Educational Forum 71: 221–30. [Google Scholar] [CrossRef] [Green Version]
  24. Karavas, Evdokia. 2014. Developing an online distance training programme for primary EFL teachers in Greece: Entering a brave new world. Research Papers in Language Teaching and Learning 5: 70–86. [Google Scholar]
  25. Mckay, Penny. 2006. Assessing Young Language Learners. Cambridge: Cambridge University Press. [Google Scholar]
  26. Merriam, Sharan B. 2009. Qualitative Research: A Guide to Design and Implementation. San Francisco: Jossey-Bass. [Google Scholar]
  27. Mohmmed, Abdallelah O., Basim A. Khidhir, Abdul Nazeer, and Vigil J. Vijayan. 2020. Emergency remote teaching during Coronavirus pandemic: The current trend and future directive at Middle East College Oman. Innovative Infrastructure Solutions 5: 72. [Google Scholar] [CrossRef]
  28. Ridgway, Jim, Sean Mccusker, and Daniel Pead. 2004. Literature Review of E-Assessment. Bristol: NESTA Futurelab. [Google Scholar]
  29. Robson, Colin. 2002. Read-World Research, 2nd ed. Oxford: Blackwell. [Google Scholar]
  30. Schleicher, Andreas. 2020. The Impact of COVID-19 on Education: Insights from Education at a Glance 2020. Paris: OECD, Available online: https://www.oecd.org/education/the-impact-of-covid-19-on-education-insights-education-at-a-glance-2020.pdf (accessed on 10 August 2021).
  31. Shaaban, A. Karim. 2001. Assessment of Young Learners. FORUM 39: 16–23. [Google Scholar]
  32. Shepard, Lorrie A. 2000. The Role of Assessment in a Learning Culture. Educational Researcher 29: 4–14. [Google Scholar] [CrossRef]
  33. Simonson, Michael, Sharon E. Smaldino, Michael Albright, and Susan Zvacek. 2000. Assessment for Distance Education. In Teaching and Learning at a Distance: Foundations of Distance Education. Hoboken: Prentice-Hall, chap. 11. [Google Scholar]
  34. Tarhini, Ali, Kate Hone, and Xiaohui Liu. 2013. Extending the TAM model to empirically investigate the students’ behavioural intention to use e-learning in developing countries. Paper presented at Science and Information Conference, London, UK, October 7–9; pp. 732–37. [Google Scholar]
  35. Taylor, J. Steven, Robert Bogdan, and Marjorie DeVault. 2015. Introduction to Qualitative Research Methods A Guidebook and Resource, 4th ed. London: John Wiley & Sons. [Google Scholar]
  36. Tsagari, Dina. 2011. Investigating the ‘assessment literacy’ of EFL state school teachers in Greece. In Classroom-Based Language Assessment. Edited by Dina Tsagari and Ildikó Csépes. Berlin: Peter Lang Verlag, pp. 169–90. [Google Scholar]
  37. Tsagari, Dina, and Christina Nicole Giannikas. 2017. To L1 or not to L1, that is the question: Research in young learners’ foreign language classroom. English Language Education Policies and Practices in the Mediterranean Countriesand Beyond 11: 131–52. [Google Scholar] [CrossRef]
  38. Tsagari, Dina, and Christina Nicole Giannikas. 2018. Re-evaluating the use of L1 in the Second Language Classroom: Students vs. teachers. Applied Linguistics Review 11: 151–81. [Google Scholar] [CrossRef]
  39. Tsagari, Dina, and Egli Georgiou. 2016. Use of Mother Tongue in Second Language Learning: Voices and Practices in Private Language Education In Cyprus. Mediterranean Language Review 23: 101–26. [Google Scholar] [CrossRef]
  40. Xu, Wen. 2021. Pedagogic Practices, Student Engagement and Equity in Chinese as a Foreign Language Education. In Australia and Beyond, 1st ed. London: Routledge. [Google Scholar] [CrossRef]
  41. Zhang, Jingshun, Eunice Jang, and Saad Chahine. 2021. A systematic review of cognitive diagnostic assessment and modeling through concept mapping. Frontiers of Contemporary Education 2: 10–16. [Google Scholar] [CrossRef]
  42. Zhang, Renyi, Yixin Li, Annie L. Zhang, Yuan Wang, and Mario J. Molina. 2020. Identifying airborne transmission as the dominant route for the spread of COVID-19. Proceedings of the National Academy of Sciences of the United States of America 117: 14857–63. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Vocabulary and spelling progress during alternative assessment online/Case 1.
Figure 1. Vocabulary and spelling progress during alternative assessment online/Case 1.
Languages 07 00121 g001
Figure 2. Vocabulary and spelling progress during alternative assessment online/Case 2.
Figure 2. Vocabulary and spelling progress during alternative assessment online/Case 2.
Languages 07 00121 g002
Figure 3. Vocabulary and spelling progress during alternative assessment online/Case 3.
Figure 3. Vocabulary and spelling progress during alternative assessment online/Case 3.
Languages 07 00121 g003
Figure 4. Vocabulary and spelling progress during alternative assessment online/Case 4.
Figure 4. Vocabulary and spelling progress during alternative assessment online/Case 4.
Languages 07 00121 g004
Figure 5. Vocabulary and spelling progress during alternative assessment online/Case 5.
Figure 5. Vocabulary and spelling progress during alternative assessment online/Case 5.
Languages 07 00121 g005
Figure 6. Vocabulary and spelling progress during alternative assessment online/Case 6.
Figure 6. Vocabulary and spelling progress during alternative assessment online/Case 6.
Languages 07 00121 g006
Figure 7. Vocabulary and spelling progress during alternative assessment online/Case 7.
Figure 7. Vocabulary and spelling progress during alternative assessment online/Case 7.
Languages 07 00121 g007
Figure 8. Vocabulary and spelling progress during alternative assessment online/Case 8.
Figure 8. Vocabulary and spelling progress during alternative assessment online/Case 8.
Languages 07 00121 g008
Table 1. Vocabulary assessment activities.
Table 1. Vocabulary assessment activities.
Featured AssessedAssessment ActivityForm of Execution
Self-AssessmentQuizzesSynchronous and Asynchronous
Review of VocabularyUse of surrounding while on online platformSynchronous
Use of VocabularyStudent recordingsSynchronous and Asynchronous
Vocabulary understanding Misconception checksSynchronous
Developing assessment and providing feedback skillsPeer AssessmentSynchronous
Table 2. Spelling Assessment Activities.
Table 2. Spelling Assessment Activities.
Feature AssessedAssessment ActivityForm of Executions
Self-assessment and spelling logicSpelling quizzesSynchronous and Asynchronous
Memory recall and spelling aloudSpelling online gamesSynchronous
Word identification and spelling memoryTrace-write-rememberAsynchronous
Vocabulary understanding Sound-it-outSynchronous
Developing assessment and providing feedback skillsPeer assessment Synchronous
Table 3. Illustration of emerging themes.
Table 3. Illustration of emerging themes.
ThemesIllustrationsData Collected
Responses →To assessment activities →-Figures that emerged from automated feedback
-Instances noted during observations when automated feedback was not available
Efforts →To complete and succeed during the assessment process →-Number of questions asked
-Number of possible responses
-Number of self-correction
Accuracy →In spelling and vocabulary activities →-Figures that emerged from automated feedback
-Instances noted during observations when automated feedback was not available
Participation →Willingness to participate →-Instances noted during observations when automated feedback was not available
Peer empowerment →Working together/fair and valid peer assessment/peer encouragement →-Instances noted in Observation Sheets
-Instances of requests from students to work together
-Instances of requests from students to work alone
-Number of activities/content/automated feedback/peer feedback according to teacher guidelines
Reactions to feedback →From teacher and peers →-Instances noted in Observation Sheets
-Figures that emerged from automated feedback
-Instances noted during observations when automated feedback was not available
Changes →In the students’ approach to the spelling and vocabulary assessment activities →-Instances noted in Observation Sheets
Table 4. Interview spelling activity.
Table 4. Interview spelling activity.
CasesSpelling of: ‘Lucky’Spelling of: ‘Ice-Cream’Spelling of: ‘Friends’Spelling of: ‘Helicopter’Spelling of: ‘Tunnel’Spelling of: ‘Out of’Spelling of: ‘London’Spelling of: ‘Tourist’
C1lukyice-creamfreindshelicoptertunelOut ofLondontourist
C2luckyice-creamfriendshelicoptertunelOut ofLondontourist
C3luckyice-creamfrendshelikoptertunnelOut ofLondonturist
C4luckyice-creamfriendshelikoptertunnelOut ofLondontourist
C5lukyice-creamno responseno responseno responseOut ofLonturist
C6luckyice-creamfriendshelicoptertunnelOut ofLondonturist
C7luckyice-creamfreindshelicoptertunelAut ofLondontourist
C8lukyice-creamfriendshelikoptertunnelOut ofLondontourist
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Giannikas, C.N. Squaring the Circle of Alternative Assessment in Distance Language Education: A Focus on the Young Learner. Languages 2022, 7, 121. https://doi.org/10.3390/languages7020121

AMA Style

Giannikas CN. Squaring the Circle of Alternative Assessment in Distance Language Education: A Focus on the Young Learner. Languages. 2022; 7(2):121. https://doi.org/10.3390/languages7020121

Chicago/Turabian Style

Giannikas, Christina Nicole. 2022. "Squaring the Circle of Alternative Assessment in Distance Language Education: A Focus on the Young Learner" Languages 7, no. 2: 121. https://doi.org/10.3390/languages7020121

APA Style

Giannikas, C. N. (2022). Squaring the Circle of Alternative Assessment in Distance Language Education: A Focus on the Young Learner. Languages, 7(2), 121. https://doi.org/10.3390/languages7020121

Article Metrics

Back to TopTop